Most active commenters
  • hot_gril(4)
  • internetter(3)

←back to thread

356 points joaovcoliveira | 19 comments | | HN request time: 0.558s | source | bottom

Hello everyone!

At a company I worked for, we needed to develop an MVP (basically a web page) and apply certain business logic to a Google Drive spreadsheet that was frequently updated by the Sales team.

In this case, we had two options:

Develop a backend to replace the current spreadsheet and have the sales team use it as a new "backoffice" - This would take a very long time, and if the hypothesis we were testing was wrong, it would be time wasted.

Create the web page and use Google's SDK to extract data from the spreadsheet.

We chose to go with the second option because it was quicker. Indeed, it was much faster than creating a new backoffice. But not as quick as we imagined. Integrating with Google's SDK requires some effort, especially to handle the OAuth logic, configure it in the console, and understand the documentation (which is quite shallow, by the way).

Anyway! We did the project and I realized that maybe other devs might have encountered similar issues. Therefore, I developed a tool that transforms Google spreadsheets into "realtime APIs" with PATCH, GET, POST, and DELETE methods.

Since it's a product for devs, I think it would be cool to hear your opinions. It's still quite primitive, but the basic features already work.

https://zerosheets.com

1. ctrlaltdylan ◴[] No.40016131[source]
Is the Google Sheets API rate limit open enough for actual production use?

I thought it was pretty restrictive, no more than 60 writes per minute, but I'm not sure about the reads restrictions.

replies(4): >>40016321 #>>40016486 #>>40018172 #>>40018854 #
2. r00fus ◴[] No.40016321[source]
Couldn't you cache the reads? Not many usages really require real-time from their data store.
replies(1): >>40016343 #
3. RockRobotRock ◴[] No.40016343[source]
Do you really want to deal with caching logic for what should be a simple API call? Sounds like a convincing argument to use whatever this product is.
replies(2): >>40017430 #>>40018874 #
4. joaovcoliveira ◴[] No.40016486[source]
For now I`m setting no restrictions. Since it is an MVP, I`m trying to understand what a basic and a hard user would be. After a while, Ill figure out how to charge for it and what limitations should a free and a paid user have.

My Google API rate limit is way bigger then 60/minute.

5. internetter ◴[] No.40017430{3}[source]
client = (APIcall) => redis.get(sha1(APIcall)) || { res = api(APIcall); redis.set(sha1(APIcall),res) return res }

Not that hard. Like 10 lines of code to get a decentish cache going.

replies(1): >>40018106 #
6. hot_gril ◴[] No.40018106{4}[source]
Assuming you have Redis
replies(2): >>40018214 #>>40018984 #
7. stephenbez ◴[] No.40018172[source]
I used Google Sheets as a data source that business people could update, but eventually we moved away from it as we found it unreliable. We would get an occasional error (maybe a 429) even though we were polling the sheet once a minute (we had a few other sheets that polled once every few minutes).

This manifested as an issue when doing a deploy but being unable to get critical data. We added retries and stuff like that but it seemed not great to run a business of something that isn’t designed for this purpose.

replies(1): >>40020584 #
8. internetter ◴[] No.40018214{5}[source]
Redis uses like 5mb of baseline RAM and can be deployed in a few lines of docker-compose.
replies(2): >>40018307 #>>40018327 #
9. ◴[] No.40018307{6}[source]
10. hot_gril ◴[] No.40018327{6}[source]
I'm not allowed to do that where I work. License is a no-no, can't run jobs without red tape, and there's no Docker either.
replies(1): >>40018336 #
11. internetter ◴[] No.40018336{7}[source]
Ok? I'm surprised your work lets you build a whole product ontop of google sheets, then. Also, why did you delete your original comment on not having a server?
replies(1): >>40018430 #
12. hot_gril ◴[] No.40018430{8}[source]
I deleted it cause I realized this thing has a server (probably). Was mixing it up with other people's projects that didn't have one.

They're internal tools, but big ones. And I'm surprised too. You won't hit too much resistance doing things the well-supported ways, but for some reason there's no well-supported way to run a cache.

13. elondaits ◴[] No.40018854[source]
Yes. I used Google sheets as a database to build a website and ran into this issue. The worse part is, if you come across the limit there’s not much you can do but wait or rate limit.

Another problem I had is an API change one year in.

I would not use Google Sheets again. Maybe I’d try Airtable, Notion, or some other similar platform where the API access is more of a priority to the company.

replies(2): >>40020449 #>>40020480 #
14. kellpossible2 ◴[] No.40018874{3}[source]
It's max a couple of hours work to cache in some local database like sqlite or in memory.
15. randomdata ◴[] No.40018984{5}[source]
Hell, just stick the data in memory.
replies(1): >>40019218 #
16. hot_gril ◴[] No.40019218{6}[source]
Valid strategy
17. yawnxyz ◴[] No.40020449[source]
For reading sheets, it's better to use the "share as CSV" option since that gets cached pretty well w/o limits
18. gofreddygo ◴[] No.40020480[source]
I've resisted this temptation to integrate with google apis for these 2 specific reasons rate limits and api changes.
19. coderintherye ◴[] No.40020584[source]
Perhaps the dreaded 503 Internal Error ?

I'm convinced most of the people in this thread haven't tried working much with Google Sheets API at scale. Most of the time it's fine, then it will have days where 30-40% of the calls (as measured by Google Cloud console API monitoring) will throw an internal error which Google advises the option for is to "try again later". Also API calls that take up to 4 minutes (?!) to return (again as measured by their own API monitoring tools in Cloud console).

It's too bad because I otherwise really like this approach.