Most active commenters
  • hot_gril(4)
  • internetter(3)

←back to thread

356 points joaovcoliveira | 11 comments | | HN request time: 0.002s | source | bottom

Hello everyone!

At a company I worked for, we needed to develop an MVP (basically a web page) and apply certain business logic to a Google Drive spreadsheet that was frequently updated by the Sales team.

In this case, we had two options:

Develop a backend to replace the current spreadsheet and have the sales team use it as a new "backoffice" - This would take a very long time, and if the hypothesis we were testing was wrong, it would be time wasted.

Create the web page and use Google's SDK to extract data from the spreadsheet.

We chose to go with the second option because it was quicker. Indeed, it was much faster than creating a new backoffice. But not as quick as we imagined. Integrating with Google's SDK requires some effort, especially to handle the OAuth logic, configure it in the console, and understand the documentation (which is quite shallow, by the way).

Anyway! We did the project and I realized that maybe other devs might have encountered similar issues. Therefore, I developed a tool that transforms Google spreadsheets into "realtime APIs" with PATCH, GET, POST, and DELETE methods.

Since it's a product for devs, I think it would be cool to hear your opinions. It's still quite primitive, but the basic features already work.

https://zerosheets.com

Show context
ctrlaltdylan ◴[] No.40016131[source]
Is the Google Sheets API rate limit open enough for actual production use?

I thought it was pretty restrictive, no more than 60 writes per minute, but I'm not sure about the reads restrictions.

replies(4): >>40016321 #>>40016486 #>>40018172 #>>40018854 #
r00fus ◴[] No.40016321[source]
Couldn't you cache the reads? Not many usages really require real-time from their data store.
replies(1): >>40016343 #
1. RockRobotRock ◴[] No.40016343[source]
Do you really want to deal with caching logic for what should be a simple API call? Sounds like a convincing argument to use whatever this product is.
replies(2): >>40017430 #>>40018874 #
2. internetter ◴[] No.40017430[source]
client = (APIcall) => redis.get(sha1(APIcall)) || { res = api(APIcall); redis.set(sha1(APIcall),res) return res }

Not that hard. Like 10 lines of code to get a decentish cache going.

replies(1): >>40018106 #
3. hot_gril ◴[] No.40018106[source]
Assuming you have Redis
replies(2): >>40018214 #>>40018984 #
4. internetter ◴[] No.40018214{3}[source]
Redis uses like 5mb of baseline RAM and can be deployed in a few lines of docker-compose.
replies(2): >>40018307 #>>40018327 #
5. ◴[] No.40018307{4}[source]
6. hot_gril ◴[] No.40018327{4}[source]
I'm not allowed to do that where I work. License is a no-no, can't run jobs without red tape, and there's no Docker either.
replies(1): >>40018336 #
7. internetter ◴[] No.40018336{5}[source]
Ok? I'm surprised your work lets you build a whole product ontop of google sheets, then. Also, why did you delete your original comment on not having a server?
replies(1): >>40018430 #
8. hot_gril ◴[] No.40018430{6}[source]
I deleted it cause I realized this thing has a server (probably). Was mixing it up with other people's projects that didn't have one.

They're internal tools, but big ones. And I'm surprised too. You won't hit too much resistance doing things the well-supported ways, but for some reason there's no well-supported way to run a cache.

9. kellpossible2 ◴[] No.40018874[source]
It's max a couple of hours work to cache in some local database like sqlite or in memory.
10. randomdata ◴[] No.40018984{3}[source]
Hell, just stick the data in memory.
replies(1): >>40019218 #
11. hot_gril ◴[] No.40019218{4}[source]
Valid strategy