At a company I worked for, we needed to develop an MVP (basically a web page) and apply certain business logic to a Google Drive spreadsheet that was frequently updated by the Sales team.
In this case, we had two options:
Develop a backend to replace the current spreadsheet and have the sales team use it as a new "backoffice" - This would take a very long time, and if the hypothesis we were testing was wrong, it would be time wasted.
Create the web page and use Google's SDK to extract data from the spreadsheet.
We chose to go with the second option because it was quicker. Indeed, it was much faster than creating a new backoffice. But not as quick as we imagined. Integrating with Google's SDK requires some effort, especially to handle the OAuth logic, configure it in the console, and understand the documentation (which is quite shallow, by the way).
Anyway! We did the project and I realized that maybe other devs might have encountered similar issues. Therefore, I developed a tool that transforms Google spreadsheets into "realtime APIs" with PATCH, GET, POST, and DELETE methods.
Since it's a product for devs, I think it would be cool to hear your opinions. It's still quite primitive, but the basic features already work.
I thought it was pretty restrictive, no more than 60 writes per minute, but I'm not sure about the reads restrictions.
Do you have a sense for how much usage Google will allow on a given sheet or user? I.e. will Google shut down the API after a certain usage level?
(Side-note, your SSL cert seems to be having trouble, i.e. www.zerosheets.com is correctly encrypted, zerosheets.com is not.)
Last week I was looking for a place to store random data with API access, and was looking at making a Google Sheets backend, but PocketBase was easy and didn't have a 60 rpm quota [1].
Deploying to a cheap VPS was very easy with CapRover.
300 Read requests per minute 300 Write request per minute
Since Ive just launched it, that number is far far away from being reach. (If it happens, the number can be increased).
Regarding the SSL cert, some other users pointed the problem but I can`t replicated. Tried with many different browsers. Im not sure about whats happening, it`s always working for me but I will keep digging.
Thank you for your comment bro :)
My infrastructure is 100% focused in scale so I think we can work together, we just need to share the costs.
Please send me a message here https://www.zerosheets.com/contact so I can get your contact.
My Google API rate limit is way bigger then 60/minute.
From what I see, the Common Name for the cert at https://zerosheets.com is `*.ip-66-228-52-143.cloudezapp.io` rather than zerorsheets.com. It's issued by Let's Encrypt. www.zerosheets.com is issued by Amazon. Hope that helps.
The solution? Export them all to a Google Sheet with one tag per column. Edit and apply-on-save. Filter, sort, summarize, pivot all the much you want. It's incredible.
To others seeing this - has anyone else ever tried this approach? Are there players out there who do this and have SOC II etc. data security measures?
- [0] https://thetechenabler.substack.com/i/142898781/making-a-sim...
Hope things have improved over that time.
I'm also surprised Google hasn't tried to kill this use-case. That's my main hesitation for using it on anything semi-serious.
Not that hard. Like 10 lines of code to get a decentish cache going.
Based on OPs description, I wonder if this company was spun up while already doing business with those companies.
super easy for prototyping and getting the job done, while google sheet as a backend is nice too but I need authentication etc etc
I've used it for a lot of system data that needs to be modified by a few people at most. With a little bit of careful code and caching (I like validating and syncing to S3) you can easily use it as a crud frontend for important system data.
It also can make great adhoc dashboards - plug into APIs (even private ones if you add in custom Google Scripts code) or push data updates to the sheets. I've had rather large reports auto updated on a schedule with multiple views of the data (pivot tables, queries, lookups, etc). Fully focused customized views into the data needed to solve specific internal problems with really quick delivery times.
Sure, a custom developed option should be substantially better than Google Sheets, but you won't develop it quicker. By the time you actually need something bigger/better, you should be at a place where your needs are better defined and you can afford the resources to develop something better.
I haven't started yet because (1) I still haven't finished writing up my last user script experience blog post, and (2) I'm terrified of the auth nightmare. It might be easier or harder in the user script context—I am in the context of a web page, so maybe that means I can do a normal oauth flow from there or something?
Do you have any plans to open source?
They seem pretty powerful.
This manifested as an issue when doing a deploy but being unable to get critical data. We added retries and stuff like that but it seemed not great to run a business of something that isn’t designed for this purpose.
They're internal tools, but big ones. And I'm surprised too. You won't hit too much resistance doing things the well-supported ways, but for some reason there's no well-supported way to run a cache.
so not just google sheets as the backend, google sheets as your personal backend for you instance of the app.
Being able to get data joined across sheets is priceless.
Another problem I had is an API change one year in.
I would not use Google Sheets again. Maybe I’d try Airtable, Notion, or some other similar platform where the API access is more of a priority to the company.
We're in the process of migrating away from Google Sheets now as it's a little painful to set up and it's impossible to completely prevent users from editing the wrong fields in our use case. But it has served very well so far and would highly recommend this approach to start with for anyone!
It's a community association membership management thing, and I wanted the (non-technical) association board members to be able to easily inspect the data and do something useful with it if I moved away. It has mostly been successful, except that sometimes someone will manually edit the sheet and break something.
Couple of suggestions/questions from using Google sheets to put enter some data that changed manually via back office admin which we then showed in a website that was accessed by thousands.
1. Will I hit rate limits if I access this API often via say GET requests from a public facing site? (What we did was used Redis as an intermediate cache)
2. Let's say the data on the sheet is changed, is there a webhook to notify the change? (What we did incorrectly initially was to rely on polling for change detection. Which was a bad idea especially as we grew increasingly reliant on this. Instead, the better way to do this is to put an 'Update' button in the sheet which then triggers an API call to an intermediate server to bust the cache I mentioned in the previous point. This also initiated a new fetch of the sheet data. A more user-friendly/better UX way to do this would be to use the Appscript Sheet APIs sheet change event handlers which can be leveraged for this if you don't want to enforce an update button on users though I haven't used this myself. [1])
In your case, I guess the way to do 2 would be to ask users to clone a sheet that has this Appscript code embedded in it when they create a new sheet that must be served with your API. No sure if the UX of that is acceptable though.
Wish your product success.
[1] https://developers.google.com/apps-script/reference/script/s...
I built this page with several google sheets as data source,;)
- Needed to manage vocab learning content for 60 world languages, mostly being updated by a handful of university students and a bunch of remote volunteers. Google Sheets was our source of truth, with a nightly sync into MariaDB. We even wrote some callable scripts that could detect ambiguities and other common issues. Saved us a ton of tooling effort, and stakeholders found it cool that we could update content so quickly.
- Working on a new product, my boss found a Figma plugin that lets you wire up prototypes to a Google sheet. It’s report-heavy and we’re still building the real app, so to dogfood it internally I’m running a wizard-of-oz collecting data with a different tool we already built and dumping results into sheets and then sorting/filtering/calculating for all the needed views. Internal stakeholders get a real, clickable prototype with real values that closely approximates what the real product will look like. It’s been amazing!
I think I'm failing to see the point here :-/
Good luck!
Good memories
It's a free Google Sheets / Apps Script you just paste to your sheet, and it turns the sheet into a full CRUD. It's kind of rate limited though but completely free!
Edit: I've thought about creating a company around Sheets before, and the problem is that once you get to the "willing to pay" stage, you also kind of outgrow Sheets. I'd rather migrate to Turso, Cloudflare D1, or Pocketbase instead of staying with Sheets or SpreadAPI, because of the limitations.
I'm convinced most of the people in this thread haven't tried working much with Google Sheets API at scale. Most of the time it's fine, then it will have days where 30-40% of the calls (as measured by Google Cloud console API monitoring) will throw an internal error which Google advises the option for is to "try again later". Also API calls that take up to 4 minutes (?!) to return (again as measured by their own API monitoring tools in Cloud console).
It's too bad because I otherwise really like this approach.
With a few plugins and extensions you can really do amazing things with this type of setup. The problem is that your spreadsheets basically become an unmaintainable and impenetrable nightmare and all your business logic is held hostage in various people's personal sheets. Making wide changes becomes hard/impossible and doing something that in a traditional software framework would be easy ("Let's change the frobnicator to twizzle first and then twozzle afterwards, instead of the other way round as it does it now") become very difficult requiring many many changes in many many sheets with a lot of risk, and a lot of dilligence to be sure you have actually made a particular critical fix everywhere.
I can see that Google Sheet has higher availability but fewer builtin dashboards.
Now you have effectively created bottlenecks for all critical processes because simultaneously working on spreadsheets is about the most dangerous thing you can do. Data integration becomes a perilous and time consuming affair; there’s no easy way to guarantee conformity between spreadsheets.
You paper over these problems with more code than you ever would have had if you switched to the proper tools earlier.
Triggers/AppScript - not that I am aware of right now...
Everyone I know doing stuff with GSheets (including Glide) is using a full-sync approach with an optimistic write layer of some sort.
GSheets API frequently reports being unable to do things, sometimes because it's calculating (common for huge sheets), sometimes just because it's down, other-times because rate limits
My impression is it's not a great API to build on directly.
A trigger based system that captured writes and wrote them to an outbox sheet could drastically speed up the sync process however.
The reason why we don't use that approach is there isn't APIs sufficient to setup this sort of thing automatically (last time I checked at least) and it makes changes to the users sheets which isn't impossible to live with but is distasteful.
To add to this - no revision control.
OK, maybe you can store your spreadsheets in a RCS, but can you get diffs of the changes, so you can confirm the revision you're about to push is what you meant to do? Can you review the history of diffs, so you can see how the system has changed over time? If multiple people make changes, can you do a merge? Do you even have a separate "working copy" where you can try stuff out, or are you just editing the production copy with no safety harness?
https://www.levels.fyi/blog/scaling-to-millions-with-google-...
I'm currently in a bit of the same problem. Make an MVP of a webapp. What makes my predicament extra annoying is that my toolbox is primarily Microsoft. I have Power Automate, SharePoint, Office365, etc available to me. I do not have Azure unfortunately.
The answer is spin up a flask app with SQL and do things right. Does something like Zerosheets exist for the MS environment?
Really cool product! I'm bookmarking it for future products that don't need to live in MS land.
Of course, just because you told yourself you aren’t doing a software project doesn’t mean you’re not prone to the standard problems of managing those, but the incubation period for many of them is long enough to catch a lot of non-practicioners unaware. I don’t think telling people to leave it to the professionals is the answer, for what it’s worth,—I just don’t see how we get from here to a world where one could interpolate between systems and end-user software more gracefully.
I'm not claiming this is a sufficient solution, but Google Sheets does keep a version history and you can see changes. Doesn't solve the problem of working copies and merging though.
Not a full IDE or god forbid pip modules - just hello world. And that's one of the better experiences thus far. Some finance shops simple don't give you any options.
Its par for the course to give office workers access to only Excel and then people are surprised when they build abominations in excel...
Highly recommended.
Some even worse that they require the data to be in excel since that's all they know.
Please keep in mind that you can always post your improvement ideas or PR's here https://github.com/ziolko/spreadapi.
Personally, I think good refactoring support would be the way to go - make it easier for a programmer to go in and "extract method" or similar to clean things up if they have gotten out of hand, without necessarily having to rewrite the whole thing from scratch.
Can you share the tool that transforms the spreadsheet into a realtime API? Sounds creative and interesting!
I don't understand why it doesn't seem like a big deal, knowing the limitations of spreadsheets. The spreadsheet experts have to know that it will be spaghetti in a decade, right?
I still think the solution for unifying those is to use a database as a backend, with the spreadsheets merely being a frontend for that backend. It does require the people to come somewhat closer to the software, though. I don't know that software can reasonably work with the ill-defined schema of spreadsheets.
See https://github.com/ziolko/spreadapi/blob/master/spread-api.j...
As an example:
curl -L \
--data '[{ "method": "POST", "sheet": "Sheet1", "payload": { "username": "John", "age": 25 } }, { "method": "GET", "sheet": "Sheet1" }]' \
"https://script.google.com/macros/s/$APP_ID/exec"
A single curl request, that POSTs a user, and then GETs the data. So you can do multiple POSTs to add multiple rows too.