The web hosting costs basically nothing. Most of the cost comes from the database.
Modern computers are mind-bogglingly powerful. An old laptop off eBay can probably handle the load for business needs for all but the very largest corporations.
As someone who is literally using old laptops to host things from my basement on my consumer line (personal, non-commercial) and a business line (commercial)...
I can host this for under 50 bucks a year, including the domain and power costs, and accounting for offsite backup of the data.
I wish people understood just how much the "cloud" is making in pure profit. If you're already a software dev... you can absolutely manage the complexity of hosting things yourself for FAR cheaper. You won't get five 9s of reliability (not that you're getting that from any major cloud vendor anyways without paying through the nose and a real SLA) but a small UPS will easily get you to 99% uptime - which is absolutely fine for something like this.
It seems like computers are getting more capable, but developers are becoming less capable at roughly the same pace.
And that makes perfect sense. Why should humans inconvenience themselves to please the machine? If anyone’s at fault, it’s the database for not being smart enough to optimize the query on its own.
I heard we had a 7 figure annual compute spend, and IIRC we only had a few hundred requests per second peak plus some batch jobs for a few million accounts. A single $160 N100 minipc could probably handle the workload with better reliability than we had if we hadn't gone down that particular road to insanity.
Heh, remind me of a discussion I had with a coworker roughly 6 month ago. I tried to explain to them that the ability to scale each microservices separately almost never improves the actual performance of the platform as a whole - after all, you still need to have network calls between each service and could've also just started the monolith twice. And that would've most likely even needed less RAM too, even if each instance will likely consume more - after all, you now need less applications running to serve the same request.
This discussion took place in the context of a b2e saas platform with very moderate usage, almost everything being plain CRUD. Like 10-15k simultaneous users making data entries etc.
I'm always unsure how I should feel after such discussions. On the one hand, I'm pretty sure he probably thinks that I'm dumb for not getting microservices. On the other hand... Well... ( ꈍ ᴗ ꈍ )
"Andy giveth, and Bill taketh away."
Computers keep getting faster (personified as Andy Grove, from Intel), and software keeps getting slower (Bill Gates, from Microsoft).
If you can understand programming, you can understand Linux. Might take a while to be really confident, but do you need incredible confidence when you have backups? :)
Especially with that meme he showed about vercel is laws +500% markup lmaoo
Don't be afraid of computers, don't be the pink elephant!
It’s a systemic problem. You’re going to loose the battle against human nature: Ever noticed how, after moving from a college dorm into a house, people suddenly manage to fill all the space with things? It’s not like the dorm was enough to fit everything they ever needed, but they had to accommodate themselves to it. This constraint is artificial, exhausting to keep up, and, if gone, will no longer be adhered to.
If a computer suddenly becomes more powerful, developers aren’t going to keep up their good habits performance optimisation, because they had those only out of necessity in the first place.
I agree with this statement for normal people. Not for software developers. You're just begging for stagnation. Your job is literally dealing with computers and making them do neat stuff. When you refuse to do that because "computers should be making my life easier" you should really find another line of employment where you're a consumer of software, not a producer.