←back to thread

1479 points sandslash | 1 comments | | HN request time: 0.246s | source
Show context
mkw5053 ◴[] No.44322386[source]
This DevOps friction is exactly why I'm building an open-source "Firebase for LLMs." The moment you want to add AI to an app, you're forced to build a backend just to securely proxy API calls—you can't expose LLM API keys client-side. So developers who could previously build entire apps backend-free suddenly need servers, key management, rate limiting, logging, deployment... all just to make a single OpenAI call. Anyone else hit this wall? The gap between "AI-first" and "backend-free" development feels very solvable.
replies(5): >>44322896 #>>44323157 #>>44323224 #>>44323300 #>>44323451 #
jeremyjh ◴[] No.44323224[source]
Do you think Firebase and Superbase are working on this? Good luck but to me it sounds like a platform feature, not a standalone product.
replies(2): >>44323309 #>>44323319 #
1. mkw5053 ◴[] No.44323319[source]
Probably some sort. In the meantime it doesn't currently exist and I want it for myself. I also feel like having something open source and that allows you to bring your own LLM provider might still be useful.