This DevOps friction is exactly why I'm building an open-source "Firebase for LLMs."
The moment you want to add AI to an app, you're forced to build a backend just to securely proxy API calls—you can't expose LLM API keys client-side.
So developers who could previously build entire apps backend-free suddenly need servers, key management, rate limiting, logging, deployment... all just to make a single OpenAI call.
Anyone else hit this wall? The gap between "AI-first" and "backend-free" development feels very solvable.
replies(5):