←back to thread

1479 points sandslash | 1 comments | | HN request time: 0.209s | source
Show context
mkw5053 ◴[] No.44322386[source]
This DevOps friction is exactly why I'm building an open-source "Firebase for LLMs." The moment you want to add AI to an app, you're forced to build a backend just to securely proxy API calls—you can't expose LLM API keys client-side. So developers who could previously build entire apps backend-free suddenly need servers, key management, rate limiting, logging, deployment... all just to make a single OpenAI call. Anyone else hit this wall? The gap between "AI-first" and "backend-free" development feels very solvable.
replies(5): >>44322896 #>>44323157 #>>44323224 #>>44323300 #>>44323451 #
sockboy ◴[] No.44323157[source]
Yeah, hit this exact wall building a small AI tool. Ended up spinning up a whole backend just to keep the keys safe. Feels like there should be a simpler way, but haven’t seen anything that’s truly plug-and-play yet. Curious to see what you’re working on.
replies(1): >>44323271 #
dieortin ◴[] No.44323271[source]
It’s very obvious this account was just created to promote your product…
replies(1): >>44323334 #
mkw5053 ◴[] No.44323334[source]
I don't even have a product although I'd love people to work on something open source together. Also, I'm not nearly cool enough to earn a green username.
replies(2): >>44323425 #>>44323567 #
1. shwaj ◴[] No.44323425[source]
I think they were replying to the person with the green user name :-)