←back to thread

1480 points sandslash | 1 comments | | HN request time: 0.218s | source
Show context
mkw5053 ◴[] No.44322386[source]
This DevOps friction is exactly why I'm building an open-source "Firebase for LLMs." The moment you want to add AI to an app, you're forced to build a backend just to securely proxy API calls—you can't expose LLM API keys client-side. So developers who could previously build entire apps backend-free suddenly need servers, key management, rate limiting, logging, deployment... all just to make a single OpenAI call. Anyone else hit this wall? The gap between "AI-first" and "backend-free" development feels very solvable.
replies(5): >>44322896 #>>44323157 #>>44323224 #>>44323300 #>>44323451 #
1. smpretzer ◴[] No.44322896[source]
I think this lines up with Apple’s thesis of on-device models being a useful feature for developers who don’t want to deal with calling out the OpenAI

https://developer.apple.com/documentation/foundationmodels