Acquiring them gives OAI:
- ready-made team that understands the IDE plumbing and the developer UX at a deep level
- head start in a platform ecosystem that's hard to crack
3. team that knows how to push the models into productized, developer-ready experiences
So it's not the prompts. It's engineering the scaffolding and UX around the model so it feels like magic to the user. That's what OpenAI is buying.
Also possible that Alex solved a lot of UX and integration challenges that could translate to other Apple contexts (productivity apps, design tools, even consumer-facing AI on macOS/iOS).
Without having used Alex myself (I don't do iOS or macOS development), I would guess that all the retrieval, context slicing, editor integration yada yada that they've built aren't necessarily unique to coding. Same scaffolding could support things like AI-driven writing, design, or general productivity in an Apple-native way.