←back to thread

255 points tbruckner | 1 comments | | HN request time: 0.227s | source
Show context
m3kw9 ◴[] No.37421727[source]
OpenAIs moat will soon largely be UX. Anyone can do plugins, code etc but when operating by everyday users the best UX wins after LLM becomes commodified. Just look at stand alone digital cameras vs mobile phone cams from Apple.
replies(3): >>37422623 #>>37423170 #>>37424079 #
smoldesu ◴[] No.37422623[source]
> but when operating by everyday users the best UX wins

Is that not why OpenAI is ahead right now? For free, you can have access to powerful AI on anything with a web browser. You don't need to wait for your SSD to load the model, page it into memory and swap your preexisting processes like it would on a local machine. You don't need to worry about the local battery drain, heat, memory constraints or hardware limitations. If you can read Hacker News, you can use AI.

Given the current performance of local models, I bet OpenAI is feeling pretty comfortable from where they're standing. Most people don't have mobile devices with enough RAM to load a 13b, 4-bit Llama quantization. Running a 180B model (much less a GPT-4 scale model) on consumer hardware is financially infeasible. Running it at-scale, in the cloud is pennies on the dollar.

I'm not fond of OpenAI in the slightest, but if you've followed the state of local models recently it's clear why they keep coming out ahead.

replies(1): >>37422761 #
anurag6892 ◴[] No.37422761[source]
this advantage is not specific to OpenAI right? Any big cloud provider like Amazon/Google can host these open LLM models.
replies(2): >>37422795 #>>37424905 #
1. nerbert ◴[] No.37424905[source]
OpenAI's got the first mover advantage. It's everything if you don't fuck up.