←back to thread

163 points wmf | 4 comments | | HN request time: 0.27s | source
Show context
smcleod ◴[] No.45367136[source]
Their top model still only has "Up to 228 GB/s" bandwdith which places it in the low end category for anything AI related, for comparison Apple Silicon is up to 800GB/s and Nvidia cards around 1800GB/s and no word if it supports 256-512GB of memory.
replies(2): >>45367166 #>>45367523 #
piskov ◴[] No.45367166[source]
Most consumers don’t care about local LLMs anyway.
replies(1): >>45367237 #
1. alphabettsy ◴[] No.45367237[source]
Yet the apps top the App Store charts. Considering that these are not upgradable I think the specs are relevant. Just as I thought Apple shipping systems with 8 GB minimums was not good future proofing.
replies(2): >>45367250 #>>45367341 #
2. piskov ◴[] No.45367250[source]
What apps with local llm top app store charts?
replies(1): >>45370954 #
3. p_ing ◴[] No.45367341[source]
Looking at the Mac App Store in the US, no they don't. There's not an LLM app in sight (local or otherwise).
4. happymellon ◴[] No.45370954[source]
They asked ChatGPT.