←back to thread

163 points wmf | 1 comments | | HN request time: 0s | source
Show context
smcleod ◴[] No.45367136[source]
Their top model still only has "Up to 228 GB/s" bandwdith which places it in the low end category for anything AI related, for comparison Apple Silicon is up to 800GB/s and Nvidia cards around 1800GB/s and no word if it supports 256-512GB of memory.
replies(2): >>45367166 #>>45367523 #
piskov ◴[] No.45367166[source]
Most consumers don’t care about local LLMs anyway.
replies(1): >>45367237 #
alphabettsy ◴[] No.45367237[source]
Yet the apps top the App Store charts. Considering that these are not upgradable I think the specs are relevant. Just as I thought Apple shipping systems with 8 GB minimums was not good future proofing.
replies(2): >>45367250 #>>45367341 #
piskov ◴[] No.45367250[source]
What apps with local llm top app store charts?
replies(1): >>45370954 #
1. happymellon ◴[] No.45370954[source]
They asked ChatGPT.