←back to thread

172 points marban | 2 comments | | HN request time: 0.657s | source
Show context
bearjaws ◴[] No.40052158[source]
The focus on TOPS seems a bit out of line with reality for LLMs. TOPs doesn't matter for LLMs if your memory bandwidth can't keep up. Since it doesn't have quad channel memory mentioned I guess it's still dual channel?

Even top of the line DDR5 is around 128GB/s vs a M1 at 400GB/s.

At the end of the day, it still seems like AI in consumer chips is chasing a buzzword, what is the killer feature?

On mobile there are image processing benefits and voice to text, translation... but on desktop those are no where near common use cases.

replies(3): >>40052204 #>>40052260 #>>40052353 #
postalrat ◴[] No.40052204[source]
https://www.neatvideo.com/blog/post/m3

That says M1 is 68.25 GB/s

replies(3): >>40052258 #>>40052307 #>>40052631 #
1. givinguflac ◴[] No.40052258[source]
The op were obviously talking about M1 Max.
replies(1): >>40052494 #
2. postalrat ◴[] No.40052494[source]
How is it obvious? Anyone reading that could assume that any M1 gets that bandwidth.