←back to thread

486 points dbreunig | 4 comments | | HN request time: 0.02s | source
Show context
jsheard ◴[] No.41863390[source]
These NPUs are tying up a substantial amount of silicon area so it would be a real shame if they end up not being used for much. I can't find a die analysis of the Snapdragon X which isolates the NPU specifically but AMDs equivalent with the same ~50 TOPS performance target can be seen here, and takes up about as much area as three high performance CPU cores:

https://www.techpowerup.com/325035/amd-strix-point-silicon-p...

replies(4): >>41863880 #>>41863905 #>>41864412 #>>41865466 #
ezst ◴[] No.41863905[source]
I can't wait for the LLM fad to be over so we get some sanity (and efficiency) back. I personally have no use for this extra hardware ("GenAI" doesn't help me in any way nor supports any work-related tasks). Worse, most people have no use for that (and recent surveys even show predominant hostility towards AI creep). We shouldn't be paying extra for that, it should be opt-in, and then it would become clear (by looking at the sales and how few are willing to pay a premium for "AI") how overblown and unnecessary this is.
replies(6): >>41863966 #>>41864134 #>>41865168 #>>41865589 #>>41865651 #>>41875051 #
renewiltord ◴[] No.41864134[source]
I was telling someone this and they gave me link to a laptop with higher battery life and better performance than my own, but I kept explaining to them that the feature I cared most about was die size. They couldn't understand it so I just had to leave them alone. Non-technical people don't get it. Die size is what I care about. It's a critical feature and so many mainstream companies are missing out on my money because they won't optimize die size. Disgusting.
replies(5): >>41864304 #>>41864691 #>>41864921 #>>41866254 #>>41866907 #
1. ezst ◴[] No.41866907[source]
I'm fine with the mockery, I genuinely hadn't realized that "wanting to pay for what one needs" was such a hot and controversial take.
replies(1): >>41867899 #
2. ginko ◴[] No.41867899[source]
The extra cost of the area spent on npu cores is pretty hard to quantify. I guess removing it would allow for higher yields and number of chips per wafer but then you’d need to set up tooling for two separate runs (one with npu and one without) Add to that that most of the cost is actually the design of the chip and it’s clear why manufacturers just always add the extra features. Maybe they could sell a chip with the NPU permanently disabled but I guess that wouldn’t be what you want either?

Fwiw there should be no power downside to having an unused unit. It’ll just not be powered.

replies(1): >>41870106 #
3. ezst ◴[] No.41870106[source]
The argument boils down to "since it's there, better to keep it because making a version without it would defeat economies of scale and not save much, if at all", and that's a sensible take… under the assumption that there's a general demand for NPUs, which I contest.

In practice, everyone is paying a premium for NPUs that only a minority desires, and only a fraction of that minority essentially does "something" with it.

This thread really helps to show that the use-cases are few, non-essential, and that the general application landscape hasn't adopted NPUs and has very little incentive to do so (because of the alien programming model, because of hardware compat across vendors, because of the ecosystem being a moving target with little stability in sight, and because of the high-effort/low-reward in general).

I do want to be wrong, of course. Tech generally is exciting because it offers new tools to crack old problems, opening new venues and opportunities in the process. Here it looks like we have a solution in search for a problem that was set by marketing departments.

replies(1): >>41872019 #
4. Miraste ◴[] No.41872019{3}[source]
Modern SoCs already have all kinds of features with use-cases that are few and non-essential. Granted they don't take as much space as NPUS, but manufacturers are betting that if NPUs are available, software will evolve to use them regularly. If it doesn't, they'll probably go away in a few generations. But at a minimum, Microsoft and Apple seem highly committed to using them.