Most active commenters
  • mrtksn(5)
  • parsimo2010(3)
  • klelatti(3)
  • muro(3)

←back to thread

292 points kaboro | 55 comments | | HN request time: 1.644s | source | bottom
1. parsimo2010 ◴[] No.25059497[source]
I accept that the performance of Apple's chips have increased rapidly in the last few years, but the benchmarks that they are using to compare to various x86 CPUs makes me suspicious that they are cherry-picking benchmarks and aren't telling the whole story (either in the Stratechery article or the Anandtech they got the figures from).

Why am I suspicious? THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k! I understand they are comparing single threaded speed. I'll accept that the A14 is more power efficient. I'll acknowledge that Intel has been struggling lately. But to imply that a low power mobile is straight up faster than a high power chip in any category makes me extremely suspicious that the benchmark isn't actually measuring speed (maybe it's normalizing by power draw), or that the ARM and x86 versions of the benchmark have different reference values (like a 1000 score for an ARM is not the same speed of calculation as a 1000 score on x86). It just can't be true that the tablet with a total price of $1k can hang with a $500 CPU that has practically unlimited size, weight and power compared to the tablet, and when the total price to make it comparable in features (motherboard, power supply, monitor, etc) makes the desktop system more expensive.

Regardless of whether it's an intentional trick or an oversight, I don't think that the benchmark showing the mobile chip is better than a desktop chip in RAW PERFORMANCE is true. And that means that a lot of the conclusions that they draw from the benchmark aren't true. There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.

replies(11): >>25059551 #>>25059579 #>>25059583 #>>25059690 #>>25059897 #>>25059901 #>>25060075 #>>25060410 #>>25060485 #>>25063022 #>>25064162 #
2. zepto ◴[] No.25059551[source]
Anandtech says you are wrong:

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...

replies(2): >>25059718 #>>25061489 #
3. gsnedders ◴[] No.25059579[source]
> THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k! I understand they are comparing single threaded speed.

FWIW, the 100W i9-10900k isn't even Intel's fastest single-threaded chip: that's the i7-1165G7 (a 28W part). Intel's desktop stuff is ancient: they're effectively still just Skylake (2015) derivatives; for single-threaded stuff the more modern designs they're shipping on mobile (Ice Lake and later) beat the desktop parts because their IPC is that much faster.

Power doesn't really help single-threaded performance, aside from wide vector units, nowadays.

replies(3): >>25060307 #>>25060384 #>>25065460 #
4. mrtksn ◴[] No.25059583[source]
Apple’s chip is not just a general purpose CPU, it is designed for specific workloads.

We have similar performance jumps in cryptocurrency mining: GPU’s are orders of magnitude faster than CPU’s and ASIC’s are orders of magnitude faster than GPU’s for the same power consumption.

replies(2): >>25059678 #>>25059861 #
5. norswap ◴[] No.25059678[source]
It's supposed to run a Mac, why isn't it a general purpose CPU? What specific workload do you have in mind? I could use it for web browsing, programming, media editing...
replies(2): >>25059931 #>>25059934 #
6. klelatti ◴[] No.25059690[source]
I'm not sure anyone is saying that the 5W A14 is faster than a 100W i9 (at least for sustained performance).

I don't see why you should say that 'no way' an M1 (with appropriate cooling etc) is going to outperform a desktop class system - if you read the Anandtech review it's clearly superior to competing architectures in many respects and is built on a better process technology.

7. parsimo2010 ◴[] No.25059718[source]
I read that article too and mentioned it in my first sentence. That's the article that Stratechery pulled their figures from. The validity of that article and those benchmarks is what I'm doubting.

Will the new Macbooks with M1 chips compare favorably against Intel laptops with low power and fanless designs? Yeah.

Is the existing A14 chip faster than than a 10900k (even in single threaded performance)? No way. There is something in the benchmark that is messed up to the point where you can't compare them.

replies(2): >>25059869 #>>25064113 #
8. klelatti ◴[] No.25059861[source]
But the M1 is a general purpose CPU and it is faster (without any help from Neural Engines etc) than competing devices.
replies(1): >>25059977 #
9. yzmtf2008 ◴[] No.25059869{3}[source]
See the neighbor comment. If i7-1165G7 (a 28W part) can be faster than 10900k, there’s no reason M1 couldn’t be faster.
replies(1): >>25060398 #
10. sooheon ◴[] No.25059897[source]
> THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k

"RIM thought iPhone was impossible in 2007": https://web.archive.org/web/20150517013510/http://www.macnn....

replies(1): >>25061083 #
11. anomaloustho ◴[] No.25059901[source]
Maybe I just need more education in this realm but I’m not sure why a difference in electrical wattage makes it physically impossible for one processor to produce a better result than another processor.

In the presentation, Johny Sruji seemed to place a bigger emphasis on the reduced power consumption than he did the speed. Saying things like, “this is a big deal” and “this is unheard of”.

In my mind, the argument of wattage seems analogous to saying, “There is no way a low wattage LED bulb will ever outshine a high wattage filament bulb.” I have assumed that we’ve been able to make leaps and bounds in CPU technology since the dawn of computers while also reducing power consumption.

But maybe there is some critical information I am missing here. I’m certainly no expert and would love to hear more about why the wattage aspect holds weight.

replies(2): >>25060209 #>>25060529 #
12. kridsdale1 ◴[] No.25059931{3}[source]
Perhaps the answer is that x86 is far more general purpose than that. Many of the instructions are as old as the Millennial workforce at Intel who is building their chips, from an era where the web wasn't a concern and GUIs were uncommon. A vast amount of transistors are dedicated to a different optimization than what the software of this decade cares about.

OTOH ARM and Apple have tailored their chips to the workload of ~10 years ago (running Javascript at single-digit watts) which is far more tailored to what actually gets used these days.

13. mrtksn ◴[] No.25059934{3}[source]
Anything that has to do with audio, video, graphics processing and so on.

There are videos on youtube demonstrating that you can edit large video files on an iPhone connected to an external monitor and do it smoother than a much larger PC.

Here is an example of using iPhone SE strapped to an external screen, editing a 4K footage: https://www.youtube.com/watch?v=LmbrOUPFDvg

Notice how smooth is everything.

replies(1): >>25066699 #
14. mrtksn ◴[] No.25059977{3}[source]
Why without any help from Neural Engines and etc.? They, the workload specific processors, are included in M1.
replies(2): >>25060044 #>>25060255 #
15. klelatti ◴[] No.25060044{4}[source]
They are but they don't account for the CPU performance jumps being quoted - these are the result of better standalone CPU performance.

Your parent comment seemed to imply that the leap was due to an architecture shift like CPU -> GPU. It's not, it's just better CPU design.

replies(1): >>25060179 #
16. dev_tty01 ◴[] No.25060075[source]
> There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.

Well, no point in arguing here. You may be right, but the machines will be in the hands of users next week. It would be stupid for Apple to make those claims if they weren't true. We'll see soon enough.

Assuming the claims are true, we shouldn't forget that Intel per core performance improvements have been incremental at best for several years. They've really run into some major problems with their fab process development in recent years. TSMC (Apple Silicon foundry) is well ahead. It has been kind of hard to watch since that has historically been such a strength for Intel. They're a strong company, they'll get it together.

replies(1): >>25063143 #
17. mrtksn ◴[] No.25060179{5}[source]
Well, there's no standalone CPU to talk about, is there?

Purpose built architecture could mean many things, like having efficient cores and high performance cores, codec specific hardware, the way that the memory is accessed, cache configuration, co-processors, signal processors.

Everything counts.

18. johncolanduoni ◴[] No.25060209[source]
So there is technically such a known limit due to a link between information-theoretic entropy and thermodynamic entropy, which provides a lower bound on energy usage for a particular digital circuit via the second law of thermodynamics. In simpler terms, there is an unavoidable generation of heat when you "throw bits away" like AND and OR gates do. However we are several orders of magnitude away from that efficiency bound in today's chips, so your analogy to LED bulbs is more apt than you may realize: LED bulbs are still far away from their theoretical maximum efficiency, but they're still a massive improvement over incandescent bulbs.

If you want to know more about this limitation, I suggest looking at a way of organizing computation that avoids this issue called "reversible computing"[1]. As I said, it won't be of practical significance for classical computing for a long while, but it's actually pretty fleshed out theoretically.

[1]: https://en.wikipedia.org/wiki/Reversible_computing

19. wmf ◴[] No.25060255{4}[source]
Plenty of workloads like SPEC and compiling don't use GPUs, neural networks, etc. They just use CPU cores, cache, and memory. Fortunately Apple has gotten the basics right and they have also added accelerators.
replies(1): >>25060897 #
20. ◴[] No.25060307[source]
21. muro ◴[] No.25060384[source]
Is it sustainably faster, or only for a few seconds until it throttles? Searching for "10900k vs i7-1165G7" showed the 10900k as mostly faster with a few exceptions.

I'm curious about next week's launch and benchmarks, Apple's claims compare it to a 1.2GHz i7, which I expect to quickly throttle. That's why I also expect the parent comment to be right, current desktop CPUs will still be faster.

replies(1): >>25061190 #
22. muro ◴[] No.25060398{4}[source]
the question is "for how long" - if it heats up and throttles after a few seconds, desktop CPUs will still have a major advantage for longer tasks.
replies(1): >>25060670 #
23. p3ndrag0n ◴[] No.25060410[source]
> Regardless of whether it's an intentional trick or an oversight, I don't think that the benchmark showing the mobile chip is better than a desktop chip in RAW PERFORMANCE is true. And that means that a lot of the conclusions that they draw from the benchmark aren't true. There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.

The benchmark is true but misleading. It compares 'Intel vs Apple Top Performance' meaning essentially the speed it could go at max. It is not a real world number or result and exists purely in a vacuum. If your phone ran at that speed for an extended period of time I guarantee it would melt. I think the only conclusion to be drawn is that Apple's mobile CPU's are very capable and well designed, and ARM has a lot of untapped potential.

24. reaperducer ◴[] No.25060485[source]
THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k!

An A14 is both faster and lower power than a 6502.

Also, why are you shouting? It's just computers. It's not important.

replies(2): >>25063564 #>>25092762 #
25. yazaddaruvala ◴[] No.25060529[source]
(Apologies I’m not sure what your educational background is)

Basically, higher wattage makes chips create more heat in shorter time.

Heat in general is destructive, like with cooking or with camp fires or when a car “over heats” and stops working. Silicon chips are very detailed and any small change could make them stop working. So the heat applied needs to be below some threshold (i.e. don’t let it get too hot).

If the chip needs more wattage it creates more heat and that heat needs a “heat sink” and fan to protect the chip from degrading.

Heat sinks and fans require a lot of space, high surface area to volume ratios. Take a look at the PS5 tear down, 90% of the insides are a heat sink. Laptops and phones don’t have a lot of room for heat sinks or fans.

Therefore, if the chip can use less wattage then it will get less hot. Meaning it can work better in “fan-less” devices like the MacBook Air and the iPhone and iPad.

26. DeRock ◴[] No.25060670{5}[source]
2/3 devices announced yesterday come with active cooling (MacBook Pro 13" and Mac mini).
replies(1): >>25061530 #
27. mrtksn ◴[] No.25060897{5}[source]
I would guess that the tight proximity of the components as well as the way their communication is designed also brings something on the table. People are used to complain that the ram is soldered to the board, now it is part of the IC.
28. gamblor956 ◴[] No.25061083[source]
They thought the iPhone was "impossible" in the sense that it couldn't offer everything it claimed to offer without having terrible battery life.

And they were absolutely correct: battery life on the original iPhone was abysmal. But it turns out that consumers didn't care.

replies(2): >>25063007 #>>25064282 #
29. hombre_fatal ◴[] No.25061190{3}[source]
You can see a single-thread benchmark for both of those here: https://www.cpubenchmark.net/singleThread.html (comparing desktop vs laptop tabs), they get a similar score.

The perf vs power charts on that website also put to rest the mistake of thinking perf simply increases with watt consumption.

As for performance vs heat, well, you'd expect even better results from the chip consuming much less power. How does that 100W chip perform with a phone-miniaturized heat sink? Or the power-sipping chip with a double-tower fan cooler?

replies(1): >>25067535 #
30. barkingcat ◴[] No.25061489[source]
at the end of the article:

"This moment has been brewing for years now, and the new Apple Silicon is both shocking, but also very much expected. In the coming weeks we’ll be trying to get our hands on the new hardware and verify Apple’s claims."

Anandtech doesn't have the ability to say either way since they are also going by the marketing data ... once they benchmark it for real then they'd be able to prove either way, but your statement is premature.

replies(2): >>25062112 #>>25063358 #
31. barkingcat ◴[] No.25061530{6}[source]
Apple active cooling systems are also not that great, with the six core MBP15" heating and throttling when it was first released, and the MBP 16" also having heating/throttling issues.

The Mac Mini also isn't the paragon of active cooling. I've worked with one of the current gen Intel Mac Minis, and that thing gets really hot! Like 60-80 degrees celcius. The insides must be cooking if the outside is that hot.

The 2013 Mac Pro also had heating design issues, only corrected with the current gen Mac Pro.

I'd say active cooling is a consistent weakness of the entire modern Mac hardware design division.

replies(1): >>25065542 #
32. zepto ◴[] No.25062112{3}[source]
If my statement is premature the parent’s all caps statements are barely more than a zygote.
33. donor20 ◴[] No.25063007{3}[source]
I used / bought the original iphone and thought it was fine, and the phone was magic.

The next one or two though had really bad battery life (iphone 3g?) I mean, 3-5 hours active use down from 7-8 (which meant you needed to charge one in morning or evening usually).

Remember, the original iphone had data, but it was pretty slow (but still amazing).

replies(1): >>25063174 #
34. alwillis ◴[] No.25063022[source]
I think you should prepare yourself for quite a shock.
35. oblio ◴[] No.25063143[source]
Apple is the company that was cherry picking benchmarks for YEARS as PowerPC was being crushed by Intel. Apple has been making false or at least misleading claims forever.

Not all of them, mind you, but you need a boulder of salt.

replies(2): >>25065480 #>>25066193 #
36. oblio ◴[] No.25063174{4}[source]
Well, you overlooked the battery side because of other features. But you have to remember that battery life for the average phone back then was measured in the tens of hours. I'd use to leave for the weekend with my Nokia without a charger. A decade later and smartphones aren't there yet. We've just learned to accept the pain :-)
replies(1): >>25066658 #
37. parsimo2010 ◴[] No.25063358{3}[source]
That comment was about the M1 chip, which hasn't been benchmarked outside of Apple. The A14 has been around long enough (reviewers have had them for at least a few days) for people to run the (flawed/misleading) benchmarks on, which is the chip I was talking about in my original comment.
38. trimbo ◴[] No.25063564[source]
Same as a Cray XMP! That baby used 345 kilowatts.

But comparing against 40 year old technology has nothing to do with comparing against current offerings.

We'll just have to wait a week to see how it fares compiling Chrome.

replies(2): >>25063751 #>>25063754 #
39. npunt ◴[] No.25063751{3}[source]
While it's Intel's latest, the i9-10900k is hardly a current offering - its yet another spin of Skylake, a 5 year old CPU design, and using a variant of Intel's 14nm, a 6 year old litho process.

The i9 has a density of ~44mT/mm2 versus the M1's 134mT/mm2 (3x)

The i9 has ~9.2B transistors, compared to the M1's 16B (174%)

The M1 is two generations ahead on lithography and has a more sophisticated CPU design than Intel. It'll do fine.

40. MrRadar ◴[] No.25063754{3}[source]
> Same as a Cray XMP!

That reminds me of this classic bit of technology humor, the Apple Product Cycle[1]. It doesn't ring quite as true today as when it was first posted, but the broad strokes are still similar. Specifically it appears we're on the stage where "The haters offer their assessment. The forums are ablaze with vitriolic rage. Haters pan the device for being less powerful than a Cray X1 while zealots counter that it is both smaller and lighter than a Buick Regal. The virtual slap-fight goes on and on, until obscure technical nuances like, “Will it play multiplexed Ogg Vorbis streams?” become matters of life and death."

[1] https://web.archive.org/web/20061028040301/http://www.mister...

41. kergonath ◴[] No.25064113{3}[source]
So you have no idea why, but you are certain they are wrong? There are several sibling comments with lots of reasons why a lower-power core could perform better than an outdated desktop one.
42. dreamcompiler ◴[] No.25064162[source]
The RAM is inside the M1 package. That has to make a huge difference in memory access time; it probably saves about a nanosecond in each direction just because it's closer to the CPU. There's probably other stuff going on like little or no microcode compared to the x86 ISA. So yeah, it's plausible that the M1 is really faster in absolute terms than a desktop PC.
43. rconti ◴[] No.25064282{3}[source]
I don't have the stats, but I know for a fact I've never owned an iPhone (back to the original) that needed to be charged midday. Not much more matters to the average user.

And I'm sure there are power users who killed their original iPhone by noon in 2007, and I'm sure there are power users who do the same today.

replies(1): >>25065337 #
44. nvrspyx ◴[] No.25065337{4}[source]
This was in the time of dumbphones, which had battery life on the span of days, not hours. Compared to cell phones back then, it was relatively abysmal to need to charge your phone once a day. Even Blackberries didn't require daily charging back then IIRC.
replies(1): >>25066732 #
45. m463 ◴[] No.25065460[source]
power heat perf - pick one

desktops allow all of them

46. fomine3 ◴[] No.25065480{3}[source]
To be fair, all consumer chip companies' PR is hype-ish. Anyway we should wait for independent benchmark rather than official Ad.
47. fomine3 ◴[] No.25065542{7}[source]
Apple's cooling design isn't great for cooling (maybe great for visual?), but it should be advantage for Apple Silicon compared to Intel CPU on the same Mac.
replies(1): >>25067898 #
48. snowwrestler ◴[] No.25066193{3}[source]
PowerPC chips had some advantages over Intel, which was why they were used to create supercomputers for a while, and were picked to power the first Xbox.

PowerPC’s biggest flaw was power efficiency, a situation which became critical as Apple’s sales skewed toward laptops. The G5 was a beast but it ate power like a beast too; it was never going to work in a laptop, and so Apple had to switch.

replies(1): >>25069040 #
49. photojosh ◴[] No.25066658{5}[source]
Put it on low power mode and only use it to make a few calls and send a few texts and an iPhone will easily last a weekend. The real issue is that many people actually use their phones constantly, and it's not just for calls/texts.
50. photojosh ◴[] No.25066699{4}[source]
I thought the mouse support was iPad only... thanks for the TIL.
51. rconti ◴[] No.25066732{5}[source]
For sure. I remember my Ericsson T39 was multi-days, and I think the extended battery (which made it thicker) took it to... 9 days? Maybe more? Insane. Of course, the screen was never on because there was nothing to do with it :)
52. muro ◴[] No.25067535{4}[source]
Agreed, I think you make different design decisions- a mobile CPU is not just an underclocked desktop CPU. Apple also introduced a new CPU, not just put a A14 inside.

I'm looking forward to benchmarks and seeing how well the new machines work - compiling speed, lightroom, responsiveness, how well Chrome works with max 16GB RAM :).

53. the_lucifer ◴[] No.25067898{8}[source]
I mean the whole point of Apple Silicon is that since it's all made by Apple they can control how much heat their MacBooks generate now
54. noisem4ker ◴[] No.25069040{4}[source]
It's the Xbox 360 which had a PowerPC CPU, named Xenon. The first one used a standard Intel x86 chip.
55. ksec ◴[] No.25092762[source]
>Also, why are you shouting?

Exactly. It is very annoying. Especially coming from recently registered accounts. Instead of posting a question asking why, when I could have Explain to him like he was 5, it is now a massive rant.