This is the only thing stopping me from getting a Framework laptop right now. I'd pay a premium for it as well.
- no Management Engine
- chips that don't turbo boost themselves into throttling
- not supporting a company with a toxic approach to business
I believe AMD outperforms Intel when you're targeting mobile performance/battery life, rather than "moar CPU" workloads. Though that might change now that Intel is using their own approach to performance cores. Still, given the last decade of Intel development, they don't exactly have my trust that they'll execute performance cores without serious hiccups.
Better integrated graphics, especially with the upcoming line, if what AMD says holds true.
Non-toxic approach to business.
Dr. Lisa Su has done incredible things with that company, and I'll happily support a group that recognizes the need for experience in top tech positions vs. MBAs/Lawyers/Fund Managers/etc...
The modularity of some components can be assumed because they are industry standards, like wifi modules I suppose. Other components perhaps Framework have designed their own range of modules with a common form factor, but it must be very expensive to engineer a compatible mainboard in the same form factor with a different chipset, unless they are using an existing standardised design.
Mainly is just out of principle and voting with my wallet.
AMD is important for multiple reasons.
First, it shows that they listened to feedback. From way over here in the corner it seems like AMD has been the most requested feature for the Framework.
Second, many people perceive that AMD outperforms Intel.
Third, many people think it is extremely important to reward positive competition in the market place.
Eighth, it would truly, truly prove the upgradeability and versatility of the Framework. Then we could move on to imagining dual^H^H^H^Hquad-Arm boards and RISCV boards and other fantasies.
I would argue one of the most glaring problems with selling Framework laptops was that they where "still" on Intel 11th Gen hardware which is often perceived as "not so grate" of a choice.
I'm sure they would love to also ship AMD based mobos (and Arm too) but it needs to be profitable, i.e. the additional sales gained through also supporting AMD must outclass the higher logistic cost as well as higher development cost. This might not seem like a big deal but from the little experience I have with logistics and things like maintaining Intel and AMD BIOS support, still having pressure to also ship a faster Intel mother board etc. I highly duped this makes any sense at this point in time.
Also, yes many people perceive AMD outperforms Intel, but many also perceive the opposite! Sure competition is grate, but Framework is not yet a well established company. Lastly I don't think they need to technically prove that upgrading to AMD or ARM is possible, the problem is not technology but logistics, resources (BIOS maintenance, testing, etc), supply-chains and potentially shitty contracts and practices by Intel (and other Companies).
So IMHO they need to first establish themself well, and then branch out.
I bought an ASUS ZenBook earlier this year because as much as I like Framework's product, I don't want to give Intel another dollar after they bent me over a barrel for a decade.
Unfortunately it seems the pendulum swings on this one at least a bit. Unless you want a flagship CPU, you'll wait a good half year to a year to get half as much choice of budget CPUs with rather extreme handicap (cache).
Also half of them are OEM only.
Try to find a good current gen CPU for a small to mid sized NAS in their lineup, it's not easy.
https://media.ccc.de/v/36c3-10942-uncover_understand_own_-_r...
Not the GP, but here's my reason:
For dGPUs, I strongly prefer AMD over nVidia because of Linux driver support. In recent years, most laptops with an AMD dGPU have AMD CPUs.
It's possible that my calculus will change in the next few years. E.g., if any of these things come to market:
- good laptop with Intel CPU and AMD dGPU
- AMD CPU with a fast iGPU. (I know these are in the pipeline, but I'm waiting for benchmarks.)
- Intel's upcoming laptop iGPUs / dGPUs perform well and have good Linux drivers.
- nVidia's efforts to open-source parts of their Linux drivers address my personal pain points.
I have heard of these things before but I am not quite sure what the possibilities are. Do you have a link that can summarize what this actual means in terms of security concerns?
https://www.cpu-monkey.com/en/compare_cpu-apple_a14_bionic-v...
https://www.cpu-monkey.com/en/compare_cpu-apple_a14_bionic-v...
For people that need to use their devices on the go, I think it's a no brainer to prefer a Ryzen 6000 vs Intel.
The RDNA2-based Radeon 680M iGPU also significantly outperforms the (admittedly, much improved) Intel Xe iGPUs in 3D rendering. In synthetics, the new Radeon iGPUs are going head to head with Nvidia 1650 Max-Q dGPUs. This probably doesn't much matter if you aren't doing any gaming, but if you are, it means you can play most modern titles reasonably on the road in a thin and light form factor without giving up any battery life when you aren't.
Even if you want a flagship CPU; e.g. see the newest 5xxx series Threadrippers which were only released after a year and half and even then they are only available in overpriced e-waste systems from Lenovo where the CPU is locked down to the motherboard and won't work anywhere else.
AMD is not your friend. Just like every other huge corporation.
> where the CPU is locked down to the motherboard
Don't quote me on this, but I think I heard that this wasn't on by default?
https://www.cpu-monkey.com/en/compare_cpu-apple_a15_bionic_5...
There was a good thread here the other day on the subject of ARM hardware and the difficulties of things such as device trees and odd boot processes
Your level of understanding about how CPUs control their frequency, voltage, and power is evidently "none". Why spread comments like this which only serve to confuse and mislead readers?
Which is why you should reward behavior and not branding. Buy because they're doing/selling the right thing now, not because you've got loyalty towards a multinational conglomerate.
One signal for instance I want to send is "I buy from whoever has good Linux support". You stop supporting it well, I look for competition.
Still "no" level of understanding? If there's something incorrect about my statements, feel free to correct me -- I do want to learn more, and I'm certainly no expert in CPUs. But it's just flat out rude (and against the contributor guidelines, I believe) to comment like this. Build other people up, don't tear them down.
Yes, this is optimal and what literally everyone wants.
An airplane takes off at full power, reaches cruising speed, and reduces power to maintain cruising speed.
A CPU uses max power until it reaches its max operating temperature, then it maintains that temperature operating at lower power.
Why does the latter offend you when it's exactly the same as the former?
> Still "no" level of understanding?
Sadly, yes.
> don't tear them down.
This conversation started with you tearing down thousands of expert electrical engineers who make Intel CPUs.
A better analogy:
An airplane takes off at full power, reaches cruising speed, but its engines have overtaxed themselves and can't maintain altitude. The place descends to a suboptimal altitude until the engines can turn back on, and raise the plane back to the altitude it's supposed to cruise at.
Your CPU explanation is technically correct:
> A CPU uses max power until it reaches its max operating temperature, then it maintains that temperature operating at lower power.
Yep, this is a very high-level explanation of what CPUs do. The trouble with Intel processors today is that they use max power for too long, and have to throttle so heavily to "maintain that temperature operating at lower power" that you can notice the latency when the CPU downclocks. An ideal operating curve wouldn't use max power for so long that it causes obvious latency issues to an end user. That's why I have Turbo Boost disabled on my laptops -- the few seconds of "max power" it yields just aren't worth the massive downclock while the CPU cools down. Better to set a more conservative power level that doesn't get in my way. This is especially noticeable if you use emulation or a beefy IDE like Android Studio that turbo boosts your computer to a high temperature in the first few seconds of use, then turns text editing and code suggestions into a sluggish slideshow for the next few minutes because the CPU has downclocked. Or maybe I'm just imagining that?
> This conversation started with you tearing down thousands of expert electrical engineers who make Intel CPUs.
Did I say anything bad about the engineers? I have lots of disparaging things to say about the way Intel works as a business, mostly based around how product and sales operate. I think the engineers at Intel do the best they can under the constraints of a poorly run company. But there's a reason engineering talent has been fleeing for the better part of a decade.