—————
What does not work: Keyboard, mouse, TB & USB-C ports, thermal/freq mgt.
Conclusion: Highly recommended
For what it's worth, the majority of mechanical RGB keyboards and mice are USB-A anyways, so, if you're fine with a very powerful machine that wouldn't have an internal keyboard support for a few weeks, sounds like a good advice anyways!
That’s actually a pretty big escape hatch for early development. It explains how you’d be able to get past having a nonfunctional keyboard pretty easily, for example.
Everyone is buying the tool that does the job, or building that tool if they want to make that large investment...
Honestly when my current Helix 2 finally starts to die on me I'll be looking for a tablet or hybrid replacement since I neither want nor need an attached keyboard+mouse anyway, in my normal usage they're mostly just something that takes up desk space.
Obviously there are also plenty of people with preferences entirely incompatible with this approach, but so it goes.
I'm updating my wiring and air conditioning for a 7x5090 workstation because having that power for experiments under the desk is worth the cost (and fire hazard).
If I had to build 10,000 of those I'd be banned by NVidia from ever buying their hardware again.
Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
Also of note, even the most premium keyboards and mice are Full Speed USB 1.1, running at up to 12Mbps. You can verify this yourself through the Apple menu, About This Mac, System Report, USB, and look for your external USB keyboard or mouse.
Compare to the USB-C USB4 ports being capable of up to 40000Mbps. And, to be full-featured, they need to support up to 100W, or more, of power input, as well as output in excess of what would be required for a USB-A USB3 5000Mbps or even 10000Mbps port. Which is to say, for the cost of a single USB-C, a manufacturer can easily provide 4 or more USB-A ports, with a change to spare. That would avoid unnecessary adapters and improve compatibility.
Not to mention that most of the memory sticks are still USB-A, too, and there's no Fit USB sticks for USB-C at all, only for USB-A. Which means that it's far easier to semi-permanently extend storage of a USB-A laptop than of a USB-C one, which you may want to do to try out a system without messing up with your main Windows installation.
It's basically a nab against Apple's decision to remove a useful port, especially on the M4 Mac mini, where they now have USB-C ports that are not fully-featured anymore — the ports at the front have a different spec than the ones at the back, which we all remember now at the time of the announcement, but imagine dealing with it several years down the line, where you'll have to be looking up specs and troubleshooting your things not working the way you expect. They could have easily avoided it by NOT including the 2 slow USB-C ports, and including like 4 USB-A ones instead.
The point was that even the more "premium" products are still USB-A, not USB-C.
USB-A simply isn't going anywhere.
Personally, I find USB-A more useful than HDMI, since HDMI is kind of inferior to USB-C in every possible way. I've tried using a 43" UHD TV as a monitor, since they're as cheap as $149.99 USD brand new, but it had noticeable delay even at 4k@60Hz, and just didn't feel right. The UHD resolution at 43" itself would actually be perfect, since 1080p at 20.5in used to be my fav a few years ago (before QHD at 23" started reaching the sub-$200 range), but, alas, the specs of a proper monitor (brightness, matte, USB-C with PD and a hub) are just better suited for a display compared to a TV, even if the resolution itself may seem ideal.
For mobile Linux in particular, I found that it's quite the opposite, I see projects like Phosh and KDE Plasma Mobile constantly showing UI and UX improvemnts (albeit at a slower pace than desktop projects), while basic hardware support is non-functional.
Of course I'm not expecting every UX/UI dev to abandon their project to jump into low-level kernel development and bring support for more devices, but it feels like the desktop environments are developing for a device that doesn't exist.
> Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
On a modern Linux distro (that isn't one of the "advanced" ones), the answer is yes. If you install something like Mint or Ubuntu, you have a graphical app store and driver manager (which AFAIK you only need for NVIDIA GPUs).
Often they're both, with detachable cables. Again, it was just weird that you brought up RGB specifically, something not at all central to your original point, a gimmick marketed at 'gamers' which is pointless in a well lit room; as well as mechanical keyboards specifically; as well that you've carried on like this in reply. Please don't take the time to reply again.
This is typically due to default settings on TVs enabling various high-latency post-processing options.
Most TVs have "game mode" settings that disable all these at once, at which point latency is comparable to monitors.
Case in point: at both 60 Hz and 120 Hz*, 4K latency on my LG C4 is within a millisecond of the lowest-latency monitors listed here:
https://www.rtings.com/monitor/tests/inputs/input-lag
I fully agree that HDMI is inferior to USB-C, if only because quality USB-C-to-HDMI adapters are widely available, and mini/micro HDMI connectors commonly used on small devices (not including this laptop) are garbage.
* Probably also true at 144 Hz, but the linked table doesn't have a dedicated column for 144 Hz, complicating comparisons.
Which is what a lot of users have.