—————
What does not work: Keyboard, mouse, TB & USB-C ports, thermal/freq mgt.
Conclusion: Highly recommended
No idea about power consumptions.
It still scares me.
I basically did the following on trunk:
$ CPP=/usr/bin/clang MAKEOBJDIRPREFIX=/private/var/tmp/obj ./tools/build/make.py TARGET=arm64 TARGET_ARCH=aarch64 --host-compiler-type clang --debug -j17 --clean buildworld
You probably can follow build(5) from FreeBSD hosts instead.NetBSD is similar, but you need to edit `tools/llvm/Makefile` and make sure that you use the following target for `support-modules` instead:
support-modules: module-test.cpp Makefile
- if ${HOST_CXX} -stdlib=libc++ -c -fmodules -fcxx-modules -fmodules-cache-path=./module.cache \
- ${.CURDIR}/module-test.cpp 3> /dev/null 2>&1; then \
- echo HOST_SUPPORTS_MODULES=yes > ${.TARGET}; \
- else \
- echo HOST_SUPPORTS_MODULES=no > ${.TARGET}; \
- fi
+ # Just don't use modules pre for C++20 targets. Some compilers cannot support them.
+ echo HOST_SUPPORTS_MODULES=no > ${.TARGET};
You can further speed up NetBSD builds by editing `share/mk/bsd.sys.mk` and removing the workaround for SunPro's cc. The repeated invocation of /bin/mv for each object file really does add up.I have not tried cross builds of OpenBSD from other operating systems.
For what it's worth, the majority of mechanical RGB keyboards and mice are USB-A anyways, so, if you're fine with a very powerful machine that wouldn't have an internal keyboard support for a few weeks, sounds like a good advice anyways!
ThinkPad X13s Snapdragon was fanless, but it's a bit old now, plus, only 2x USB-C, without any USB-A ports, and a screen that doesn't open 180°, unlike any other ThinkPad, meh.
The CPU is pretty fast as well. I did no real benchmarks, but C++ std::sort() on the Snapdragon runs just 10-20% slower than on my 4 year old Ryzen 5 5600X desktop. Also, the base model T14s comes with 32G of memory, which is very nice.
On the other hand, I dropped mine in the street, damaging the upper right corner of the display (physically intact, but dead pixels in the corner). Even though the case material is nice, the laptop seems to be more fragile than older Thinkpads. (I've dropped my T480 and T450 numerous times, and never had issues other than cosmetic.) So the $35 accidental damage protection was worth it.
A bit more works on the T14s Gen 6 too, such as the keyboard! ;-)
Besides all the crappy Linux desktop software today (I have been trying multiple recent distros out on multiple new laptops... all the Linux desktop stuff now is buggy, features are gone that were there 10 years ago... it's annoying as hell). The ARM experience is one of being a second-class citizen. A ton of apps are released as AppImages or Snaps/Flatpaks. But they have to be built for both X86_64 and ARM64, and extremely few are built for the latter. Even when they are built for it, they have their own bugs to be worked around, and there's fewer users, so you wait longer for a bugfix. The end result is you have fewer choices, compatibility and support.
I love the idea of an ARM desktop. But it's going to cause fragmentation of available developer (and corporate/3rd-party) resources. ARM devices individually are even more unique than X86_64 gear is, so each one requires more integration. I'm sticking to X86_64 so I don't have to deal with another set of problems.
I wonder where Poul-Henning, who is based in Denmark, got that price. Perhaps he managed to get US pricing.
Lenovo EU are notorious for charging a ton of money for new models with limited supply. And poor after-market support, as everything is outsourced.
(Typing this on a T14-Gen5-AMD, under linux, which is still not really stable with the amdgpu driver crashing at least weekly)
I recently bought a Mac mini M4 to experiment with this setup, and am strongly considering getting a MBP if it works as advertised. As a longtime ThinkPad user and F/LOSS enthusiast, it feels awful giving money to Apple to run Linux as a second-class citizen, but honestly there is just no comparable hardware that does what Apple has accomplished.
I am as general purpose, regular person as you're going to find, in this world at least. I stare at a sentence like "In a functional programming language, everything is a function" and just blink. But a few months of blood and suffering to learn Nix/NixOS and I am managing the family's computers from a single repository and working faster than ever.
Short story: good but compatibility issues.
(the comment you replied to was clearly arguing quality rather than quantity, so that's what I'm asking too)
The usuals are there, like Libreoffice, though I use browser-based MS Office too. Firefox and all my plugins Just Work.
I do a ton of photo work with Darktable, which I have come to appreciate after years of fighting it. Writing tools. Software development tools. It's arguably overpowered for my needs, but that also translates into 16-hour battery life (less than macos, but plenty), dead quiet, and a machine that does everything I ask without complaint.
For the kiddo, it's mainly about configuring and locking down the machine ... and getting it back up and running quickly if he breaks something. I've been using off-lease, years-old Thinkpads for him. No games to speak of, but he's more of an xbox kid anyway. I should probably do parental controls, but I have that largely handled at the DNS level anyway.
That’s actually a pretty big escape hatch for early development. It explains how you’d be able to get past having a nonfunctional keyboard pretty easily, for example.
Anything CAD related, because there’s next to no professionally used CAD software on Linux.
Audio stuff. How many DAWs have a significant Linux user base?
And even beyond that, how many website devs are on Linux? Most people making product pages aren’t on Linux because not a lot of the designers work on Linux and it’s better to have a mono culture of devices for office management.
And your question is what one would rather do on macOS/Windows rather than Linux which again is subjective even if I scope it to stuff you can do on both Linux and macOS and windows.
Flip that around, why would someone use Linux to develop when they could use macOS? Can you give a non-subjective answer? Then try the opposite.
Even if you’re developing for Linux deployments, you can still do most of it local and then spin up docker in the VM on demand.
The number of software developers who need to run a Linux VM on their Mac/Windows are a vast minority.
Happy with my gen 11 x1 carbon (the one before they put the power button on the outside edge like a tablet ?!?)
Everyone is buying the tool that does the job, or building that tool if they want to make that large investment...
* Inventor (Mechanical analysis)
* PLAXIS (Geotechnical finite element analysis)
* Aspen HYSYS (Chemical process simulation)
Probably a pile more that I don't know off the top my head.
You should bring your dist fully up to date, and also make sure you have the latest amdgup firmware files (most likely not yet available in your dist) https://gitlab.com/kernel-firmware/linux-firmware/-/commit/9...
Eventually I moved into VMWare Workstation, for GNU/Linux stuff on the desktop, with an aging Asus netbook on the side.
Nowadays, the netbook is dead, its used replaced by tablet, and my desktops are Windows/WSL (for Linux containers only, started on demand).
At work our workstations are a mix of Windows and macOS, leaving GNU/Linux on the servers.
Not even x86 is 100% usable on laptops, meaning supporting every single feature, and late nights to fix stuff eventually gets old.
1: https://psref.lenovo.com/Product/ThinkPad/ThinkPad_T14s_Gen_...
On the other hand electronics CAD had been run mainly on Solaris decades ago, but for the last 20 years Linux has been the most likely host, including for the most expensive commercial professional solutions.
I have never heard of anyone using macOS for any kind of electronics design.
For the first time on Linux I feel better, like I am not just making sacrafices for values but like the actual whole all-around experience is better in most ways compared to my work Mac (M2 Pro so fans abound and not as aesthetically pleasing as the Airs IMO). It's instantly snappy, I have a nice large SSD, I've already swapped out RAM, no issues with key software, I have a theme with a desktop experience I prefer over the Mac one, and I can go to a presentation and type without fans stressing me out. As someone in AI for a while, personally, I don't value GPUs or NPUs, but that would be a difference. That's really leaps and bounds over Linux from 2016 or 2010 on Laptops.
Was planning on getting one.
Also, for Lenovo - https://psref.lenovo.com/ shows their products without marketing filler.
The only notable exception here is Apple with their absolutely bonkers RAM upgrade prices, which is why I would never buy a Macbook.
EDIT: I just HAD to look, MacBook Pro(ha!) by default with 16GB unified memory, it will set you back 400$ to go to 32GB, so more than 4x what Lenovo takes (64GB not even possible, of course).
> M1 MacBook was 30 times faster at generating tokens.
Apples and oranges (pardon the pun). llama.cpp (and in turn LMStudio) use Metal GPU acceleration on Apple Silicon, while they currently only do CPU inference on Snapdragon.
It’s possible to use the Adreno GPU for LLM inference (I demoed this at the Snapdragon Summit), which performs better.
I still think it would be beneficial for us to keep memory swappable at all costs. And if the connector is the problem, they should address that, rather than just accepting these tactics that _enable_ manufacturers in setting their own prices. I'm not saying they all do this, but there's plenty of them and Apple is the perfect example like you say.
I'm using an X1 Carbon Gen 11, and for my purposes at least, it's an improvement over every previous generation.
I'd love to switch to a Framework one day, but I'm not willing to use a laptop without mouse buttons. (I don't care about the TrackPoint at all; I do care about having physical mouse buttons.)
https://en.wikipedia.org/wiki/Apple_M3#Variants
Similarly, the M4 Pro is available with 12 or 14 cores, the M4 Max with 14 or 16:
https://en.wikipedia.org/wiki/Apple_M4#Comparison_with_other...
What an absolute shitshow. I'm surprised Lenovo sells laptops in Europe with these prices.
Just to be clear, that applies to the base M4. M4 Pro and Max can go up. (To 128GB I think).
And I believe these are LPDDR5X 7500 on the base M4 model. So it is more expensive than Lenovo even though it is slower.
M4Pro and Max get much faster RAM though.
If Qualcomm continues to actually work on Linux, rather than let enthusiasts do all of the work for them like Apple, I think ARM on Linux is going to be all Qualcomm with Macs yet again being a second citizen in the Linux world. For Windows, it's already a choice of "do I want to be forced into using Windows for a couple hundred dollars of savings".
Parallels has some glitches (graphical flicker when it runs low on guest memory, less than stellar trackpad scrolling) but is otherwise very stable. I like that I have access to Linux and macOS at the same time, the other side is just a 3 finger swipe away, and cut-n-paste and shared folders work. Sound and video all work, though for things like zoom calls I tend to use the macOS side. All runs happily together with 16GB RAM for each side (and I often have both xcode and android studio open on the macOS side while compiling large projects on the Linux side).
I use the top half of a Helix 2 with a Thinkpad Tablet 2 Bluetooth Keyboard because I'm one of the three people in the world who actively likes the optical trackpoint.
If you almost always use the machine on a desk rather than literally on your lap (for which I use the same keyboard paired with an 8" tablet) it's not even -much- additional hassle.
So maybe it wouldn't work for you, but "basically useless" is silly.
Honestly when my current Helix 2 finally starts to die on me I'll be looking for a tablet or hybrid replacement since I neither want nor need an attached keyboard+mouse anyway, in my normal usage they're mostly just something that takes up desk space.
Obviously there are also plenty of people with preferences entirely incompatible with this approach, but so it goes.
I went with the ASUS Zenbook. It's not perfect in terms of Linux drivers or support but they are built solidly. I would pick them again over Dell, HP or the Chinese rebrands.
I'm updating my wiring and air conditioning for a 7x5090 workstation because having that power for experiments under the desk is worth the cost (and fire hazard).
If I had to build 10,000 of those I'd be banned by NVidia from ever buying their hardware again.
The X1C6 had the potential to be a great laptop, except it was plagued by charging issues from the beginning and was limited to 16GB of RAM.
The X1Y6 is perfect; I can't find a single issue with it.
There hasn't been a decent one since the t440 and the only way to get that to a good standard was by modding the hell out of it.
The t61 was the last good one.
Framework is not quite at that quality but it's better than any other laptop made in the last 10 years.
When VAT incentivises people to essentially take their holidays outside the EU - not even incentivises, subsidises(!) - VAT's too high.
Yup, the are awesome, I'm on my second decade of driving them, should be able to get another decade out of the supply.
Meanwhile, ARM is a complete disaster, a mess and mangled by profit considerations when looking at the complete lack of platform standardisation and issues around compatibility. Issues which require significant engineering effort to bring a single ARM device in line with any random x86-based device that came out this year.
Asus may have deplorable predatory customer service, but if I buy the thing from a local reseller they have to deal with that instead of me if something goes wrong, so it doesn't really affect me haha.
Actually I may be getting the numbers wrong, it could be the t430 that I was thinking about. It's been rather a long time since I did any brain surgery on thinkpads.
Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
I rewrote part of their camera stack once to find that they hadn't managed cache coherence for the MIPI DMA, and didn't connect the coherency domains to handle it in hardware. Ticket probably still not being worked in their support portal.
Very rarely I see a little horizontal strip of corruption in my camera photos and roll my eyes.
The additional cost is obviously not just from VAT.
I'm curious what you mean by that? Just that it's usually a large fraction overall? At least it's apt per pixel instead of blasting LED strips and subtracting the light we don't want with LCDs.
Also of note, even the most premium keyboards and mice are Full Speed USB 1.1, running at up to 12Mbps. You can verify this yourself through the Apple menu, About This Mac, System Report, USB, and look for your external USB keyboard or mouse.
Compare to the USB-C USB4 ports being capable of up to 40000Mbps. And, to be full-featured, they need to support up to 100W, or more, of power input, as well as output in excess of what would be required for a USB-A USB3 5000Mbps or even 10000Mbps port. Which is to say, for the cost of a single USB-C, a manufacturer can easily provide 4 or more USB-A ports, with a change to spare. That would avoid unnecessary adapters and improve compatibility.
Not to mention that most of the memory sticks are still USB-A, too, and there's no Fit USB sticks for USB-C at all, only for USB-A. Which means that it's far easier to semi-permanently extend storage of a USB-A laptop than of a USB-C one, which you may want to do to try out a system without messing up with your main Windows installation.
It's basically a nab against Apple's decision to remove a useful port, especially on the M4 Mac mini, where they now have USB-C ports that are not fully-featured anymore — the ports at the front have a different spec than the ones at the back, which we all remember now at the time of the announcement, but imagine dealing with it several years down the line, where you'll have to be looking up specs and troubleshooting your things not working the way you expect. They could have easily avoided it by NOT including the 2 slow USB-C ports, and including like 4 USB-A ones instead.
This applies to pretty much ever manufacturer. Worked in a computer store for many years, and every manufacturer had some models with high return rates, but also models with low/normal return rates.
Saying "I will never buy $x because I had a bad experience with a bad model" is almost always a mistake IMHO.
What does matter is how they deal with the bad models. Some brands were definitely a lot better than others, and I generally advised customers based on that.
Most laptops IME mostly get used on a desk whether with additional paraphernalia or not; 'laptop' gets used to describe the class of machine more than whether it's touching your legs or not.
If you meant literal in-lap use it probably would've been better to specify it, and if you didn't mean 'useless' entirely it would probably have been worth clarifying that since your questioning whether the recommendation was sarcasm rather suggested you -did- mean useless entirely.
Language is a pain, sadly, but I don't think lojban is going to win any time soon.
Web dev is painful on Windows. NTFS really can't deal with 50k+ files in node_modules, I think it's something like 50-100x slower still. Not to mention that Windows by default _still_ tries to go and index your entire node_modules and silently poops the bed. This is one of the main reasons behind WSL's popularity, but that only works if you don't have your code on a mounted NTFS volume. Make the mistake of reopening the same repo you started working on in Windows in WSL2 thinking that you'll see an improvement & you'll bed disappointed - silently. You have to move your react project out of NTFS entirely and let the virtualised linux kernel manage those descriptors if you want to get work done.
macOS has a similar problem - not with NTFS but with Docker / virtiofs. Web development experience is generally awesome on macOS if you're working natively, but if you want to work inside a devcontainer (to maintain consistency with prod) and depend on Docker, it slows down considerably. I've heard that OrbStack and Colima have recently made this much better on macOS, but I've not tried it recently. But other more serious software development scenarios, where you want a might want a local k8s environment or you're writing lambda functions that you want to test locally? You have to use Docker and take the hit. In Linux it has always just worked (podman aside). Not to mention that Chrome's memory management is way better there than in Windows (thanks, ChromeOS).
For the rest of these please keep in mind that I explicitly said I was talking about quality of experience rather than quantity of people having to suffer through it, so the whole 'how many X people are on Linux when their manager makes them use Windows' argument is one I was specifically trying to avoid. With that said, I'll try to answer the rest of your qs:
> Anything graphics programming related. D3D and Metal are significantly more prevalent than OpenGL or Vulkan.
Agree almost completely - you wouldn't be building most graphics to run on Linux, so why not develop where your target platform is best supported. I disagree with your assertions around opengl or vulkan (see android), but UE5/Unity support in Windows vs elsewhere proves your point.
> Anything CAD related, because there’s next to no professionally used CAD software on Linux.
Agree, again obviously. In my case I love Onshape, and it works really well on Linux (apart from spacemouse driver support, which is a spacemouse issue not an Onshape one - there's no such thing as a class-compliant spacemouse interface and direct usb access for chrome would need them to implement a full driver; they invested heavily in getting a hack working on Windows but obviously not worth it for Linux, if for no reason other than that their target userbase will be extremely accustomed to Windows because of historical software availability and compatibility). But yeah, Onshape is an exception.
> Audio stuff. How many DAWs have a significant Linux user base?
Ardour supposedly has some recent converts, and the kernel is supposed to be acceptably good at realtime stuff now, pulseaudio/jack are supposedly better now. Regardless, you're right - it's too little too late. FWIW last time I did any real audio work absolutely nothing came close to CoreAudio (even on intel hackintoshes, or _especially_ on intel hackintoshes vs Windows on the same hardware). I don't think that has changed much since. RME drivers make a difference on Windows but WMI still sucks and ASIO on windows still isn't as stable as mac. Reaper officially supports Linux (<3 Justin F) but it's still dependent on Wine and yabridge, i.e. will probably never be on par. Reaper aside (which is a genius piece of software, on par with Samplitude 7 and Logic 5.5.1).
> And even beyond that, how many website devs are on Linux
Almost all of the ones I know, with a few of them still on Mac but curious. Literally none on Windows.
> Flip that around, why would someone use Linux to develop when they could use macOS? Can you give a non-subjective answer? Then try the opposite.
Hopefully I did that when I explained myself above.
> Even if you’re developing for Linux deployments, you can still do most of it local and then spin up docker in the VM on demand.
> The number of software developers who need to run a Linux VM on their Mac/Windows are a vast minority.
I think I already answered this, but for my (admittedly ignorant initial) definition of "actual engineering", unless you're targeting iOS or desktop development, _everyone_ is developing for linux as their primary target. Everyone.
I directly disagree with your final two statements, and that's kind of the point I was trying to make. For modern cloud/infra/saas/product/platform/web dev, i.e. the arguably subjective definition of "actual engineering", everything else is a compromise. Docker is, VMs are, WSL is.
Also why is that the “arguable subjective definition”? Why are we trying to define “actual engineering” at all, even if subjective?
On what scale are you defining that? Users? Complexity?
Not to be rude, but it feels like “people in a Linux bubble are just reluctant to admit that there’s a wider non-Linux world out there”.
You said “most of the engineering discussed on here” but there’s tons of posts about graphics engineering. It just happens that web/saas is the most approachable end of software engineering. Again, and not trying to be rude, I think this is a case of being too close to the trees to see the forest. Are you perhaps only clicking on the links to subjects that are relevant to your domain knowledge?
Why are CPUs being built if not for the market? Who are they building CPUs for? And who decided what kind of CPU the world needs?
I like the idea of alternative architectures as any other geek but this lately kind of thinking that permeates the subject comes out as academic arrogance.
1. The laptop overheats easily. It is usually hot to the point of being painful to touch. It has melted the adhesive of the rubber strip on the bottom, which has fallen off.
2. The trackpoint is malfunctioning. Several times a day, the mouse cursor will jump to the top of the screen, and be stuck there until I wiggle the trackpoint fully in all directions.
3. There's coil whine and clicking from some part of the power intake.
4. Battery life is extremely poor, usually on the order of ~2 hours.
5. Sometimes the trackpad buttons will stop working. You have to put the laptop to sleep and wake it up again to get them back.
6. Plenty of random freezes and black screens.
The point was that even the more "premium" products are still USB-A, not USB-C.
USB-A simply isn't going anywhere.
Personally, I find USB-A more useful than HDMI, since HDMI is kind of inferior to USB-C in every possible way. I've tried using a 43" UHD TV as a monitor, since they're as cheap as $149.99 USD brand new, but it had noticeable delay even at 4k@60Hz, and just didn't feel right. The UHD resolution at 43" itself would actually be perfect, since 1080p at 20.5in used to be my fav a few years ago (before QHD at 23" started reaching the sub-$200 range), but, alas, the specs of a proper monitor (brightness, matte, USB-C with PD and a hub) are just better suited for a display compared to a TV, even if the resolution itself may seem ideal.
https://lenovo.com/us/en/p/21n1cto1wwus1
It's $1184.40 for the default CTO with 32 GB LPDDR5X-8448MHz, and the upgrade to 64GB LPDDR5X brings the total cost of T14s to $1297.20 USD.
Even though the upgrade is listed at $193.00, that's actually the MSRP before the near-permanent discounts that Lenovo is very famous for, because 1297.20 - 1184.40 = 112.8. E.g., the extra 32GB of LPDDR5X-8448MHz — it's actually a faster variant of LPDDR5X than used in the base M4 — costs only a net $112.80 USD!
All together, that's $1297.20 for a machine with more AND faster RAM, at a cheaper price, than an M4 MacBook Pro that has a starting price of $1599.00 USD in the US, for just 16GB of the slower LPDDR5X-7500, compared to 64GB of the faster LPDDR5X-8448MHz with the Snapdragon ThinkPad.
Also, Apple is the only manufacturer in the form factor and price categories to solder storage with their laptops, as nearly everyone else uses the standard 2280, 2230 or 2242 NVME instead. Lenovo generally uses 2242 NVME in their ultraportables, which is also compatible with the cheaper/smaller and more popular 2230, as the 2230 format appears to be more popular because of its use by SteamDeck handheld gaming console and the clones, and hence has a lower price, because there's more competition in the form factor.
Drives with 2TB NVME in a 2230 form factor retail at about $150 right now (that's more expensive than 2280 but cheaper than 2242), compare to $600 that Apple charges for a 2TB upgrade from 512GB on a MacBook Pro! (It's actually $800 on a MacBook Air or Mac mini to go from 256GB to 2TB!)
Not to mention that Snapdragon does support DP MST for daisy-chaining the monitors through DisplayPort Multi-Stream Transport, whereas MacOS still doesn't — unless you're using Windows on an Intel Mac, that is!
But, hey, at least Apple has finally given us a 2 external monitor support with the base M4 chip, without having to close the lid!
For mobile Linux in particular, I found that it's quite the opposite, I see projects like Phosh and KDE Plasma Mobile constantly showing UI and UX improvemnts (albeit at a slower pace than desktop projects), while basic hardware support is non-functional.
Of course I'm not expecting every UX/UI dev to abandon their project to jump into low-level kernel development and bring support for more devices, but it feels like the desktop environments are developing for a device that doesn't exist.
> Is it even possible to use Linux on desktop without ever having to edit config files or run commands in the terminal?
On a modern Linux distro (that isn't one of the "advanced" ones), the answer is yes. If you install something like Mint or Ubuntu, you have a graphical app store and driver manager (which AFAIK you only need for NVIDIA GPUs).
As pointed out in the other comment, the true price at Lenovo for this upgrade is only $112.80 — not as good as you'd get with the DDR5 SODIMM, but it's actually cheaper than what Crucial supposedly charges for their 32GB of LPCAMM2, which isn't even as fast as what Lenovo includes.
https://www.crucial.com/memory/ddr5/ct32g75c2lp5xg — $174.99 for Crucial 32GB LPCAMM2 LPDDR5X-7500 — compare to a net $112.80 difference at Lenovo for an extra 32 GB LPDDR5X-8448MHz (Soldered).
Here's my situation:
At home, I develop on a very beefy x86_64 desktop machine running Ubuntu.
When travelling overseas for work, I have an x86_64 laptop which provides me decent performance but only lasts 3 hours or so. All my colleagues are rocking Macbooks which last a whole day and I can't even take a piss without thinking about a power outlet.
What is the battery life for a setup where you only use the outside world Mac as a shell for a Linux VM? Can you run X11 applications remotely with `ssh -X`?
I sincerely HATE the whole Mac OS ecosystem but right now, they are the best in terms of battery life for a mobile device, and I need that.
Thank you for any information you can provide!
Often they're both, with detachable cables. Again, it was just weird that you brought up RGB specifically, something not at all central to your original point, a gimmick marketed at 'gamers' which is pointless in a well lit room; as well as mechanical keyboards specifically; as well that you've carried on like this in reply. Please don't take the time to reply again.
They have two different higher tiers of protection for "next day" repairs. I'm thinking about upgrading.
This is typically due to default settings on TVs enabling various high-latency post-processing options.
Most TVs have "game mode" settings that disable all these at once, at which point latency is comparable to monitors.
Case in point: at both 60 Hz and 120 Hz*, 4K latency on my LG C4 is within a millisecond of the lowest-latency monitors listed here:
https://www.rtings.com/monitor/tests/inputs/input-lag
I fully agree that HDMI is inferior to USB-C, if only because quality USB-C-to-HDMI adapters are widely available, and mini/micro HDMI connectors commonly used on small devices (not including this laptop) are garbage.
* Probably also true at 144 Hz, but the linked table doesn't have a dedicated column for 144 Hz, complicating comparisons.
Which is what a lot of users have.
Used a t480s for 5 years and it was awesome. And now my t14 gen3 is doing pretty fine as well. What are the issues you are running into?
Parallels for Mac I’m Subscribed to and then run Fedora with i3 and it’s graphically accelerated and really really fast and I just use it like I would anything else except now I have the option to use macOS.
What you’re going to hate even more is I’m finding myself just trying to make macOS more like i3 finding tools that try and get there. Yabai and others.
In the end I am almost entirely in. Terminal with tmux and have a safari or chrome window open for docs and slack for all the work distractions.
I couldn’t for the life of me get ssh -X working on macOS so I went the full os route :shrug_emoji: