←back to thread

1725 points taubek | 1 comments | | HN request time: 0.222s | source
Show context
PrimeMcFly ◴[] No.35323525[source]
I don't want anything, any type of news being pushed by my OS. It simply isn't it's job. Maybe, as an option or optional add-on, but not the way MS does it.

I use 10 now, as locked down and 'fixed' as I was able to make it (custom ISO via NTLite with a bunch of crap removed and some fixes steamrolled in), but really I look forward to ditching it altogether - which is a shame. For all the MS hate in the OSS community, I always thought Windows did a lot of stuff well (when it was good at least).

The telemetry, changing things for the sake of changing things and forced crap constantly being added is enough. I'm so in love with awesomewm at this point, and the fact that I can customize and program every part of my UI, allowing me to have something absolutely perfect and tailor made.

replies(16): >>35324087 #>>35324818 #>>35325430 #>>35325765 #>>35326431 #>>35326762 #>>35326805 #>>35326810 #>>35327156 #>>35327165 #>>35328629 #>>35329259 #>>35331531 #>>35331556 #>>35332516 #>>35333868 #
jgaa ◴[] No.35324818[source]
> I don't want anything, any type of news being pushed by my OS.

Then, how is Microsoft supposed to properly track your interests and sell that information to their "partners"?

It's been a long time since Microsoft made an operating system. What they make today is basically a spyware-platform where you can run applications if you are really disciplined and persistent. I don't understand how people keep up with it.

I've used Linux on my desktops and laptops for decades now.

replies(11): >>35325002 #>>35325044 #>>35325173 #>>35325246 #>>35325744 #>>35326652 #>>35326676 #>>35328196 #>>35329073 #>>35342285 #>>35351138 #
ftl64 ◴[] No.35325246[source]
It's just more stable, at least this has been my experience. I've tried hard to become a full-time workstation Linux user for years, daily driving Ubuntu, Mint, and Fedora for months at a time, but I always had to come back to Windows. Nvidia and Intel driver issues, package manager bugs, reduced laptop battery life, general UI clunkiness, and times when GRUB suddenly decided not to boot have taken so many hours of troubleshooting that could've been spent doing something actually productive.

Windows has many issues, but it never decided to break on me in the middle of the day. For me, an OS is not a religious affiliation but a tool, and Windows performs much better as one.

replies(27): >>35325320 #>>35325355 #>>35325432 #>>35325665 #>>35325756 #>>35326076 #>>35326135 #>>35326251 #>>35326365 #>>35326409 #>>35326645 #>>35326992 #>>35327071 #>>35327430 #>>35327534 #>>35327618 #>>35327724 #>>35327768 #>>35327928 #>>35328739 #>>35329543 #>>35329903 #>>35329930 #>>35329987 #>>35332388 #>>35335160 #>>35348994 #
1. thomastjeffery ◴[] No.35329930[source]
Linux driver support was hell from roughly around 2010 to 2016. Both major GPU manufacturers had awful proprietary drivers (with even worse packaging), and most wifi chipsets required proprietary firmware blobs to work at all: which was very tricky to package, because of copyright bullshit.

This was also the era of major desktop environments playing fast and loose with there UX. GNOME3 was released in 2011. Ubuntu started defaulting to Unity in 2010, and started their Wayland competitor (Mir) in 2014. KDE Plasma 5 (2014) defaulted to fancy composting, and felt really bloated relative to the others. The only desktop environments that really kept true to the good old days (~2008) are XFCE4 and MATE (the GNOME2 fork). KDE5 isn't bad, either, but it's still a bit too bloated.

The other problem caused by proprietary video drivers was package versioning. It's tricky to have the right kernel version and Xorg version necessary to run a proprietary video driver blob; and keep the rest of your system up-to-date. Ubuntu found its initial success by creating a generally stable package repository roughly as up-to-date as Debian unstable. Unfortunately, Ubuntu became a bloated mess with strange things like Unity and Mir bundled in. Archlinux has been a good alternative, but it does expect a level of familiarity with shell utilities. Linux Mint (an Ubuntu or Debian fork) is still my first recommendation to casual users. One of these days, it will be NixOS, which is a giant leap in stability and package versioning.

The last change of that era that has been breaking the Linux experience is the switch from BIOS/MBR to UEFI/GPT. This shift was slow and messy, with most hardware adoption following the release of Windows 8 in 2012. GRUB used to break in one predictable way: windows overwrites the MBR, replacing GRUB with its bootloader. Now, with UEFI, boot entries are saved directly to the motherboard, and the bootloader itself lives in the ESP partition. The windows installer will put its bootloader in the first ESP it can find, and you don't get to choose which one that is. Now you have to worry about the ESP running out of space, but that's about it: everything else has been generally resolved, and the UEFI bootloader experience is very solid (apart from the windows installer caveat).

Now that AMDGPU is mature, and NVIDIA's drivers are relatively well maintained (and packaged), the Linux desktop experience is even more stable than its heyday back in 2008. If you install a distro that targets relatively recent package versions, like Archlinux, Linux Mint, or even Fedora; and you use a solid familiar desktop environment like MATE or XFCE4; you can avoid most UI/UX clunkiness and have very little need to fiddle with your package manager. Boot issues are pretty unlikely now, so long as you install in UEFI mode (not legacy BIOS emulation), and completely avoid MBR.