Today I bump into limitations of machines that were put there by manufacturers who are trying to assert ownership of the device after the purchase. In the "before times" limitations were either a fact of the hardware (i.e. you only have so much RAM, storage, CPU cycles, etc) or of your own ability (you don't know how to crack the protection, defeat the anti-debug tricks, etc). Today you're waging a nearly unwinnable battle against architectures of control baked-in to the hardware at a level below a level that the average end user has any hope of usurping.
The machine isn't trying to master me. The people who made the machine are. I wish people in the tech industry wouldn't be party to taking away computing freedom. It pays well, though, and they can console themselves with "It's not a computer, it's a phone"-type delusions (at least until the day "the man" comes for their PCs).
Even in the "before times" we had such limitations: the 486 was shipped as a cheaper version with a functional but disabled math coprocessor. There are meaningful differences in practical terms, but I definitely see it as a clear predecessor of this behavior.
While "guns don't kill people, people kill people" is a cliché, I think there's still considerable meaning behind it, and I'd say the same holds in the "machines don't do anything to people" sense. Sure, a lot of decision-making and faceless authority is outsourced to machines, but it's still people who are doing that outsourcing, and if those people stopped deciding to put so much weight on the output of (intentionally and unintentionally) black-boxed algorithms then that power of the machines would vanish instantly.
Regular people being able to commit contempt of companies' business models en masse seems to work well to keep them in check, but it's becoming ever harder with so much of everything becoming mobile-centric. And with all smartphones being locked down at the level of someone else's public keys being burned into the SoC at the factory, you can't do shit. They literally have technological supremacy over the rest of the humanity. And we're somehow okay with that.
In modern times they also do this because the process of semiconductor manufacturing is imperfect and sometimes some parts of the chip would come out damaged. IIRC this happens with GPUs a lot so they tend to have spare cores.
In a hypothetical scenario where I somehow "unlocked" the FPU functionality Intel couldn't push out a mandatory firmware upgrade to blow an e-fuse in my chip, fixing the "vulnerability" that allowed me to access the FPU and simultaneously preventing me from ever loading "vulnerable" firmware again (like, say, the Nintendo Switch).
I'll take consumer protection regulation, at least in the short term.
I wish manufacturers were required to clearly inform consumers which products are sold versus rented, self-hostable versus tied to hosted services, or crippled from running Free software by firmware locks. That would allow a market for freedom-respecting products to actually develop to a reasonable size, and not just to be a fringe thing.
The stake was low, because nobody could use your computer to drain your bank account. And someone who would "prank" your computer beyond the social norm would get a stern talking to.
Computers these days have to support your grandma making hotel reservations online without her entire financial information being sent to hackers in Eastern Europe. They're doing jobs that 70s OS designers never thought about. It's a different world.
It used to be the case that people valued freedom and the lack of it was something blatantly apparent.
When somebody was a slave, it was a very explicit interpersonal relationship which was very obviously abusive. Even today, some cultures such as Americans are so ashamed of their slaver past that they censor the word on YouTube.
When somebody worked for a company which compensated him not with money but company script which could only be exchanged for goods in company stores, it obviously created a relationship of unequal power which over time put the weaker side at an even bigger and bigger disadvantage. People were able to see and understand this and it was outlawed.
But these days, the power dynamics are so complex and have so many steps and intermediaries, people don't even know what is being taken away from them. It's a salami slicing attack too. There are minor outrages here and there but nothing even changes, two steps forward, one step back to appease them.
---
Bottom line: if a company claims it "sells" you something, the precedent is you own it fully. If you don't, that's theft. Theft, even multi step theft, should be punished in full. That means the company should pay a fine according to how much money they made from their abuse of power, multiplied by a punitive constant.
Additionally, all people involved in the decision making process should also be punished according to how much they stole.
Also, as a note, unlike modern chips where they fuse off broken cores and sell them as lower specs[1] as part of the binning process, with the early 486SX the FPU was disabled before any testing / binning, so they weren’t selling broken DX dice as SXs.
[1] Or in some cases, fusing if working silicon if the supply / demand curve works that way, see the infamous 3 core AMD Phenom.
Not if you’re a mainframe customer. Capacity based licensing has been a standard practice in the mainframe world for around 50 years.
I’m not saying it is how we -should-. But that IBM wasn’t rug pulling.
> ...that could be achieved and that we were roughly on-track on.
I think there's a strong lateral connection to this quote:
“I've come up with a set of rules that describe our reactions to technologies:
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you're thirty-five is against the natural order of things.”
― Douglas Adams, The Salmon of Doubt: Hitchhiking the Galaxy One Last Time
I think this has interesting implications wrt the perception of nostalgia, because nostalgia seems to be able to apply at any age to any event that happened far enough back in time; while the above theoretical model maps roughly to specific ages.
So I wonder what things are actually a partially overlapped Venn diagram of the above maxim and nostalgia.
In this case I think it's possible the idea that we were "roughly on-track on" with certain technologies - the notion of an emergent sense of structure that was certain to unfold - could map to some point in between points (1) and (2) in the maxim above. An objective analyses would instead recognize "success" as the survivorship bias that it is; but we're not objective :) - and I find that endlessly fascinating!
So that suggests it's a user education issue.