←back to thread

139 points obscurette | 1 comments | | HN request time: 0.223s | source
Show context
floppyd ◴[] No.44465285[source]
"Old man yells at cloud", but in so-so-so many words
replies(3): >>44465346 #>>44465450 #>>44465469 #
worldsayshi ◴[] No.44465346[source]
> we’ve traded reliability and understanding for the illusion of progress.

I wish there was a somewhat rigorous way to quantify reliability of tech products so that we could conclude if tech on average was about as buggy in the past.

I mean I also feel that things are more buggy and unreliable, but I don't know if there's any good way to measure it.

replies(4): >>44465401 #>>44465520 #>>44465592 #>>44465662 #
1. palata ◴[] No.44465592[source]
If you look e.g. at websites, I think we can measure that the average website today is slower to load than 15 years ago, even though hardware is orders of magnitude faster.

Another thing is that today, you receive updates. So bugs get fixed, new bugs get introduced, it makes it harder to track other than "well, there are always bugs". Back then, a bug was there for life, burnt on your CD-ROM. I'm pretty sure software shipped on CD-ROM was a lot more tested than what we deploy now. You probably wouldn't burn thousands of CD-ROM with a "beta" version. Now the norm is to ship beta stuff and use 0-based versioning [1] because we can't make anything stable.

Lastly, hardware today is a lot cheaper than 25 years ago. So now you buy a smartphone, it breaks after a year, you complain for 5min and you buy a new one. Those devices are almost disposable. 25 years ago you had to spend a lot of money on a PC, so it had to last for a while and be repairable.

[1]: https://0ver.org/