←back to thread

838 points turrini | 5 comments | | HN request time: 0.201s | source
Show context
caseyy ◴[] No.43972418[source]
There is an argument to be made that the market buys bug-filled, inefficient software about as well as it buys pristine software. And one of them is the cheapest software you could make.

It's similar to the "Market for Lemons" story. In short, the market sells as if all goods were high-quality but underhandedly reduces the quality to reduce marginal costs. The buyer cannot differentiate between high and low-quality goods before buying, so the demand for high and low-quality goods is artificially even. The cause is asymmetric information.

This is already true and will become increasingly more true for AI. The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI. The AI label itself commands a price premium. The user overpays significantly for a washing machine[0].

It's fundamentally the same thing when a buyer overpays for crap software, thinking it's designed and written by technologists and experts. But IC1-3s write 99% of software, and the 1 QA guy in 99% of tech companies is the sole measure to improve quality beyond "meets acceptance criteria". Occasionally, a flock of interns will perform an "LGTM" incantation in hopes of improving the software, but even that is rarely done.

[0] https://www.lg.com/uk/lg-experience/inspiration/lg-ai-wash-e...

replies(27): >>43972654 #>>43972713 #>>43972732 #>>43973044 #>>43973105 #>>43973120 #>>43973128 #>>43973198 #>>43973257 #>>43973418 #>>43973432 #>>43973703 #>>43973853 #>>43974031 #>>43974052 #>>43974503 #>>43975121 #>>43975380 #>>43976615 #>>43976692 #>>43979081 #>>43980549 #>>43982939 #>>43984708 #>>43986570 #>>43995397 #>>43998494 #
mjr00 ◴[] No.43973105[source]
> the market sells as if all goods were high-quality

The phrase "high-quality" is doing work here. The implication I'm reading is that poor performance = low quality. However, the applications people are mentioning in this comment section as low performance (Teams, Slack, Jira, etc) all have competitors with much better performance. But if I ask a person to pick between Slack and, say, a a fast IRC client like Weechat... what do you think the average person is going to consider low-quality? It's the one with a terminal-style UI, no video chat, no webhook integrations, and no custom avatars or emojis.

Performance is a feature like everything else. Sometimes, it's a really important feature; the dominance of Internet Explorer was destroyed by Chrome largely because it was so much faster than IE when it was released, and Python devs are quickly migrating to uv/ruff due to the performance improvement. But when you start getting into the territory of "it takes Slack 5 seconds to start up instead of 10ms", you're getting into the realm where very few people care.

replies(5): >>43973152 #>>43973337 #>>43974116 #>>43974554 #>>43977067 #
dgb23 ◴[] No.43973337[source]
You are comparing applications with wildly different features and UI. That's neither an argument for nor against performance as an important quality metric.

How fast you can compile, start and execute some particular code matters. The experience of using a program that performs well if you use it daily matters.

Performance is not just a quantitative issue. It leaks into everything, from architecture to delivery to user experience. Bad performance has expensive secondary effects, because we introduce complexity to patch over it like horizontal scaling, caching or eventual consistency. It limits our ability to make things immediately responsive and reliable at the same time.

replies(2): >>43973579 #>>43973817 #
mjr00 ◴[] No.43973579[source]
> You are comparing applications with wildly different features and UI. That's neither an argument for nor against performance as an important quality metric.

I never said performance wasn't an important quality metric, just that it's not the only quality metric. If a slow program has the features I need and a fast program doesn't, the slow program is going to be "higher quality" in my mind.

> How fast you can compile, start and execute some particular code matters. The experience of using a program that performs well if you use it daily matters.

Like any other feature, whether or not performance is important depends on the user and context. Chrome being faster than IE8 at general browsing (rendering pages, opening tabs) was very noticeable. uv/ruff being faster than pip/poetry is important because of how the tools integrate into performance-sensitive development workflows. Does Slack taking 5-10 seconds to load on startup matter? -- to me not really, because I have it come up on boot and forget about it until my next system update forced reboot. Do I use LibreOffice or Word and Excel, even though LibreOffice is faster? -- I use Word/Excel because I've run into annoying compatibility issues enough times with LO to not bother. LibreOffice could reduce their startup and file load times to 10 picoseconds and I would still use MS Office, because I just want my damn documents to keep the same formatting my colleagues using MS Office set on their Windows computers.

Now of course I would love the best of all worlds; programs to be fast and have all the functionality I want! In reality, though, companies can't afford to build every feature, performance included, and need to pick and choose what's important.

replies(1): >>43974364 #
Retric ◴[] No.43974364[source]
> If a slow program has the features I need and a fast program doesn't, the slow program is going to be "higher quality" in my mind.

That’s irrelevant here, the fully featured product can also be fast. The overwhelming majority of software is slow because the company simply doesn’t care about efficiency. Google actively penalized slow websites and many companies still didn’t make it a priority.

replies(1): >>43974425 #
1. mjr00 ◴[] No.43974425[source]
> That’s irrelevant here, the fully featured product can also be fast.

So why is it so rarely the case? If it's so simple, why hasn't anyone recognized that Teams, Zoom, etc are all bloated and slow and made a hyper-optimized, feature-complete competitor, dominating the market?

Software costs money to build, and performance optimization doesn't come for free.

> The overwhelming majority of software is slow because the company simply doesn’t care about efficiency.

Don't care about efficiency at all, or don't consider it as important as other features and functionality?

replies(2): >>43974548 #>>43978493 #
2. Retric ◴[] No.43974548[source]
Not being free upfront isn’t the same thing as expensive.

Zoom’s got 7,412 employees a small team of say 7 employees could make a noticeable difference here and the investment wouldn’t disappear, it would help drive further profits.

> Don't care about efficiency at all

Doesn’t care beyond basic functionality. Obviously they care if something takes an hour to load, but rarely do you see considerations for people running on lower hardware than the kind of machines you see at a major software company etc.

replies(1): >>43974644 #
3. mjr00 ◴[] No.43974644[source]
> Zoom’s got 7,412 employees a small team of say 7 employees could make a noticeable difference here

What would those 7 engineers specifically be working on? How did you pick 7? What part of the infrastructure would they be working on, and what kind of performance gains, in which part of the system, would be the result of their work?

replies(1): >>43974881 #
4. Retric ◴[] No.43974881{3}[source]
What consumers care about is the customer facing aspects of the business. As such you’d benchmark Zoom on various clients/plugins (Windows, Max, Android, iOS) and create a never ending priority list of issues weighted by marketshare.

7 people was roughly chosen to be able to cover the relevant skills while also being a tiny fraction of the workforce. Such efforts run into diminishing returns, but the company is going to keep creating low hanging fruit.

5. dgb23 ◴[] No.43978493[source]
> Software costs money to build, and performance optimization doesn't come for free.

Neither do caching, operational/architectural overhead, slow builds and all the hoops we jump through in order to satisfy stylistic choices. All of this stuff introduces complexity and often demands specialized expertise on top.

And it's typically not about optimization, but about not doing things that you don't necessarily have to do. A little bit of frugality goes a long way. Often leading to simpler code and fewer dependencies.

The hardware people are (actually) optimizing, trying hard to make computers fast, to a degree that it introduces vulnerabilities (like the apple CPU cache prefetching memory from arrays of pointers, which opened it up for timing attacks, or the branch prediction vulnerability on intel chips). Meanwhile we software people are piling more and more stuff into programs that aren't needed, from software patterns/paradigms to unnecessary dependencies etc.

There's also the issue of programs feeling entitled to resources. When I'm running a video game or a data migration, I obviously want to give it as many resources as possible. But it shouldn't be necessary to provide gigabytes of memory for utility programs and operative applications.