Rather annoyed by the hype around LLMs myself, but I notice, both in this article and more generally, that some of the criticism seems to attribute to LLMs specifically the issues that are neither unique to them, nor caused by them.
> Why would someone using a really cool tool that makes them more productive… feel compelled to sneer and get defensive at the mere suggestion that someone else isn’t doing the same?
It sounds like it is about people seeing others doing things in a way they view as inefficient or wrong, and then trying to help. "Sneer and get defensive" does not sound like trying to be helpful, but they probably would not describe themselves as sneering and getting defensive, either.
> I might have strong opinions about what code looks like, because I might have to read it, but why would I — why would anyone — have such an intense reaction to the hypothetical editor setup of a hypothetical stranger?
As above, but even closer to this particular question, see the editor war.
> But the Bitcoin people make more money if they can shame everyone else into buying more Bitcoin, so of course they’re gonna try to do it. What do programmers get out of this?
Apart from "helping" others, a benefit of promoting technologies one uses and prefers may be a wider user base, leading to a better support, proliferation of technologies they view as good and useful.
> We’ve never had a machine that can take almost any input and just do Whatever.
Well, there is Perl. It is a joke (the statement, not the language), but the previous points actually made me to think of programming languages with dynamic and weak typing, similarly allowing to pretend that some errors do not happen, at the cost of being less correct, and doing whatever when things go wrong. Ambiguities in natural languages come to mind, too.
> That means they aren’t even reading the words they claim as their own!
Both homework and online discussions featured that before LLMs, too. For instance, with people sometimes linking (not necessarily copying) materials to prove a point, but materials contradicting it. Carelessness, laziness, lack of motivation to spend time and effort, are all old things.
> I can’t imagine publishing a game with, say, Midjourney-generated art, even if it didn’t have uncanny otherworldly surfaces bleeding into each other. I would find that humiliating. But there are games on the Switch shop that do it.
I heard "AI-generated game" mentioned as a curiosity or a novelty, apparently making it a selling point. Same as with all the "AI-powered" stuff, before LLMs. There is much of that used for marketing: as block chaining and "big data" were added everywhere when those were hyped, and many silly things are similarly added into items outside of computing if they have a potential to sound cool at least to some (e.g., fad diets, audiophile hardware).
> But I think the core of what pisses me off is that selling this magic machine requires selling the idea that doing things is worthless.
This also sounds like yet another point in the clash between prevalent business requirements and more enthusiastic human aspirations. The economic and social systems, and cultures, probably have more to do with it than particular technologies. Pretty much any bureaucracy/corporate/enterprise-focused technologies tend to lessen the fun and enjoyment.