If I have to do extensive subtle prompt engineering and use a lot of mental effort to solve my problem... I'll just solve the problem instead. Programming is a mental discipline - I don't need help typing, and if using an AI means putting in more brainpower, its fundamentally failed at improving my ability to engineer software
Syntax, even before LLMs, is just an implementation detail. It's for computers to understand. Semantics is what humans care about.
And so if syntax is just an implementation detail and semantics is what matters, then someone who understands the semantics but uses AI to handle the syntax implementation is still programming.
Sure, maybe, but it's a lossy conversion both ways. And that lossy-ness is what programming actually is. We get and formulate requirements from business owners, but translating that into code isn't trivial.
...or generate 10 random numbers from 1 to 100 and sort them in ascending order.
I know which one of these is closer to, if not identical to the thoughts in my mind before any code is written.
I know which of one of these can be communicated to every single stakeholder in the organization.
I know which one of these the vast majority of readers will ask an AI to explain.