If I have to do extensive subtle prompt engineering and use a lot of mental effort to solve my problem... I'll just solve the problem instead. Programming is a mental discipline - I don't need help typing, and if using an AI means putting in more brainpower, its fundamentally failed at improving my ability to engineer software
The only way to successfully use AI is to have sufficient skill to review the code it generates for correctness - which is a problem that is at least as skilful as simply writing the code
If someone doesn't understand, even conceptually how requirements
That natural language can only be ambiguous: but legal contracts, technical specs, and scientific papers are all written in precise natural language.
And that AI interaction is one-shot where ambiguous input produces ambiguous output, but LLM programming is iterative. You clarify and deliver on requirements through conversation, testing, debugging, until you reach the precise accepted solution.
Traditional programming can also start with ambiguous natural language requirements from stakeholders. The difference is you iterate toward precision through conversation with AI rather than by writing syntax yourself.