←back to thread

451 points imartin2k | 4 comments | | HN request time: 0.813s | source
Show context
bsenftner ◴[] No.44479706[source]
It's like talking into a void. The issue with AI is that it is too subtle, too easy to get acceptable junk answers and too subtle for the majority to realize we've made a universal crib sheet, software developers included, perhaps one of the worst populations due to their extremely weak communications as a community. To be repeatedly successful with AI, one has to exert mental effort to prompt AI effectively, but pretty much nobody is willing to even consider that. Attempts to discuss the language aspects of using an LLM get ridiculed as 'prompt engineer is not engineering' and dismissed, while that is exactly what it is: prompt engineering using a new software language, natural language, that the industry refuses to take seriously, but is in fact an extremely technical programming language so subtle few to none of you realize it, nor the power that is embodied by it within LLMs. They are incredible, they are subtle, to the degree the majority think they are fraud.
replies(3): >>44479916 #>>44479955 #>>44480067 #
20k ◴[] No.44479955[source]
The issue is that you have to put in more effort to solve a problem using AI, than to just solve it yourself

If I have to do extensive subtle prompt engineering and use a lot of mental effort to solve my problem... I'll just solve the problem instead. Programming is a mental discipline - I don't need help typing, and if using an AI means putting in more brainpower, its fundamentally failed at improving my ability to engineer software

replies(3): >>44480012 #>>44480014 #>>44481015 #
handfuloflight ◴[] No.44481015[source]
This overlooks a new category of developer who operates in natural language, not in syntax.
replies(3): >>44481406 #>>44481411 #>>44483668 #
1. 20k ◴[] No.44481406[source]
Natural language is inherently a bad programming language. No developer, even with the absolute best AI tools, can avoid understanding the code that AI generates for very long

The only way to successfully use AI is to have sufficient skill to review the code it generates for correctness - which is a problem that is at least as skilful as simply writing the code

replies(1): >>44481891 #
2. handfuloflight ◴[] No.44481891[source]
You assume natural language programming only produces code. It is also used to read it.
replies(1): >>44486145 #
3. obirunda ◴[] No.44486145[source]
I don't think you understand why context-free languages are used for programming. If you provide a requirement with any degree of ambiguity the outcome will be non-deterministic. Do you want software that works or kind of works?

If someone doesn't understand, even conceptually how requirements

replies(1): >>44486382 #
4. handfuloflight ◴[] No.44486382{3}[source]
You're making two false assumptions:

That natural language can only be ambiguous: but legal contracts, technical specs, and scientific papers are all written in precise natural language.

And that AI interaction is one-shot where ambiguous input produces ambiguous output, but LLM programming is iterative. You clarify and deliver on requirements through conversation, testing, debugging, until you reach the precise accepted solution.

Traditional programming can also start with ambiguous natural language requirements from stakeholders. The difference is you iterate toward precision through conversation with AI rather than by writing syntax yourself.