←back to thread

451 points imartin2k | 1 comments | | HN request time: 0.574s | source
Show context
bsenftner ◴[] No.44479706[source]
It's like talking into a void. The issue with AI is that it is too subtle, too easy to get acceptable junk answers and too subtle for the majority to realize we've made a universal crib sheet, software developers included, perhaps one of the worst populations due to their extremely weak communications as a community. To be repeatedly successful with AI, one has to exert mental effort to prompt AI effectively, but pretty much nobody is willing to even consider that. Attempts to discuss the language aspects of using an LLM get ridiculed as 'prompt engineer is not engineering' and dismissed, while that is exactly what it is: prompt engineering using a new software language, natural language, that the industry refuses to take seriously, but is in fact an extremely technical programming language so subtle few to none of you realize it, nor the power that is embodied by it within LLMs. They are incredible, they are subtle, to the degree the majority think they are fraud.
replies(3): >>44479916 #>>44479955 #>>44480067 #
20k ◴[] No.44479955[source]
The issue is that you have to put in more effort to solve a problem using AI, than to just solve it yourself

If I have to do extensive subtle prompt engineering and use a lot of mental effort to solve my problem... I'll just solve the problem instead. Programming is a mental discipline - I don't need help typing, and if using an AI means putting in more brainpower, its fundamentally failed at improving my ability to engineer software

replies(3): >>44480012 #>>44480014 #>>44481015 #
handfuloflight ◴[] No.44481015[source]
This overlooks a new category of developer who operates in natural language, not in syntax.
replies(3): >>44481406 #>>44481411 #>>44483668 #
const_cast ◴[] No.44483668[source]
Does this new category actually exist? Because, I would think, if you want to be successful at a real company you would need to know how to program.
replies(1): >>44483827 #
handfuloflight ◴[] No.44483827[source]
Knowing how to program is not limited to knowing how to write syntax.
replies(2): >>44484108 #>>44486307 #
const_cast ◴[] No.44484108[source]
Yes, but knowing how to read and write syntax is a required pre-requisite.

Syntax, even before LLMs, is just an implementation detail. It's for computers to understand. Semantics is what humans care about.

replies(1): >>44484131 #
handfuloflight ◴[] No.44484131[source]
*Was* a required pre-requisite. Natural language can be translated bidirectionally with any programming syntax.

And so if syntax is just an implementation detail and semantics is what matters, then someone who understands the semantics but uses AI to handle the syntax implementation is still programming.

replies(2): >>44484922 #>>44485678 #
DrillShopper ◴[] No.44485678[source]
char * const (( const bar)[5])(int )
replies(1): >>44485759 #
1. handfuloflight ◴[] No.44485759[source]
{⍵[⍋⍵]}⍨?10⍴100

...or generate 10 random numbers from 1 to 100 and sort them in ascending order.

I know which one of these is closer to, if not identical to the thoughts in my mind before any code is written.

I know which of one of these can be communicated to every single stakeholder in the organization.

I know which one of these the vast majority of readers will ask an AI to explain.