←back to thread

Getting AI to write good SQL

(cloud.google.com)
478 points richards | 5 comments | | HN request time: 0.632s | source
1. levocardia ◴[] No.44011492[source]
In one of Stephen Boyd's lectures on convex optimization, he has some quip like "if your optimization problem is computationally intractable, you could try really hard to improve the algorithm, or you could just go on vacation for a few weeks and by the time you get back, computers will be fast enough to solve it."

I feel like that's actually true now with LLMs -- if some query I write doesn't get one-shotted, I don't bother with a galaxy-brain prompt; I just shelve it 'til next month and the next big OpenAI/Anthropic/Google model will usually crush it.

replies(4): >>44013660 #>>44013784 #>>44013877 #>>44015598 #
2. user3939382 ◴[] No.44013660[source]
Try getting it to write a codepen sim of 3 rectangles parallel parking.
3. AbstractH24 ◴[] No.44013784[source]
Has the pace of this slown down or I have just lost track of the narrative?

Feels like innovation in AI is rapidly changing from paradigm-shifting to incremental.

4. owebmaster ◴[] No.44013877[source]
> I just shelve it 'til next month and the next big OpenAI/Anthropic/Google model will usually crush it.

1 month to write some code with LLM, that's quite the opposite of the promised productivity gain

5. th0ma5 ◴[] No.44015598[source]
Except here the core functionality changes day to day and hinges on specific word usage.