←back to thread

323 points timbilt | 2 comments | | HN request time: 0.489s | source
1. throttlebody ◴[] No.42130052[source]
Looking up stuff, with any efficiency, requires a significant amount of prior knowledge to ask the right question.
replies(1): >>42132842 #
2. kamaal ◴[] No.42132842[source]
Prior knowledge part becomes important as you realise, verifying the output of an LLM to be right requires the same too.

In fact you can only ask the smallest possible increment so that the answer can be verified to be true with least possible effort, and you can build from there.

The same issue happens with code. Its not like total beginners will be able to write replacement for the Linux kernel in the first 5 mins of use. Or that a product manager will just write a product spec and a billion dollar product will be produced magically.

You will still do most of the code work, AI will just do the smart typing work for you.

Perhaps it all comes down the fact that, you have to verify the output of the process. And you need to be aware of what you are doing at a very fundamental level to make that happen.