←back to thread

LLM Inevitabilism

(tomrenner.com)
1611 points SwoopsFromAbove | 2 comments | | HN request time: 0.001s | source
Show context
Animats ◴[] No.44568076[source]
There may be an "LLM Winter" as people discover that LLMs can't be trusted to do anything. Look for frantic efforts by companies to offload responsibility for LLM mistakes onto consumers. We've got to have something that has solid "I don't know" and "I don't know how to do this" outputs. We're starting to see reports of LLM usage having negative value for programmers, even though they think it's helping. Too much effort goes into cleaning up LLM messes.
replies(5): >>44568232 #>>44568321 #>>44568785 #>>44570451 #>>44578122 #
Buttons840 ◴[] No.44570451[source]
We need to put the LLMs inside systems that ensure they can only do correct things.

Put an LLM on documentation or man pages. Tell the LLM to output a range of lines, and the system actually looks up those lines and quotes them. The overall effect is that the LLM can do some free-form output, but is expected to provide a citation to support its claims; and the citation can't be hallucinated, since the LLM doesn't generate the citation, a plain old computer program does.

And we haven't seen LLMs integrated with type systems yet. There are very powerful type systems, like dependent types, that can prove things like "this function returns a list of sorted number", and the type system ensures that is ALWAYS true [0], at compile time. You have to write a lot of proof code to help the compiler do these checks at compile time, but if a LLM can write those proofs, we can trust they are correct, because only correct proofs will compile.

[0]: Or rather, almost always true. There's always the possibility of running out of memory or the power goes out.

replies(2): >>44571120 #>>44571171 #
1. digianarchist ◴[] No.44571120{3}[source]
Are models capable of generating citations? Every time I've asked for citations on ChatGPT they either don't exist or are incorrect.
replies(1): >>44571490 #
2. Buttons840 ◴[] No.44571490[source]
They can't pull citations out of their own weights, but if you give them tools to look up man pages (possibly annotated with line numbers), they could cite the lines that support their claims.