←back to thread

114 points cmcconomy | 8 comments | | HN request time: 0.847s | source | bottom
1. swazzy ◴[] No.42174873[source]
Note unexpected three body problem spoilers in this page
replies(2): >>42175014 #>>42175102 #
2. zargon ◴[] No.42175014[source]
Those summaries are pretty lousy and also have hallucinations in them.
replies(1): >>42175374 #
3. johndough ◴[] No.42175102[source]
And this example does not even illustrate the long context understanding well, since smaller Qwen2.5 models can already recall parts of the Three Body Problem trilogy without pasting the three books into the context window.
replies(2): >>42175250 #>>42176842 #
4. gs17 ◴[] No.42175250[source]
And multiple summaries of each book (in multiple languages) are almost definitely in the training set. I'm more confused how it made such inaccurate, poorly structured summaries given that and the original text.

Although, I just tried with normal Qwen 2.5 72B and Coder 32B and they only did a little better.

5. johndough ◴[] No.42175374[source]
I agree. Below are a few errors. I have also asked ChatGPT to check the summaries and it found all the errors (and even made up a few more which weren't actual errors, but just not expressed in perfect clarity.)

Spoilers ahead!

First novel: The Trisolarans did not contact earth first. It was the other way round.

Second novel: Calling the conflict between humans and Trisolarans a "complex strategic game" is a bit of a stretch. Also, the "water drops" do not disrupt ecosystems. I am not sure whether "face-bearers" is an accurate translation. I've only read the English version.

Third novel: Luo Yi does not hold the key to the survival of the Trisolarans and there were no "micro-black holes" racing towards earth. Trisolarans were also not shown colonizing other worlds.

I am also not sure whether Luo Ji faced his "personal struggle and psychological turmoil" in this novel or in an earlier novel. He certainly was most certain of his role at the end. Even the Trisolarians judged him at over 92 % deterrent rate.

replies(1): >>42179203 #
6. agildehaus ◴[] No.42176842[source]
Seems a very difficult problem to produce a response just on the text given and not past training. An LLM that can do that would seem to be quite more advanced than what we have today.

Though I would say humans would have difficulty too -- say, having read The Three Body problem before, then reading a slightly modified version (without being aware of the modifications), and having to recall specific details.

replies(1): >>42177309 #
7. botanical76 ◴[] No.42177309{3}[source]
This problem is poorly defined; what would it mean to produce a response JUST based on the text given? Should it also forgo all logic skills and intuition gained in training because it is not in the text given? Where in the N dimensional semantic space do we draw a line (or rather, a surface) between general, universal understanding and specific knowledge about the subject at hand?

That said, once you have defined what is required, I believe you will have solved the problem.

8. bcoates ◴[] No.42179203{3}[source]
Yeah describing Luo Ji as having "struggles with the ethical implications of his mission" is the biggest whopper.

He's like God's perfect sociopath. He wobbles between total indifference to his mission and interplanetary murder-suicide, and the only things that seem to really get to him are a stomachache and being ghosted by his wife.