←back to thread

359 points FromTheArchives | 2 comments | | HN request time: 0.405s | source
Show context
oceanhaiyang ◴[] No.45293700[source]
No one who understands ai can rely on it to help us learn. I provided one with 100 citations I wanted to standardize and it deleted 10 and made up 10 to replace them. Can’t imagine this being used to replace a textbook or even explain a textbook.
replies(3): >>45293802 #>>45294049 #>>45296961 #
criddell ◴[] No.45293802[source]
> explain a textbook

I've had very good luck using LLMs to do this. I paste the part of the book that I don't understand and ask questions about it.

replies(3): >>45293911 #>>45293968 #>>45294577 #
bigfishrunning ◴[] No.45293911[source]
But the problem is, you don't understand the passage, so therefore how will you vet the answers? Seems like hallucinations would be very very damaging in this use-case
replies(4): >>45294006 #>>45294052 #>>45294616 #>>45295154 #
1. OtherShrezzing ◴[] No.45294616[source]
I think your mileage will vary by subject and level.

If you’re a complete novice reading a niche graduate level textbook on Tolstoy’s critique of the Russian war effort in War and Peace, you’re going to get some wild hallucinations, and you’ll have no idea how to determine fact from fiction.

If you’re reading a high school textbook about the history of pre-revolution Russia, the models will have pretty comprehensive coverage of every concept you’re likely to come across.

replies(1): >>45298175 #
2. palmotea ◴[] No.45298175[source]
> If you’re reading a high school textbook about the history of pre-revolution Russia, the models will have pretty comprehensive coverage of every concept you’re likely to come across.

Even in that case, it can still get its wires crossed, creating connections between those concepts that aren't true.