←back to thread

223 points benkaiser | 1 comments | | HN request time: 0.23s | source
Show context
jccalhoun ◴[] No.42546170[source]
It is fun and frustrating to see what LLMs can and can't do. Last week I was trying to find the name of a movie so I typed a description of a scene in chatgpt and said "I think it was from late 70s or early 80s and even though it is set in the USA, I'm pretty sure it is European" and it correctly told me it was the House by the Cemetery.

Then last night I saw a video about the Parker Solar Probe and how at 350,000mph it was the fastest moving man-made object. So I asked chatgpt how long at that speed it would take it to get to Alpha Centauri which is 4.37 light years away. It said it would take 59.8 million years. I knew that was way too long so I had it convert mph to miles per year and then it was able to give me the correct answer of 6817 years.

replies(3): >>42547089 #>>42547339 #>>42547721 #
QuantumG ◴[] No.42547089[source]
Whereas you would previously (for your first example) have a conversation with the guy at the video store and he'd not only tell you the movie but also recommend something else you might like.
replies(1): >>42547275 #
sadeshmukh ◴[] No.42547275[source]
So instead, you'd drive there, to someone who also probably doesn't know, talk (while they might also not want to), and then, you might get a recommendation. You can also ask ChatGPT for recommendations. This isn't a case where I would return to pre-LLM times.
replies(1): >>42547763 #
QuantumG ◴[] No.42547763[source]
One day you'll understand how valuable it was to have a person with knowledge and their own thoughts.
replies(1): >>42551112 #
1. sadeshmukh ◴[] No.42551112[source]
Isn't that called socializing? And there's never been more people out there interested in whatever you're interested in.