←back to thread

321 points distantprovince | 5 comments | | HN request time: 0.411s | source
1. crazygringo ◴[] No.44617589[source]
>> "I asked ChatGPT and this is what it said: <...>".

> Whoa, let me stop you right here buddy, what you're doing here is extremely, horribly rude.

How is it any different from "I read book <X> and it said that..."? Or "Book <X> has the following quote about that:"?

I definitely want to know where people are getting their info. It helps me understand how trustworthy it might be. It's not rude, it's providing proper context.

replies(4): >>44617662 #>>44617672 #>>44617825 #>>44617906 #
2. toast0 ◴[] No.44617662[source]
Because published books, depending on genre, have earned a presumption of being based on reality. And it's easy to reproduce a book lookup, and see if they link to sources. I might have experience with that book and know of its connection with reality.

ChatGPT and similar have not earned a presumption of reality for me, and the same question may get many different answers, and afaik, even if you ask it for sources, they're not necessarily real either.

IMHO, it's rude to use ChatGPT and share it with me as if it's informative; it disrespects my search for truth. It's better that you mention it, so I can disregard the whole thing.

3. Arainach ◴[] No.44617672[source]
A book is a credentialed source that can be referenced. A book is also something that not everyone may have on hand, so a pointer can be appreciated. LLMs are not that. If I wanted to know what they said I'd ask them. I'm asking you/the team to understand what THEY think. Unfortunately it's becoming increasingly clear that certain people and coworkers don't actually think at all very often - particularly the ones that just take any question and go throw it off to the planet burning machine.
4. mook ◴[] No.44617825[source]
To me, it's different because having read a book, remembered it, and pulled out the quote means you spent time on it. Pasting a response from ChatGPT means you didn't even bother reading that, understand the output, thought about it to make sure it makes sense, and then resynthesize it.

It mostly means you don't respect the other person's time and it's making them do the vetting. And that's the rude part.

5. scarface_74 ◴[] No.44617906[source]
I assume a book is correct or I at least assume to author thought it was correct when it comes to none ideological topics.

But you can’t assume positive intent or any intent from an LLM.

I always test the code, review it for corner cases, remove unnecessary comments, etc just like I would a junior dev.

For facts, I ask it to verify whatever says based on web source. I then might use it to summarize it. But even then I have my own writing style I steer it toward and then edit it.