←back to thread

504 points puttycat | 2 comments | | HN request time: 0s | source
Show context
michaelcampbell ◴[] No.46182585[source]
After an interview with Cory Doctorow I saw recently, I'm going to stop anthropomorphizing these things by calling them "hallucinations". They're computers, so these incidents are just simply Errors.
replies(5): >>46182654 #>>46182851 #>>46183122 #>>46183153 #>>46183590 #
grayhatter ◴[] No.46182654[source]
I'll continue calling them hallucinations. That's a much more fitting term when you account for the reasonableness of people who believe them. There's also equally a huge breadth of different types of errors that don't pattern match well into, "made up bullshit" the same way calling them hallucinations do. There's no need to introduce that ambiguity when discussing something narrow.

there's nothing wrong with anthropomorphizing genai, it's source material is human sourced, and humans are going to use human like pattern matching when interacting with it. I.e. This isn't the river I want to swim upstream in. I assume you wouldn't complain if someone anthropomorphized a rock... up until they started to believe it was actually alive.

replies(1): >>46182734 #
1. vegabook ◴[] No.46182734[source]
Given that an (incompetent or even malicious) human put their names(s) to this stuff, “bullshit” is an even better and fitting anthropomorphization
replies(1): >>46182792 #
2. grayhatter ◴[] No.46182792[source]
> incompetent or even malicious

sufficiently advance some competences indistinguishable from actual malice.... and thus should be treated the same