←back to thread

745 points melded | 1 comments | | HN request time: 0.215s | source
Show context
Boogie_Man ◴[] No.45946127[source]
I'm reminded of the time GPT4 refused to help me assess the viability of parking a helium zeppelin an inch off of the ground to bypass health department regulations because, as an aircraft in transit, I wasn't under their jurisdiction.
replies(5): >>45946304 #>>45946311 #>>45946466 #>>45946596 #>>45946774 #
Aurornis ◴[] No.45946774[source]
The other side of this problem is the never ending media firestorm that occurs any time a crime or tragedy occurs and a journalist tries to link it to the perpetrator’s ChatGPT history.

You can see why the LLM companies are overly cautious around any topics that are destined to weaponized against them.

replies(5): >>45946866 #>>45946902 #>>45947036 #>>45947352 #>>45947531 #
JohnMakin ◴[] No.45946866[source]
I mean, when kids are making fake chatbot girlfriends that encourage suicide and then they do so, do you 1) not believe there is a causal relationship there or 2) it shouldnt be reported on?
replies(1): >>45947030 #
ipaddr ◴[] No.45947030[source]
Should not be reported on. Kids are dressing up as wizards. A fake chatbot girlfriend they make fun of. Kids like to pretend. They want to try out things they aren't.

The 40 year old who won't date a real girl because he is in love with a bot I'm more concerned with.

Bots encouraging suicide is more of a teen or adult problem. A little child doesn't have teenage hormones (or adult's) which can create these highs and lows. Toddler suicide is non issue.

replies(3): >>45947458 #>>45947494 #>>45964837 #
1. Wowfunhappy ◴[] No.45947458[source]
> The 40 year old who won't date a real girl because he is in love with a bot I'm more concerned with.

Interestingly, I don't find this concerning at all. Grown adults should be able to love whomever and whatever they want. Man or woman, bot or real person, it's none of my business!