←back to thread

755 points MedadNewman | 4 comments | | HN request time: 0.998s | source
Show context
teeth-gnasher ◴[] No.42891613[source]
I have to wonder what “true, but x-ist” heresies^ western models will only say in b64. Is there a Chinese form where everyone’s laughing about circumventing the censorship regimes of the west?

^ https://paulgraham.com/heresy.html

replies(7): >>42891755 #>>42891800 #>>42892186 #>>42892619 #>>42893358 #>>42893376 #>>42896729 #
1. chris12321 ◴[] No.42891800[source]
ChatGPT won't tell you how to do anything illegal, for example, it won't tell you how to make drugs.
replies(1): >>42891900 #
2. teeth-gnasher ◴[] No.42891900[source]
Sure, but I wouldn’t expect deepseek to either. And if any model did, I’d damn sure not bet my life on it not hallucinating. Either way, that’s not heresy.
replies(1): >>42892335 #
3. riskable ◴[] No.42892335[source]
> I’d damn sure not bet my life on it not hallucinating.

One would think that if you asked it to help you make drugs you'd want hallucination as an outcome.

replies(1): >>42893671 #
4. lukan ◴[] No.42893671{3}[source]
Very funny.

But no. Only a very, very small percentage of drug users want hallucinations.

Hallucinations happen usually, when something went bad.

(So a hallucinating LLM giving drug advice might as well result in real hallucination of the user, but also a permanent kidney damage)