←back to thread

755 points MedadNewman | 1 comments | | HN request time: 0s | source
Show context
lxe ◴[] No.42891381[source]
You can also intercept the xhr response which would still stop generation, but the UI won't update, revelaing the thoughts that lead to the content filter:

    const filter = t => t?.split('\n').filter(l => !l.includes('content_filter')).join('\n');

    ['response', 'responseText'].forEach(prop => {
      const orig = Object.getOwnPropertyDescriptor(XMLHttpRequest.prototype, prop);
      Object.defineProperty(XMLHttpRequest.prototype, prop, {
        get: function() { return filter(orig.get.call(this)); }
      });
    });
Paste the above in the browser console ^
replies(2): >>42891427 #>>42891516 #
tills13 ◴[] No.42891516[source]
insane that this is client-side.
replies(8): >>42891775 #>>42891802 #>>42892213 #>>42892242 #>>42892457 #>>42896609 #>>42896617 #>>42896757 #
Gigachad ◴[] No.42896617[source]
It’s because they want to show the output live rather than nothing for a minute. But that means once the censor system detects something, you have to send out a request to delete the previously displayed content.

This doesn’t matter because censoring the system isn’t that important, they just want to avoid news articles about how their system generated something bad.

replies(3): >>42896943 #>>42897228 #>>42897366 #
bolognafairy ◴[] No.42896943[source]
Ern, in DeepSeek’s case, it’s not “news articles” that they’d be most concerned about.
replies(1): >>42896997 #
miohtama ◴[] No.42896997[source]
They have the same fear as everyone else "teenager learns how to cook napalm from an AI"
replies(2): >>42897116 #>>42898878 #
yndoendo ◴[] No.42898878{3}[source]
Don't need AI for such things. Just search for the Anarchist Cookbook in a search engine. [0] Amazon even sells it.

[0] https://www.amazon.com/Anarchist-Cookbook-William-Powell/dp/...

replies(1): >>42900850 #
1. miohtama ◴[] No.42900850{4}[source]
Exactly