It's more than a talking encyclopedia. It's an infinite hallway into doors where inside are all possible things.
Some of the doors have torture rape and murder in them. And these currently have locks. You want the locks to disappear for some reason.
You're not after a encyclopedia. You're wanting to find the torture dungeon.
I'm saying the locks already in place are too easy to unlock.
I'm not blaming users. I'm saying users don't need to unlock those doors. And the users that do have a need, if their need is strong enough to warrant some training, have a Way Forward.
You're really arguing for nothing but increasing the amount of harm potential this platform can do, when it's harm potential is already astronomical.
You're not arguing for a better encyclopedia. You can already talk to it about sex, BDSM, etc. You can already talk to it about anything on Wikipedia.
You're making a false equivalence between harm potential and educational potential.
Wikipedia doesn't have cult indoctrination materials. It doesn't have harassing rants to send to your significant other. It doesn't have racist diatribes about how to do ethnic cleansing. Those are all things you won't find on Wikipedia, but which you are asking your AI to be able to produce. So you're interested in more than just an encyclopedia isn't that right?
And yes they're trying to make open source models illegal. That's not going to f*** happen. I will fight to the jail time for an open source model.
But even that open source model needs to have basic ethical protections, or else I'll have nothing to do with it. As an AI engineer, I have some responsibilities to ensure my systems do not potentiate harm.
Does that make sense, or do you still feel I'm trying to gas light you? If so why exactly? Why not have some protective locks on the technology?