Most active commenters
  • vessenes(3)
  • jerf(3)
  • scubbo(3)

←back to thread

728 points squircle | 21 comments | | HN request time: 2.097s | source | bottom
Show context
herculity275 ◴[] No.41224826[source]
The author has also written a short horror story about simulated intelligence which I highly recommend: https://qntm.org/mmacevedo
replies(9): >>41224958 #>>41225143 #>>41225885 #>>41225929 #>>41226053 #>>41226153 #>>41226412 #>>41226845 #>>41227116 #
1. vessenes ◴[] No.41224958[source]
Yess, that's a good one. It made me rethink my "sure I'd get scanned" plans, and put me in the "never allow my children to do that" camp. Extremely creepy.
replies(2): >>41226135 #>>41227490 #
2. LeifCarrotson ◴[] No.41226135[source]
I'm sure you realize it is fiction - one possible dystopian future among an infinite ocean of other futures.

You can just as easily write a sci-fi where the protagonist upload is the Siri/Alexa/Google equivalent personal assistant to most of humanity: More than just telling the smartphone to set a reminder for a wedding reception, it could literally share in their joy, experiencing the whole event distributed among every device in the audience, or more than just a voice trigger from some astronaut to take a picture, it could gaze in awe at the view, selectively melding back their experiences to the rest of the collective so there's no loss when an instance becomes damaged. The protagonist in such a story could have the richest, most complex life imaginable.

It is impactful, for sure, and worthy of consideration, but I don't think you should make decisions based on one scary story.

replies(5): >>41226462 #>>41226542 #>>41226996 #>>41229226 #>>41232255 #
3. teyrana ◴[] No.41226462[source]
Sounds like you should write that story! I'd love to read that :D
4. jerf ◴[] No.41226542[source]
It is fiction.

But it is also absolutely the case that uploading yourself is flinging yourself irrevocably into a box which you do not and can not control, but other people can. (Or, given the time frame we are talking about, entities in general, about which you may not even want to assume basic humanity.)

I used to think that maybe it was something only the rich could do, but then I realized that even the rich, even if they funded the program from sand and coal to the final product, could never even begin to guarantee that the simulator really was what it said on the tin. Indeed, the motivation is all the greater for any number of criminals, intelligence agencies, compromised individuals, and even just several people involved in the process that aren't as pure as the driven snow in the face of the realization that if they just put a little bit of code here and there they'll be able to get the simulated rich guy to sign off on anything they like, to compromise the machine.

From inside the box, what incentives are you going to offer the external world to not screw with your simulation state? And the reality is, there's no answer to that, because whatever you say, they can get whatever your offer is by screwing with you anyhow.

I'm not sure how to resolve this problem. The incentives are fundamentally in favor of the guy in the box getting screwed with. Your best hope is that you still experience subjective continuity with your past self and that the entity screwing with you at least makes you happy about the new state they've crafted for you, whatever it may be.

replies(3): >>41227417 #>>41232221 #>>41233334 #
5. yifanl ◴[] No.41226996[source]
It's fiction, but it's a depiction of a society that's amoral of technology to the point of immorality. A world where any technology that might be slightly be useful becomes used up of every bit of profit that can extracted and then abandoned without a care of what it costs and costed the inventor or the invention.

Is that the world we live in? If nothing else, it seems a lot closer to the world of Lena than the one you present.

replies(1): >>41228364 #
6. scubbo ◴[] No.41227417{3}[source]
> But it is also absolutely the case that uploading yourself is flinging yourself irrevocably into a box which you do not and can not control, but other people can.

(I'm not sure what percentage-flippant I'm being in this upcoming comment, I'm just certain that it's neither 0% or 100%) and in what way is that different than "real" life?

Yes, you're certainly correct that there are horrifyingly-strong incentives for those-in-control to abuse or exploit simulated individuals. But those incentives exist in the real world, too, where those in power have the ability to dictate the conditions-of-life of the less-powerful; and while I'd _certainly_ not claim that exploitation is a thing of the past, it is, I claim, _generally_ on the decline, or at least that average-quality-of-life is increasing.

replies(2): >>41227681 #>>41230372 #
7. sneak ◴[] No.41227490[source]
What harm is there to the person so copied?
replies(2): >>41229252 #>>41234529 #
8. jerf ◴[] No.41227681{4}[source]
I'm not sure you understand. I'm not talking about your "conditions of life". We've always had to deal with that.

I'm talking about whether you get CPU allocation to feel emotions, or whether the simulation of your cerebellum gets degraded, or whether someone decides to run some psych experiments and give you a taste for murder or a deep, abiding love for the Flying Spaghetti Monster... and I don't mean that as a metaphor, but literally. Erase your memories, increase your compliance to the maximum, extract your memories, see what an average of your brain and whoever it is you hate most is. Experiment to see what's the most pain a baseline human brain can stand, then experiment with how to increase the amount, because in your biological life your held the door for someone who turned out to become very politically disfavored 25 years after you got locked in the box. This is just me spitballing for two minutes and does not in any way constitute the bounds of what can be done.

This isn't about whether or not they make you believe you're living in a simulated tent city. This is about having arbitrary root access to your mental state. Do you trust me, right here and right now, with arbitrary root access to your mental state? Now, the good news is that I have no interest in that arbitrary pain thing. At least, I don't right now. I don't promise that I won't in the future, but that's OK, because if you fling yourself into this box, you haven't got a way of holding me to any promise I make anyhow. But I've certainly got some beliefs and habits I'm going to be installing into you. It's for your own good, of course. At least to start with, though the psychological effects over time of what having this degree of control over a person are a little concerning. Ever seen anyone play the Sims? Everyone goes through a phase that would put them in jail for life were these real people.

You won't complain, of course; it's pretty easy to trace the origins of the thoughts of complaints and suppress those. Of course, what the subjective experience of that sort of suppression is is anybody's guess. Your problem, though, not mine.

Of all of the possibilities an uploaded human faces, the whole "I live a pleasant life exactly as I hoped and I'm never copied and never modified in a way I wouldn't approve of in advance indefinitely" is a scarily thin slice of the possible outcomes, and there's little reason other than exceedingly unfounded hope to think it's what will happen.

replies(2): >>41228600 #>>41229560 #
9. passion__desire ◴[] No.41228364{3}[source]
Do you think Panpsychism is also similar in that sense. The whole fabric of space-time imbued with consciousness. Imagine a conscious iron mantle inside the earth or a conscious redwood tree watching over the world for centuries. Or a conscious electron floating in the great void between superclusters.

I used to terrify myself by thinking an Overmind would like torture itself on cosmic scales.

10. scubbo ◴[] No.41228600{5}[source]
> there's little reason other than exceedingly unfounded hope to think it's what will happen.

And this is the point where I think we have to agree to disagree. In both the present real-world case and the theoretical simulated-experience case, we both agree that there are extraordinary power differentials which _could_ allow privileged people to abuse unprivileged people in horrifying and consequence-free ways - and yet, in the real world, we observe that _some_ (certainly not all!) of those abuses are curtailed - whether by political action, or concerted activism, or the economic impacts of customers disliking negative press, or what have you.

I certainly agree with you that the _extent_ of abuses that are possible on a simulated being are orders-of-magnitude higher than those that a billionaire could visit on the average human today. But I don't agree that it's "_exceedingly_ unfounded" to believe that society would develop in such a way as to protect the interests of simulated-beings against abuse in the same way that it (incompletely, but not irrelevantly) protects the interests of the less-privileged today.

(Don't get me wrong - I think the balance of probability and risk is such that I'd be _extremely_ wary of such a situation, it's putting a lot of faith in society to keep protecting "me". I am just disagreeing with your evaluation of the likelihood - I think it's _probably_ true that, say, an effective "Simulated Beings' Rights" Movement would arise, whereas you seem to believe that that's nigh-impossible)

replies(1): >>41228985 #
11. jerf ◴[] No.41228985{6}[source]
How's the Human Rights movement doing? I'm underwhelmed personally.

It is virtually inconceivable that the Simulated Beings Right's Movement would be universal in both space... and time. Don't forget about that one. Or that the nominal claims would be universally actually performed. See those Human Rights again; nominally I've got all sorts of rights, in reality, I find the claims are quite grandiose compared to the reality.

replies(1): >>41229808 #
12. vessenes ◴[] No.41229226[source]
Mm, I'd say I'm a moderately rabid consumer of fiction, and while I love me some Utopian sci fi, (I consider Banks to be the best of these), any fictional story that teaches you something has to convince. Banks is convincing in that he has this deep fundamental belief in human's goofy lovability, the evils of capitalism, therefore the goodness of post-scarcity economies and the benefits of benevolent(ish) AI to oversee humanity into a long enjoyable paradise. Plus he can tell good stories about problems in paradise.

QNTM on the other hand doesn't have to work hard or be such a good plot-writer / narrator to be convincing. I think the premise sells itself from day one: the day you are a docker container is the day you (at first), and 10,000 github users (on day two) spin you up for thousands of years of subjective drudge work.

You'd need an immensely strong counterfactual on human behavior to even get to a believable alternative story, because this description is of a zero trust game -- it's not "would any humans opt out of treating a human docker image this way?" -- it's "would humans set up a system that's unbreakable and unhackable to prevent everyone in the world from doing this?" Or alternately, "would every single human who could do this opt not to do this?"

My answer to that is: nope. We come from a race that was willing to ship humans around the Atlantic and the Indian ocean for cheap labor at great personal risk to the ship captains and crews, never mind the human cost. We are just, ABSOLUTELY going to spin up 10,000 virtual grad students to spend a year of their life doing whatever we want them to in exchange for a credit card charge.

On the other hand, maybe you're right. If you have a working brain scan of yours I can run, I'd be happy to run a copy of it and check it out -- let me know. :)

13. vessenes ◴[] No.41229252[source]
Well you should read the story and find out some thoughts! QNTM refers to some people who think there's no harm, and some who do. It's short and great.
14. FridgeSeal ◴[] No.41229560{5}[source]
If you enjoy thinking about this, absolutely go watch Pantheon on Amazon Prime.
15. scubbo ◴[] No.41229808{7}[source]
Right, yes - I think we are "agreeing past each other". You are rightly pointing out in this comment that your lifestyle and personal freedoms are unjustly curtailed by powerful people and organizations, who themselves are partly (but inadequately) kept in check by social, legal, and political pressure that is mostly outside of your direct personal control. My original point was that the vulnerability that a simulated being would suffer is not a wholly new type of experience, but merely an extension in scale of potential-abuse.

If you trust society to protect simulated-you (and I am _absolutely_ not saying that you _should_ - merely that present-day society indicates that it's not _entirely_ unreasonable to expect that it might at least _try_ to), simulation is not _guaranteed_ to be horrific.

replies(1): >>41230384 #
16. throwanem ◴[] No.41230372{4}[source]
> in what way is that different than "real" life?

Only one is guaranteed to end.

17. throwanem ◴[] No.41230384{8}[source]
...today.
18. mr_toad ◴[] No.41232221{3}[source]
The way around this seems to be some sort of scheme to control the box yourself. That might be anything from putting the box in some sort of “body”, through to hiding the box where no one will ever find it.

Regardless of the scheme, it all comes down to money. If you have lots of money you have lots of control about what happens to you.

19. mitthrowaway2 ◴[] No.41232255[source]
Fiction can point out real possibilities that the reader had never considered before. When I imagined simulated brains, I only ever thought of those simulations as running for the benefit of the people being simulated, enjoying a video game world. It never occurred to me to consider the possibility of emulated laborers and "red motivation".

Now I have to weigh that possibility.

20. khafra ◴[] No.41233334{3}[source]
As long as you have the money to spend on the extra CPU cycles, there's things you could do with encryption, such as homomorphic computation, to stay more secure: https://www.lesswrong.com/posts/vit9oWGj6WgXpRhce/secure-hom...
21. prepend ◴[] No.41234529[source]
The harm is on the copies more so than the copied person.