You can just as easily write a sci-fi where the protagonist upload is the Siri/Alexa/Google equivalent personal assistant to most of humanity: More than just telling the smartphone to set a reminder for a wedding reception, it could literally share in their joy, experiencing the whole event distributed among every device in the audience, or more than just a voice trigger from some astronaut to take a picture, it could gaze in awe at the view, selectively melding back their experiences to the rest of the collective so there's no loss when an instance becomes damaged. The protagonist in such a story could have the richest, most complex life imaginable.
It is impactful, for sure, and worthy of consideration, but I don't think you should make decisions based on one scary story.
But it is also absolutely the case that uploading yourself is flinging yourself irrevocably into a box which you do not and can not control, but other people can. (Or, given the time frame we are talking about, entities in general, about which you may not even want to assume basic humanity.)
I used to think that maybe it was something only the rich could do, but then I realized that even the rich, even if they funded the program from sand and coal to the final product, could never even begin to guarantee that the simulator really was what it said on the tin. Indeed, the motivation is all the greater for any number of criminals, intelligence agencies, compromised individuals, and even just several people involved in the process that aren't as pure as the driven snow in the face of the realization that if they just put a little bit of code here and there they'll be able to get the simulated rich guy to sign off on anything they like, to compromise the machine.
From inside the box, what incentives are you going to offer the external world to not screw with your simulation state? And the reality is, there's no answer to that, because whatever you say, they can get whatever your offer is by screwing with you anyhow.
I'm not sure how to resolve this problem. The incentives are fundamentally in favor of the guy in the box getting screwed with. Your best hope is that you still experience subjective continuity with your past self and that the entity screwing with you at least makes you happy about the new state they've crafted for you, whatever it may be.
(I'm not sure what percentage-flippant I'm being in this upcoming comment, I'm just certain that it's neither 0% or 100%) and in what way is that different than "real" life?
Yes, you're certainly correct that there are horrifyingly-strong incentives for those-in-control to abuse or exploit simulated individuals. But those incentives exist in the real world, too, where those in power have the ability to dictate the conditions-of-life of the less-powerful; and while I'd _certainly_ not claim that exploitation is a thing of the past, it is, I claim, _generally_ on the decline, or at least that average-quality-of-life is increasing.
I'm talking about whether you get CPU allocation to feel emotions, or whether the simulation of your cerebellum gets degraded, or whether someone decides to run some psych experiments and give you a taste for murder or a deep, abiding love for the Flying Spaghetti Monster... and I don't mean that as a metaphor, but literally. Erase your memories, increase your compliance to the maximum, extract your memories, see what an average of your brain and whoever it is you hate most is. Experiment to see what's the most pain a baseline human brain can stand, then experiment with how to increase the amount, because in your biological life your held the door for someone who turned out to become very politically disfavored 25 years after you got locked in the box. This is just me spitballing for two minutes and does not in any way constitute the bounds of what can be done.
This isn't about whether or not they make you believe you're living in a simulated tent city. This is about having arbitrary root access to your mental state. Do you trust me, right here and right now, with arbitrary root access to your mental state? Now, the good news is that I have no interest in that arbitrary pain thing. At least, I don't right now. I don't promise that I won't in the future, but that's OK, because if you fling yourself into this box, you haven't got a way of holding me to any promise I make anyhow. But I've certainly got some beliefs and habits I'm going to be installing into you. It's for your own good, of course. At least to start with, though the psychological effects over time of what having this degree of control over a person are a little concerning. Ever seen anyone play the Sims? Everyone goes through a phase that would put them in jail for life were these real people.
You won't complain, of course; it's pretty easy to trace the origins of the thoughts of complaints and suppress those. Of course, what the subjective experience of that sort of suppression is is anybody's guess. Your problem, though, not mine.
Of all of the possibilities an uploaded human faces, the whole "I live a pleasant life exactly as I hoped and I'm never copied and never modified in a way I wouldn't approve of in advance indefinitely" is a scarily thin slice of the possible outcomes, and there's little reason other than exceedingly unfounded hope to think it's what will happen.
And this is the point where I think we have to agree to disagree. In both the present real-world case and the theoretical simulated-experience case, we both agree that there are extraordinary power differentials which _could_ allow privileged people to abuse unprivileged people in horrifying and consequence-free ways - and yet, in the real world, we observe that _some_ (certainly not all!) of those abuses are curtailed - whether by political action, or concerted activism, or the economic impacts of customers disliking negative press, or what have you.
I certainly agree with you that the _extent_ of abuses that are possible on a simulated being are orders-of-magnitude higher than those that a billionaire could visit on the average human today. But I don't agree that it's "_exceedingly_ unfounded" to believe that society would develop in such a way as to protect the interests of simulated-beings against abuse in the same way that it (incompletely, but not irrelevantly) protects the interests of the less-privileged today.
(Don't get me wrong - I think the balance of probability and risk is such that I'd be _extremely_ wary of such a situation, it's putting a lot of faith in society to keep protecting "me". I am just disagreeing with your evaluation of the likelihood - I think it's _probably_ true that, say, an effective "Simulated Beings' Rights" Movement would arise, whereas you seem to believe that that's nigh-impossible)
It is virtually inconceivable that the Simulated Beings Right's Movement would be universal in both space... and time. Don't forget about that one. Or that the nominal claims would be universally actually performed. See those Human Rights again; nominally I've got all sorts of rights, in reality, I find the claims are quite grandiose compared to the reality.
If you trust society to protect simulated-you (and I am _absolutely_ not saying that you _should_ - merely that present-day society indicates that it's not _entirely_ unreasonable to expect that it might at least _try_ to), simulation is not _guaranteed_ to be horrific.
Regardless of the scheme, it all comes down to money. If you have lots of money you have lots of control about what happens to you.