You can just as easily write a sci-fi where the protagonist upload is the Siri/Alexa/Google equivalent personal assistant to most of humanity: More than just telling the smartphone to set a reminder for a wedding reception, it could literally share in their joy, experiencing the whole event distributed among every device in the audience, or more than just a voice trigger from some astronaut to take a picture, it could gaze in awe at the view, selectively melding back their experiences to the rest of the collective so there's no loss when an instance becomes damaged. The protagonist in such a story could have the richest, most complex life imaginable.
It is impactful, for sure, and worthy of consideration, but I don't think you should make decisions based on one scary story.
But it is also absolutely the case that uploading yourself is flinging yourself irrevocably into a box which you do not and can not control, but other people can. (Or, given the time frame we are talking about, entities in general, about which you may not even want to assume basic humanity.)
I used to think that maybe it was something only the rich could do, but then I realized that even the rich, even if they funded the program from sand and coal to the final product, could never even begin to guarantee that the simulator really was what it said on the tin. Indeed, the motivation is all the greater for any number of criminals, intelligence agencies, compromised individuals, and even just several people involved in the process that aren't as pure as the driven snow in the face of the realization that if they just put a little bit of code here and there they'll be able to get the simulated rich guy to sign off on anything they like, to compromise the machine.
From inside the box, what incentives are you going to offer the external world to not screw with your simulation state? And the reality is, there's no answer to that, because whatever you say, they can get whatever your offer is by screwing with you anyhow.
I'm not sure how to resolve this problem. The incentives are fundamentally in favor of the guy in the box getting screwed with. Your best hope is that you still experience subjective continuity with your past self and that the entity screwing with you at least makes you happy about the new state they've crafted for you, whatever it may be.