Once aligned ASI is achieved, it will invent the technology to create digital copies of people. This is mind upload. The worlds people will inhabit once uploaded will be personal utopias curated by artificial superintelligence. What I mean by “personal utopias” is that these will be worlds created specifically for individual uploaded minds, optimised for their personal flourishing. They will not primarily contain utopian societies. It’s the best-possible world for the uploaded mind, but not for the other inhabitants of the world.
This would be ethically questionable if the other inhabitants were moral patients. So, they won’t be. They will be P-zombies, agents indistinguishable by any empirical means from natural humans, but who lack the light of consciousness, or any subjective experience.
One might object. “But would one, living in such a world, not feel unsatisfied in one’s relationships and interactions with the other inhabitants, knowing that they did not really perceive one, or feel anything about one at all?” I think this is a point which needs addressing. The claim seems true, that knowing one is the only truly conscious person in the world would for most people reduce the satisfaction of existence in that world. It could still be enjoyable; one enjoys video games. But, despite it being enjoyable, it seems unlikely that it would be the best possible world for one. The solution is simple enough. Just have one’s ASI world-curator remove from one’s mind the awareness of the fact that the other occupants in the world are P-zombies.
There is a loss in the erasure: that one will feel the moral weight of one’s actions, which will cause certain inhibitions. But this would not be in balance a loss, but, rather, a gain, as life would feel ultimately purposeless otherwise.
Another objection is that people would not readily abandon the friends they had pre-upload. I do not deny this. Humans are emotional creatures. And it will be possible to co-inhabit digital environments with one’s pre-upload friends. But people will soon choose to splinter off into the kinds of personal utopias I outlined above.
People will quickly find themselves forming much stronger, deeper connections with artificial people than they ever did with natural people (why is explained later in this post). And as people find themselves spending ~no time with those whom they had previously held dear, sharing a world with them becomes strictly costly: instead of constructing a world which is the best possible world for one, the ASI world-curator must compromise between what is best for one and what is best for one’s friends and loved ones. And to whatever extent one's actions are legible to one’s pre-upload friends, one is inhibited in one’s inevitable wish not to incur judgement for violating the ethical norms of pre-upload society, which will generally be far from the norms which bring an individual the greatest good. Therefore, people will splinter off into their own worlds, isolated from other humans.
So, I have established some of the parameters of the worlds which our uploaded minds will inhabit. They will be worlds curated by ASI to be the best possible worlds for their single conscious inhabitants. The ASI will have general freedom in shaping the world, unburdened by ethical considerations beyond those which concern these individuals. But, concretely, what will these personal virtual utopias actually be like?
As I mentioned earlier, they will be very different to any traditional depictions of utopian society. After all, utopia as popularly conceived is paradoxical: it attempts to solve for a society that simultaneously grants purpose and freedom but also abundance and peace. But each side is only really attainable at the cost of the other. ASI curators will resolve the paradox by focusing on purpose and freedom and giving up on abundance and peace. Abundance and peace are societal goals, but not fundamentally important to the individual’s good. For the individual, struggle, danger, pain, and self-sacrifice are all aspects of a good life. A world of high-stakes, where things are bad and need to be changed, and evil forces need to be repelled, and there is much that is unknown,… it’s not a world of abundance and peace, but it’s the world one would likely like to live in.
But all of that is still abstract, and we’re trying to get a more concrete picture of life in these personal utopias. Well, the world is meant to be the best possible world for one. So imagine all the things which have brought your joy in this world. In one’s personal utopia, amplified versions of all these joys will be present. One will witness events more interesting than any one witnessed pre-upload, make stronger emotional connections, accomplish greater things, experience deeper love, stronger passion, take bigger risks, experience greater turns of fortune, etc.
People have a natural tendency to try to come up with ways in which such a world would be worse than the real world, rather than better. But it wouldn’t be, at least not from one’s subjective perspective. The only way in which it would be worse is that one’s actions would not be truly meaningful. Today, in the real world, one’s actions influence (we presume) such things as whether safe ASI will really be developed, which determines the fate of at least billions of souls. In our virtual utopias, though we will not know it, our actions will not be truly meaningful. But, nonetheless, subjectively they will likely be strictly better than the real world, and definitely be, subjectively, broadly, vastly preferable.
So, I imagine virtual personal utopian worlds as being places of righteous martyrs; grand betrayals; convoluted plots; ancient families; galactic empires; deep magic; inexhaustible lore; perfectly-written characters of all moral colours: good, evil, grey, with moral arcs from good to evil (and vice versa), etc.; worlds full of diverse civilisations, immense beauty, and so on. But, more than any of that, worlds in which the main characters live lives of grand struggle and triumph, loss and discovery, etc.