I can't take credit for this idea, it has been 'in the water' of the EA/rationalist community for a while, for example in Answer to Job. It is the kind of thing I tend to assume that everyone in the community understands or knows, and yet people constantly seem surprised when I explain it. And apparently the repugnant conclusion is still considered an 'unsolved' problem by the philosophical community.
The key insights are:
1) Multiple identical instances of an entity matter morally exactly as much as a single instance, and
2) Identity is a continuum, not a discrete thing.
If point 1 is not obvious to you, consider this thought experiment:
Imagine two people uploaded into computer brains. One of them is uploaded onto a normal computer, and another one is uploaded into a giant computer where all the wires and circuits are 100 times as large in cross-section, so that 100 times as many electrons flow through them. Does the person in the giant computer have 100 times as much moral worth as the person in the normal computer?
Of course not, that would be ridiculous. So consider the following sequence of events:
1) An uploaded person is running on a flat computer chip
2) The computer chip is made twice as thick.
3) A barrier of insulation is placed halfway through the chip, so that the flow of electrons is exactly the same.
4) The two sides are moved apart. All of the inputs and outputs remain exactly the same, the flow of electrons is completely unchanged from step 2.
It should be clear that no new person has been created. The moral worth of the thing in step 4 is exactly the same as the moral worth of the thing in step 1.
Of course, if the inputs begin to diverge, then at some point the two entities will become two different people, and then they will each have full moral worth. Even with identical inputs, random chance could cause them to remember or think about different things, so that they diverge over time. But as long as they are identical, then they are just like someone running on a computer with really thick wires.
This insight clears out a lot of nonsense. A universe tiled with hedonium (identical things experiencing bliss) has exactly as much moral worth as one unit of hedonium. There is no value in wasting any resources or making any sacrifices to instantiate multiple copies of anything for utilitarian reasons.
Now, consider the question of when the diverging entities start to count as a unique individual. It seems silly that a single electron path changing makes a copy count as a unique individual, i.e. taking its moral worth from 0 to 1. But you can say that about every electron. At which point does it become a new person?
Consider a thought experiment where you take a drug that will wipe out your memories at the end of the day. You will go to sleep, and then wake up as if that day never happened. How will you feel at the end of the day? It will probably be weird and unpleasant, but very few people will treat this as though they are about to die (and if you do count this as a death, then you must also believe that every time someone gets drunk it counts as someone dying). And very few people would think that giving a person such a drug should count as murder.
So, we already implicitly treat an entity who is very slightly different, the equivalent of a person gaining a few hours of memories, as less than a full person in a moral calculation. Two slightly-divergent ems are very similar to the difference between the person with an extra day of memories who will be lost, and the person who still remains. I do not know how much less. Maybe they count as 1.001 people, maybe 1.5 people. The main point is that there's some kind of function where you input the difference between an entity and a similar one, and it outputs the marginal moral value of that entity existing, and this function is continuous.
An important implication of this is that near-identical copies of AIs or ems doing near-identical tasks matter only slightly more than one of them. So fears of a moral catastrophe in this area are probably exaggerated.
And there is a limit to the total moral value of a population, based on how distinct the members of the population are. And once you start talking about vast numbers of people, it gets harder and harder to add a new member to the population who is actually distinct. Anyone new will be a minor variation of an existing person, and therefore count less. This means that you cannot create arbitrary amounts of value by simply expanding population size, which means that the Repugnant Conclusion is much less of a problem.
2 comments:
This doesn't really remove repugnant conclusions, it just blunts them a bit. There are enough possible humans with lives barely worth living, with those lives being meaningfully distinct from each other. (I mean unless you think all medieval peasants were just repetitions of the same one person) (Although it may limit the scale. 10 really happy humans vs a million lives barely worth living feels different from 10 billion really happy humans vs 10^15 lives barely worth living.
Also, we can create situations where the "ethical" thing to do makes everyone worse off. Suppose a million people living lives not worth living in a dystopian hellscape. But each person has a distinct little glimmer of joy. Each person has a different little nice thing that makes the hellscape slightly more bearable. If we take all those little glimmers of joy away, their lives would become almost identical.
This doesn't really remove repugnant conclusions, it just blunts them a bit. There are enough possible humans with lives barely worth living, with those lives being meaningfully distinct from each other. (I mean unless you think all medieval peasants were just repetitions of the same one person) (Although it may limit the scale. 10 really happy humans vs a million lives barely worth living feels different from 10 billion really happy humans vs 10^15 lives barely worth living.
Also, we can create situations where the "ethical" thing to do makes everyone worse off. Suppose a million people living lives not worth living in a dystopian hellscape. But each person has a distinct little glimmer of joy. Each person has a different little nice thing that makes the hellscape slightly more bearable. If we take all those little glimmers of joy away, their lives would become almost identical.
Post a Comment