Poor one-boxers

Imagine you're a hedonist who doesn't care about other people, nor about your past or your distant future. All you care about is how much money you can spend today. Fortunately, you're on a pension that pays either $100 or $1000 every day, plus an optional bonus. How much you get is determined as follows. Every morning, a psychologist shows up to study your brain. Then he puts two boxes in front of you, one opaque, the other transparent. You can choose to take either both boxes or only the opaque one. The transparent box contains a $10 bill. The opaque box contains nothing if the psychologist has predicted that you will take both boxes; if he has predicted that you will take one box, it contains $100. The psychologist's predictions are about 99% accurate. The content of your boxes is your bonus payment. In addition, you get your ordinary payment, which is either $100 or $1000 depending on how many boxes you took the previous day: if you took both, you now get $1000, otherwise $100. The ordinary payment is given to you before the psychologist studies your brain, so by the time you choose between the two boxes, you already know how much you received. What do you do?

Evidential decision theory says that you should take only the opaque box, while causal decision theory says you should take both. Since you already know whether your ordinary payment is $100 or $1000, your decision between the boxes has neither causal nor evidential impact on today's ordinary payment. It does have an impact on how much you will get tomorrow, but you don't care about tomorrow. What's left is a standard Newcomb problem.

The upshot is that if you follow evidential decision theory, you have $200 to spend most days, and occasionally only $100 (if the prediction went wrong). If you follow causal decision theory, you have $1010 most days, and occasionally $1110.

In this scenario, the two-boxers might start to ridicule the one-boxers. "Look how much money we got again today, and how many beautiful things we could buy. You, on the other hand, have barely enough for food and rent. Wouldn't you rather be one of us?"

The ridicule is, of course, unfair. When the two-boxers take their two boxes, they don't care what effect this has on their payment the following day. They choose two boxes only because this is guaranteed to give them $10 more today than whatever they would get if they chose only one box. It just happens that their choice also makes them get $1000 the following day. The payment setup rewards two-boxers and punishes one-boxers -- but not because two-boxers are making the better choice. The setup could just as well have been the other way round, so that one-boxers would have been much better off than two-boxers.

Nevertheless, the ridicule is quite similar to the ridicule two-boxers often get from one-boxers in the standard Newcomb problem. "If you're so smart", say the one-boxers, "why ain'cha rich? Look how much money we got; wouldn't you rather be one of us?" We two-boxers reply that we're poor not because we made the wrong choice, but because the Newcomb setup rewards one-boxers and punishes two-boxers: one-boxers mostly get to choose between $100 (in the opaque box) and $110 (in both boxes together), while two-boxers are given a choice between $0 (in the opaque box) and $10 (together). That's why one-boxers end up with more money -- not because they made the better choice. (Indeed, if they had taken both boxes, they would be even richer.)

It would be nice if we could add that the setup might just as well have been the other way round, rewarding two-boxers rather than one-boxers. However, it is difficult to come up with such a reversed setup. Lewis (in "Why ain'cha rich?") conjectured that it can't be done. The story above at least seems to come close.

(The story can also be adapted to several agents rather than a single agent at different times. Suppose infinitely many copies of you have been created and assigned to the numbers ...-3,-2,-1,1,2,3,..., with the number 0 reserved for yourself. Each of you first gets either $100 or $1000. Then you face a Newcomb problem as above. If player i takes both boxes, player i+1 gets $1000, otherwise they get $100. All you care about is your own personal profit, and thus all your copies likewise only care about their own personal profit. If you're a one-boxer (or more generally an evidential decision theorist), you end up with either $200 or $100. If you're a two-boxer (a causal decision theorist), you get either $1010 or $1110.)

Comments

# on 20 February 2012, 14:23

This is really nice.

Add a comment

Please leave these fields blank (spam trap):

No HTML please.
You can edit this comment until 30 minutes after posting.