An argument against some causal decision theories

Here is an attempt at an argument against formulating causal decision theory in terms of counterfactuals (loosely following up on the discussion in the previous post). The point seems rather obvious, so it is probably old. Does anyone know?

Suppose you would like to go for a walk, but only if it's not raining. Unfortunately, it is raining heavily, so you have almost decided to stay inside. Then you remember Gibbard and Harper's paper "Counterfactuals and two kinds of expected utility".

According to Gibbard and Harper, the right action is the one that maximises counterfactual expected utility, defined as

CEU(A) = \sum_i P(A []-> S_i) U(A & S_i),

where P is your probability function, U your utility function, and S_i ranges over some suitable partition of states. A sensible partition in the present case would be { Rain, not Rain }. This is indeed the partition we choose, but for fun we express it in more complicated (but equivalent) words:

S_1: it rains and it is not the case that [you go for a walk iff you don't actually go for a walk].
S_2: either it doesn't rain or [you go for a walk iff you don't actually go for a walk].

Let's say you assign utility 5 to going for a walk when it doesn't rain, utility 1 to staying inside when it rains, 0 to staying inside when it doesn't rain, and -5 to walking in the rain. Since utilities do not vary between logically equivalent propositions, this allows us to calculate the utilities we need in the Gibbard and Harper formula. For instance, "Walk & S_1" is "Walk & Rain & not [Walk iff not actually Walk]", which is logically equivalent to "Walk & Rain", so the utility is -5. Overall, we get

U(Walk & S_1) = U(Walk & Rain) = -5.
U(not Walk & S_1) = U(not Walk & Rain) = 0.
U(Walk & S_2) = U(Walk & not Rain) = 5.
U(not Walk & S_2) = U(not Walk & not Rain) = 1.

Now your probability that you will go for a walk is presumably low; let's say it is 1/5. Moreover, "not Walk" is logically equivalent to "Walk []-> (Walk iff not actually Walk)". (What you do in the closest Walk worlds comes apart from what you actually do iff you don't actually go for a walk.) So

P(Walk []-> Walk iff not actually Walk) = P(not Walk) = 4/5.

Note that S_2 contains "Walk iff not actually Walk" as a disjunct. Since in general A []-> B entails A []-> (B v C), your probability for Walk []-> S_2 must therefore be at least 4/5:

P(Walk []-> S_2) >= 4/5.

At this point, we can stop calculating probabilities, because it is clear that when we plug the numbers into the Gibbard and Harper formula, going for a walk will easily win:

CEU(Walk) = P(Walk []-> S_1) U(Walk & S_1)
+ P(Walk []-> S_2) U(Walk & S_2)
= [at most 1/5] * -5 + [at least 4/5] * 5
>= 3
CEU(not Walk) = P(not Walk []-> S_1) U(not Walk & S_1)
+ P(not Walk []-> S_2) U(not Walk & S_2)
= [some mixture of 0 and 1]
<= 1

So you go for a walk in the rain.

(Part of what led to this outcome is the assumption that you probably won't go for a walk. If you're inclined to follow Gibbard and Harper, doing the calculations will raise P(Walk), thereby lowering CEU(Walk). The equilibrium state lies slightly below P(Walk) = 1/2. But the problem remains: this isn't the right state of mind; you shouldn't be indifferent between going and not going.)

The problem is that substituting logically equivalent sentences in the consequent of a counterfactual can affect the counterfactual's truth-value. As a result, the recommendations of Gibbard and Harper are not only 'partition dependent', but dependent on the exact way partitions are picked out: equivalent ways of referring to the same partition lead to different recommendations. And that seems bad.

Comments

No comments yet.

Add a comment

Please leave these fields blank (spam trap):

No HTML please.
You can edit this comment until 30 minutes after posting.