Whinging about conditionalization 
Subjective Bayesianism as it is often employed in philosophy of science consists of three commitments:
PSYCH (the psychological bit) An agent's degrees of belief can be represented as a real number for each proposition of the language.

SYNCH (the synchronic bit) An agent's degrees of belief at a time ought to obey the axioms of probability.

DIACH (the diachronic bit) An agent's degrees of belief should be updated over time by conditionalization.

As an example of DIACH, suppose that P1 is the probability function representing your beliefs before learning some evidence E and that P2 is the function afterwards. After learning E, you believe it; so P2(E)=1. For another hypothesis H, you should change your degree of belief in H to your prior degree of belief in H given E; that is P2(E)=P1(H|E). There is a general probability kinematics for cases in which your learning changes your degree of belief in but does not make you certain of E; often it's called Jeffrey conditionalization.


Colin Howson and Peter Urbach, in ch 6 of Scientific Reasoning, argue that violating SYNCH makes one inconsistent but that violating DIACH does not. They argue by constructing a case in which you are imagined to consistently violate DIACH. I'll summarize a streamlined version of the case before whinging about their argument.

Let P1, P2 be your successive degrees of belief. You believe some claim H for legitimate reasons: P1(H)=1. You suspect, however, that you have a brain lesion such that you will be less confident of H later on. Let E be the propostion 'P2(H)=1/2'. You suspect now that, because of the brain lesion, E will be true. Yet you think that E does not indicate any legitimate reason to doubt H. It will just be because you are overcome by vapors of black bile. As such, P1(H|E)=1. That is, you are presently confident of H even supposing that E turns out to be true (and you later lose confidence in H.)

Now the brain lesion does its work, and P2 is your new credence function. You are now uncertain of H: P2(H)=1/2. This is just the state of affairs represented by E, and you are aware of it, so P2(E)=1. If you kept your conditional probabilities fixed, as DIACH demands, then P1(H|E)=P2(H|E)=1. Yet it follows from the other values and rules of probability that P2(H|E)=1/2, so DIACH leads to a violation of SYNCH. Violating SYNCH would be inconsistent, so consistency demands violating DIACH.

That's the argument.

The brain lesion in this example seems like too much of a philosophers' contrivance, but I'll let that slide for a moment. Note, however, that the lesion makes it impossible to obey DIACH at all in this case. Given that you have prior P1(H|E)=1 and that you learn E, you should have posterior P2(H)=1. The lesion stops you from drawing that conclusion.

You can still obey SYNCH by adjusting P2(H|E)=1/2, but that does not seem like much of a victory. You would remain consistent, and so in that limited sense rational, but you would still be apportioning your belief in a vicious way. Your organic condition would have condemned you to a kind of irrationality, even if not inconsistency, and violation of DIACH would be symptomatic.

Moreover, there is a kind of legerdemain involved in conditionalizing on your present degrees of belief. As Richard Moran has argued, there is an important difference between third-person ascription (judging whether Steve believes H, for example) and first-person ascription (judging whether you believe H). The former involves considering Steve's behavior. The latter involves considering the evidence for and against H. You can ask the former question about yourself up until now. You ask the latter when you deliberate whether you now and henceforth shall believe H.

In the case given above, is your deliberation of the third-person or the first-person kind?

If it is third personal, then you must conclude that P2(H|E)=1/2. All of your behavior will indicate that, because it indicates P2(H)=1/2 and P2(E)=1. But, from the third-person standpoint, one must conclude that this configuration of belief is the irrational result of a bad brain.

If it is first personal, then it is nonsense to represent your reflection in terms of P2(H|E). E is itself a claim about P2. You must ask yourself, instead, that evidence suggests that H could be concluded from E. In effect, you are deliberating on what P3(H|E) ought to be. It is unclear how this deliberation would or should go, because the gedanken lesion is so underspecified that we don't know how or even if it constrains P3.

The subjectivist might object that it is spurious to call the violation of DIACH in this case irrational, because there is no bell that goes off telling you that your change of belief is vicious. Yet the subjective Bayesian typically does not specify which belief changes count as observations. If we consider purely your first-person point of view and treat DIACH as a rational constraint, then your spontaneous change from believing H P(H)=1 to not believing it P(H)=1/2 just is the learning that happens in this case. You ought to conditionalize on this new piece of evidence, using the full probability kinematics.

(Actually, the usual framework doesn't allow you to renege on beliefs once they are set to probability 1. But that is incidental to the point here. The case will suffice for H&U's argument, if at all, supposing any value for P1(H) that is distinct from P2(H).)

[ add comment ] ( 7350 views )   |  [ 0 trackbacks ]   |  permalink

<<First <Back | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | Next> Last>>