Discover more from Good Thoughts
All Probabilities Matter
If they're well-grounded, at least
The idea that one should strictly ignore tiny probabilities (no matter the stakes) is both extremely widespread and demonstrably false. (Perhaps it is motivated by the thought that we shouldn’t be vulnerable to Pascal’s muggings. But better solutions to that worry are available.)
Many people think you should simply ignore—round down to zero—probabilities smaller than, say, 0.00000001 (or one in 100 million). To see why this is false, suppose that:
(i) a killer asteroid is on track to wipe out all life on Earth, and
(ii) a billion distinct moderately-costly actions could each independently reduce the risk of extinction by 1 in a billion.
Now, it would clearly be well worth saving the world at just moderate cost to a billion individuals. And, since each of the relevant actions is independent of the others, if all billion are worth performing then any one is equally worthwhile no matter how many others are performed. So it is extremely worthwhile to oneself perform one of the risk-mitigating actions, despite the moderate cost and tiny probability of making a difference.
We can generalize the case to make the probability arbitrarily small, so long as the stakes are correspondingly increased. For arbitrarily large X, suppose that a population of X individuals is threatened with extinction, and each individual has the opportunity to independently reduce the risk of mass extinction by 1/X. This action is clearly worth taking. So it is demonstrably false that probabilities as tiny as 1/X can be categorically ignored, no matter what number you put in for X.
What about Pascalian muggers?
Pascal’s mugger threatens to torture some unimaginably huge number of sentient beings in another galaxy, unless you hand over your wallet. Seems pretty unlikely that he’s telling the truth. But, the worry goes, you should give it some non-zero probability (however tiny — call it 1/Y), and then he simply needs to add that there are more than Y lives at stake in order for it to be “worth” handing over your wallet, in terms of expected value. Since it would actually be clearly irrational to hand over your wallet on the basis of such a ludicrous claim, the critic concludes that we shouldn’t be moved by expectational reasoning based on small probabilities.
This case is very different from the asteroid case we previously considered. A couple of notable differences:
(1) The asteroid case involved objective chances, rather than made-up subjective credences.
(2) In the asteroid case, each act made a definite difference to the resulting objective probability, with vastly different outcomes guaranteed to occur if no-one acts vs if all X people do so. In the mugger case, it’s not true that Y people handing over their wallets would be guaranteed to do any good. So your one act doesn’t even make a definite probabilistic difference. It’s a different (more epistemic) sort of gamble.
Either of these features (or some third possibility) could explain what’s wrong with Pascal’s Muggings, without implying that you can generally (i.e. even in our asteroid case) ignore tiny probabilities.
For what it’s worth, my diagnosis would be that the problem with Pascal’s mugging lies in epistemology rather than decision theory. If you grant even 1/Y credence to the mugger’s claims to influence Y lives, you’re being duped. However large a population Y he claims to affect, your credence should be some vastly smaller 1/Z. If he then claims to affect this larger number Z instead, your credence in that should be some vastly smaller 1/Z*. Whenever complying with the mugger would be irrational, you shouldn’t accept credences that would rationalize compliance.
It’s probably a reasonable heuristic to just treat the mugger’s baseless claims as if they have literally zero chance of being true. But it doesn’t follow from this that you can similarly “round down” other small probabilities. It’s especially clear that we should not “round down” small objective chances, or probabilities that are well-grounded in our knowledge of the situation.
You should ignore Pascal’s mugger. You shouldn’t ignore all tiny probabilities. Whether a tiny risk should be ignored or not cannot be determined solely on the basis of how small a number it is. It also depends on whether the assigned probability is robust and well-grounded (e.g. in objective chances) or is a subjectively made-up number that could easily be off by orders of magnitude.
That’s not to say that you should ignore all subjective made-up numbers, either. Yes, we can be very confident that the Mugger was lying. But we can’t always be so confident about our situation. As I argue in X-Risk Agnosticism, it seems most reasonable to assign non-trivial subjective credence to things like AI risk. There are, after all, genuine reasons for concern there, even if it’s hard to know how to quantify it and there’s also a pretty good chance that there will turn out to be no problem after all. Given such uncertainty, moderate precautionary measures seem prudent.
The upshot, then, is that we should take seriously risks of great harm that are either (i) non-trivial in subjective likelihood, supported by genuine—non-ludicrous—reasons for concern, or (ii) objectively well-grounded, no matter how small (“trivial”) the probability in question.
We should only dismiss high-stakes risks (where the immense value at stake seems proportionate to the low chance) that are both (i) trivially tiny in probability, and (ii) this particular probability estimate isn’t robust or objectively well-grounded, so there’s reason to expect that a vastly lower probability is actually warranted.