23 Comments
Aug 13, 2022Liked by Richard Y Chappell

You should consider diminishing value of goods and maybe life, but not of population in these doubling experiments. Increases in more food has diminishing returns, but doubling population will double utility itself unless you do not take the total view, but you seemed sympathetic to it on utilitarianism dot net.

I don't think the partiality works because, like you said, it should at least be permissible to take the double-or-nothing bet. Even if we were to say "we care more about people we know" it wouldn't necssarily mean we should or we have the strongest moral reason to.

This is a tricky problem in my view. However, I think a true utilitarian should probably just take the bet. However, then they should take the bet again...and again...until we cease to exist. Which seems incredibly wrong. I don't really have a solution, but it would seem a bit ad hocish if I did. All the population ethics work-arounds seem so incredibly ad hocish. This problem which I first heard from Cowen seems honestly like one of the better arguments against utilitarianism - better than Huemer's in your debate (even though I liked his). I was feeling more sympathetic to utilitarianism recently, but this issue pushed me away a bit more...sorry!

Expand full comment

I bite the bullet! I once found this really unintuitive, but reflecting on it, it seemed like irrational risk aversion, and it no longer seems unintuitive.

Expand full comment

Such thought experiments are obviously not meant to be realistic, but to illustrate a point. What is he point here? I take it to be that it is possible to find circumstances where utility maximizing will give bad advice. I’m tempted to think this is true of all decision procedures - they have an appropriate scope and an error rate.

Nassim Taleb comes to mind. He is always on about the risk involved in using an algorithm that makes assumptions about the statistical distribution we observe, which can lead to a form of overconfidence that serves well on average, but eventually kills you if if you don’t take precautions (or maybe even if you do).

Expand full comment

“ doubled-up life (or world) doesn’t intuitively strike me as truly twice as good as the single life (world), making risk aversion explanatorily redundant.”

Isn’t risk aversion just a formalization of that intuition? Whatever your reason for thinking gambles are worth less than “risk neutral equivalent” sure things is the reason for acting with risk aversion. There is the behavior, and there is it’s label. What's redundant? Maybe the explanation? But risk aversion is a diagnosis, not a prescription or an explanation. It doesn’t explain the behavior we observe - we need evolution and psychology for that, not decision theory or however we should classify risk aversion.

Expand full comment