6 Comments

I'm agnostic about whether you're right in this article, so my higher-order agnosticism about X risk agnosticism leads to me taking X risks seriously.

Expand full comment
Jun 8, 2023Liked by Richard Y Chappell

Existential risk is very important in my view. I am really unsure about anthropic arguments like the doomsday argument but they sound very possible, I’m just not sure which one is right. It seems like this weird type of thinking that’s very counter intuitive might be really important in evaluating x-risk.

Expand full comment
Oct 6, 2023·edited Oct 6, 2023Liked by Richard Y Chappell

Out of interest, what are your views on people who say that not only should you be agnostic about the probability of x-risk, but you should also be agnostic about whether human extinction would be a bad thing. Some of vegan friends (although not *only* the vegans) tend towards the view that given that factory farming is so bad, stopping it entirely may outweigh the costs of human extinction. When you add in possible future s-risks to human beings (and non-human animals) and the possibility that humans so far have had lives that aren't worth living, is agnosticism *so* irrational here?

Expand full comment
Jun 9, 2023Liked by Richard Y Chappell

I think x-risk is like factory farming, in that worrying about it is pretty overdetermined, and doesn't depend much on the details of your worldview or your ethical principles. Like, I don't know how I feel about the "astronomical potential" stuff, but that doesn't change my concern about x-risks much at all.

Expand full comment