And what academic philosophers can expect from the book
My personal thought is that it might be a orders of magnitude more important to focus on securing the existence of a small population for the longterm future, than insuring something like a 80% reduction in human population doesn't occur. The possibility of repopulating after a catestrophic event seems very important. I think it might be good to keep a few hundred people in a bunker deep in the earth, and that might be worth a lot more than keeping coal in the ground. Did he discuss any ideas like this? I think I will read the book.
Re "Would they really be okay with, say, burying landmines under a children’s playground on the condition that the mines are set to be inert for a century?"
No, but, in doing so, you would kill those children. And people against longtermism are obviously against killing children. ("Blatantly indecent," indeed.)
And there's no obvious step from that prohibition to longtermism, to neutrality, etc. I'm not sure there's even an unobvious step.
I look forward to reading MacAskill's whole book.
Here is a critique. I hope it comes across as constructive, that's what I'm aiming for.
Your post ignores the perspective of suffering focused ethics (a range of views from "reducing suffering matters somewhat more" to "reducing suffering is all that matters"). Ignoring or short-changing that perspective seems to be a pattern in your writing here and on utilitarianism.net . Doing so seems pervasive also among many others in the EA top tier. With top tier I mean people with the highest status in the EA community and/or people who hold positions of power in the most established and well-funded EA organizations like Open Phil and 80000 hours.
I find Magnus Vinding's argument on this very revealing and convincing
It seems MacAskill's book also continues that problematic pattern, at least after skimming his chapter 8 on population ethics. There appears to be no mention of Boonin, Vinding or other s-focused writers in the book.
It seems to me that you and people in the EA top tier can take a step toward "building a morally exploratory world" by giving suffering focused ethics more space and resources.
As a final note, it appears that you yourself in this text mostly appeal to intuitions about preventing harms or suffering, rather than creating beings with positive wellbeing or improving already positive states. For example your discussion of climate change, "broken glass left on a hiking trail" and "burying landmines".