There’s a general recipe that underlies the organ-harvesting case and similar standard “counterexamples” to utilitarianism:
(1) Imagine an action that violates important (utility-promoting) laws or norms, and—in real-world circumstances—would be disastrous in expectation.
(2) Add some fine print stipulating that, contrary to all expectation, the act is somehow guaranteed to turn out for the best.
(3) Note that our moral intuitions rebel against the usually-disastrous act. Checkmate, utilitarians!
This is a bad way to do moral philosophy. Our intuitions about real-world situations may draw upon implicit knowledge about what those situations are like, and you can’t necessarily expect our intuitions to update sufficiently based on fine print stipulating that actually it’s an alien situation nothing like what it seems. If you want a neat, tidy, philosophical thought experiment where all else is held equal—unlike in real life—then it’s best to avoid using real-life settings. Rather than asking us to assess an alien situation masquerading as a real-life one, it would be much clearer to just honestly describe the alien situation as such. Literally.
Consider Martian Harvest:
Martians, it turns out, are rational, intelligent gelatinous blobs whose sensory organs are incapable of detecting humans. They generally live happy and peaceful lives, except for practicing ritual sacrifice. Upon occasion, a roving mob will impound six of their fellow citizens and treat them as follows. Five have a portion of their “vital gel” removed and ritually burned. As a result, they die within a day, unless the sixth chooses to sacrifice himself—redistributing the entirety of his vital gel to the other five. Whoever is alive at the end of the day is released back into Martian society, and lives happily ever after.
You find yourself within the impounding facility (with a note on your smartwatch indicating that you will be automatically teleported back to Earth within an hour). Five devitalized Martians are wallowing in medical pods, while the sixth is studiously ignoring the ‘sacrifice’ button. You could press the button, and no-one would ever be the wiser. If you do, the Martian medical machinery will (quickly and painlessly) kill the sixth, and use his vital gel to save the other five.
Should you press the button?
This is a much more philosophically respectable and honest thought experiment. Believers in deontic constraints will naturally conclude that you should not press the button, as that would kill one unconsenting (Martian) person as a means to saving five others. Still, in this situation where it’s clear that all else truly is equal, I find it intuitively obvious that one ought to press the button, and expect that many others will agree. It’s not any sort of costly “bullet” to bite.
That is to say, Martian Harvest is not a “counterexample” to utilitarianism, but simply a useful test case for diagnosing whether you’re intuitively drawn to the utilitarian account of what fundamentally matters.
Now, given that Martian Harvest more transparently describes the structural situation that the familiar Transplant scenario aspires to model, and yet our intuitions rebel far more against killing in the Transplant case, it seems safe to conclude that extraneous elements of the real-world setting are distorting our intuitions about the latter. (And understandably enough: as even utilitarians will insist, real-world doctors really shouldn’t murder their patients! This is not a verdict that differentiates utilitarians from non-utilitarians, except in the fevered imaginations of their most rabid critics.)
That is, Transplant is, philosophically speaking, an objectively worse test case for differentiating moral theories. It’s an alien case masquerading as a real-life one, which builds in gratuitous confounds and causes completely unnecessary confusion. That’s bad philosophy, and anyone who appeals to Transplant as a “counterexample” to utilitarianism is making a straightforward philosophical mistake. Encourage them to consider a transparently alien case like Martian Harvest instead.
As written the Martian case actually has a wrinkle that may actually sneak in some extra game-theoretic moral intuitions: the sixth Martian is avoiding the lever, choosing to allow five of its fellows to die that it might live. So it’s morally culpable (if to an understandable degree!) in a way that “punishing” it feels less bad, perhaps especially to a certain kind of traditional deontologist, but also (as in the transplant case) game-theoretic considerations that may be considered by utilitarians. In the other direction there may be hesitancy in interfering with a (literally) alien cultural practice whose purposes you don’t understand.
Of course if you specify the sixth Martian has not awoken either the punishment intuition presumably disappears, and if you specify the ritual sacrifice came about for some bad reason (a cruel Martian emperor decreed it for its amusement, or whatever), then these might disappear, so this is all a quibble that doesn’t detract from your main point.
This is great! I've been thinking about this issue a lot, and thinking that we need to do a better job of drawing real-world cases where the stipulations actually line up with our intuitions, but I suppose that making the case alien can help too!