2012-02-04 (Saturday) § Leave a comment
Okay, I was wrong. Suffering isn’t the only thing that matters to me morally. Happiness does too. But not as much as suffering. The thing that made this clear to me was the thought experiment about a button that would instantly and painlessly wipe out every suffering-capable entity (including me). I couldn’t get myself to say I would push that button. If worst individual non-consensual suffering is my only moral metric, then I should push that button even if only one entity somewhere is suffering just a tiny bit. It would guarantee a WINCS of zero henceforth and forevermore. That’s as low as it goes. Since I don’t feel like a universe with no conscious entities at all is morally better even compared to one with a really low but nonzero WINCS, it can’t be that WINCS is the only relevant moral metric for me.
Below is a representation of how much each thing matters to me, using my favorite intellectual tool, the indifference curve. The curved line here represents all the outcomes that are equally morally acceptable to me. The idea is that as the WINCS increases, it requires a higher and higher greatest individual happiness to compensate for it morally. Any point under the curve is morally acceptable to me, and any point above it is morally unacceptable.
I made this graph with an awesome open-source vector image editor.
Now one thing left to specify is how much suffering and happiness (in vague terms) is represented by a given length on each axis.
Another is whether the curve approaches a horizontal asymptote as GIH increases and continues to rise indefinitely even if at a continually decreasing rate, or it actually reaches a 0 slope (flattens out). If it does reach 0 slope, that means there’s some amount of individual suffering such that no amount of individual happiness makes it okay to me. I’m inclined to say there is.
Now about that universal euthanasia button. With the graph the way I’ve set it up, the origin point (zero WINCS, zero GIH) is morally equivalent to any point on or below the curve, but morally better than any point above it. That means if I have the button in front of me and I’m in a situation represented by any point above the curve, I should push the button, since doing so will result in a morally better situation. I think I’m okay with that. I can imagine a situation where some entity is suffering so badly that I’d rather instantly extinguish all conscious life than allow that suffering to continue.