Friday, February 25, 2005

Artificial Desires

I’ve been doing a lot of thinking about this topic lately, and have been inspired by Hugo’s post to write some thoughts down about it.

We all have different utility equations, and a good little pragmatic ethicist likes me just tries to raise overall utility. One problem comes, often, when someone’s desires differ from yours, it’s hard to generate sympathy, and that can lead to rationalizations and ignoring someone’s. It’s even harder in a world of socialization and culture, when we judge some desires as “fake”, or created.

Clearly for some things we all share they substantively hurt us. Physical pain, hunger, losing a loved one. There are some things we accept as substantive, even if we don’t necessarily feel the same desire regarding them, like loneliness or desire for a certain food. Any of us would be quick to help someone achieve desires regarding these things.

But there are some things that beyond that, it’s easy to believe people “choose” to get upset over. A person who thinks everyone should give them huge presents and attention on their birthday. A fan who’s crushed at his sports team’s loss. In my own life, I’ve had a decent amount of “look, it’s your choice to be upset at X, stop demanding sympathy from me.” I’ve seen this take place in political-cultural arguments, emotional discussions of relationships, and quibbles over religion.

What is the proper utilitarian response to that (not that consequentialism is the be all and end all of moral systems, but making the most happiness a common ground most people can at least relate to)? If someone values something, in a way that a) causes them net pain and b) could be made not-true, then what responsibility do we have to affect that value system. Do we respect it as as integral a desire as any, do we ignore the pain in this instance so we don’t encourage this value system at all, or something else?

I think I should try to separate the harm being caused at the moment, and what I can do about that, from distaste of an artificial desire. I can work to remove that desire in general, but mathematically speaking, this doesn’t mean
Ev = Expected pain of value system
Pv = Probability of removing value system this action would cause (or the proportional reduction here)
S = Sadness caused by ignoring person’s pain here

In general people are good about this (don’t know the last time I’ve seen an atheist at a funeral yell “there is no better place”), but certainly not always. And I think the calculations behind Ev and Pv are so difficult and fuzzy (particularly once we realize that our rationalizations tilt towards self indulgence), that we’re better off always assigning a higher value to S.

In the case of Hugo’s post, it means that it’s important that men turn away from viewing any sexually stimulating woman as distracting and content-less, but the time and place to do that is not when they are trying to worship.

(Amusingly, this means I actually agree rather strongly with Hugo in his post saying “manners are to make people feel comfortable”, I just don’t think he does.)


Post a Comment

<< Home