Showing posts sorted by date for query "joshua greene". Sort by relevance Show all posts
Showing posts sorted by date for query "joshua greene". Sort by relevance Show all posts

Saturday, December 21, 2013

6 Surprising Findings About Good and Evil

From Mother Jones, moral psychologist Joshua Greene and author of the recent book "Moral Tribes: Emotion, Reason, and the Gap Between Us and Them", presents "6 Surprising Scientific Findings About Good and Evil". Some of the more salient points for this audience:
  • According to Greene, while we have innate dispositions to care for one another, they're ultimately limited and work best among smallish clans of people who trust and know each other.
  • "We have gut reactions that make us cooperative," Greene says. Indeed, he adds, "If you force people to stop and think, then they're less likely to be cooperative."
  •  We also keep tabs and enforce norms through punishment; in Moral Tribes, Greene suggests that a primary way that we do so is through gossip. He cites the anthropologist Robin Dunbar, who found that two-thirds of human conversations involve chattering about other people, including spreading word of who's behaving well and who's behaving badly. Thus do we impose serious costs on those who commit anti-social behavior.
  • [J]ust as we're naturally inclined to be cooperative within our own group, we're also inclined to distrust other groups (or worse). "In-group favoritism and ethnocentrism are human universals," writes Greene. What that means is that once you leave the setting of a small group and start dealing with multiple groups, there's a reversal of field in morality. Suddenly, you can't trust your emotions or gut settings any longer. "When it comes to us versus them, with different groups that have different feelings about things like gay marriage, or Obamacare, or Israelies versus Palestinians, our gut reactions are the source of the problem," says Greene.

His conclusion:

Based on many experiments with Public Goods Games, trolleys, and other scenarios, Greene has come to the conclusion that we can only trust gut-level morality to do so much. Uncomfortable scenarios like the footbridge dilemma notwithstanding, he believes that something like utilitarianism, which he defines as "maximize happiness impartially," is the only moral approach that can work with a vast, complex world comprised of many different groups of people.

But to get there, Greene says, requires the moral version of a gut override on the part of humanity—a shift to "manual mode," as he puts it.
***
To be more moral, then, Greene believes that we must first grasp the limits of the moral instincts that come naturally to us. That's hard to do, but he thinks it gets collectively easier.

Maybe one of the quickest way we can do that is to stop using gossip (i.e. public shaming) as a blunt instrument enforcement mechanism for misplaced social (not really even moral) enforcement (see also Duck Dynasty scandal).

Thursday, August 15, 2013

Sociopathic morality?

This is an interesting summary of the dominant views in the scientific community regarding morality. Many have been discussed here before, including Jonathan Haidt's views on intra-culture morality and Paul Bloom's findings on the moral world of children. I liked this insight into the role that empathy/emotions play in morality vs. logic:

People who behave morally don’t generally do it because they have greater knowledge; they do it because they have a greater sensitivity to other people’s points of view. Hauser reported on research showing that bullies are surprisingly sophisticated at reading other people’s intentions, but they’re not good at anticipating and feeling other people’s pain.

The moral naturalists differ over what role reason plays in moral judgments. Some, like Haidt, believe that we make moral judgments intuitively and then construct justifications after the fact. Others, like Joshua Greene of Harvard, liken moral thinking to a camera. Most of the time we rely on the automatic point-and-shoot process, but occasionally we use deliberation to override the quick and easy method. We certainly tell stories and have conversations to spread and refine moral beliefs.
When you put it that way, it seems obvious why sociopaths would struggle with having an internal sense of morality.

My favorite part of the article, though, was this critique:
For people wary of abstract theorizing, it’s nice to see people investigating morality in ways that are concrete and empirical. But their approach does have certain implicit tendencies.

They emphasize group cohesion over individual dissent. They emphasize the cooperative virtues, like empathy, over the competitive virtues, like the thirst for recognition and superiority. At this conference, they barely mentioned the yearning for transcendence and the sacred, which plays such a major role in every human society.

Their implied description of the moral life is gentle, fair and grounded. But it is all lower case. So far, at least, it might not satisfy those who want their morality to be awesome, formidable, transcendent or great.
It's an interesting argument. I see this skewed focus frequently with religious people. They often tend to want to focus on the nice, nondescript aspects of their religion where God is behaving well, not killing children or drowning the world or enacting all sorts of vengeance. But most versions of God have some sort of edge to them. All versions of God are powerful beings, after all. They wouldn't remain powerful without doing certain things to cultivate that power, including being awesome, formidable, transcendent, and great. If we think that godliness is a virtue, then it would also be a virtue for us to cultivate power and try to become more awesome, formidable, transcendent, and great. And you don't necessarily get to be that powerful by rolling over and being "nice" in every situation.

I find it really disingenuous for people to focus on the "nice" side of morality without giving any consideration to the obvious ying to the yang (unless it really is true that all conservative people are godless and going to hell). As a religious person myself, I sometimes have people get on my case about some of the more aggressive, competitive, and antisocial things that I do, claiming that they are not consistent with my religion. I am not necessarily humble the way they expect the religious to be humble (but which is better, to lie to yourself in order to be humble, or to honestly acknowledge both your strengths and your weaknesses?). I can be ruthless and I don't often doubt myself. There are things about me that seem a little too dark and edgy to be the Mormon/Christian I profess to be. But the Christian God can be ruthless too. The Christian God can be all the things that I am, given the right context. I just feel like I am coming at godliness from the opposite end that most people do -- that the cultivating power side of things happens to be my area of expertise and that I need to practice and work at the love side of things. And for other people maybe it is vice versa, but that we'll all eventually meet at our goal in the middle.  

Thursday, October 1, 2009

Trolley problem

This was sent in by an anonymous reader. I remember reading the trolley problem before and being really surprised that some people might not kill the one guy to save the more. For some reason, I also feel that killing the wandering stranger is a mistake. Maybe it is because I know the statistics of organ transplant success. Or I know how expensive and dangerous it is to do those type of surgeries. In some sort of way, I think I feel like the young stranger is actually more deserving of his own organs than the other five. Or it could be my particular form of efficiency-loving Burkean libertarianism that generally doesn't like to mess with things because of unforeseen consequences -- the longer the period you're looking at (organ donations), the greater the uncertainty. Or maybe I do have a soul. Here it is:

Thought experiments can teach us about the cognitive processes involved in moral decision making, and perhaps none is ultimately so telling as the trolley problem. One formulation of the trolley problem goes like this:
Five people are tied to a trolley track, and a trolley is speeding toward them. You're standing next to a switch that can divert the trolley onto another track. If you do nothing they'll all be killed in a matter of seconds. If you throw the switch it will divert the trolley off of the track with the five people tied to it, and onto a track with only one person tied to it. While the five will be saved, the one who wouldn't have been harmed otherwise will now be killed. Do you throw the switch?
Another formulation can be stated this way:
Five people are tied to a trolley track, and the trolley is speeding toward them. If you do nothing, they'll all be killed in a matter of seconds. You're standing on a bridge behind a tall and extremely fat man who is leaning against a rickety railing. No one else is there, and he's totally oblivious to both your presence and his precarious position. If you push him, the railing will give and he'll fall directly in front of the trolley. He will be killed, but he'll also bring the trolley to a stop, preventing it from harming the five people tied to the track. Do you push the fat man?
For hard-headed readers who answered "yes" to the first two, there is at least one more formulation:
You are a talented surgeon in a small village where five people need various organ transplants. None of them are on waiting lists, and all will die in a matter of days if they don't get organs. Each is in a weakened state, so you can't use organs from one to save another. If you could find a healthy donor, you'd be able to save them all.

By chance, a young traveler with a minor cut on his arm visits your office. Just for kicks you run a blood test, and find out that he's a perfect match for all 5. He mentions that no one saw him come in. Furthermore, not only is he in the country illegally, traveling alone on foot, and paying for everything with cash, but he didn't even tell anyone back home where he was going because he's estranged from his family. etc., etc. Do you tell the young man to wait while you get a tetanus booster shot, only to return with a syringe containing a powerful sedative? Or do you just smile and send him on his way?

When answering the various formulations of the problem, what decisions did you make, and why did you make them? How long did it take you to arrive at your decisions? How long did it take you to come up with explanations for your decisions?

Joshua Greene of Harvard University has done extensive research on cognition and moral judgment by asking test subjects these kinds of questions while performing functional Magnetic Resonance Imaging (fMRI). His results can be summarized in three sentences. When people choose a course of action which maximizes outcomes, the parts of their brain which show the most activity are those associated with rational and quantitative thinking. When people choose a course of action in which they do not perform acts which directly harm others, the parts of their brain which show the most activity are those associated with emotion and feeling. These parts of the brain 'light up' as soon as the questions are read, much more quickly than most people can formulate an explanation.
When I tried to answer the series of trolley questions myself, I found that I was making nearly instantaneous judgments as to which option made me feel least guilty. The process of using my intellect to come up with a list of justifications didn't even start until after my decision was final. The most troubling thing about this for me is that I approached the series of questions with deliberate intent to be as rational and consistent as possible. I realized that I couldn't be. In all likelihood, virtually no one can.

This raises important questions for readers who pride themselves on rationality. We could address hypothetical questions about judgments on specific situations, but I'm more interested in general questions like: "How can I trust my moral judgment on anything?" Well, how can you? The fundamental point of the trolley problem goes beyond whether any one decision is right or wrong. The real question is whether it's even possible for certain categories of human decision making to be rational. I don't think we can have a grown-up discussion about morality without addressing this.

At its core, the trolley problem raises a new kind of "duality" question; one which is thoroughly modern and scientific. Are we thinking with one brain, or multiple brains? Does the amygdala vie for control with the cerebral cortex? Do the right and left hemispheres struggle against each other? If so, what determines which will win? If we direct our own thoughts and decisions, then why are those decisions made as quickly as autonomic reflexes? Who or what is in control of our thoughts? What does our thought process say about who and what we are?
Join Amazon Prime - Watch Over 40,000 Movies

.

Comments are unmoderated. Blog owner is not responsible for third party content. By leaving comments on the blog, commenters give license to the blog owner to reprint attributed comments in any form.