Is this wrong? If so, why?
I'm guessing your knee-jerk answer to the first question is, "Well, yeah, it's wrong!!" But your second answer is probably more like "Umm, because...well....".
If you thought it's wrong to eat Bubba, but found it tough to say exactly why, then you've been "morally dumbfounded". This is the term used by the social psychologist Jonathan Haidt, whose research I'm going to discuss in this post. By asking people questions like the one above, Dr. Haidt has gained some fascinating insights about human morality, how it might have evolved, and how it varies between groups of people.
In the story above, the point is not whether eating a pet pig is wrong or not. The point is that many people have the gut reaction that it is, but then have trouble saying exactly why they think so. That's moral dumbfounding. The gut reaction comes first. More often than not, when the person finally articulates why they think it was wrong to eat Bubba, their reasoned response will match their gut reaction. Most people don't respond to moral conundrums by reasoning through the arguments on each side, and then making an impartial decision. They think, "How awful!", or "How nice!", and then the reasoning part of their brain kicks in. Sometimes the reasoning part will decide, "Well, maybe I don't really have good reasons for that gut reaction". But really, how often does that happen? Mostly, people use their "higher order" thinking skills to find ways to justify their initial impulses.
Dr. Haidt sees the mind as having two levels, which he compares to an elephant and a rider on its back. The elephant is the older, more intuitive and emotional part of the mind, shaped by evolution over tens of millions of years. The rider is the biologically newer part of the mind, which is able to formulate thoughts in higher order concepts and words. The elephant is far more powerful than the rider, and its reactions are automatic, visceral, and fast. The rider knows he's capable of thinking deeper thoughts than the elephant. He also imagines he's the boss, but he's usually wrong. Most of the time the elephant decides to move, and then the rider thinks, "I told him to do that".
In the case of moral dumbfounding, the elephant encounters a moral shock, and bolts righteously in the opposite direction. This happens before the rider can even articulate the situation. Most of the time, the rider acts more like the elephant's PR person than its master. The rider says "The bushes were rustling. It was probably a cobra. They're common around here. Yeah, it's a good thing I decided to stay away". Occasionally, though, the rider tries to change the elephant's behavior. He reins the beast in and says, "Wait, let's think about whether there was really anything bad in those bushes". Sometimes, the rider can even exert his will, convincing the elephant--against its will--that it's all clear. Dropping the metaphor of elephant and rider, the higher-order part of the mind convinces the lower order part to reconsider its reactions, and perhaps even change them. But it's not easy, and it's not common.
Another important insight from Haidt's research is the suggestion that, for most people throughout history, morality is a far more complex thing than modern westerners may realize. Haidt's thinking about morality was transformed when he spent time in India, where he had to learn to navigate a more complex moral world than he was used to in the United States. People in India have an elaborate mental geography, mapping out what is clean and unclean, and go to great lengths to ensure that the two don't mix. They eat with their right hand, not their unclean left hand. They walk on unclean streets, so when they get home they remove their shoes to keep from polluting their household. Traditional Hindus, of course, also have a deep sense of social hierarchy. They believe it is only right that some different people are born at different places on the social scale, and that those near the bottom should treat those near the top with proper respect. Even the low-status people are likely to think this way. Haidt talks about trying to engage household servants in a friendly conversation, and discovering that it made both them and their bosses deeply uncomfortable.
Based on these sorts of insights, and a lot of empirical research, Haidt and his colleagues have come up with what is known as Moral Foundations Theory. According to this theory, evolution has given the human mind not just a single moral sense, but a suite of at least 6 moral foundations, which evolved to help us function as social animals. We don't just have one moral elephant--we have six, at least. Rather than try to explain them and get them wrong, I'm going to take the liberty of quoting at length from MoralFoundations.org, a website run by Haidt and his colleagues. The six moral foundations of the theory are:
"1) Care/harm: This foundation is related to our long evolution as mammals with attachment systems and an ability to feel (and dislike) the pain of others. It underlies virtues of kindness, gentleness, and nurturance.
2) Fairness/cheating: This foundation is related to the evolutionary process of reciprocal altruism. It generates ideas of justice, rights, and autonomy. [Note: In our original conception, Fairness included concerns about equality, which are more strongly endorsed by political liberals. However, as we reformulated the theory in 2011 based on new data, we emphasize proportionality, which is endorsed by everyone, but is more strongly endorsed by conservatives]
3) Liberty/oppression: This foundation is about the feelings of reactance and resentment people feel toward those who dominate them and restrict their liberty. Its intuitions are often in tension with those of the authority foundation. The hatred of bullies and dominators motivates people to come together, in solidarity, to oppose or take down the oppressor.
4) Loyalty/betrayal: This foundation is related to our long history as tribal creatures able to form shifting coalitions. It underlies virtues of patriotism and self-sacrifice for the group. It is active anytime people feel that it's "one for all, and all for one."
5) Authority/subversion: This foundation was shaped by our long primate history of hierarchical social interactions. It underlies virtues of leadership and followership, including deference to legitimate authority and respect for traditions.
6) Sanctity/degradation: This foundation was shaped by the psychology of disgust and contamination. It underlies religious notions of striving to live in an elevated, less carnal, more noble way. It underlies the widespread idea that the body is a temple which can be desecrated by immoral activities and contaminants (an idea not unique to religious traditions)."According to Moral Foundations Theory, most psychologically-normal humans are born with a basic moral sense, which predisposes them to have preferences along each of these six dimensions. Depending on their personality, upbringing, and culture, some dimensions may be valued more than others. Haidt compares our six-dimensional, adjustable moral sense to a stereo equalizer with six sliding knobs. The setting for each knob is determined by who you are, and how and where you were raised.
One of Haidt's most interesting ideas is that westerners, especially educated, affluent, and liberal westerners, have dialed some of the knobs way down compared to most people in traditional societies. Western liberals tend to value the first three foundations far more than the second three. Western conservatives, however, view all six as important aspects of morality, and in this they're more like people in traditional societies around the world. They value in-group loyalty, which helps explain why are much more comfortable than liberals with patriotic displays, and far less comfortable with suggesting that their country might be in the wrong. They value authority, which explains why they are more likely to embrace tradition, and traditional authority figures. Their heightened sense of sanctity and degradation explains why they are more likely to think some sexual practices are wrong, even if they take place between consenting adults who both enjoy it. In contrast to conservatives, liberals generally think that if you are being fair, and not hurting or oppressing anyone, you should be able to do what you want.
So, liberals and conservatives have the knobs on their moral equalizers set at different levels. But it's actually more complicated than that. For each dimension of morality, liberals and conservatives seem to have different ideas about what is required to be moral. It's as though each knob could slide sideways, as well as up and down. For example, liberals tend to value fairness in the sense of equality of outcome, so they're far more likely to think it's OK for wealth to be redistributed than conservatives. Conservatives think of fairness as proportionality and just desert--you get what you deserve (whether you deserve reward or punishment), and you shouldn't get what you haven't earned. On the liberty/oppression dimension, liberals and conservatives are both likely to feel oppressed and think their liberties are threatened. But liberals are more likely to feel oppressed when freedom of expression, sexuality, or choice is threatened, while conservatives feel oppressed when they feel their economic freedom or religiosity is threatened. Both sides are concerned with liberty and oppression--they just have a different idea of what those things mean.
Haidt and his colleagues are unusual in that they're willing to admit that their theory is probably incomplete, and sure to be revised. On their website, they even invited people to challenge their theory, and suggest other foundations of morality that they may have missed. One of these suggestions led to the expansion of the theory to include Liberty/Oppression as one of its foundations. Other people's suggestions for universal dimensions of human morality seem pretty plausible to me, including truthfulness, wisdom, and self-control.
I can think of a couple of suggestions, myself. I notice that conservatives place a very high value on responsibility/accountability, and I'm not sure how well that fits into any of the six current foundations of the theory. I also think courage is important, and probably needs to be accounted for. This morning, I asked some friends on Facebook to list qualities they see as characteristic of a moral person. They listed some that don't fit into Moral Foundations Theory in an obvious way, including: genuineness/authenticity, simplicity, faithfulness to one's belief system, modesty/humility/awareness of our fallibility, and tolerance. Tolerance is an especially interesting one, because I doubt very much that it's innate. Cultures throughout history have not been especially tolerant of each other. The modern liberal idea of tolerance as a virtue is a recent thing; a product of culture, not nature.
I suspect that as the theory is refined, its founders will come to realize that liberals value the last three dimensions more than they thought, but in different ways. Many liberals, for example, have a clear sense of loyalty/betrayal, when it comes to their own liberal ideology. Try showing up at a Sierra Club meeting with a big greasy hamburger in a styrofoam box. You will likely be treated as a bit of a turncoat. Almost all humans seem to have a sense of tribal loyalty, and they wear the badges of their tribe with pride, whether that badge is "Celebrate Diversity", or "National Rifle Association". The subconscious elephant of tribal morality is an especially powerful one. How many people embrace a belief because they belong to a certain group, instead of the other way around?
Haidt has also pointed out that liberals do have a sense of the sanctity/degradation dimension. Whereas conservatives are concerned with the sanctity of sex and marriage, liberals are often concerned with the sanctity of nature, and many of them have a vision of nature as pure and harmonious. This extends to the food they choose to eat, which must be free of "unnatural" ingredients, as well as any other taint of ecological or social impropriety. Many liberals are developing dietary restrictions that are nearly as moralized and strict as those of Hindus or Orthodox Jews.
The theories I've described here seem extremely important to me, especially in such divisive times. First, I think they give us a clearer picture of ourselves and the world we live in, and that's good in and of itself (I'm one of those people who thinks truth is a moral issue). But they're also important because they make us stop and think about our own morality. If most moral reactions are based more on gut-level emotion than careful reasoning, then we should stop and consider whether we've really thought through our beliefs, or just rationalized the ones we preferred in the first place.
Liberals and conservatives are constantly accusing each other of immorality, but Moral Foundations Theory reminds us that almost everyone thinks morality (or ethics, if you prefer that term) is important. Of course, some people on each side are immoral (with slick and powerful politicians, maybe it's more than some). This is a problem, but the bigger issue is that each side conceives morality differently. If we acknowledge that the other side is just as concerned with morality as we are, it's easier to break the mutually destructive cycle of demonization. Then we might start trying to understand each other, and at least agree to disagree.
I'm not saying we should be relativists. We can try to understand the other side without agreeing with them, and we can be civil while resisting an agenda we see as harmful. Also, just because nature has given most people a moral sense, that doesn't mean their moral sense is right. Nature may predispose us to treat those who look, act, and talk like we do better than we treat outsiders. But that doesn't mean it's right to do so. Similarly, just because people in some cultures think some people are born inferior to others doesn't mean they are right. I believe some settings on the moral equalizer are better than others. As a secular, relatively liberal westerner, I distrust automatic in-group loyalty, respect for authority, and traditional ideas of purity...and I think I'm right to do so. But I still think it's important for me to understand that most people on Earth live in a more complex moral world than I do. Besides, I could be wrong.
Haidt, J. (2007). The new synthesis in moral psychology. Science, 316, 998-1002.
Is "Do Unto Others" Written Into Our Genes? Nicolas Wade. New York Times
Jonathan Haidt on the moral roots of liberals and conservatives (TED video)
The Moral Instinct. Steven Pinker. New York Times
The Righteous Mind: Why Good People are Divided by Politics and Religion. Jonathan Haidt