Wednesday, October 17, 2012

Critical Thinking: What it is, and Why it's Not Just a Catchphrase

I know no safe depository of the ultimate powers of the society, but the people themselves: and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is, not to take it from them, but to inform their discretion by education. 
- Thomas Jefferson
Once I had a bumper sticker that said, “Think for yourself, or you're not thinking at all.”  I admired this sentiment a great deal, which is not surprising, since I had come up with it myself. Yes, I'm a little embarrassed to say it now, but I made my own bumper sticker.  I even bought some special paper to print it on. But the ink wasn't made for life in the elements, so it faded pretty quickly, and I didn't replace it. It's not that I didn't still believe it; it's just that I thought it was incomplete. After all, what good does it do to tell people to think for themselves, when so many people are terrible at thinking? I briefl considered making a twin bumper sticker that said, “...but think carefully.”  And then I thought, “That's ridiculous!”, and peeled the first one off.

As silly as my twin bumper stickers would have been, I do think both ideas are important. We live in a country where people are allowed, and even expected (in theory), to think for themselves. That being the case, you would think we would put more effort into teaching people how to think; not what to think, but how think, and how to think clearly and effectively. This sounds like a good idea, but it's pretty vague.  What exactly would we teach?  One of the first step in clear thinking is to define terms, so let me start by doing that.  The word “thinking" covers a huge range of mental processes, including concept formation, memory, decision-making, visual thinking, creative thinking, and so on.  I'm talking about something more specific. I'm talking about the kind of careful thinking that's aimed at deciding what to believe. This kind of thinking proceeds by carefully and honestly weighing the evidence for beliefs before accepting them. I'm tempted to call this “rationality” but that term, like "thinking" is also a little ambiguous. In economics, rationality is used to mean something like “optimal decision-making or action.”   Economists imagine ideal worlds in which there is an optimum way for “rational agents” to proceed in order to “maximize their utility”. This optimal strategy could easily include lying, and need not have much to do with what's true or morally right. That's not the kind of rationality I'm talking about here. What I'm interested in is honest, deliberate thinking aimed at finding what is really true, or what is really right...or at least getting as close to those things as possible.

Critical Thinking

This kind of deliberate thinking is already taught in school, though not nearly enough. It was once known as informal logic or reasoning, but these days it's usually called critical thinking. Most people have heard that term, because it's become an educational buzzword.  That's unfortunate, because it means it's in danger of being emptied of meaning by people who repeat it--parrotlike--because they like the sound of it.  But critical thinking isn't just some new educational fad that will soon go the way of New Math. It has roots going back as far at least as far as Socrates, and still has some very important lessons to offer. 



Critical thinking is based on the idea that people can learn to be more rational, effective thinkers. They do this by learning concepts of logic, inference, and evidence that apply across academic fields; and beyond them to everyday challenges like evaluating news stories and sales pitches.  One important point about critical thinking is that the word "critical" doesn't necessarily mean “negative”. It means something closer to "evaluative". To be a critical thinker is to carefully evaluate everybody's thinking, including your own. 

Most textbooks on critical thinking are written by philosophers or psychologists. The philosophers tend to focus on logic and logical fallacies, while the psychologists say more about cognitive biases, and their books may also talk about things like decision making, problem solving, memory, and creative thinking. Critical thinking is closely associated, but not identical, with the skeptic movement, which focuses on promoting science and questioning paranormal and pseudo-scientific ideas like astrology and homeopathy. Because a lot of prominent skeptics are atheists, critical thinking is sometimes associated with atheism, particularly the aggressive “new atheism”, promoted by people like Richard Dawkins and Christopher Hitchens. In fact, you often see websites about atheism and critical thinking, as if those two things were identical. They're not. You can be a good critical thinker without being an atheist.

Knowledge, Skills, and Attitude

Now, what are the particular skills and habits one has to learn to be a good critical thinker? People write rather large textbooks on that topic, so obviously I can only scratch the surface in a blog post. But I do think there are some central ideas about critical thinking that can be summed up without turning this post into a dissertation

Edward Glaser, a psychologist who designed the first psychological test of critical thinking, gave a good overview of the characteristics of a critical thinker:
The ability to think critically...involves three things: ( 1 ) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one's experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods. Critical thinking calls for a persistent effort to examine any belief or supposed form of knowledge in the light of the evidence that supports it and the further conclusions to which it tends.
Notice that the very first thing Glaser mentions isn't knowledge or skill, but attitude. It's no good having the ability to think critically if you don't have the motivation to use it. Someone who knows all about logic and reasoning, but is too lazy or apathetic to actually apply it to real world issues, is not a critical thinker. Even worse than apathy is intellectual dishonesty. Someone who knows logic and rhetoric, but applies them to proving his point, instead of finding what is actually true, isn't a good critical thinker either, though he might be an effective debater (or politician). Some other attitudes essential for critical thinking are: an ability to balance open-mindedness and skepticism, a willingness to be self-critical and realize you're prone to biases and mistakes, and dedication to fair-mindedness—to listening to what others are saying, and accepting that they might actually have a point. It may sound strange, but good critical thinking isn't just about cold logic. It also has ethical and emotional dimensions, because it requires you to be fair-minded and honest, and to actually care about what's true.

The third part of Glaser's definition emphasizes that critical thinking is a skill. And like any skill, it can be improved with practice. You can't learn to be a good critical thinker just by reading about it, any more than you can learn to drive by reading the driver's education book. It will certainly help, but you're eventually going to have to get behind the wheel.  

Building Arguments

Glaser's second criterion is where we get into the actual stuff people have to learn in order to be critical thinkers. At the core of critical thinking is the idea of argument. In philosophy and logic, the word "argument" (like "critical") means something much less contentious than in everyday use. An argument is simply a set of statements in support of a claim or idea.  It consists of one or more premises and a conclusion. The premises are offered as evidence of the conclusion. For example, if I say, “Now is a good time to buy a house, because interest rates are low”, that's an argument with one premise and one conclusion. In this case, the conclusion (now is a good time to buy a house) comes before the premise (because interest rates are low). The word “because” is a hint that the second phrase is offered as a reason for accepting the first. Any argument has to have both premises and a conclusion. If I just say, “Now is a good time to buy a house”, I haven't made an argument. I've just made a claim, and given you no reason to believe it. If you look at debates online, it's amazing how seldom people actually make arguments. Most of the time, they're just making claims (e.g. You're an idiot!!), but offering no evidence to back them up. Critical thinking is really all about evidence.  One of its fundamental tenets is that if you're going to believe something, you need evidence that it's actually true.*

Of course, some arguments are much more complex than the one above.  Arguments are modular--simple arguments can be combined to form more and more complex arguments.  This works because conclusions have a dual nature--the conclusion of a simple argument can become a premise in a larger argument.  But no matter how complex an argument is, the first step in carefully evaluating it is to identify the premises (often accompanied by words like “since”, or “because”) and the conclusions (usually heralded by words like “therefore”, “so”, or “consequently”).

But now I need to define my terms again. What does it mean to evaluate an argument? Let's start with the simple example above: "Now is a good time to buy a house, because interest rates are low."  This argument is actually pretty weak because it's incomplete. There are a lot of other things to consider before buying a house, like: whether you can afford it, whether resale values are falling, what the closing costs would be, and so on. It's not enough just to make an argument. If you want to make a reasonable claim, you have to make a good argument. But what exactly makes an argument good?

Good and Bad Reasoning

Howard Kahane, who wrote a classic book on critical thinking called Logic and Contemporary Rhetoric, gives a good summary of the characteristics of good reasoning.
Reasoning can be either cogent (good) or fallacious (bad). We reason cogently when we satisfy the following conditions:
  1. The premises of our reasoning are believable (warranted, justified), given what we already know or believe.
  2. We consider all likely relevant information.
  3. Our reasoning is valid, or correct, which means that the premises we employ provide good grounds for accepting the conclusion we draw.
The argument about buying a house is weak because it doesn't meet the second criterion: it doesn't include all relevant information. Leaving out information is a favorite tactic of politicians and advertisers, of course, because doing so allows them give the wrong impression without actually being accused of lying (perhaps the definition of lying needs to be expanded).

Besides leaving out relevant information, there are two other ways an argument can be bad. The first happens when the premises aren't true, or at least aren't plausible. For example, if it isn't true that interest rates are low, then that would make the house-buying argument fallacious. An argument is also bad when the conclusion doesn't follow logically from the premises. If I say, “There will be a lunar eclipse this month, so it's a good time to buy a house”, that's a bad argument, because there's no reason to think a lunar eclipse has any bearing on the wisdom of house-buying. Even if all the premises are true, that's not enough. The conclusion also has to follow from those premises (and all the relevant premises have to be considered).

Deductive Reasoning

Time to define terms again. What does it mean for a conclusion to logically follow from a set of premises? To really understand this idea, we need to look at two kinds of logical arguments: deductive arguments and inductive arguments. The following syllogism illustrates deductive logic.
If I went to Graceland, then I went to Memphis.
I went to Graceland. 
Therefore, I went to Memphis.
In this case, my conclusion is said to be valid. In fact, any syllogism that takes this form is valid, just by virtue of its structure. But this doesn't mean it's true. All that “valid” means is that the conclusion is true if the premises are true. If I replaced Memphis with New Orleans, the syllogism would still be valid, because if the first two premises were true, then the conclusion would be true as well.  But the conclusion would still be false, because the first premise is false: Graceland is not in New Orleans.

Now, even if the premises are true, some conclusions are invalid, because they don't follow from the premises. Consider this syllogism:
If I went to Graceland, then I went to Memphis.
I went to Memphis.
Therefore, I went to Graceland.
This looks reasonable at first glance, but it's not valid.  The syllogism specifies that if I went to Graceland, then I went to Memphis, not vice versa. And in fact, you can go to Memphis without going to Graceland.  I've committed a fallacy here, which logicians call the fallacy of affirming the consequent.

In deductive logic, all you're really doing is deciding what you can conclude based on the information already contained in the premises, and in the relationship between them. It's quite mechanical--a certain kind of syllogism always has a certain set of valid conclusions, no matter what the content of the syllogism (remember that “valid” is not the same as “true”). I could substitute letters, and write:
If A then B. 
A 
Therefore, B
In this kind of syllogism (known as modus ponens) if the premises are true, then the conclusion is always true, no matter what A, B, and C stand for.

Inductive Reasoning

The other kind of logical argument—inductive logic—isn't nearly as mechanical, in that it's not necessarily true that if the premises are true, then the conclusions are also true. With inductive reasoning, as Kahane puts it, the conclusions, “go beyond what is contained in the premises.” In this kind of reasoning, you're taking a set of premises (often a set of facts or observations) and trying to find regularities, resemblances, or patterns you can use to draw conclusions--conclusions that extend to situations that aren't covered by the premises. This is pretty abstract, so maybe an example is in order.  If you've seen a million crows in your life, and all of them were black, then you might conclude inductively that all crows are black. But just one albino crow (and they do exist) would prove you wrong.  All you can conclude with certainty from a lifetime of seeing black crows is that most crows, at least in the places you've lived, are black.  

Since inductive conclusions aren't necessarily true, even if the premises are true, logicians say that good inductive arguments are strong, instead of valid. This might make it sound like deductive arguments are the way to go, but that's not necessarily true, because deductive arguments are only certain if the premises are certain.  And how do we know the premises are certain? Oftentimes, the premises in deductive arguments are either taken as basic assumptions or axioms (as in geometry) or they are based on induction.  Either way, we can seldom be absolutely sure that they're true.

There are several kinds of inductive reasoning, some of which take place almost unconsciously. For example, associative learning—for example, learning that bacon tends to co-occur with eggs—is a kind of induction.  We learn such associations automatically, without really having to think. Some other examples of induction include induction by enumeration (as in the crow example above), analogical reasoning (coyotes can swim, so foxes can probably swim too), causal reasoning (I took an aspirin and my headache went away, therefore the aspirin caused the headache to go awa), and statistical reasoning (85% of people surveyed support Smee for president, therefore Smee will win). Science is mostly inductive, because it's concerned with looking at different instances of phenomena (for example, an apple falling from a tree and the moon orbiting the earth) and deriving general laws** to explain all those instances in a unified way (i.e. Newton's laws of gravity).  One definitive lesson we've learned over the past few hundred years is that you can only get so far on logic alone.  Philosophers argued over ideas based on nothing but logic for centuries, and they didn't reach a whole lot of consensus.  When natural philosophers (who came to be known as scientists) started making careful observations, forming hypotheses, and then putting those hypotheses to experimental tests--that's when human knowledge really began to take off. We didn't get far reasoning from purely logical first principles, because a lot of those first principles (Aristotle's assumptions about physics, for example) turned out to be wrong.

Reasoning vs. Rationalizing

Whether we're reasoning inductively or deductively, the whole point of reasoning is inference--the act of taking a set of facts or claims and extrapolating from them to gain new knowledge. Of course, what we usually do is lay out a conclusion first, and then start putting premises under it. That's not reasoning; that's rationalization. It puts the cart before the horse in two ways: by putting conclusions before premises, and by mistaking defending one's beliefs for seeking the truth. The point of reasoning is to decide which conclusions are true, not to support the ones you have come to take on faith. That's why critical thinking demands that we relax our grip on our beliefs, in case the evidence demands that we change them.  As John Maynard Keynes said when he was criticized for changing his mind, "When my information changes, I alter my conclusions. What do you do?"  

Fallacies and Biases

With all reasoning and argument--inductive or deductive--what we're doing is looking at the evidence for conclusions, to see if they're justified.  As Kahane's quote above illustrates, conclusions can have insufficient support for at least three reasons: the premises are untrue or irrelevant, the premises are insufficient, or the conclusion doesn't follow from the premises. When you offer up a conclusion, but make one of these mistakes, you've committed a fallacy (I say "mistakes" because nobody would do this on purpose, right?).  Since there are many more ways for reasoning to be bad than good, there are all kinds of ways for arguments to be fallacious.  In fact, philosophers have identified hundreds of fallacies. They've also tried various schemes for classifying for classifying those fallacies into categories.  One way is Howard Kahane's, which divides fallacies into the categories questionable premise, suppressed evidence, and invalid inference, which correspond to his criteria for judging arguments.  Other common schemes divide fallacies into formal and informal types (roughly corresponding with deductive and inductive logic), or into fallacies of presumption, relevance, and ambiguity.  This post certainly isn't the place to get into the pros and cons of these classifications.  Besides, it turns out that the human mind makes predictable mistakes that don't exactly correspond with logical categories.  This means it may make more sense for beginners to learn the most common fallacies, instead of worrying about how to classify all the fallacies. 

I'll link to some good websites that discuss fallacies below.  But logical fallacies aren't the only enemies of good reasoning.  Overlapping with fallacies are a whole galaxy of cognitive biases.  While most of us go around thinking we see the world pretty much as it is, the human mind is really quite...warped. We're biased, in all sorts of ways.  There are biases of perception, of reasoning and decision-making, of memory...just about everything the mind does is subject to error.  One major category of errors is egocentric biases, which make us think we're smarter, nicer, and more likely to be right than most other people.  That's especially people with people who aren't aren't like us, which brings us to sociocentric or ethnocentric biases, which assume that our culture or subculture is obviously better than any others.  Now, I'm not arguing for radical cultural relativism here.  It's not necessarily irrational to conclude that one culture is better than another in some way.  But it is irrational to conclude that our culture is better without ever really questioning the idea.  

The Pursuit of Reason

In this post, I've argued that there's a core set of ideas and skills about critical thinking that everyone can learn, and apply toward becoming better thinkers. I've tried to give a basic outline of those core ideas--a basic sketch of the vast, complex territory of critical thinking.  The complexity of that territory is revealed by the hundreds of logical fallacies and cognitive biases we can fall prey to.  Some people have argued that the study of critical thinking would be simplified if, instead of focusing on these pitfalls--on how reasoning can go wrong--we should focus on what constitutes good reasoning.  That makes some sense, because, as I said, there are far fewer types of good reasoning than bad reasoning.  However, some biases and fallacies are so common that everyone needs to learn them. Besides, even good reasoning can take a wide variety of forms.  There are many kinds of deductive reasoning, and even more kinds of inductive reasoning.  

The point is that, while it's easy enough to discuss the main ideas of critical thinking in a longish blog post, learning to be a good critical thinker is never going to be as simple as, say, learning to play checkers.  It's more like chess. Even the basic rules are rather complicated, and it takes a lifetime of practice to get really good at playing the game.  (If I'm going to be honest--and I'd better be in a post like this--I'm really just a beginner. I may seem like I know what I'm talking about, but I've only begun to understand the rules of the game, and this post is my way of testing that understanding.  I don't kid myself that I'm good at playing it yet.)

The mathematician and philosopher Gottfried Wilhelm Leibniz believed that one day, all good reasoning would be standardized into a perfectly accurate, mechanical process whose results could be agreed upon by every rational person:  As he put it, "The only way to rectify our reasonings is to make them as tangible as those of the mathematicians, so that we can find our error at a glance, and when there are disputes among persons, we can simply say:  Let us calculate, without further ado, to see who is right."  Well, he miscalculated.  Human reasoning is inextricably messy.  Real thinking is a chaotic affair; a tangle of hunches, memories, images, and emotions, which shifts back and forth between deduction and induction.  Perhaps the best we can do is try to tame it to some extent; to impose some standards and structure on our thinking.  One way to do this is by learning about good reasoning and bad reasoning, and trying to make sure we stay on the good side as much as possible.  That's easier said than done, of course, because the small tribe of good arguments is outnumbered by a vast nation of bad ones.  

Personally, I try to remember one big idea: everything I accept as true needs to be supported by some sort of evidence.  The evidence for it should outweigh the evidence against it, and if it doesn't, then I shouldn't believe it. This brings me back to the idea that critical thinking has as much to do with attitude as with knowledge or skill.  If I'm going to go beyond just thinking for myself, as my bumper sticker advised, and actually try to think well, then I have to approach the task of thinking with a certain set of values. I have to care what's actually true more than I care about winning arguments.  I have to be honest and tenacious enough to consider all the evidence for and against claims.  Finally, if I really want to be right, I have to remember how easy it is to be wrong.

The whole idea of teaching critical thinking is based on the proposition that people can learn to be better thinkers.  The evidence suggests that they can (see the book by Diane Halpern below).  Good thinking is just like any other skill. With knowledge, dedication, and a whole lot of practice, it's possible to get better at it.  It's not easy, but it's possible.  In a democratic country like ours, where the quality of our society depends on our wisdom as citizens, it's absolutely essential.  


_____________________________________________________


* If you want to point out that I didn't offer any reasons why this tenet should be accepted...then you have an excellent point. But the idea that the evidence suggests that evidence isn't really necessary...that would be a strange one.  Maybe that's a topic for another post, one of these days.

**However, induction is not necessarily about generalizing. You often hear that inductive reasoning moves from the specific to the general, while deductive reasoning moves from the general to the specific. That's often true, but not always. For example, if I think to myself, “All the snakes I've ever poked in the eye have bitten me, therefore this snake will bite me if I poke it in the eye”, then I'm reasoning inductively to a particular case.


Some Good Books

A Rulebook for Arguments - Anthony Weston

Bad Science: Quacks, Hacks, and Big Pharma Flacks - Ben Goldacre

Critical Thinking - Brooke Noel Moore and Richard Parker

The Demon Haunted World: Science as a Candle in the Dark - Carl Sagan

Influence: Science and Practice: Robert Cialdini

Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life - Howard Kahane and Nancy Cavender

Thought and Knowledge: An Introduction to Critical Thinking - Diane Halpern

Why People Believe Weird Things - Michael Shermer


Websites and Infographics

http://www.informationisbeautiful.net/visualizations/rhetological-fallacies/

www.yourlogicalfallacyis.com




No comments:

Post a Comment