Thursday, December 5, 2013

For the Bible Tells Me So

Castle of the Pyrenees. Rene Magritte
The other day I came home to find a pamphlet the Jehovah's Witnesses had tucked into my door frame. It's entitled Can the Dead Really Live Again?, and it's answer is a resounding yes. I normally just throw those pamphlets away, but I've kept this one because of an interesting argument it uses to defend its position on life after death. It starts with a rhetorical question: "Can we really believe what the Bible says?" Then it answers, "Yes, for at least three reasons." And here's the striking thing: all three reasons are supported solely with Bible verses.

Now, I'm no logician, but I'm pretty sure that's circular reasoning. It's basically saying, "We can believe what the Bible says because of what the Bible says." In other words, it tries to support its conclusion with statements that assume the conclusion is true. The argument hovers in midair. What's needed to connect it to the firmament of credibility is firm, verifiable evidence that the Bible is true, beyond what the Bible itself says. But no such evidence is offered.

My point here isn't to debate whether there's life afer death or to bash Jehovah's Witnesses. I've known many of them, and they're usually very nice people. Nor do I want to stomp on an easy target like a cheaply-printed pamphlet. I know there are lots of Christians out there far more logically sophisticated than whoever wrote that. What I'm trying to do is point out this kind of circular reasoning, and the real harm it can do.

I don't just see such reasoning in throwaway pamphlets. I see it used all the time by certain conservative Christians to justify attitudes and laws that hurt real people in real ways. The clearest examples these days are assertions that homosexuality is immoral and that gay marriage should be illegal (or remain illegal). This country is full of people who think it's fully justified to tell two consenting adults they shouldn't be able to love or marry each other. They're willing to deny them what most people consider one of the main sources of happiness and meaning in life, and they justify this attitude, and those laws, by citing Bible verses.

That's a pretty serious stance to take. So when I hear people taking it, I always try to ask, as nicely as possible, "OK, but how do you know the Bible is right?" Responses vary, but one I've heard several times is, "Because it's the word of God." So then I ask, "But how do you know that?" Once again, responses vary, but I've actually heard people say, "Because it says so in the Bible."

So we're back to circular reasoning; to arguments built on floating boulders instead of bedrock evidence. And that's just not good enough, especially if those arguments are being used to dictate how people can live their lives. Unless there's clear and undeniable proof that: 1. There is a God. 2. God is the ultimate judge of what is right. 3. God dictated those Bible verses; then pointing to them doesn't count for much. If there isn't clear evidence for number 3 in particular, the simpler explanation for those verses is that they were written by plain old human beings...people just like you and me, except that they lived in a far more violent, sexist, ignorant, and superstitious time. If they were written by such people, without divine inspiration, why should we listen to them? Haven't we made some intellectual and moral progress since then? After all, it's no longer considered acceptable to massacre whole cities, to stone people to death for adultery, or to attribute mental illness to a legion of demons. Why should we put stock in ancient opinions about other things?

Of course, the ancients were probably right about some things. "Thou shalt not kill" seems like a pretty good moral maxim (even if it's widely ignored.) So, I'm not necessarily saying they weren't right. I'm just saying that if they were, you can't prove it by saying, "It's written in this book." Anybody can write a book. If you add, "and God wrote or inspired that book" that would certainly add more weight to the argument, but only if it's true. And if someone says it's true, then they should be able to tell me how they know that. "Because the Bible says so" is not an acceptable answer, because it just takes us back where we started. What's the evidence that takes us outside the logical circle? If you ask an astronomer why she thinks the universe began in a Big Bang, she'll start citing measurable, independently verifiable evidence: leftover radiation predicted before it was discovered, galaxies flying away from each other, predictions from general relativity and particle physics, Hubble observations of young galaxies, and so on. If she couldn't offer any such evidence, we would have no reason to take her seriously. Why should it be any different for someone quoting the Bible?

Another assertion that can't stand on its own is, "God says this is wrong." If someone says that, then surely it's fair to ask: Why? Why does God say it's wrong? Does it cause harm? If so, what? If not, then what else is his reason? Surely God doesn't disapprove of things for no reason? If someone can explain why something is wrong--by saying what harm it does, for example--then they're actually giving me a reason to consider their argument. Alternatively, if they say, "I don't exactly know why God says it's wrong, but I know he says so," then we're back where we were before, and  they should be able tell me how they know he actually says that.

If people can offer evidence for those things, then they're making an actual argument. It's not necessarily a valid one, if the evidence is unconvincing or doesn't logically support their conclusions. But at least its an honest effort. What isn't a real argument is saying, "it says it in the Bible," or "God says it's wrong." Such statements might possibly begin a convincing argument, but they certainly can't end one, despite what the bumper stickers say. By themselves, they just hover in mid-air, resting on nothing.

Tuesday, December 3, 2013

Newton's Weird World

If you go to a bookstore and look through the popular science section, the books on physics will mostly be about the most mind-bending modern theories: quantum mechanics, relativity, big bang cosmology, black holes, and so on. You won't find many books about the basic classical physics you learn in school. Newton's laws of motion, prisms and rainbows, magnets and electric motors--these things don't seem to strike the popular imagination. I think that's because modern physics seems so exotic (and because reading about it makes you look smart). Space and time bend and meld, single particles go through two holes at the same time (unless you try to catch them at it), black holes slow time, capture light, and may even lead to other universes...that is freaky stuff.

Lately, though, I've been realizing how freaky and counter-intuitive the old-school physics of Newton and Galileo can be. People have an intuitive understanding of physics, and it's good enough to let us navigate the surface of this particular planet most of the time, but in the grand scheme of things, it's wrong. Sometimes dramatically wrong. Most of the people who ever lived went to their graves thinking the earth is flat. They were wrong about the shape of the surface they lived on every day of their life. I would too, if I hadn't been taught differently. It's a humbling thought. 

People also assumed for thousands of years that heavy rocks fall faster than light rocks. After all, doesn't a feather fall more slowly than a boulder? It's just common sense. But Galileo thought he would test the idea anyway, and it turned out common sense was wrong. Light things may fall more slowly on Earth because of air resistance, but the deeper law of nature is that in a vacuum, feathers fall just as fast as boulders. That's well known these days, of course, so we don't really feel how surprising it is. But for people living at the time, it was earth-shattering.

It's shocking to discover that natural law runs counter to our intuition, even for things we see every day. When you start thinking about all the ways that's true, dusty old textbook physics starts to gleam a little more. Here are some examples I like:*

When I go for one of my brief, agonizing runs, I always think about how strongly the earth is pulling down on me. But I never consider how I'm pulling up on it with the same amount of force. Every object with mass creates a gravitational field, and I assuredly have mass. And every force comes in pairs--nature is symmetrical that way. I tug on the Earth just as hard as it tugs on me. It's just that the earth is so much more massive that it accelerates me a lot more than I accelerate it.

I just said the earth is pulling "down" on me. That's because I go around thinking there's a real up and a real down. But of course there isn't. And north certainly isn't up, no matter how hard it is to think otherwise. "North is up" is just a convention, and early cartographers often drew maps "upside down" before that convention was established, as in this map of Europe and North Africa from 1459. I can look at that map and tell myself it's just as valid as "right side up" ones, but it's still looks wrong. It's not, though. I am.


The most basic rules of motion can be totally surprising. For example, if you hold a rifle five feet above the ground, and I hold a bullet in my hand at the same height, and I drop it at the same moment you shoot, the two bullets will hit the ground at the same time (disregarding air resistance, etc). The high-velocity bullet falls just as fast as the low-velocity bullet. You would think the lateral motion of the bullet from the gun would somehow interfere with its downward motion ("downward" I should say) but it doesn't. The two motions do combine to create a curved path, but their magnitudes are independent.

Speaking of falling objects, the moon is falling. It's dropping like the giant rock it is. It's just that its lateral motion is balanced with its "downward" motion in such a way that it falls around the earth instead of into it. It's been plummeting for billions of years, but it's never managed to land. We've been plummeting into the sun all that time, too. It makes me a little queasy thinking about it.

Another illusion I have is that when I throw a rock, I always feel like I'm giving it a certain amount of energy. I imagine this energy fades as the rock progresses, so it finally slows down and lands. But that's not what's happening at all. When the rock leaves my hand, it's going at a particular velocity, and it would keep going at the same velocity indefinitely in the absence of other forces. As Newton taught us, objects in motion tend to stay in motion. The rock slows and falls because air resistance exerts a force that decelerates it, while gravity works to return it toward the earth. Energy is always conserved, so the energy I give to the rock doesn't fade away. Some of it is transformed into heating the air that slows down the rock, but none of it disappears.

As for heat, it's funny, counterintuitive stuff, too. In fact, it's not even stuff. It's molecular motion, and it behaves in unexpected ways. When I step out of the shower and put one foot on the tile floor and the other on a bath mat, I could swear the bath mat is warmer than the floor. But it can't be--they're both at the same temperature as the rest of the room. It's just that the tile conducts heat better than the fibers in the mat, so it sucks heat away from that foot more efficiently. That's why it feels colder, even though it isn't.

Of course, I'm speaking metaphorically when I say the tile "sucks heat", even though I may not realize it. It's not really what happens. In fact, the idea of suction is an illusion. If I take a drink through a straw, I'm not exerting a "force of suction". I'm lowering the air pressure in the straw, and that allows the pressure of the atmosphere (a surprisingly high 14.7 pounds per square inch) to push the drink into the straw. It doesn't sound right, does it? It goes against common sense.

And that's the problem with common sense.

_________________________________________________

* You probably know these things as well as I do. My point isn't to say "Did you know that....", but to say, "We both know this; let's stop and think about how amazing it really is." That's how most of my posts are intended, actually. 

Thursday, November 21, 2013

The Light Fantastic

"My own suspicion is that the universe is not only queerer than we suppose, but queerer than we can suppose" - JBS Haldane

Have you ever gotten a new camera, and decided to finally learn something about how photography works? You know, f-stops, exposure, ISO, all that stuff? I have. Multiple times, actually, because I can't seem to remember it. I'm learning again now. I was trying to figure out how aperture--the size of the hole light passes through in the lens--affects focus. I learned (once again), that the smaller the hole is, the greater your depth of field will be. Not only will your subject be in focus, but the background will be too. Make the hole bigger, and the background will be all blurry. But then I started wondering why that actually happens. Why does light that goes through a little hole stay more focused than light that goes through a big hole? That ignited my curiosity about light and lenses, and before long I had gotten out an old physics book from college to try to wrap my head around all that stuff.

That got me thinking about an idea I've always found fascinating. When you really start looking at how light behaves, it almost starts to seem intelligent. I mean, it's not really, but it gives that impression. Consider this: lenses work because they refract light, which just means they bend light rays. The reason they do is that light travels more slowly in glass than air. If you shine a beam of light through a thick piece of glass (at an angle) and onto a wall, it will bend and slow down when it enters the glass, and then bend again when it leaves. And here's where it gets weird--if a beam of light goes from point A to point B, as shown below, somehow it's able to choose the one path, among the many possible, that takes the least amount of time. That path isn't necessarily a straight line, because if it moves more slowly in glass, then it should alter its course so the distance through the air is farther, and the distance through the glass shorter.


The classic analogy is a with a lifeguard rushing to save a drowning swimmer. Imagine you're that lifeguard, standing on the beach. You look out to your left and see the swimmer in trouble, as shown below. Should you make a beeline--running, and then swimming, straight at him? No...sometimes the fastest route between two points isn't a straight line. You can run faster than you can swim, so you should run a little farther down the beach, so you don't have to swim so far. There's an optimal route that's faster than any other. Most lifeguards won't hit on it exactly. But light is smarter than a lifeguard, at least in that sense. It finds the fastest route, and it does so at, well, the speed of light. It doesn't matter if it has to bend and straighten its way through several layers of air and glass--the path it chooses will be the one that takes the least time.


How does it do that? How does it "know" which path to take? Christian Huygens came up with a plausible answer back in 1678, by thinking of light as moving in a wave, and of a wave front as the combination of many smaller waves. He was able to model refraction that way, as in this image. As the wave fronts approach the glass at an angle, each one changes angles as it slows down and enters. The change in direction will be at an angle that minimizes the time it takes for light to get from one point to another. Huygens was also able to model reflection this way, and a guy named Fresnel applied the idea to other "wavy" phenomena, like diffraction and interference. Nature is full of things that move in waves, and thus behave in all these ways, including water waves, sound waves in the air, and even seismic waves through the earth. Applying the idea to light helped turn the tide of scientific opinion away from Newton's idea theory that light was made of particles, and got physicists thinking of light as ripples propagating through space.

The Huygens-Fresnel idea does seem to provide a mechanism for how light "chooses" the shortest path, and it accurately predicts what that path will be. The problem is, light doesn't really work that way. At least, it's a lot more complicated than Huygens realized. Since the early 20th century, physicists have known that light is as much particle as wave. A beam of light is composed of countless discrete particles called photons. Each one has a frequency, which is what determines the color of visible light. Physicists once thought that brightness (amplitude) was analogous to the height of water waves--brighter light was thought to have higher crests and lower troughs. But it turns out that all photons of a particular wavelength carry the same amount of energy. A bright light is just spewing out more photons per second than a dim light.

This gums up the works for Huygens' wave theory. Even if you send one photon at a time through a piece of glass, light will still pick the fastest path. That's pretty crazy when you thing about it. How can a single particle do that? The simplistic wave theory also falls apart when you look closely at reflection. If you shine a light on a pane of glass, most of it will go straight through. But up to 16% of the photons will bounce back, which is why you see a dim reflection even in transparent glass. Light reflects from both the front side and the back side of the glass, and how much it reflects depends on how thick it is. And that's where it gets crazy once again. If you put a light detector inside a very thick pane of glass and aim a stream of photons at it, you can measure how many photons are reflected by just one surface--the front--and you'll find it's about 4% in most kinds of glass. But if you send the light all the way through the glass, and measure how much is reflected, you find a strange trend. As you keep making the glass thicker, you find that the amount of light reflected will rise and fall regularly--rising gradually from zero up to 16 percent, falling back toward zero, and then rising again. Think about that zero percent. That means, even though four percent of photons bounce off the front surface of glass, if you add a back surface it can cut that percentage to zero. What?? How does the light "know" how far away the back of the glass is when it's just getting to the front? Somehow it does, even if the glass is several meters thick. It's like light isn't just smart...it's psychic.

Of course, it's not really, but the truth is just about that weird. All this stuff is explained by the theory of quantum electrodynamics, or QED. Let's go back to light taking the fastest path through a piece of glass. If I understand it correctly, QED says that each photon takes every path from one point to another--even long ones, where it goes way off to the side and then back again. It spreads out all over the place in a very un-particle-like way, but then arrives as a particle. Each of those paths has a certain "probability amplitude" (the quantum world is all about probabilities) and oddly enough, most of the paths are about equally probable. But the probabilities mostly cancel each other out, except for those very close to the one that takes the least time. So that's the direction the light goes. A similar interaction of probabilities determines how much light will reflect off glass of various thicknesses. That all sounds crazy, and I certainly don't understand it in any deep sense, but apparently the math works just fine. Even physicists who have mastered the equations can't really explain what's going on--they don't understand why those equations work.

But they do work, smashingly. QED describes the quantum basis for most of the physical processes we see all around us; how light moves, how electrons orbit nuclei, how atoms bond together to create different materials...basically everything except nuclear physics, gravity, and certain kinds of radioactivity. Not only that, it's one of the most accurate theories in science. One of the originators of the theory, Richard Feynman, compared its accuracy to measuring the distance across the United States with a margin of error smaller than the width of a human hair. But the theory is also completely mind-boggling. As Feynman also said, "The theory of quantum electrodynamics describes Nature as absurd from the point of view of common sense. And it agrees fully with experiment. So I hope you accept Nature as She is--absurd." Light, and everything else at the quantum level, is pretty freaky stuff. But Feynman had a deep reverence for nature, and I think his point was more about common sense than nature itself. Light is more fundamental and ubiquitous than we are--it's pervaded space and time since the universe began. Looking at nature at a wide angle, we're the anomaly, not light or any other quantum phenomenon. If we find that it's absurd from the point of view of common sense, then we've discovered yet another flaw in common sense. Who are we to say what's absurd?

______________________________________________

QED: The Strange Theory of Light and Matter / Richard Feynman

I never explained how depth of field works. Here's a video that does

Good video introduction to QED

An even more mind-blowing, but more widely known, phenomenon of quantum physics is demonstrated by the double slit experiment. Good video on it here.