What Folding Chairs Taught Me About Skepticism

02/03/2013 07:06 pm ET | Updated Apr 05, 2013

Click here to read an original op-ed from the TED speaker who inspired this post and watch the TEDTalk below.

At the Baptist church in which I was raised, my Sunday school teacher liked to explain faith with the analogy of a folding chair. No matter how many times you've sat in a folding chair before, she would explain, the only way to be sure a new chair will support your weight is to have faith and take a seat. "We have faith in chairs to support us;" she'd say; "and we have faith in Christ to save us."

Even as I grew out of my fundamentalist upbringing, this analogy still gnawed at the back of my mind. After all, we all play our hunches in day-to-day decision-making -- often without realizing we're doing it. We trust drivers not to run us down in the street; we trust (some) news sources not to knowingly lie to us; we trust doctors and scientists to explain conclusions whose details we don't understand.

But this is just everyday pattern recognition. As Michael Shermer explains in his TEDTalk, our brains are evolved to pick out patterns even where none exist, for the simple reason that false positives are often harmless, but missed danger can be fatal. Once our ancestors developed the capacity for abstract thought, though, false positives often did turn out to be fatal; especially for those who sacrificed themselves -- or got sacrificed -- to appease personified natural forces.

So what is it that separates everyday trust -- everyday leaps of faith, even -- from the kinds of ironclad beliefs that inspire jihads; crusades; leper-kissing; cathedral-building? How does one become so utterly sure of an idea that rests on so many unproven assumptions?

The answer -- or, at least, a part of it -- lies in the concept of "proof." If you'd asked a bronze-age farmer for proof that the earth sits fixed in space and the stars rotate around it, he'd tell you to look up at the sky; the truth is there for anyone to see. Ask him about the gods and he'd likely give you a similar answer: "Every year we spill blood for the wheat goddess, and every year the wheat grows." Though his anterior cingulate cortex would've been just as adept at picking out patterns as ours are, the patterns he saw would've been distinctly different from the ones that seem so obvious to us.

Proof and logic, in other words, don't exist in a vacuum -- they sprout in a substrate of preexisting assumptions. Mathematicians realized this, to their horror, in the early twentieth century, when thinkers like Kurt Gödel demonstrated that a fully self-contained mathematical system was a logical impossibility. We're left, then, with the rather uncomfortable idea that truth and falsehood - and even proof - are inherently relative concepts.

This sort of freshman-level existential angst might seem about as distant as possible from the metaphysical surety of the religious believer, but the two are united by an unavoidable aspect of human thought: At some point, whether consciously or unconsciously, each one of us has to decide which propositions he or she is willing to take on trust. That's not to say, of course, that all propositions are created equal. To believe in a wheat goddess today, for instance, you'd have to ignore or reinterpret acres of physical and biological data. Even more importantly, your wheat goddess wouldn't predict future crops with anywhere near the accuracy of a bioassay or a soil survey. In short, the more data we gather, the more aspects of our beliefs we have to revise.

And here we come to the real crux of the belief engine. The chief survival value of a brain as disproportionately huge and calorically expensive as ours is that it makes reliable predictions about complex situations. The more useful predictions a given belief generates, the more inclined we are to trust it -- perhaps even to build a life upon it.

Our desire for foresight becomes problematic when we consider that a belief doesn't need to be empirically valid in order to make useful predictions. Take, for instance, an athlete who believes that spiritual forces guide his bat-swing or jump shot -- and consistently hits home runs or makes baskets while he prays. We can tell him he's just tuning out distractions and relinquishing motor control to semi-conscious muscle memory -- that his interpretation of the process is flawed, in other words -- but what we can't deny is that when he thinks certain thoughts, certain useful results are statistically more likely to ensue. What we're really shooting down is his belief's range of useful predictiveness.

Non-mathematicians have learned to trust math and physics because properly applied theorems produce accurate predictions about certain ranges of natural phenomena. And even as we discover more and more cross-applicable principles -- the mathematics of neural network dynamics, for instance, make surprisingly accurate predictions about the behavior of social groups - we still find that applying a theorem in the wrong way, or to the wrong system, produces inaccurate predictions. In fact, that's essentially how we're able to "back into" the scientific definition of "wrong" itself. The really strange part of all this isn't that proof and prediction are so limited -- it's that most of us get along pretty contentedly this way.

Faith, in short, is simply a tool. Applied too broadly, it can lead to dogmatism; applied judiciously, it can cement societies together. And doubt is a tool too -- one with the power to shave away old superstitions or to bring innovation to a screeching stop. The goal of a skeptical mind, then, isn't to stop seeing patterns altogether, but to understand what to do with the patterns we see.

Ideas are not set in stone. When exposed to thoughtful people, they morph and adapt into their most potent form. TEDWeekends will highlight some of today's most intriguing ideas and allow them to develop in real time through your voice! Tweet #TEDWeekends to share your perspective or email to learn about future weekend's ideas to contribute as a writer.