Behavioral economist Dan Ariely is a funny guy on a mission. As director of the Center for Advanced Hindsight, he insists on a commitment to absurdity, but there is nothing cynical about his approach to human behavior.
In his previous book, Predictably Irrational, Ariely exposed our false assumptions about the rationality of markets and individuals with plenty of surprising and humorous examples. Our irrationality may be very predictable, but our ability to forecast this behavior doesn't alter the conditions that give rise to it. Recognizing this, he adopts his paradoxical mission: to design better economic and social institutions to protect us from our confident pursuit of rational economic and social institutions.
In The (Honest) Truth About Dishonesty, Ariely applies his experimental approach to how we "lie to everyone -- especially ourselves." The book discusses the powerful ways irrationality affects our lives, and it begins with a critique of those who think dishonesty is a result of a rational cost-benefit calculation. In a series of experiments, Ariely neatly shows that neither the size of the reward nor the probability of getting caught substantially affects the likelihood of dishonest behavior. The cost-benefit framework for understanding cheating just doesn't pay off.
Ariely sees two conflicting motivations at work in dishonest behavior. On the one hand, we want to view ourselves as honorable, and on the other hand, we want to get as much stuff as possible. We want the benefits of cheating, and we want to see "ourselves as honest, wonderful people." So we fudge. We fool ourselves and others. Our "cognitive flexibility" cuts us so much slack that we often don't perceive ourselves as getting away with anything. This flexibility keeps the contradictions between our principles and our behavior beyond the horizon of our consciousness.
The (Honest) Truth About Dishonesty is full of examples of how we deceive ourselves about cheating. In golf, for instance, to most people it seems less like cheating to favorably reposition a ball with one's foot than to move it with one's hand. Tapping the ball with the club is best of all! As a rule, "cheating becomes much simpler when there are more steps between us and the dishonest act." We are more averse to directly taking some cash off the table but much more likely to behave dishonestly to get a reward that, in the end, has cash value. Psychological distance is key.
Dishonesty isn't always so bad. The author describes how doctors and nurses lied to him repeatedly when, as a teenager, he was recovering from severe burns that almost killed him. If they had told him the brutal truth, he might not have mustered the strength to go on. They didn't want him anticipating excruciating pain that he was in any case powerless to avoid. The pain was real, but the altruistic dishonesty of his caregivers eased his suffering.
Ariely notes that "we quickly and easily start believing whatever comes out of our own mouths," which means that once we take credit for something, we are likely to really believe that we deserve it. When students are induced to cheat on tasks in an experimental situation, they start to believe that their skill level has increased. They certainly realize that they are, say, using an answer key to "solve" a problem. Nonetheless, they begin to inflate their perception of their competence at problem solving. This kills two birds with one stone. They don't feel guilty for having cheated, and since they've forgotten about the cheating, they feel better about their performance.
Despite the good humor with which Ariely discusses his ingenious experiments, this is depressing stuff. But there is hope. Although it is easy to induce dishonest behavior in people, it is also easy to reduce the incidence of such behavior. Mostly, small reminders of basic moral standards tend to improve behavior. Whether it's the Ten Commandments, an honor code or a declaration of professional principles, bringing moral standards to mind reduces cheating. Signing a pledge (at the top of the page) before filling out a form is more effective at reducing dishonesty than signing a pledge after completing a form. Ariely likes having students write out their own honor codes on assignments so that they have to think about ethics rather than just signing something automatically.
He offers some recommendations on conflicts of interest, particularly in medicine. The problem is that many of our professionals systematically find themselves in conflict situations and that they fool themselves about not falling into unethical behavior. And when these professionals know their clients well, when they are most trusted, the worst conflicts tend to arise. Whether we are on the client side or the professional side, we are likely to tell ourselves that these situations don't apply to us and the people we trust. We fool ourselves, and so we don't recognize the dishonesty.
Ariely shows us how some basic factors, such as being tired or hungry, undermine our efforts to be ethical. I was struck here, as I was in Daniel Kahneman's excellent Thinking, Fast and Slow, by the example of judges who tended to defer to parole boards as the judges got hungrier. The concept of "ego depletion" -- that we can run out of the strength to do what we know we should -- reminds us that willpower is a muscle. It takes energy to do the right thing.
We also learn that, once cheating starts, it tends to gain momentum and become contagious. That's why we shouldn't tolerate small indiscretions; it lowers the bar for everyone.
Ariely raises the bar for everyone. In the increasingly crowded field of popular cognitive science and behavioral economics, he writes with an unusual combination of verve and sagacity. He asks us to remember our fallibility and irrationality, so that we might protect ourselves against our tendency to fool ourselves. I guess only advanced hindsight will one day tell us how successful we have been.
Cross-posted with washingtonpost.com
Follow Michael Roth on Twitter: www.twitter.com/mroth78