How We Make Decisions on Issues Like Ferguson

It is interesting that in assessing the situation in Ferguson, individuals tend to retreat to their separate opinion corners. It appears you have to be either pro Mike Brown or pro Darren Wilson. More commonly opinions are expressed in terms of which team (or individual) is a demon and which is blameless. Why is this?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

It is interesting that in assessing the situation in Ferguson, individuals tend to retreat to their separate opinion corners. It appears you have to be either pro Mike Brown or pro Darren Wilson. More commonly opinions are expressed in terms of which team (or individual) is a demon and which is blameless. Why is this?

The reason, I think, is based on the science of decision-making. Most decisions and opinions are not arrived at thoughtfully, but through habit, group pressure or how much it conforms to a previously established world view.

Everyone makes decisions all day long, and every once in a great while we have to make a critically important decision. Generally, we are fairly confident about our logic in arriving at most decisions, even if we do not like the consequences. Surprisingly, behaviorists point out that most decisions are made without conscious thought. That's right; we are mostly on automatic pilot!Given the number and complexity of decisions confronting us, we tend to unconsciously utilize tools called heuristics. A heuristic is simply a rule of thumb or generalization that simplifies decisio- making. We rely on heuristics for expediency and because they do work for the most part. Over time you might have noticed that A usually occurs with B, that X means Y must have occurred, and so on.

Another common test: "How does my group (race, gender, political party, country, Facebook friends) feel about this issue?"

But, being generalizations, heuristics are rife with the potential for errors and biases. For one thing, heuristics are often developed from prior experience (actually, the memory of experiences). Some very important research by psychologist Dr. Elizabeth Loftus (Memory, surprising new insights into how we remember and why we forget) discovered that memory is not a mental bank to which objective experiences are deposited, then withdrawn in their original state at a later date.

Rather, memory is reconstructed when needed, colored by subsequent experiences, influenced by current vested interests and tweaked by context. Think of the differences among various eyewitness testimonies in the Mike Brown-Darren Wilson incident.

Interestingly, memory is initially laid down with some personal biases to start with. Two individuals observing a single event from the same vantage point often have very different recollections. Selective memory and false memory are both powerful illusions often treated as valid and used as the basis for making current decisions.

Since heuristics, while convenient, are often based on erroneous memory and unexamined biases, why do we continue to use them? They are easy, comfortable, and serve us well for the most part, since they are readily available for access. In fact the most common type of heuristic is called the availability heuristic and is defined simply as overestimating the frequency of vivid, extreme or recent events and causes. There is a strong tendency to determine the frequency or likelihood of an event, or explain its cause by how easily something similar is available from memory.

Think how vivid some childhood experiences (good or bad) still are. Not only can you recall the details, but even when you make decisions as an adult, those memories can come to the forefront in an uncanny way, especially if you perceive that the current situation promises substantial stress or reward.

The representativeness heuristic reflects the tendency to predict or prejudge the likelihood of an event from limited prior experiences. This might be played out, for example, if we have an unfavorable experience with a policeman or a person of color. Each encounter represents a likely opportunity for another negative outcome. Even news coverage, by the way, is incorporated by many as a prior experience. Differences can be threatening. As the demographics of the country change we are increasingly exposed to individuals of different cultures, for example. It is tempting to ascribe any individual difference in personality and behavior to a quality (often undesirable) inherent to an entire culture. We might even consider that lack of intellect or a poor work ethic or tendency to violence as representative of an entire group.

Remember also, a previous encounter, positive or negative, with a policeman or young black man, is not necessarily predictive or representative of subsequent experiences with other members of either group. While tempting, it can be erroneous and even dangerous to use lazy generalizations.

The anchoring and adjustment heuristic is the common tendency to make decisions based on adjustment from some initial base or anchor. Salesmen use this technique very effectively when they quote a retail price and bargain away from that price. The buyer automatically perceives any price below that as a "deal", even though the sticker price might have been inflated to start with. You can probably think of countless examples of decisions you have made based on this rule. Giving disproportionate weight to the first information you receive is a classic mistake.

One form of fallacious reasoning is the trap. Just as it's used in common parlance, a trap is very easy to fall into, and difficult to get out of. The confirming evidence trap leads us to seek out information which confirms our existing point of view, while ignoring contrary information: "my mind is made up, don't confuse me with the facts." This appears comical, but reflects the way the majority of us make decisions. It is common to predict both the outcome of interactions and to explain outcomes based on unfounded preconceptions.

Scan the social media sites for "black thug" and "unarmed black men." Often the poster does not seek to understand the evidence but jumps to explaining the outcome based on a confirming evidence trap. Most come to a decision, then search for "proof" that confirms their position.

Sadly, confirming evidence traps are perpetuated because they are comforting, avoids the difficult work of critical thinking or evidence finding. Unfortunately and more significantly, in a metaphysical way, confirmation biases can be remarkably self-fulfilling. You observe what you expect, after all.

In most decision making situations, the key is to be more critically aware of our thoughts and actions. Once we decide to engage awareness, it is easier to discern if we are dealing with a unique situation; gathering evidence to make a logical, fair decision; or simply trying to prove our preconceptions.

Heuristics are not necessarily bad. But since both our thoughts and actions are often on automatic pilot we sometimes make even important decisions uncritically.

Three safe rules of thumb: commit to being more consciously aware of your decision making style, be willing to entertain different perspectives, be willing to change your mind if new evidence presents itself.

Oh, and own your daily contribution to societal stereotyping; as well as the consequences of the decisions you make on a daily basis.

Popular in the Community

Close

What's Hot