It is remarkable to think that behind almost every effortful physical action we take lays a decision, whether we know it or not. More remarkable still is the fact that the same can likely be said for each of the effortful mental actions we generate (e.g., thoughts).
In recent years my colleagues, psychologists Wouter Kool and Matthew Botvinick, have provided exciting new evidence for a principle that has been referred to as the "law of least mental effort." They have shown that individuals treat mental effort very similarly to how they treat physical effort. All else being equal, they tend to prefer less over more thinking. This should all seem reasonable given our intuitions for the experience of mental effort as costly. More importantly, though, they have also shown that decisions regarding mental effort are sensitive to incentives in ways remarkably similar to decisions regarding physical effort. People tend to shift their allocation of mental effort depending on the manner in which income and wages are dispensed, just as predicted by economic models of labor allocation (models that tell you, for instance, how much it will take to get your employee to work a few extra hours this week).
Their findings contribute to a growing body of research that begs us to think about thinking the same way we think about any laborious task, as something we do despite the costs and because of the payoffs. And yet we tend not to reflect on the fact that we are constantly faced with a decision as to whether it is worth the effort to do any of the things that research has shown to be mentally demanding. These include choosing between conflicting options, generating useful information from memory, switching between tasks, or down-regulating automatic emotional responses that interfere with our longer-term goals (e.g., anger at a client or customer, laughter at a funny incident you are reminded of). You may be familiar with these as some of the basic components of your daily mental expenditure, and may also recognize the fact that some of these can coincide with one another to multiplicatively increase our mental load. For instance, when you switch from a familiar to an unfamiliar task you face what is referred to as a "task-switch cost" in addition to the cost of resolving the conflicting possibilities for how to approach the new task.
The law of least mental effort explains not only why we are sensitive to such costs, but also why we might often take advantage of the myriad cost-saving tricks we have at our disposal. It turns out that the vast majority of decisions we make, consciously or not, offer us at least one relatively easy option from the perspective of mental effort: an "easy way out." Rather than generate a lot of information from memory, why not just use the first thing that comes to mind? Rather than regulate emotions, why not express them (and/or pretend you didn't by rationalizing after the fact)? Rather than resolve conflict/unfamiliarity, why not stick with a known quantity? For instance, we can avoid conflict entirely by going with the same choice we made before, sticking with the default options (e.g., the box that has already been checked), or relying on an answer someone else has already thought through for us (whether a contemporary scholar, founding father, ancient philosopher, or theologian).
Occasionally we might employ some of these strategies because it just feels right, and we think of these cases as times that we have relied on our "intuition" or "gut instinct." As I've written previously, these intuitions can influence not only our behaviors but also our basic beliefs about the world around us. Intuitions are appealing because they are typically borne from something that we have experienced often enough that we have gradually come to simplify and automatize the way we process or respond to it. In the classic example, chess masters do not take each turn to consciously process each possible move, as they might have as amateurs. The same is true of musicians and athletes, and it is well known that forcing these folks to abandon their intuitions and consciously reflect on each note or each pitch can greatly worsen their performance. We can therefore rely on intuitions to often give us the correct answer (especially when applied to our area of expertise) but to always give us an easy answer.
What this all boils down to is the following: When we find that we have chosen the default option, maintained the status quo, stood on principle, or otherwise relied on our automatic emotional or cognitive response (i.e., intuition), there will always be some chance that we did it in order to spare some cognitive expense. There may also be an independent and well-reasoned argument for having made the decision we did, and that may or may not have gone into our choice process. But to the extent our choice was already tempting enough for its effort-saving qualities, our brain needn't have known what that other argument was.
There is a great deal we have still to understand about the idea of cognitive costs -- like what it is exactly about thinking that makes it so costly/effortful -- and my colleagues and many others continue to spearhead such research. But from the pieces I've laid out above it is clear that these costs may play a huge role in the type and amount of effort we put into thinking when confronted with problems ranging from multiplication to understanding the cause of unexpected and devastating events (for instance, in the weeks past and those to come, consider how often terms like "evil" and "criminality" are used to better clarify one's stance on the causes of gun violence). In each case we have mental shortcuts available to us, the question is whether we choose to take advantage of them. If and when we do, it's not because we're lazy. Just economical.