BinCam's users buy garbage cans that photograph their trash, analyze it, and inform Facebook friends of their habits. Its creators and customers hope to tap into the power of peer pressure to improve behavior. Technology skeptic Evgeny Morozov wrote in the Wall Street Journal that BinCam's smart technology falls somewhere between "bad smart" and "good smart." Bad devices limit choices by making some things hard or impossible - say, breathalyzers that prevent drunk drivers from starting their cars. Good technologies leave us in charge, offering only information to aid our decisions. BinCam could, for instance, get "good smart" if it benchmarked our recycling behavior against other people in our demographic group. But it is half bad because but it appeals to "base instincts" like earning rewards, competition and impressing others.
But making choices involves more than pondering the alternatives. Morozov's picture of autonomy and choice hearkens back to decayed ideas from centuries past. We are not disembodied, perfectly rational actors or computers like Star Trek's Data. Most of our daily decisions defy this assumption. Reflection has little to do with where we step, when we eat, or how we move from home to work. Instincts guide our gait, which prevents cognitive overload. Our (often malfunctioning) hormones and the schedules set by our employers decide when we will eat. Cities planned around the automobile push us toward sedentary commutes, even if we prefer a refreshing walk before facing eight hours at a desk.
More important, most people already know they want to eat better and exercise and recycle more, and they know what is involved. They don't need knowledge; they need the motivation and the means for achieving their goals, and these intertwine with our emotions, instincts and environment. Just as a prosthetic helps an amputee walk upright, smart technologies can help us augment our will and overcome the many external and internal barriers that constrain us. Smart technology takes reality as we find it and it empowers us. We need more than information to choose - we need sustainable motivation. Better to work with our impulses and bring them into alignment with the rational choices we want to make. If anything, smart technology increases autonomy - if autonomy means doing the stuff we would thoughtfully prefer to do. The author's naïve notions about the stuff of human choices lead him to miss the obvious.
Morozov's rational actor famously performed center stage in classical economic theory, which assumed that people maximize their own well-being when making purchases. Ironically, though, he scolds engineers for curtailing choice by offering consumers the choices they crave. Rather than follow his own advice and provide unbiased information about smart products, he imports his idiosyncratic ideas about choice, and labels these products "bad," and suggests they make us dumb. They appeal to "base instincts," as if Runkeeper users are baboons. By his own confused standards, his words "fall somewhere between good smart and bad smart." Even Morozov's modest suggestion that BinCam should benchmark customer's recycling habits is a known tactic for creating comparison and appealing to "base" instincts like people's pride and their desire to keep up with others.
Perhaps Morozov fails to recognize that all technologies have biases, as Rushkoff has argued. Guns and pillows can kill, but their bias toward doing so differs exponentially. We can pick any kind of car to get to work - gas, diesel, electric--and among hundreds of models, but "this sense of choice blinds us to the fundamental bias of the automobile toward distance, commuting, suburbs, and energy consumption." Morozov has strangely singled out smart technology for criticism, though bias is common to all products and at least smart technologies make many of their nudges explicit. People buy them because they want to change their behavior. It's hard to say the same for the biases that come with products like ramen and chips and fast food. People don't purchase these with chronic disease and early death in mind, though unfortunately that's often part of the package. Morozov misses the forest and then cuts down the wrong tress.
This doesn't mean all products are evil. But it seems that bias and base instincts are unavoidable. Technologies frame our decisions. Marketers influence our choices by bombarding us with messages that appeal to impulses like pride, fear, sex, exclusivity and security. Animal urges - like starvation avoidance, push us to overeat. City planning, or the lack thereof, constrains our choices about where we live and how we travel. In other words, we are up against all kind of intentional and unintentional forces that we didn't ask for, all of which conspire to curtail our autonomy.
And, in rarer cases, we should circumscribe choice. Even libertarians have long acknowledged that freedom should be limited when our actions harm others. This is why we have a criminal code. Breathalyzers that keep drunks from driving should limit their liberty to injure innocents, just as the threat of prison or an officer's pistol strongly encourages people to avoid violent altercations. Appropriate autonomy is more than mere license.
Morozov raises more serious concerns, like privacy and the possibility of people getting forced to turn over information or engage with coercive technology. But he offers nothing new here: these worries are widely shared and decent legislation can protect citizens from overzealous technocrats and rapacious businesses. More important, Morozov's rubric for "good" and "bad" relies on discredited ideas about human autonomy and choice, causing him to so wildly miss the mark that he criticizes smart technology as autonomy-limiting, when it actually offers us the chance to play a greater role in choosing our own destinies.