It's hard to escape the trap of Big Data these days. In the current issue of The Week, for example, the cover headline reads: "Trapped in Big Data: The debate over surveillance, privacy and national security." The National Security Agency (NSA) revelations have brought into relief just how much personal data is routinely mined not only by federal, state and local governments, but also by corporations. The seemingly limitless records of our e-mails, phone calls, purchases, travels and current locations are used to pitch products and services. If we are to believe the NSA, these data have also been used to foil terrorist plots.
The NSA revelations have unleashed a spirited political debate about the costs and benefits of Big Data collection and analysis. NSA defenders, including President Obama, suggest that the state must tread "lightly" on our rights to privacy in order to keep us safe. Critics suggest that government thirst for Big Data is the first step toward an authoritarian state in America.
Beyond the important debates about privacy and the constitutionality of NSA snooping, there is one issue that has received less attention -- the usefulness of Big Data. In a recent Ethnography Matters blog, cultural sociologist Tricia Wang, who writes about the human dimensions of technology, describes the use and misuse of Big Data, which she defines as "a term often used to describe the quantitative data that is produced through analysis of enormous data sets." She argues that ...
...Big Data tends to place a huge value on quantitative results, while devaluing the importance of qualitative results. This leads to the dangerous idea that statistically normalized and standardized data is more useful and objective than qualitative data, reinforcing the notion that qualitative data is small data.
These two problems, in combination, reinforce and empower decades of corporate management decision-making based on quantitative data alone. Corporate management consultants have long been working with quantitative data to create more efficient and profitable companies.
With statistically sound analysis, consultants advise companies to downsize, hire, expand, merge, sell, acquire, shutdown, and outsource all based on numbers (e.g. McKinsey, Bain & Company, BCG, and Deloitte).
The risk in a Big Data world is that organizations and individuals start making decisions and optimizing performance for metrics--metrics that are derived from algorithms.
Without a counterbalance the risk in a Big Data world is that organizations and individuals start making decisions and optimizing performance for metrics -- metrics that are derived from algorithms. And in this whole optimization process, people, stories, actual experiences, are all but forgotten...
Wang's useful essay brings into the Big Data debate an age-old problem -- how should we approach the analysis of the world -- how decisions are made in institutions, how people interact, what motivates personal preferences and so on. In one camp there are our folks who want to lump data. They gather enormous amount of the stuff -- even Big Data -- and derive from their analyses clear-cut principles, metrics, protocols and algorithms, which can then be applied to decision-making--no matter the context. In the other camp are folks who want to split data. They usually believe that you have to pay attention to the nagging complexities of, to use Wang's terms, "people, stories, and actual experiences..."
The late Clifford Geertz, perhaps the most influential American anthropologist in the 20th century, believed that you could use what he called "thick description" to make sense of the seemingly infinite array of data in the world. Contemporary analysts who employ thick description, which does not ignore quantitative data, would attempt to give massive arrays of Big Data a social or institutional context. Put another way, thick description would give nameless and formless Big Data some degree of analytical meaning, creating sets of what Wang calls "Thick Data," the stuff of ethnography, the ground-level description and analysis of social, cultural economic and political phenomena.
The critical assessment of Big Data, including, of course, the unimaginable amounts of private information that the NSA has uncritically mined and stored, is more than an arcane discussion of scholarly method. It has institutional and political implications. Large organizations, including the federal government, like Big Data because it holds the promise that social, cultural, economic and political complexity can be reduced to set of hermetically sealed formulae, protocols and algorithms. These are politically expedient because they are neat and tidy, a feature that presents the illusion that you can control events or, in the case of the NSA, prevent terrorist incidents. Such a narrative builds public confidence -- always a political plus.
If you are blind to context, however, your analysis is bound to reinforce your long held assumptions about the world. Thirty years ago Geertz called these assumptions "home truths." Among the decision-makers such blindness is likely to lead to unanticipated outcomes, collateral damage and terrible mistakes. In other words, the leaders of large organizations, including President Obama, are trapped under ever-increasing piles of Big Data, which means they have little knowledge of what's happening on the ground, let alone in the streets. Using Big Data protocols, metrics and algorithms, they tend to make decisions in isolation of experiential context.
The problem of Big Data is here to stay, which means that in the coming months and years we'll need a legion of ethnographically trained analysts to produce "Thick Data" -- to save us from ourselves.
Follow Paul Stoller on Twitter: www.twitter.com/Sohanci