12/20/2013 11:00 am ET Updated Feb 19, 2014

Enough With Business Models Exploiting User Data -- Why 23andMe and Google Are Not Users' Friends

Last month, the Federal Drug Administration told upstart 23andMe to stop marketing its genetic testing kits to consumers. These kits had been telling consumers what genes they carried, including flagging potentially serious propensities for various diseases. The FDA was unhappy that the company had been stonewalling the agency on providing assurance that the company was not providing health advice to customers based on that genetic testing.

Pundits picked sides in the controversy, with libertarians promoting 23andMe as the next step in individual control of personal health and critics seeing it as one more faddish health care scam spreading misleading data preying on consumer fears. Most health care analysts argued that the kind of genomic testing that 23andMe provides is of little value to most doctors and patients in making health care decisions, although it might start a conversation with medical professionals.

So far this could be the garden variety debate we've seen on claims by various nutritional supplements to have amazing properties disputed by medical authorities. But the kicker for 23andMe is that it's not really expecting to make its money on the bargain-priced $99 genetic testing kits it processes for customers. Those are just the lure to get user medical data for the company's own databases, which will become a cash cow marketed to a whole range of corporate clients.

As 23andMe board member Patrick Chung told Fast Company, "The long game here is not to make money selling kits, although the kits are essential to get the base level data. Once you have the data, [the company] does actually become the Google of personalized health care." Given that 23andMe's founder and CEO, Anne Wojcicki, is married to Sergey Brin, the fact that 23andMe has Google's business model is not surprising.

And Google's business model is based on gaining an almost unassailable dominance of user data online in order to market users of its search, video, email, cell phone and other services to its advertising clients. As I've written about in more depth in this academic article, Google's users have seen their data used to target ads from some of the worst exploitive sources possible, from subprime mortgages in the heart of the financial crisis (one of the largest sources of Google's revenues back then) to fake and illegal pharmaceutical drugs (for which the company was fined $500 million by the government) to scam artists advertising "help" for families threatened with foreclosure (ads which the company only took down when ordered to by the U.S. commerce department).

Even companies using Google to sell legal products are using a whole variety of behavioral and contextual data to deliver different offers and discounts to people based on that profiling. This version of price discrimination online delivers higher prices in aggregate for consumers than traditional marketing, since each person ideally gets a price tailored to their highest price point -- and what Google's Chief Economist Hal Varian calls the "myopic consumers" will pay more since they will never hear about the lower prices paid by other customers.

Now marry that business model to the most intimate genetic data for each person buying a genetic test from 23andMe. The possibilities are chilling -- and obviously lucrative for the investors in the company. Given that those shelling out $99 for a hot new genetic test are likely to have a larger than normal share of hypochondriacs among them, 23andMe can "pass on" information about services and products tailored to the medical fears of each 23andMe customer based on the genetic risks identified by the company. The company promises not to directly sell user data to third parties -- but then so does Google, which makes $50 billion per year in indirectly monetizing its user data.

On top of that, 23andMe is explicitly piling up a critical mass of user genetic data to identify patterns and aggregate information to sell to medical researchers, insurers and pharmaceutical companies. The company hopes to covert that data into it's own patents as well, further monetizing user data. Medical data is valuable and many people buying these tests don't even know they are handing over valuable personal information, just part of the broader trend in the economy of "big data" shifting wealth into a narrow set of corporate hands with little compensation to those users who are the source of that data-driven wealth.

Business models based on grabbing user data quickly and monetizing later are the toast of Silicon Valley, but they are just the latest version of ripping off people who don't know the value of what they are selling. 23andMe has just extended the concept by getting people to actually pay the company for the privilege of fattening 23andMe's genetic data bank.

We need is a broader public debate on the intertwined issues of how we are going to safeguard the privacy of personal genetic data and how we are going to ensure that any aggregation of that data strengthens publicly available research for all medical researchers, not just the proprietary patent interests of a few select corporate players. 23andMe is the wrong direction on both counts.

We should be glad the FDA put a temporary stop to its marketing but we need a far broader public debate on how to ensure that consumers don't give up valuable personal data in general without being fully informed of its value and creating the option for consumers to opt out of data sharing and aggregation all together. And Silicon Valley needs a better model than building its future on exploiting user data.