Don't Play Games With Your Health

HealthTech companies need intense research and regulations, not to be rushed to market.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

As a HealthTech start-up CEO, I'm straddling two worlds. One world is as an entrepreneur in Silicon Valley, and the other as a doctor and neuroscientist. The first world is built and rewarded on hockey-puck growth, wild valuations, beta-launches, and speed: Google, Facebook, Twitter, Uber - the faster, the better.

In the second world, faster is not always better. This world is modulated by rigorous process, evidence, regulation and even an ancient oath (Do No Harm) put in place to protect and save our lives.

With increasingly advanced technology, we've seen apps and products hit the market that make all sorts of promises. Some claim to help users fight anxiety and stress, others depression, still others are even billed as suicide prevention. We have also seen companies use technology to promise, at the prick of a finger or drop of saliva, vast knowledge of, and answers about, our health.

At first glance, this seems like exciting progress. And partly, it is. The merging of tech and health will undoubtedly mean great strides in prevention, diagnosis and treatment of disease. However, in terms of research and regulation, traditional medicine collides with the tech revolution.

An Emphasis On Real Research

I recently read an article in TechCrunch entitled "Pseudo-Therapy Apps: The Fad Diet of Mental Health." It's penned by Psychotherapist Irene Gorodyansky, and I couldn't agree more with her analysis.

She writes of the "Therapy app" trend, mentioning a handful of apps that promise users relief from serious psychological problems, by playing games. The apps all have two things in common: they promise fast results, and they are backed by skewed data. Three months, 30 days, 25 minutes. These are very quick fixes to serious problems, such as anxiety, or depression.

The numbers that prove such results are misleading, if not blatantly false, explains Gorodyansky. Some only account for people who use the apps long term, not those who may abandon it after only a few days or weeks. They use small sample sizes, and there is no equivalent of a placebo.

She writes, "What are we to believe about the efficacy of this approach to mental health if we can't trust the numbers? Do the scientifically backed games and exercises become just games and exercises?"

And further, "We can't allow the word 'science' to lose its meaning when it's related to people's health. Flimsy science used for promotional purposes trivializes the significance of mental health issues."

Her views are spot-on. Transformative digital health companies have the potential to improve our health and enrich our lives, but not if they are built on false data. The messages that promote these apps use pseudo-science to make people believe that if they play a game, they'll feel better.

Instead, real, evidence-based companies need to be nurtured and fostered in the Valley. We need a real dialogue between investors, marketers, and end users about how to build and grow companies with responsibility, respect and restraint.

We need to differentiate between games and gimmicks versus technology grounded in diligent, long-term research. If an app makes a health claim such as "Reduced symptoms of anxiety and depression," that claim should be backed by years of rigorous independently funded peer-reviewed research and testing including peer reviewed published articles on the results of controlled clinical trials.

I've witnessed what the Valley can do and I believe we can make this cultural shift in the HealthTech space. As a doctor (and an entrepreneur) I'd prescribe that this shift happen fast.

The Importance of Regulations

The disconnect between healthcare and tech is also shown in the unrealistic expectations pushed on certain entrepreneurs based on a system that knows and rewards only one speed: fast.

Healthcare is regulated for a reason. Perhaps popular culture would cause you to believe that regulations are there to place a barrier between the consumer and the latest methods of care. In truth, these regulations - licenses, or FDA approvals, for instance - were put in place long ago to stop unethical practitioners from harming people.

Wearable devices can give you up-to-the minute data on your biological functions, whether it be steps per day, heartbeats per minute. Testing services promise early detection for diseases, mapping of DNA, and data about your ancestry. But what is the user to do with that data? As I once saw printed on a mug: "Please do not confuse your Google search with my medical degree."

Having more data about our own health is a powerful idea whose time has come. But as the consumer, you've got to consider the source. When people without medical degrees rely on data from companies that may not have FDA approval, there's a chance that they could make dangerous decisions about their own health. There is a massive education gap in our culture around what is science versus what isn't.

Of course, like any profession, medicine has its good eggs and bad doctors. I think most of us would agree that we want to see a doctor who has been properly trained and holds a license to practice medicine. And yet, we've seen tech companies that boast extensive health data and care, but at the same time don't have medical doctors at the helm or even providing the care via their apps.

To bypass licensing requirements, certain services are dubbed "Advice" or "Coaching," when what they're really providing is medical care - even talking about "Cognitive Behavioral Therapy" in their product offering while having unlicensed personnel with bachelors degrees dispensing the "treatment" re-named "coaching". This is a dangerous business. People offering "advice" might have no medical ethics training, no clinical training, or no healthcare certification of any kind. They could be practicing medicine without a license.

Currently, all fifty states require a license to practice medicine or psychology, the simple act of putting that into an app should not change the fact that a licensed professional needs to be on the other end prescribing or offering treatment or therapy. I predict that, if we don't intervene before then, the collision could happen at the occurrence of the first wrongful death or malpractice suit against such a company when a person with depression or anxiety who is seeing a "coach" with no training in managing such diseases commits suicide while using such an app.

I urge more consumers to be educated about where they get their healthcare questions and answers.

But what we really need is HealthTech companies to be held to the highest standard. Unlike typical Valley startups, this is going to take patience, process and time. We need companies to be backed by years of informed research, and to not be pushed to market before they're thoroughly vetted and regulated where necessary. Would you accept a prescription or pill from your doctor that's been rushed to market and is beta tested...on you?

When taking someone's health into consideration, faster isn't always better. When we move too fast in medicine we don't break things, we break people. Those people aren't numbers in our data spreadsheets, they are people's children, mothers, fathers, husbands, wives, sisters, brothers. Let's change the conversation.

Mylea Charvat is CEO and Founder of Savonix (http://www.savonix.com), an evidence-based brain assessment platform that has undergone decades of rigorous research. She completed her Ph.D. Fellowship in Clinical Neuroscience at Stanford School of Medicine and was an early employee at Preview Travel - now Travelocity.

Popular in the Community

Close

What's Hot