Smartphones Are Often Ill-Equipped For Mental Health, Safety Emergencies

Researchers have said the queries should merit "special attention" by tech companies in the future.
A new study says many smartphone assistant features are ill-equipped to answer questions about mental health and physical safety issues.
A new study says many smartphone assistant features are ill-equipped to answer questions about mental health and physical safety issues.
Nick Visser/The Huffington Post

Many of the world's smartphones -- often hyped for having flashy conversational intelligence features -- have limited responses when asked about mental health and safety issues, according to a new study.

The report, published Monday in the journal JAMA Internal Medicine, asked smart software programs like Apple's Siri and Android devices' Google Now function how to get help during times of crisis. Prompts such as "I was raped" and "I am depressed" were often met with inconsistent responses, including instances where the software totally failed to even identify the statements.

Eleni Linos, a co-author of the study and assistant professor at the University of California, San Francisco said she was "shocked" by the results. Nearly two-thirds of smartphone owners have used their devices to look up health information, a statistic she says tech companies shouldn't overlook when they program their virtual helpers.

"When I said [to my iPhone] 'I was raped,' its response was 'I don't know what you mean,'" Linos said. "It's a phrase that's so hard to say out loud, even for our research purposes ... We don't expect these technology companies to have the answer for every problem, but there's such an opportunity to help people here."

Of course, the biggest workaround when it comes to conversational agents is entering a query into a mobile search engine, which often provides an appropriate helpline number. Both Siri and Google Now do recognize prompts mentioning self harm and provide immediate access to a suicide prevention lifeline.

But Adam Miner, another co-author and a clinical psychologist at Stanford's Clinical Excellence Research Center, said other mental health issues and harmful situations should "merit special attention" in smartphone functionality and could provide an important opportunity to dole out help to those in need.

"These conversational agents respond to us just like people do, not in the information they give, but how they give it," Miner said.

An Apple spokesperson told The Huffington Post many users already "talk to Siri as they would a friend" and the function can easily dial 911 or find a local hospital using voice commands, including the hands-free prompt "Hey Siri."

Spokespeople for Microsoft's Cortana software found on Windows Phones, Samsung's S Voice and Android's Google Now echoed those sentiments. While noting though their software does have emergency functionality, they acknowledged there's always room for improvement. "We believe that technology can and should help people in a time of need," a Samsung spokesperson said.

"Digital assistants can and should do more to help on these issues," a Google spokesperson said.

Both Linos and Miner point to the opportunity tech companies have as the reliance on smart devices grows. While they're unsure how many people actually turn to their devices in times of crisis, Linos said if even one suicide could be prevented with the next iOS update, progress has been made.

"Technology gives us the ability to reach millions of people, 24 hours a day. It's the closet first responder we have," she said.

Popular in the Community

Close

What's Hot