Emotions Analytics to Transform Human-Machine Interaction

Our devices are quite smart. They know what we type and touch, what we say and where we are; they even know how we look like, but they are quite clueless when it come to how we feel and what we mean
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Our devices are quite smart. They know what we type and touch, what we say and where we are; they even know how we look like, but they are quite clueless when it come to how we feel and what we mean. This still-absent bond between humans and machines is also the chief theme of the Gartner's 2013 Hype Cycle for Emerging Technologies, suggesting "machines are becoming better at understanding humans and the environment -- for example, recognizing the emotion in a person's voice. "We all agree that human emotions are complicated, and arguably, the human voice might be the most personal and revealing "emotional designator." But currently, this is still the next big revolution waiting to happen -- the most important, non-existing interface out there. Or is it?

We all know that words alone don't always tell the whole story. No doubt to truly understand we need to reach beyond the verbal. In many cases, it's not what we say, but how we say it. We know this intuitively, and studies in neuropsychology in the last 50 years have demonstrated that body language and vocal intonation have a bigger impact than your actual choice of words.

When you first meet someone, in less than 10 seconds after he or she starts talking, you've already formed an opinion about this person. As reported by Carol Kinsey Gorman at Forbes, researchers from NYU found that it takes just 7 seconds to make a first impression. Experience the versatile Bryan Cranston as Walter White in AMC drama Breaking Bad -- his voice alternates between the feeble, stomped-upon character of Walter White and a Meth kingpin persona known as "Heisenberg" -- you instinctively know who to stay clear off.

"Emotions Analytics" is a new field that focuses on identifying and analyzing the full spectrum of human emotions including mood, attitude and emotional personality. Now imagine Emotions Analytics embedded inside mobile applications and devices -- opening up a new dimension of human-machine interfaces. Picture machines (and their users) -- finally -- understanding who we are, how we feel and what we really mean. Can you envision a world, where people are also more in touch with their own emotional sides? We believe this world is practically just around the corner.

5 Steps to Emotionally Power Your App

1. Rethink your entire app. Emotional Analytics changes both the flow and the value proposition of the app. It will require total redesign. In the mid-60s, E.A. Johnson at the Royal Radar Establishment in Malvern (UK) created the first touch screen; that has since sparked completely new user experiences via a plethora of innovative applications. Within just five years, the Apple App Store has grown to include close to 900,000 mobile apps.

The introduction of a gyroscope built into smartphones has further revolutionized mobile gaming, which in turn, is now being introduced into Google Glass. In all these examples those who understood the value of the new interface, resisted the urge to make a quick fixes to their current solutions and got it right, made it big.

We think that introducing Emotions Analytics promises an equally game-changing impact. It's more than adding few extra features -- we see a totally revised and emotionally aware interface. And with a fresh interface, you need also to rethink your value proposition. Moreover, imagine the adoption and market share potential across industries -- likely to further drive up the 2016 Gartner estimate of 300 billion mobile apps downloads annually.

2. Words are overrated. Cognitive language is a poor emotional yardstick. Yet most of the sentiment analysis industry is focused on words. Think of emotions as your car's spark plugs -- little and hidden but responsible for the resulting combustion that ultimately powers the car. Similarly, emotions summon the words in your prefrontal cortex; we dress them up by applying cultural filters and social norms and run them through our personalized cognition. The result is by now an almost indistinguishable mix of which emotions are just a small & diluted component.

Speaking of "communications of feelings and attitudes," the widely quoted formula of nonverbal communications pioneer Albert Mehrabian suggests in "Silent Messages" that only 7 percent of our communicational impact pertaining to feelings and attitudes is based on verbal language. The bulk is delivered by body language vocal modulations. Our intonations are literally tuned by our emotions -- happiness or sadness, excitement or depression, anger or anxiety. Free from language the music of our vocal expression is universal and rings true across races and cultures. And not just humans, just think of the family dog.

Ironically, most sentiment analysis solutions are focused on figuring out those seven percent with mixed results. One can of course choose to use an MRI brain scan to crack the mystery of human language. Using MRI, Dr. Sophie Scott at University College London has done just that showing how the brain takes speech and separates it into words and "melody." Her studies suggest words are shunted over to the left temporal lobe for processing, while the melody is channeled to the right side of the brain, a region more stimulated by music." Interesting as it may be, donning a Lady Gaga-like contraption on our heads to identify emotions in every day conversations would certainly not meet with "applause."

3. It's not what you say, but how you say it. Emotions live mainly in the intonation and body language. It's the way we are wired. You might have heard the French saying "c'est le ton qui fait la musique, "which translates to "it is the tone that makes the music; and it's not what you say but how you say it." It's the combination our vocal modulations (and body posture -- both activated by the same part of our brain that produces emotions), and the memory that the listener holds for that melody, that either makes the person we are talking to listen up or shut down. This is why we can quickly detect the deteriorating emotion fueling Walter White's growls in Breaking Bad. Or reflect on the poor fate of Dirty Harry's antagonists when Clint pops the question "Do I feel lucky?"

Vocal communication and body language go hand in hand. Posture, gait and facial expression further broadcasts what's happening inside of us. Sue Shellenbarger, writer of the WSJ's "Work & Family" column, recently reported on "new research shows posture has a bigger impact on body and mind," as a "powerful and expansive pose actually changes a person's hormones and behavior." So, striking a pose might also enhance your impact on your listeners. In other words (literally), it looks like both can become great emotions yardstick -- far superior to language and text. In principle they are. But reality is different.

4. Looking is good. Listening is better. Gaining understanding from visual clues offered by body language and facial expressions is highly complex. Researchers at the Hebrew University of Jerusalem and at New York University and Princeton University found that the body language of tennis players provided better cues (than their facial expressions) in trying to judge where an observed player had "undergone strong positive or negative experiences."

Nonetheless, we are visually wired. Facial recognition apps are everywhere albeit with a catch -- getting your subject to gaze directly into the camera and guaranteeing perfect lighting and adequate shading is tricky. Moreover, the "subject" is aware and might "turn it on" as he knows he is on camera. Wouldn't you?

We are surrounded by voice. It's present in everything we do -- even when we pay no attention, talking while letting our minds drift in thoughts. Listening is effortless and so much easier, as we do it naturally -- even in the dark, we don't have problems listening, while a camera might. Moreover, we can intuitively discern a speaker's authenticity -- even better when in the dim. Our ears listen for those "vocal indicators" and literally perform an auditory double-take when we hear an increase in speech rates or a decrease in pause duration, which are powerful indicators of mood improvements in patients suffering from depression, for example[1]. We are wired to detect the emotions "resonating" in the voice of the person we are speaking with. And since emotions -- positive and/or negative -- are paramount to our decision-making processes, applying a vocal intonation driven solution is both easier and superior. Translated into your daily life, wouldn't it be so much more enriching to develop apps and prime machines to do what we do intuitively all of the time?

5. Think bold. Net sum: Emotions Analytics allows you to gain deeper context and meaning in everything you do. Emotions Analytics can transform business, games, media and marketing and impact your health and dating life and much more. No matter where you (and your customers)are and no matter what language they speak. In life, it's not about what you say, but how you say it. Imagine this initial layer of emotional understanding in every aspect of your life embedded in every voice assisted and voice enabled device allowing you to gain a better hold of context and meaning in practically everything you do. The potential is boundless and the probable deployments and applications are countless. Could this really be the "ultimate technology"?

Thinking bold -- how would you use it? Where would you expect it? Get creative. Don't try to conform to existing paradigms. Elevate your app. Light up your mobile app's emotional blind spot with Emotions Analytics that crack the code of the human emotional dimension. Let's transform how we interact with machines and with each other.

Popular in the Community

Close

What's Hot