Many years ago, I was fascinated by a program named Dr. Sbaitso, an offspring of the famous ELIZA pattern matching program developed at MIT during the '60s. Dr. Sbaitso was a simple natural language text to speech program that pretended to be a psychotherapist. The kind doctor would mostly rephrase any input into a question, so if I'd write "I hate my boss," it would reply, "Why do you think you hate your boss?" or "Are there other people that you hate?" Download it here and see why ELIZA and other chatbots managed to pass the basic Turing test in which subjects believed they were conversing with a real person.
We live in an era where communicating with machines, like the one you're reading this blog on, is second nature. As with our friends and loved ones, our machines can bring out the best and the worst in us; we all want to do something stupid to our smartphone, hate the nice lady inside our GPS device with a vengence, find ourselves talking fondly to our laptop or get addicted to a game or some other nice piece of code. We also know that we don't really need a human face for communications to become emotional; think of the texts and emoticons/symbols that we trade with people we never met at forums, chats or social networks, messages about our lives that sometimes carry more emotion and generate more relief than the means available to us in the "real life." Come to think of it, it's easy to imagine that some of our Facebook "friends" are actually natural language programs.
Cognitive scientist Marvin Lee Minsky argues that emotions are actually a way of thinking -- our very human way of thinking. It's a very small surprise, then, that machines and our interfaces with them are carefully designed to evoke our emotions. They achieve it by showing "empathy" (via positive feedback) and by establishing a "relationship" with us, the users. We now want our machines to "recognize" us and "listen" to us, to "understand" and proactively "help" with what we want and need, and modern devices are beginning to carry just enough sensors and computing power to start delivering all that. "Siri" is an attempt at creating a "personality" (mainly by using humor), and a line of new robot toys, pets, and assistants to the elderly or post-traumatic patients is another small beginning, signaling a wave of new artificial intelligence (AI) machines that will actively employ emotions when communicating with us humans.
We're not fantasizing about the AI that is usually associated with science fiction and is called "strong AI" (artificial intelligence that matches or exceeds human intelligence and is conscious). Our "weak AI" (machines that can demonstrate intelligence but do not necessarily have a mind, mental states or consciousness) is already very common and responsible to many aspects of our lives, from weather forecasting, via algo-trading and surveillance, and all the way to my video games.
With every accumulated petabyte of research and scientific knowledge, it seems more and more that we ourselves are sort of "machines," physically and perhaps also emotionally. True, endlessly complex and still beyond our own understanding, but whether you believe we were created by a god or by nature, there's certainly a design -- a magnificent blueprint. We are sometimes reluctant to acknowledge it, but although we're all singular entities, we're also very similar in many ways; many of our behaviours and issues can be "categorized." All colors are made from three basic ones, and perhaps there's a similar, though more complex, hierarchy to our emotions.
If this is true, it's just a matter of time before machines can really help us improve our emotional well-being. When math and computer sciences, machine learning and algo-art, hardware and cognitive sciences exponentially accelerate toward each other, it will soon become possible for emotional intelligence to be augmented by "artificial emotional intelligence." This new AEI will not need to be aware or conscious; the amount of data already available about us combined with clever questionnaires can be used to create a basic personality analysis. The EAI algorithms will be then able to follow our behaviour, "read" our emotions, and react to it in a very complex way: Be supportive when needed, or opinionated and even conflictual where appropriate, and yes, perhaps even insightful. Such an entity could be a great listener and provide some of the fundamental value of talk therapy, and with time, much more than that. A similar approach is already applied by IBM with "Watson" -- the program that won "Jeopardy" is now going through medical school, and when it "graduates" it will offer its diagnosis and prognosis to physicians.
The possibilities and implications are beautiful and plenty. Think of a therapist that is "with you" whenever and wherever you need it/him/her/whatever. It almost sounds obvious for such a machine to be able to practice positive psychology by constantly focusing us on our strengths or think of a machine that can help autistic children interpret and understand facial expressions, or veterans that never get left behind with their PTSD, and stress that can be alleviated in real time with the help of a friendly machine. Think of a time where many of our emotional daily challenges can be dealt with and contained before they escalate into something much more painful and potentially harmful.
Perhaps the fundamental question is whether a machine can ever really care for us, or more precisely, can we really ever feel loved by a machine? Therapy helps us because it allows us to feel we're accepted for who we are despite our faults. Simply put, a good therapist will care for us, accept and even love us, in her or his own way, and thus allow us to accept and love ourselves a little more. I don't know if we can ever achieve such a relationship without a human connection. On the other hand, I can definitely imagine a contextual AI machine that "lives" in our mobile device and knows so much about us that it can actually sense us: On a Monday morning, when our self-esteem hits bottom, it will remind us of our professional achievements. If we feel lonely on Valentine's Day (not a very complicated algorithm), it can help us fondly recall the meaningful relationship we recently had and promise us it will all be all right.
Isn't that somewhat human? A little bit like love?
Oren Frank is the co-founder and CEO of www.talktala.com.
For more by Oren Frank, click here.
For more on emotional intelligence, click here.