Computers Can Now Read Our Body Language

Scientists have rigged up computers to detect boredom based on movement.
In the future, this man's laptop might be able to tell that he's bored.
In the future, this man's laptop might be able to tell that he's bored.
DTP via Getty Images

Bored with the Excel spreadsheet or work emails on the screen in front of you? Your laptop will one day be able to tell, according to new research.

Researchers have developed a program that allows a computer to read your body language and determine whether you're bored or interested by what you see on the screen.

The University of Sussex findings, published Tuesday in the journal Frontiers in Psychology, showed that measuring a person's involuntary movements as they use a computer can reveal whether they are absorbed in what they're doing.

Involuntary movements are tiny micro-movements that we constantly make. They include movements in the neck, face and limbs -- such as fiddling the hands, crossing the legs, shifting our gaze and turning the neck.

If we're bored or unengaged, we'll make more of these movements. When we're interested in what we're doing -- which the researchers refer to as "rapt engagement" -- we make fewer involuntary movements.

“People naturally make involuntary movements about once every 10 seconds -- this is the background 'movement of life.'"”

- Dr. Harry Witchel, the study's lead author

"It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle," Dr. Harry Witchel, a body language expert at Brighton and Sussex Medical School and the study's lead author, explained in a statement.

For the research, 27 healthy participants sat in front of a computer with speakers. The participants were presented with a range of stimuli that ranged from enthralling to utterly boring over the course of three minutes -- "fascinating games to tedious readings from EU banking regulation."

Meanwhile, the researchers used a handheld sensor device to track the participants' involuntary hand movements over the three-minute period. The movements were then measured and analyzed with video motion tracking.

They also compared involuntary hand movements during two reading tasks, one engaging and one non-engaging, and found that the engaging reading task resulted in 42 percent fewer involuntary hand movements.

"People naturally make involuntary movements about once every 10 seconds -- this is the background 'movement of life,'" Witchel said. "What we showed is that when people are engaged with what is on a computer screen -- particularly when they are interested -- they inhibit their non-instrumental movements. People even inhibit their leg movements."

These findings could help computers to better detect a person's engagement level, potentially leading to improvements in digital learning. For instance, a tutoring program that adapts to the user's interest level could change its strategies when it detects the student becoming disengaged.

The findings also have important implications for the development of artificial intelligence. According to Witchel, the research could one day help scientists to create more empathetic robots that are able to react to a person's level of engagement.

"Such robots or technologies will have to be able to read and respond to the emotions of the humans they interact with," he said. "Part of that emotion-recognition will depend on posture and body language."

Before You Go

LOADINGERROR LOADING

Popular in the Community

Close

What's Hot