By Larry Greenemeier
In popular fiction, humanoid robots have no rhythm—look no further than the “robot dance” for evidence of this. Yet rhythm—or the neurophysiological processes that enable humans to produce patterns of recurring movement—is the key to creating bots that move more like people. So says a team of University of Arizona engineers who claim to have built a set of robotic legs that mimic the human gait better than any other artificial life form to date.
Indeed, the video above makes a compelling case. Although the gait is a bit stiff, the robot legs flex and even have some swagger. M. Anthony Lewis, director of Arizona’s Robotics and Neural Systems Laboratory, and Theresa Klein, a Ph.D. student at the lab, are publishing a study on Friday in the Journal of Neural Engineering detailing how they were able to accomplish this.
Like many roboticists, Lewis and Klein looked to nature for inspiration. Humans have a central pattern generator (CPG) in their spinal cord’s lumbar region. The CPG is a neural network producing rhythmic signals that allow the body to generate the step cycle needed for locomotion. The CPG creates and controls these signals based on information it gathers from the legs, which indicate, for example, the slope and solidity of a surface as they walk.
Lewis and Klein’s robot features the simplest form of a CPG—just two neurons that fire signals alternately to produce a rhythm, as well as load sensors that determine force in the limb when each leg presses against a stepping surface. This setup is similar to the mental mechanism that allows human babies to learn to walk—a pair of neurons enables their little legs to work in rhythm with practice.
Each leg of the university’s robot consists of a hip, knee and ankle moved by nine muscle actuators. Muscle contraction is mimicked by rotating the motor to pull on Kevlar straps. Each muscle strap features a load sensor that models a tendon in a human leg, sensing muscle tension when a muscle is contracted and sending signals to the brain about how much force is being exerted and where.
Of course one of the primary goals of this research is to create more human-like movement in robots. But the researchers also hope their work helps better explain how humans walk and how spinal-cord-injury patients can recover the ability to walk if properly stimulated in the months following their injury.
Images and video courtesy of the University of Arizona
Meet Jules, the newest and most realistic humanoid robot yet from David Hanson and the team at Hanson Robotics.
A robot that looks just like its creator (www.newscientist.com).
Engineers at Kagawa University in Japan are developing a talking robotic version of the human mouth: To enable the robot's speaking abilities, engineers at Japan's Kagawa University used an air pump, artificial vocal chords, a resonance tube, a nasal cavity, and a microphone attached to a sound analyzer as substitutes for human vocal organs.
ACTROID-F in AIST Open Lab 2010.
Robot modeled after Albert Einstein. Einstein mimics the facial expressions he detects in others. Smile at him, and he'll smile back.
Cybernetic human dance demo in DCEXPO, 2010.
Humanoid face created by Hanson Robotics (www.hansonrobotics.com). Robotics scientists at Hanson previously created animatronic puppets for Disney studios.
Animatronic baby mechanism for anonymous TV series. Built by Chris Clarke for CNFX Workshop.
Taiwanese Kissing Robots (NTUST Robot) were exhibited in AutoRob2009 in Gwangju, Korea. They were developed by Prof. Chyi-Yeu Lin's research team in National Taiwan University of Science and Technology.
Robot girl with silicone skin.