Imagine Machines That Can See
Mark Baard 06.04.03

Joseph Ayers' invention is a biomimetic robot lobster. He hopes it will be able to vary the levels of chaos in its neural network so that it can complete complex tasks like clear minefields or sniff out dangerous substances.

BOSTON -- Robotics experts are turning to nature for guidance in making machines that see, hear, smell and move like living creatures.

Inspired by the neurobiology of small animals, they're learning to make robot lobsters and other critters that might be able to clear minefields or sniff out dangerous substances.

But mimicking lobsters and bugs is one thing. Making robots that can match the intelligence and physical agility of humans is quite another.

Scientists are working in the emerging field of biomimetics, in which machines are designed to function like biological systems. They have only the foggiest idea of how the human brain perceives and acts on information from the body's sense organs, even though they've known the mechanics of those organs for many years.

"We have computer models of how the vision works in the (primary visual cortex)," said Galle Desbordes, a researcher at the Active Perception Lab at Boston University. "Beyond that, everything becomes quite a bit more mysterious."

Still, the Active Perception Lab is applying some new knowledge about human vision to a system that will provide valuable 3-D visual information to robots.

The system imitates small eye movements that humans use to gather information about objects in their visual fields.

"The system," said the lab's director, Michele Rucci, "can be used by robots for depth perception, which will help them better navigate and manipulate objects within their environments."

Iguana Robotics' bot walks with sight.

Rucci and Desbordesused computers and an eye-tracking device to confirm that the slight jittering of the eyes contributes not only to the gathering of three-dimensional information in the human brain, but to overall visual sensitivity as well. By stabilizing an on-screen image within 1 millisecond of each eye jitter, Rucci and Desbordes found that visual sensitivity declined by as much as 20 percent in the absence of small eye movements.

The Active Perception Lab presented its findings at last week's Conference on Cognitive and Neural Systems, a meeting of cognitive and neural scientists and roboticists sponsored by Boston University's Department of Cognitive and Neural Systems and Center for Adaptive Systems, or CNS, and the Office of Naval Research.

M. Anthony Lewis, another researcher who attended the conference, is trying to teach robots to respond in a more natural way to obstacles in their environments.

"Getting limbs to behave without conscious thought and under visual guidance, as they do in humans, remains a challenge," said Lewis, CEO of Iguana Robotics. The company is building a walking robot that runs on a network of artificial neurons, densely packed computer chips that can process data more quickly than conventional chips.

Iguana's robot uses a navigation system that mimics the way human beings guide their movements by sight. For example, the robot senses the objects it trips over, associates the bump with an image of the objects, and remembers to step over them the next time.

"Where the robot bumps into something is where the learning should take place," Lewis said.

Unlike conventional robotic designs, which specify where a robot should be at each moment in its trajectory, Iguana's robot stumbles around and learns from its environment.

Walking bot learns from mistakes.

"It is similar to when you stumble, or slip or activate any of those low-level reflexes that keep you walking when you encounter something unexpected," Lewis said.

He hopes Iguana can make a robot that will be able to adapt spontaneously to any situation.

"The robot should be able to run into a burning building, climb in dangerous regions to bring in medical supplies or be able to hang out at grandma's and take her for her morning stroll," he said.

That day may come, but probably not right away. Robots will have to think very quickly -- indeed chaotically -- to extricate themselves safely from fires, or perhaps to get away from grandma after her morning constitutional.

Robotic Lobster's Think Tank.

Conventional robots are deterministic and tend to bump into the same obstacle over and over until their batteries run out. But animals vary their movements in an attempt to avoid repeating their mistakes. An animal trapped in a box, for example, might scratch and gnaw and flail against all of the box's surfaces until it happens upon the best way out.

"The difference between robots and animals is that if we get stuck, we can wriggle out of it," said Joseph Ayers, director of the Biomimetic Underwater Robot Program at Northeastern University and co-editor of Neurotechnology for Biomimetic Robots.

Ayers is on sabbatical at the Institute for Nonlinear Science at the University of California at San Diego, where he is trying to give his own invention, a biomimetic robot lobster, the ability to vary the levels of chaos in its neural network.

"Robots need this ability," Ayers said. "Because if they can't do this out in the real world, they're toast."