This article focuses on different research and development tasks intended towards creation of agile swimming robot. Armed with extraordinary agility and electrical sensors that show the location of insect larvae, the black ghost knifefish haunts at night. An underwater robot based on the ghost knifefish could work in the murkiest waters. Malcolm MacIver, a professor of mechanical engineering and neurobiology at North-western University, is building a prototype of such an agile ROV in his laboratory. Active electrosense could do more for ROVs and robots, MacIver said. It may soon be able to distinguish living organisms, including divers, from inanimate objects by sensing capacitance – the ability of a material to induce a phase lag between voltage and current – MacIver reported in 2012 at the International Conference on Intelligent Robots and Systems. Down the road, the bioroboticists hope to incorporate another ant navigational skill into a robot – a neat ability to detect polarized sunlight and use it to determine compass direction.
In the desperate weeks after the Deepwater Horizon drilling platform exploded, as thousands of barrels of oil spewed from a mile-deep well into the Gulf of Mexico, BP sent down remotely operated vehicles and tried one strategy after another to try to stanch the flow. And for two months, those efforts failed—in part because the people piloting those ROVs could not see what they were doing.
Not only did oil cloud the water, but every time the pilots tried to maneuver the ROVs, the thrusters stirred up sea-floor sediment. “The ROVs were so difficult to maneuver that they sometimes slammed into the oil wellhead and delayed capping operations for weeks at a time,” said Malcolm MacIver, a professor of mechanical engineering and neurobiology at Northwestern University. “There is a very big need in technology for vehicles that can move with higher agility.”
Armed with extraordinary agility and electrical sensors that show the location of insect larvae, the black ghost knifefish hunts at night.
MacIver is building a prototype of such an agile ROV in his laboratory. But he's not designing it from scratch. Nature, he said, has already provided the model, in the form of a dusky and decidedly carnivorous fish that not only moves through the water differently from most other fish, but also senses the world differently as well.
Animals perceive the world in ways we cannot. Bald eagles can see fish in the water from several hundred feet above. A great gray owl can hear rodents scurrying under two feet of snow. Bloodhounds can smell so well that what they sniff out is admissible in court.
But some animals use senses we don’t even have, senses that could give ROVs and robots much needed new abilities. Fish detect flowing water with a lateral line sensing system, which senses movement and pressure changes as water moves by. Flies have a biological gyroscope that gives them exquisite balance and responsiveness in mid-air. And the black ghost knifefish, which lives in murky streams in the Amazon, uses electricity to navigate its world.
The knifefish, on which MacIver works, emerges at night to hunt for insect larvae and small crustaceans. It emits an electrical field around its body. Nearby objects alter that field. The fish uses thousands of receptors on its skin to detect the changes, forming what amounts to an electrical image of the object. This, along with its phenomenal agility in the water, enables the knifefish to avoid head-on collisions and nab small critters rather than small rocks.
MacIver aims to build an equally agile swimming robot with similar electrosensing skills, which he said will let it do tasks no underwater robot can manage today. If he succeeds, new robots could scope out deep-sea oil wells, help police divers find bodies in murky waters, swim fish-like through sewer pipes to find leaks, or explore cabins inside a sunken ship.
And that's just the start, said Tom McKenna, who directs the biorobotics program at the Office of Naval Research. “If you’re looking for innovative solutions,” McKenna said, “you have principles available in nature that are the results of eons of evolution and clearly show superior capability.”
In MacIver's lab, a blind robot that resembles a large white lozenge signals a gantry above, which steers it slowly through an underwater obstacle course of closely packed pylons. Then the robot tries again, but this time the water is muddy and opaque. The course mimics the complex underwater environments that the electric fish—and an underwater vehicle—must sometimes navigate. Yet the robot, which uses only active electrosensing and no video cameras, glides smoothly through the maze.
This is MacIver's SensorPod, and it was years in the making.
As an undergraduate, MacIver double-majored in philosophy and computer science, which sparked a burgeoning interest in artificial intelligence. After a stint trying to diagnose jet engine faults for the Canadian government, he decided to pursue a cognitive science doctorate at the University of Illinois, Urbana-Champaign. Two years later, he switched into neuroscience.
That led him to the black ghost knifefish and its extraordinary agility. A knifefish can move forward and backwards like other fish, but it can also do full body rolls like a kayaker. What's more, it can swim sideways, as if it was a car that parked by driving sideways into a tight spot.
To understand the basis for such superb water acrobatics, MacIver began examining how knifefish swim. Most fish swim either by undulating their bodies, or by wiggling a pair of fins, one on each side. In contrast, knifefish have a single bottom fin called a ribbon fin along their midline, and they swim by waving it.
“An underwater robot based on the ghost knifefish could work in the murkiest waters.”
MacIver noticed that as the ribbon fin waves, two waves propagate toward each other from each end of the fin. This makes the animals more stable and more agile at the same time. “It turns out to be the secret of the animals’ maneuverability,” MacIver said.
MacIver proved that idea by building a foot-long robotic fish. Called GhostBot, it has an artificial ribbon fin consisting of 32 spines covered with Lycra. The fin undulates just like the knifefish's ribbon fin. The first time MacIver's team put it in a flow tunnel, which laboratory researchers use to observe fish swimming, GhostBot swam forward and backward. The team fine-tuned the frequency, amplitude, and speed of the undulations to better control the fish's swimming speed. But the robot still needed a way to avoid things that go bump in the night.
That's where electrosensing came in. Black ghost knifefish employ electricity much as a bat uses sonar. Bats navigate and hunt by shrieking ultrasonic screams that humans can’t hear and listening for echoes bouncing off inanimate objects and flying insects.
The black ghost knifefish instead emits a weak (1 mV), high-frequency (1 kHz) oscillating electric field from a specialized electric organ in its trunk. Each object disrupts the field in a unique way. By combining data from thousands of tiny electroreceptor organs in its skin, its brain senses an image of an object.
The fish senses only a few centimeters away from its body, but it senses in every direction, as if it has eyes on the back of its head and the top and bottom of its body. The knifefish also repositions itself to better perceive signals, using its body as a tunable antenna.
“It's like an electrical bat,” MacIver said.
As MacIver thought about how to incorporate electrosense into a swimming robot, he started talking with engineers who build ROVs. ROVs typically use video cameras and sonar to avoid obstacles. Both technologies are good at sensing long distances underwater, but struggle up close, MacIver said. Even short-wave sonar, which works better in tight quarters, fails in water cluttered with objects.
Lighting the way for the video cameras presents other problems. “Huge wattage goes to powering lights so you can see a tiny space in front of the robots,” he said.
Active electrosense would help ROVs and underwater robots overcome all these problems. “That became part of the motivation,” MacIver said.
Active electrosensing is not new to engineering. Physicians diagnose lung cancer with electrical impedance tomography, which sends a small current throughout the body and uses electrodes on the body to pick up abnormalities in the chest cavity. Geologists inject current into the ground and use electrodes on the Earth to detect cavities, including saline aquifers and oil reservoirs.
MIT engineers have also built robotic fingers with electrosense that detect an object before the robot hand grasps it. This is a skill people do routinely— for example, we position our hand like a “C” around a cup before we grasp it—but most robots today can’t do that. Active electrosense would let them.
Combine that ability with good contact sensors, which are available today, and robot hands could move with unprecedented deftness. An underwater robot that “sees” with active electrosense could manipulate objects with equal dexterity.
“Sensing capacitance would be like adding color to black-and-white images.”
“I have to trot after them,” Mangan said.
Active electrosense could do more for ROVs and robots, MacIver said. It may soon be able to distinguish living organisms, including divers, from inanimate objects by sensing capacitance—the ability of a material to induce a phase lag between voltage and current—MacIver reported in 2012 at the International Conference on Intelligent Robots and Systems.
Living organisms, and little else in nature, possess this electrical property. Sensing capacitance would be like adding color to black-and-white images, MacIver said.
Over the next few years, MacIver plans to build a robot that combines SensorPod's electrosensing skills and GhostBot's agility. Such a robot would be slow, like the black ghost knifefish, but it would swim forward and backward, roll, and parallel park like the real thing, and it would extend the range of underwater robots to the murkiest of environments. If it is commercialized, it could one day plug pipelines, seal wells, supply frogmen, and find drowned bodies.
While electrosensing could help robots navigate at sea, other unusual animal senses could help them find their way on land. Barbara Webb, a bioroboticist at the University of Edinburgh, started her career exploring artificial intelligence, and soon began building computational models to reproduce aspects of human perception. She quickly gave up on humans and turned to ants.
“They have a simpler brain than humans,” she said, “and we can actually have some hope of understanding the connections between perceptual systems and motor systems, and how that controls behavior.”
She and Michael Mangan, a postdoc in her laboratory, study how European desert ants navigate. Many ant species do this by laying down a trail of pheromones, then following their scent. But these desert ants live in hot, semiarid southern Spain, where ground temperatures exceed 50 ̊C during the day—hot enough to quickly destroy pheromones. Unlike most desert creatures, European desert ants head out at mid-day, in their case to find tasty, dying insects baking in the sun. They then head back to the nest, and quickly.
European desert ants see visible and ultraviolet light. This makes the sky bright and everything else dark, enabling them to recognize local landmarks by their silhouettes and make their way through complex terrain.
Biologists suspected that desert ants navigated with a form of vision that rivals Superman's, though rather than X-rays, they see ultraviolet light. Webb and Mangan investigated. They replaced the glass lens and optics in an ordinary digital camera with components made of quartz, which is transparent to both visible and UV light, and added some off-the-shelf algorithms. Then they took pictures on the ants’ home turf—an abandoned field with grassy tussocks in a semi-industrial area on the outskirts of Seville.
Sure enough, in these images the sky looked bright and everything else looked dark. “This allows them to distinguish sky from ground very clearly,” Webb said.
A graduate student in the lab then walked around Edinburgh, testing the UV camera to see if it could help a person navigate like a desert ant. The UV images highlighted the contrast between sky and buildings, just as the setting sun creates a silhouette of the Manhattan skyline. Software compared these UV skyline images to reference images they’d obtained several months earlier in different weather conditions. Silhouettes for nearly all points on the path matched. The results showed that UV sensing alone would be enough to help a person find the way around the city successfully, they reported in a peer-reviewed publication presented at the 2014 Robotics: Science and Systems Conference.
Mangan and Webb suspected that the ants navigated by checking skyline images against those in their memory, continuing straight if they matched, and turning if they didn’t.
To test that idea, they built a groundhugging ant robot with a panoramic UV lens and a skyline recognition algorithm. “It's actually a mobile phone on wheels,” Webb said. It navigated the abandoned field in Spain just fine, according to results the researchers reported during the Living Machines conference in August.
Both actual ants and Webb's ant robots screen out information on contrast, color, and texture, and detect only the silhouette of the surrounding skyline. Less information to process means a robot brain would need less in the way of hardware, software, and the electricity to run. “Where we can gain insights for engineering would be where we have similar constraints in terms of power and computation. Maybe you want to have robots in the field a long time,” Mangan said.
Down the road, the bioroboticists hope to incorporate another ant navigational skill into a robot—a neat ability to detect polarized sunlight and use it to determine compass direction, Mangan said.
Nor are bioroboticists limiting themselves to the earthly plane. Mark Willis, an insect biologist at Case Western Reserve University in Cleveland, has used Air Force funding to build a contraption he calls Robo-Moth, modeled on moths’ ability to follow plumes of pheromones as they fly. Once they can model the sense of smell better, researchers could build an artificial nose on a stick to help military forces avoid having to guess where terrorists are hiding in a building. “If we gave special ops guys a flying moth that could fly down the hallway and tell them what's down there, they would use it tomorrow,” Willis said.
Meanwhile, Jessica Fox, a Case Western bioroboticist, is investigating a unique sensing system that allows flies to fly forward and backward, turn, and hover in mid-air like helicopters. As flies carry out these relatively complicated maneuvers, they rely on vestigial wings that function as a gyroscope does in airplanes and helicopters—to detect the Coriolis force that indicates whether the flying machine is rotating.
“They have some of the best flying behavior that we know, so we might be able to improve flying behavior for some of our vehicles,” Fox said.
According to Webb, “Insects tell you a lot of things we’d like robots to do.” That includes flying, swimming, crawling, and scrambling across uneven ground. “They’re incredibly energy efficient, so they don’t need huge amounts of battery power,” Webb said.
Webb's admiring words about insects could also hold for animals in general. As she put it, “They’re amazingly more competent than the best robots we have.”