Medical researchers have discovered that the best way to operate microscale devices is through intuitive controls. According to researchers, a joystick can be very disruptive, instead, if it feels like you're holding on to the handle of an instrument and it's just following your hand; it's completely intuitive. A system developed by Johns Hopkins University translates hand movements into motions that are smooth and precise enough to inject minuscule arteries. The steady hands system is still a laboratory curiosity, to be sure. However, it has shown great promise in the experiments Taylor and his colleagues have performed to date. A microdevice that the team has built has displayed five-micrometer precision. A microscale device could make this near-impossible task routine by injecting blood thinner directly into the affected vessel. One issue that would have to be worked out, however, is immobilizing the patient: even motion calibrated to mere micrometers could do damage if the patient's eyeball moves.
When Russell Taylor first started working on surgical robots in the late 1980s, one of the most cumbersome tasks was aligning the device to the patient's body. The joystick controller used to move the business end of IBM's robotic hip-replacement machine, called Robodoc, to the hip was plenty precise. But trying to move the tip of the instrument through x-y-z coordinates was unnatural.
"In many cases, we found that the most convenient way for the surgeon to interact with the robot was to grab it and manipulate it;' Taylor said. "That motion was very smooth and accurate-a little like moving through molasses."
After discovering the joys of grabbing the robot, the surgeon didn't want to touch a joystick again. Taylor and his colleagues realized that they could take advantage of the surgeon's intuitive motion through three-dimensional space and the machine's uncanny precision to do more sophisticated things.
Over the past decade, Taylor, who is now a professor of computer science at Johns Hopkins University in Baltimore, has moved from macro scale robotic surgeons to microscale machines. One of his latest experimental devices is sufficiently small and precise to inject a micro needle into a minuscule blood vessel in the human eye. But the key to their laboratory success isn't simply their scale. It's the computer interface that enables a human surgeon to work intuitively with the micro electromechanical device.
This marriage of human spatial intuition and mechanical accuracy doesn't come naturally. The Johns Hopkins team has had to develop a set of computer interfaces that can turn a surgeon's reflexes into a smooth robotic motion. But their success at it points to a potential breakthrough in how engineers interact with the microscale and even nanoscale-world.
The advantages of robotically performed surgery over the traditional human-guided operation aren't obvious to the layman. We want to have an expert working on us, not a machine. But for many tasks in sensitive parts of the body, the human touch may be too clumsy. Taylor recalled an experiment pitting human versus robot in the delicate job of endoscopically clearing excess blood from brain tissue (with white gelatin standing in for gray matter).
"We used the same tool, one time held by a surgeon and the robot, one time held just by a surgeon," Taylor said. "It took about four minutes for the guy working freehand, and there was about 10 or 15 percent excess material"which in a real procedure would have been brain. The robot, on the other hand, took two minutes longer, but sucked up only one-tenth the amount of gelatin.
Even so, guiding machines into the crannies of the human body isn't easy. Robots have traditionally worked in so-called Cartesian space, the three-axis coordinate system that enables mathematicians to specify any point with just three numbers. Give a computer the starting coordinates and the final coordinates, and it can calculate the trajectory a robotic arm must travel with ease.
Computers have grown incredibly powerful over the past decade, but that three-axis view of the world is still common in guiding machines. The common joystick and ubiquitous computer mouse track motion over two axes and send that signal to computers.
"A joystick can be very disruptive," Taylor said. "Instead, if it feels like you're holding on to the handle of an instrument and it's just following your hand; it's completely intuitive."
This issue has become even more acute when human operators have tried to control micro scale devices. Since there's no direct physical connection between the operator and the device, it's been difficult to create a controller that feels intuitive. The joystick or button box has been the default.
When Taylor joined the faculty of Johns Hopkins in the rnid-1990s, one of the first projects he got involved in was scaling down the Robodoc system to a size capable of repairing blood vessels or operating inside the ear.
The problem with tele-operated robotic surgical systems to date, Taylor said, is that the machine can be very disruptive in an operating room. "You're making the surgeon sit someplace else, and there's all this equipment," Taylor said. A much more direct way of doing the work would be to have both the human and the robot hold the tool, with the surgeon holding the handle. When the surgeon pulls on the handle, the computer translates this macroscale motion into commands to be executed by the device at the microscale.
"The robot is very precise and its hand doesn't shake," Taylor said.
This way of operating takes advantage of the highly trained hands of the surgeon. Rather than teach the surgeon a new way of thinking about moving his tools, the computer uses his intuitive motions to control the machine.
"Surgeons are very physical people," Taylor said, so such systems are easier for them to learn.
The Johns Hopkins steady hand system isn't just a souped-up haptic feedback device, Taylor said. For one thing, it doesn't try to present data on the forces on the tip of the MEMS device. But other kinds of data can be imparted to the operator through the controller.
Taylor envisions creating a well-defined, three-dimensional space within the patient's body using data from CT scanners and other imaging systems. As the surgeon moves the microsurgery device during an operation, the computer that mediates between the surgeon and the MEMS system keeps track of the system's position in real time. Try to move the microscalpel outside the boundary set up for the operation-say, toward a blood vessel or a nerve-and not only would an alarm be sent to the surgeon, but the device would stop short.
Measured in Micrometers
The steady hands system is still a laboratory curiosity, to be sure. But it has shown great promise in the experiments Taylor and his colleagues have performed to date. A micro device that the team has built has displayed five micrometer precision.
One area that Taylor sees as a potential application is in microsurgery on the eye. Blood can clot in the minuscule vessels in the retina; if left unchecked, such clots can lead to partial blindness. Dissolving the clots could save vision in many older patients, and such procedures have a greater likelihood of success the closer to the blockage doctors can inject medicine. But operating on the eye is a delicate proposition at best, and using human fingers to pierce those tiny blood vessels is a recipe for disaster.
A micro scale device could make this near-impossible task routine by injecting blood thinner directly into the affected vessel. Already, a version of a microsurgery robot developed by Taylor and colleagues Greg Hager and Louis Whitcomb has injected liquid into the tiny arteries of a chicken embryo. One issue that would have to be worked out, however, is immobilizing the patient: Even motion calibrated to mere micrometers could do damage if the patient's eyeball moves.
This issue would be less of a concern with another delicate microscale procedure-injecting DNA directly inside the nucleus of a cell. Taylor said the technique, which involves operating tiny glass needles while observing through a powerful microscope, has a relatively low success rate, meaning the action has to be performed again and again. The work is so demanding to learn and perform that most technicians (biology graduate students, generally) burn out on the work in about a year. A MEMS-based device that could automate certain steps would not only have a greater success rate, but would likely be easier to learn and less stressful to do.
The techniques could even be adapted beyond the bounds of biology to nanoscale machines. A MEMS device that can grab and manipulate very small bits of matter could have a profound effect on how micromachines- or even nanomachines-are assembled. To do this successfully, though, will require tools that are not only unfathomably precise, but also simple to use. An xy- z arcade-style joystick won't be up to the task.
But a system that incorporates the steady-hand interface would make manipulating a nanotube as natural as picking up a pencil.
Taylor said the key was to mate human coordination and machine precision. "And because we have a computer in the middle, you can start to do much more powerful things with the control."