This article provides details of various smart prosthetics that have been developed in last few years. Behind smart prosthetics are many technologies powering consumer products-faster microprocessors, more powerful batteries, wireless technology and new control systems that use the body’s nerves and muscles. A new bionic hand enables users to move the thumb and index finger independently of the remaining three fingers, a significant advance in dexterity over prior claw-like mechanisms. The bionic ear transmits sound and power wirelessly to electrodes implanted deep in the ear, allowing the deaf to hear. The Rheo Knee uses force and position sensors that monitor speed and load more than 1000 times per second. This information goes to an artificial intelligence program that emulates the feedback a natural knee would receive from the body’s central nervous system. However, researchers believe that despite outstanding progress, the future holds more challenges; a second looming issue involves sensory input. At Brown University, neuroscience professor John Donaghue has teamed with Peckham to develop ways to activate nerves from within the brain itself. Their goal is to develop within five years a brain-controlled system that will let a tetraplegic take a glass of water, lift it, and bring it to his or her mouth.
“Any sufficiently advanced technology is indistinguishable from magic,” the science fiction writer Arthur C. Clarke once wrote. If that is true, then a new generation of smart prosthetics—systems that combine mechanical devices with electronic intelligence to replace body parts—have begun to edge into the realm of the magical.
This appeared to be the case when Michael Callahan cast a spell over National Instruments Corp.’s annual users meeting in Austin, Texas, this past August. The 25- year-old president of a startup, Ambient Corp. of Champaign, Ill. Joined National Instruments’ senior vice president of research, Tim Dehne, on the stage.
Before more than 4,000 people, Callahan attached a black collar around Dehne’s neck. He then told Dehne to think of some words. On two huge screens behind them, a laptop attached to the collar registered activity. A second or two later, the computer voiced Dehne’s thoughts: “This is really neat stuff.”
Callahan then pointed to Ambient’s cofounder, Thomas Coleman, seated in a wheelchair at the back of the stage. Using thought alone—while holding his hands in the air—Coleman maneuvered his wheelchair to the front of the stage, spun it around, and rode back to his previous position.
Callahan and Coleman call their speech-capturing technology the Audeo. They hope it will let the mute speak and that application of the same principles will give people who cannot move their hands or legs the ability to wheel from room to room, move a computer cursor, turn on the lights, and switch television channels at home.
The Audeo works by intercepting nerve signals as they move through the neck to the vocal cords. “The brain is sending a signal to the proper place, even if those muscles are not working,” Callahan explained.
The collar around Dehne’s neck contained transducers that pick up those signals. “It’s like I’m talking in a conference room and you press your ear to the glass,” Callahan said. “It’s muffled, but you can still hear what I’m saying. We try to do that with the electrical signal made by your nerves. We capture it, then process and condition it to turn it into something useful.”
This is similar to how an electrocardiogram works, but with one very significant difference: An EKG measures the electrical impulses created by a contracting muscle. These are relatively strong signals. The Audeo measures nerve pulses that are orders of magnitude smaller.
“The challenge,” Callahan said, “is getting a clean, reliable signal and then filtering out the noise from the body’s other nerves, and then doing something with it in a robust way.”
(Ambient in Illinois, by the way, is totally unrelated to the Ambient Corp. in Newton, Mass., which is involved in technology for broadband communication over power lines.)
A new bionic hand enables users to move the thumb and index finger independently of the remaining three fingers, a significant advance in dexterity over prior claw-like mechanisms.
Riding a Wave
Ten or even five years ago, locating such a small signal and isolating it from the surrounding noise in real time would have been impossible. Processors were too slow, algorithms and filters less developed, and transducers not precise enough. Today, however, Ambient and other prosthetics developers are surfing a wave of emerging technologies that are changing the way man-made devices interact with the human body.
In fact, perhaps the most amazing thing about Ambient’s thought-powered wheelchair is how typical it is of this new generation of smart prosthetics. Today, bionic ears let the deaf hear and retinal implants give the blind
Using thought alone—while holding up his hands—inventor Thomas Coleman maneuvered his wheelchair to the front of the stage, spun around, and returned to the rear
limited sight. Powerful prosthetic knees enable users to climb stairs with something like a natural gait. Artificial hands clasp keys or forks, while thoughts direct the movement of mechanical arms.
What makes this possible? Is it faster microprocessors? Better sensors? Improved materials? More powerful batteries? Wireless technology? More accurate electromechanical and physiological modeling tools? Innovative implants that reach into muscles, nerves, and even the brain? New surgical procedures?
“All of the above,” said Hunter Peckham, who organized a session on smart prosthetics last February at the annual meeting of the American Association for the Advancement of Science. Peckham, a professor of biomedical engineering and orthopedics who started out in mechanical engineering, is executive director of the Functional Electrical Stimulation Center at Case Western Reserve University in Cleveland. His work enables people with severed nerves to move their limbs again.
According to Peckham, “We’ve taken advantage of breakthroughs in all those enabling technologies “Think of all the factors you need to emulate the speed, torque, and power of an able-bodied limb,” he said. “You need small, energy- efficient motors and powerful batteries. And you need lightweight materials. If you ever walked for a mile with a one-pound weight in your hand, you know how much additional energy it requires you to put out.”
Many of these advances had their genesis in consumer electronics. According to Warren Grill, an associate professor of biomedical engineering and neurobiology at Duke University: “We now have all these small, powerful computational devices that don’t use very much power. We’ve been able to shrink lab racks of equipment and multiple computers down to a small package that someone can wear.”
Behind smart prosthetics are many technologies powering consumer products-faster microprocessors, more powerful batteries , wireless technology-and new control systems that use the body's nerves and muscles.
unding has also played a role. Grill notes that the 1990s were the Decade of the Brain. Starting in the early ’90s, the federal government doubled the budget of the National Institutes of Health, which focused much of its newfound largesse on fundamental brain research. “Now we’re taking the results of those federal dollars and turning them into products, treatments, and tools for diagnostics and delivery,” Grill said.
Although NIH funding has leveled off, the Department of Defense is spending tens of millions of dollars to develop better prosthetics for veterans injured in Iraq. “This is attracting the attention of the most clever designers in the country, and they are incorporating new concepts and new technologies into our products,” Peckham said.
Replacing an Arm
Bill Hanson, another former mechanical engineer and now president of Liberating Technologies Inc. of Holliston, Mass., gladly admits to piggybacking on advances made by larger industries.
The company introduced its Boston Digital Arm six years ago. It is built around a motorized elbow capable of lifting 10 pounds (and 50 pounds, if users lock it in place and use it as a lever).
“Ten pounds doesn’t sound like much, but it’s enough for most common activities, like picking up a gallon of milk or carrying an attaché case to work,” Hanson noted. The arm’s microprocessor controls the elbow plus four other prosthetic devices, such as hands, grippers, wrist rotators, and shoulder lock actuators.
“We were doing some of this 10 years ago, but the controllers were less sophisticated,” Hanson said. “Back then, we might have offered three control strategies and they had to fit all patients. For some, it was not the optimal system.”
Hanson took advantage of new processors, such as the high-speed digital signal processors developed by Texas Instruments Co. for consumer electronics. “Now we have about 32 control strategies for patients,” he said. Patients can control their arms by pushing switches, tensing muscles, or with sensors over muscles. “We can customize the arm to each patient, and if the clinician has done the job right, the patient will learn faster and be more proficient with it.”
The original controller has been overtaken by development. “When we designed it in 2000, it could run four motors plus the elbow. We didn’t have enough motors to run,” Hanson related. “Few of the peripherals attached to the arm could take advantage of the controller. But since 2004, devices like wrists and grippers have included motors. One company has even designed a hand with moving fingers. Now we need to redo the controller to include more inputs and outputs.”
Behind the profusion of motors is a relatively new technology originally developed for cell phones: lithiumion batteries. “They have twice the capacity and half the weight of nickel-cadmium batteries,” Hanson noted. “They charge faster, don’t lose capacity, seem to last as long, and don’t have environmental problems.” They have allowed designers to add motors where they would have been impractical before.
One reason that Hansons controller needs new inputs is the increased function of prosthetic hands. He points to the i-LIMB Hand from Touch EMAS Ltd. of Edinburgh, Scotland. Touch EMAS claims that i-LIMB is the world’s first commercial “multi-articulating” bionic hand. In other words, it has five independently powered digits.
In the past, prosthetic hands opened and closed like claws. The i-LIMB, however, has a rotating thumb, an independent index finger, and three remaining fingers that operate in unison. These powered digits can hold a key (between the thumb and side of the index finger), grip a glass (hand surrounding the object), extend the index finger (for dialing a number), or hold a knife or fork (index finger and thumb meet).
A stall detection system tells the hand when to stop powering. The fingers remain locked in position until the user flexes a muscle that triggers an open signal.
Users with arms but no hands control the i-LIMB through electrodes attached near muscles in the forearm that would ordinarily help position the hand. When the mind wants the hand to move, it sends a signal to these muscles. When the muscles contract, they emit an electrical signal. These myoelectrical signals are strong and easy to detect, like the heart muscle during an EKG. The signals go to a microprocessor, which then tells the motors in the hand what to do.
That works as long as part of the limb remains. But what happens when the entire arm has been amputated? Hanson is teaming with Todd Kuiken of the Rehabilitation Institute of Chicago to develop a more natural way to control arms when all muscles are gone.
Kuiken reasoned that the nerves going to the severed arm still work (the reason for phantom sensation). His solution was to reroute those nerves to the chest. He sliced the chest muscle into distinct bands and attached the nerves. Now, when the brain tells the arm to rise, the nerves cause the chest muscles to contract. The myoelectric electrodes sense the contractions and send the information to the microcontroller, which activates the arm. The same approach may work with hands, as well.
Walk the Walk
It requires complex thoughts to use arms and hands to manipulate objects. Legs have a simpler range of motion. This is one of the reasons that even a simple wooden peg leg is enough to help someone stand and walk.
Today’s smarter prosthetic legs and feet, such as those commercialized by Iceland’s Ossur hf, go well beyond such precarious support. Their sensors and high-speed processors enable them to respond immediately to changes in speed, load, walking style, and terrain. Ossur’s new Power Knee even provides the power to walk up a staircase foot over foot.
Although Ossur made prosthetic parts for more than 30 years, it released its first bionic system, the Rheo Knee, in early 2005.
The Rheo Knee uses force and position sensors that monitor speed and load more than 1,000 times per second. This information goes to an artificial intelligence program that emulates the feedback a natural knee would receive from the body’s central nervous system.
While the control algorithms are based on laboratory analysis of normal subjects, they must also adapt to the walking style of the individual wearing the device, explained Ian Fothergill, Ossur Americas’ senior clinical marketing manager. Initially, users input settings until the system feels right. The artificial intelligence program then collects data and reprograms itself to optimize the settings.
“The device does a series of controlled experiments with different gait parameters and observes the outcome,” Fothergill said. “Upon observing the optimal gait pattern, it will select the corresponding parameters. If there are any perturbations from this pattern, it recognizes them instantly and repeats the procedure.”
The Rheo Knee gets its name from its use of magnetorheological fluids, which contain suspended iron particles that align instantly when exposed to a magnetic field. The effect is like someone stepping on a brake. MR fluids let the knee go from firm when standing to resistance-free when turning a corner in just 0.001 second. Unlike the best previous approach, hydraulic fluids, MR fluids do not develop unwanted resistance when users walk at moderate or high speeds.
Ossur followed up the smart but passive Rheo Knee with the Power Knee, which it describes as “the world’s first powered prosthesis for above-knee amputees.” Thanks to improved battery technology, it provides enough power for users to walk up a staircase foot over foot. It also enables users to sit down or rise from a chair more easily, or simply go for a long walk without feeling exhausted.
The Power Knee replaces concentric muscle with motorized power. “The torque output from the Power Knee through its angular range is matched to the human knee,” Fothergill explained. “Maximum torque is applied between 45 and 50 degrees of knee flexion. This is managed through modulation of the motor power and the geometric location of the actuator and knee axis. This has proven a challenge, as the knee’s design must be fast for certain portions of gait and powerful for others.”
The knee’s processor uses data from gyrometers, pressure cells, and load cells to measure the motion, position, and velocity of the user’s sound leg. It then matches the motion of the Power Knee to the presumed gait of the user.
This type of smart design is apparent in the company’s latest product, the Proprio Foot. This smart foot does things most people never consider when they walk. It lifts the toe when it leaves the ground so the foot clears curbs or irregularities in the terrain. It positions itself depending on whether the user is sitting or standing. It adjusts to improve balance on steps or uneven ground. In other words, it uses high-speed artificial intelligence to act just like a natural foot.
Seeing Is Believing
Hands, arms, legs, and feet are not the only smart prosthetics. The first cochlear implant (better known as the bionic ear) was implanted more than 20 years ago. Today, more than 100,000 people, some as young as four months old, wear implants made by the worlds four manufacturers.
Unlike hearing aids, cochlear implants do not amplify sound. Instead, they use electrical pulses to stimulate the ear’s auditory nerves directly. Combined with training afterward, this is enough to give deaf people the ability to hear and understand speech.
The implants capture sound with a microphone. A processor breaks the sounds into their component frequencies. A wireless radio transmitter outside the skull then broadcasts these signals to a receiver embedded in the skull behind the ear.
Beaming a signal to the receiver eliminates the possibility of an infection working its way along wires going into the skull. Radio transmission also powers the receiver, the same way scanners send power to RFID tags. Like RFID tags, the receiver can also send back information to verify its integrity. Some receivers, such as the HiRes 90K from Advanced Bionics Corp., accept up to 90,000 updates per second.
The receiver distributes the electrical pulses to an array of electrodes implanted in the cochlea, where it parallels the path of the auditory nerve. In a healthy ear, this region is lined with tiny hair cells. They vibrate when sound reaches them, causing an electrical disturbance that the auditory nerves interpret as sound. In bionic ears, direct electrical stimulation replaces the electrical fluctuations created by moving hairs.
Conceptually, this is similar to the workings of the bionic eye being developed by Mark Humayun, a professor of ophthalmology and biomedical engineering at the University of Southern California, and Second Sight Medical Products Inc. of Sylmar, Calif. Both eye and ear rely on small, fast microprocessors and the ability to transmit power and data wirelessly. Instead of replacing damaged ear hairs, though, the bionic eye uses electrodes to replace damaged rods and cones in the retina.
“Rods and cones convert light to electrical impulses in the nervous system,” explained Scott Dunbar, Second Sight’s patent attorney and counsel. “Retinitis pigmentosa and macular degeneration attack the rods and cones, but leave the rest of the retina relatively untouched. What we do is put a small array of electrodes on the retina. These electrodes stimulate the nerve cells that are still viable to create perception of light.”
Like cochlear implants, bionic eyes have taken advantage of smaller, more powerful processors developed for consumer electronics. They also capture images with charge-coupled devices, specialized light-sensitive chips used in digital cameras. They also have benefited from wireless technology that allows surgeons to embed receivers near the nose and run electrodes to the retina. “It’s important to get the electrode close to the retina,” Humayun said. “That lets you decrease the currents needed to send information. You need to keep currents low to keep from electrolyzing the electrode.”
Humayun and Second Sight implanted the first artificial retinas in six subjects in 2002. The good news, according to Dunbar, is that not one implant has failed yet. The bad news is that they provide only 16 pixels of vision. This is less than minuscule when compared with 768,000 pixels on a high-resolution computer monitor or 4 million pixels in a healthy eye.
“Still, that’s enough to identify a place setting or a door,” Dunbar said. “One patient was so excited because when blind people walk across the street, they tend to curve, and now he can see the white line in the crosswalk. Another patient has learned to shoot baskets. The longer you have it, the more you can learn to do.”
Additional help is on the way. Earlier this year, Humayun implanted the first 60-channel bionic eye in a patient. He believes it will provide a dramatic improvement in the patient’s ability to navigate the world.
Still More to Control
Despite outstanding progress, the future holds more challenges. For example, even the highest-performance bionic limb is suspended by straps, suction, or a combination of the two. “If you really want high performance, you need a way to attach it right to the bone,” said Case Western’s Peckham. “And that means you need to create a viable interface between the natural tissue of the skin and the artificial limb.”
A second looming issue involves sensory input. Today, a person with a prosthetic hand cannot pick up a cup without looking directly at it. This is because the hand provides no sense of feeling. Nor can it gauge the temperature of a cup of coffee or whether clothes coming out of a dryer are still damp. Liberating Technologies has a grant to add vibrating sensors to the Boston Arm, Hanson said. While they promise to give users feedback on position and grip strength, they will not restore a user’s sense of touch.
Yet true sensory input may not be so far away, and it may come as a byproduct of research into prosthetics control. “The generic problem with today’s controls,” Peckham explained, “is that you have more things that you need to control than things you can control them with. What we need to figure out is where to get those signals from, and what is the minimum number of signals needed to control limbs and digits.”
A human eye has 4 million pixels, an implant 16. Yet that is enough for a blind person to cross the street safely, or even shoot baskets.
Peckham’s solution is to take the information directly from nerves that would ordinarily operate those appendages. This involves placing electrodes on nerves and extracting a signal. This is not a trivial undertaking. “Nerves are hugely complex bundles of fibers,” Peckham said, “and any of those fibers could be going to the muscle you want.” He said he has made progress on isolating nerves and filtering out the noise generated by surrounding nerve fibers to find the signals he wants.
Not only does Peckham plan to use nerves to control prosthetics, but he also hopes to use the same nerves to carry sensory input back to the brain. “It’s easy to embed sensors today, but what do you do with that information? One of the ways to deliver it is through a neural interface,” he said.
At Brown University, neuroscience professor John Donaghue has teamed with Peckham to develop ways to activate those nerves from within the brain itself. Their goal, they say, is to develop within five years a brain-controlled system that will let a tétraplégie take a glass of water, lift it, and bring it to his or her mouth.
The technology would go far beyond controllers now used for prosthetics. A single system—the brain rather than some external switch—could control prosthetic devices, an exoskeleton for tetraplegics who need more support, or the natural muscles of people with severed spines or other neurological damage. While the brain is incredibly noisy, Donaghue says modern signal processors are fast enough to filter out noise while capturing the signals he wants.
This may sound more like magic than science. Yet more than 30,000 people have electrodes implanted deep in their skulls to combat the tremors and shaking caused by Parkinson’s disease. The electrodes produce regular electrical pulses that normalize the abnormal pattern of brain pulses characteristic of the disease, according to Grill at Duke University. He is now investigating the use of similar implants to regulate severe incontinence.
The extreme edge of prosthetics and bionic research is possible because it builds on other advances, from microprocessors and batteries to new surgical techniques and the sheer audacity of researchers—some originally trained as mechanical engineers—who seek to rewire the human body as if it were another piece of advanced machinery.
No, it is not magic. But like all encounters with the scarcely known, it should inspire awe in all but the most leaden souls.