This paper describes the development of intelligent mapping from a haptic user interface to a remote manipulator to assist individuals with disabilities performing manipulation tasks. This mapping, referred to an assistance function, is determined on the basis of environmental model or sensory data to guide the motion of a telerobotic manipulator while performing a given task. Human input is enhanced rather than superseded by the computer. Three manual dexterity assessment tests commonly used in occupational therapy field were chosen to implement several forms of assistance functions designed to augment the human performance. The test bed used for these tasks consisted of a six-degree-of-freedom force-reflecting haptic interface device, PHANToM with the GHOST SDK Software. One of the tests was chosen to be implemented in a real telerobotic system consisting of the haptic device as a Master and the Robotics Research Corporation manipulator (RRC K-2107) with a vision system and laser range data as a Slave. The results demonstrated that the forms of assistance provided reduced the execution times and increased the performance of the chosen tasks. In addition, these results suggest that the introduction of the haptic rendering capabilities, including the force feedback, offers special benefit to motion-impaired users by augmenting their performance on job-related tasks.

This content is only available via PDF.
You do not currently have access to this content.