In robotics research, the electroencephalograph (EEG) based brain-computer interface (BCI) as a control input has been used in designing prosthesis, wheelchairs and virtual navigation. The paper presents the research work on BCI development that communicates between an operator and a robotic gripping device. The control of a BCI robotic hand is broken down into two main subsystems. The first subsystem acquires a signal from the brain through the Emotiv EPOC EEG headset, extracts features and translates them into an input to the control system. The second subsystem incorporates kinematics and feedback from sensors, to control the multiple degrees of freedom used in the gripping device depending on the action specified by the higher-level BCI control. The BCI is trained to filter and extract features relating to the different hand motions from the data sets. Machine learning is used in conjunction with data filtering, feature extraction, and feature classification techniques to create a more accurate and personalized BCI hand control system. The system analyzes the EEG data, compares with the EEG data patterns from previous attempts. The test results demonstrate the movement functions of the gripper using the BCI, and the success rate for each function are presented in this paper.

This content is only available via PDF.
You do not currently have access to this content.