A study is presented on brain computer interface (BCI) using motor imagery (MI) and facial expressions to control a mobile robot. Traditionally, only MI signals are used in BCI applications. In this paper a hybrid approach of using both MI and facial expression stimulations for BCI is proposed. Electroencephalography (EEG) signals were acquired using a sensor system and processed for several MI and facial expressions to extract characteristic features. The features were used to train support vector machine (SVM) based classifiers and the trained classifiers were used to recognize test signals for correct identification of MI and facial expressions. A system was developed to implement the BCI using MI and facial expressions to control a mobile robot. Results of training using MI and facial expressions, individually and together are presented for comparison. The combined features from MI and facial expression stimulations were found to give performance similar to facial expressions but better than MI only.
- Dynamic Systems and Control Division
Brain Computer Interface Using Motor Imagery and Facial Expressions to Control a Mobile Robot
- Views Icon Views
- Share Icon Share
- Search Site
Kuffuor, J, & Samanta, B. "Brain Computer Interface Using Motor Imagery and Facial Expressions to Control a Mobile Robot." Proceedings of the ASME 2018 Dynamic Systems and Control Conference. Volume 1: Advances in Control Design Methods; Advances in Nonlinear Control; Advances in Robotics; Assistive and Rehabilitation Robotics; Automotive Dynamics and Emerging Powertrain Technologies; Automotive Systems; Bio Engineering Applications; Bio-Mechatronics and Physical Human Robot Interaction; Biomedical and Neural Systems; Biomedical and Neural Systems Modeling, Diagnostics, and Healthcare. Atlanta, Georgia, USA. September 30–October 3, 2018. V001T13A006. ASME. https://doi.org/10.1115/DSCC2018-9234
Download citation file: