The aim of this paper is to explore a new multimodal Computer Aided Design (CAD) platform based on brain-computer interfaces and touch based systems. The paper describes experiments and algorithms for manipulating geometrical objects in CAD systems using touch-based gestures and movement imagery detected though brain waves. Gestures associated with touch based systems are subjected to ambiguity since they are two dimensional in nature. Brain signals are considered here as the main source to resolve these ambiguities. The brainwaves are recorded in terms of electroencephalogram (EEG) signals. Users wear a neuroheadset and try to move and rotate a target object on a touch screen. As they perform these actions, the EEG headset collects brain activity from 14 locations on the scalp. The data is analyzed in the time-frequency domain to detect the desynchronizations of certain frequency bands (3–7Hz, 8–13 Hz, 14–20Hz 21–29Hz and 30–50Hz) in the temporal cortex as an indication of motor imagery.

This content is only available via PDF.
You do not currently have access to this content.