Vehicles with better usability have become increasingly popular due to their ease of operations and safety for driving. However, the way how usability of in-vehicle system user interface is studied still needs improvement. This paper concerns how to use advanced computational, neurophysiology- and psychology-based tools and methodologies to determine affective (emotional) states and behavioral data of an individual in real time and in turn how to adapt the human-vehicle interaction to meet the user’s cognitive needs based on this real-time assessment. Specifically, we set up a set of neuro-physiological equipment that is capable of collecting EEG, facial EMG (electromyography), skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of eye ball movement and objects that the user interacts. All hardware components and software is integrated into a cohesive augmented sensor platform that can perform as “one coherent system” to enable multi-modal data processing and information inference for context-aware analysis of affective and cognitive states based on the rough set inference engine. Meanwhile subjective data is also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.

This content is only available via PDF.
You do not currently have access to this content.