Skip to Main Content
Skip Nav Destination
ASME Press Select Proceedings

International Conference on Computer and Automation Engineering, 4th (ICCAE 2012)

By
Jianhong Zhou
Jianhong Zhou
Search for other works by this author on:
ISBN:
9780791859940
No. of Pages:
460
Publisher:
ASME Press
Publication date:
2012

Nowadays new interaction forms are not limited by Graphical User Interfaces (GUIs) making Human Computer Interaction (HCI) more natural. The development of humanoid robots for natural interaction is a challenging research topic. By using gesture based humanoid we can operate any system simply by gestures. The inter-human communication is very complex and offers a variety of interaction possibilities. Although speech is often seen as the primary channel of information, psychologists claim that 60% of the information is transferred non-verbally. Besides body pose, mimics and others gestures like pointing or hand waving are commonly used. In this paper the gesture detection and control system of the humanoid robot CHITTI is presented using a predefined dialog situation. The whole information flow from gesture detection till the reaction of the robot is presented in detail.

Abstract
Key Words
Introduction:
1. Marking for Gesture Control
2. Pie and Marking Menus:
3. The Prototype:
4. The Recognition Algorithms:
5. UBI Hand
6. Integration:
7. Conclusion:
Acknowledgement
References
This content is only available via PDF.
You do not currently have access to this chapter.
Close Modal

or Create an Account

Close Modal
Close Modal