A key objective of gesture based computer aided design (CAD) interface is to enable humans to manipulate 3D models in virtual environments in a manner similar to how such objects are manipulated in real-life. In this paper, we outline the development of a novel real-time gesture based conceptual computer aided design tool which enables intuitive hand gesture based interaction with a given design interface. Recognized hand gestures along with hand position information are converted into commands for rotating, scaling, and translating 3D models. In the presented system, gestures are identified based solely on the depth information obtained via inexpensive depth sensing cameras (SoftKinetics DepthSense 311). Since the gesture recognition system is entirely based on using depth images, the developed system is robust and insensitive to variations in lighting conditions, hand color, and background noise. The difference between the input hand shape and the nearest neighboring point in the database is employed as the criterion to recognize different gestures. Extensive experiments with a design interface are also presented to demonstrate the accuracy, robustness, and effectiveness of the presented system.

This content is only available via PDF.
You do not currently have access to this content.