This paper provides a natural, yet low-cost way for human to interact with electronic devices indoor: the development of a human-following mobile robot capable of controlling other electrical devices for the user based on the user’s gesture commands. The overall experimental setup consists of a skid-steered mobile robot, Kinect sensor, laptop, wide-angle camera and two lamps. The OpenNI middleware is used to process data from the Kinect sensor, and the OpenCV is used to process data from the wide-angle camera. A new human-following algorithm is proposed based on human motion estimation. The human-following control system consists of two feedback control loop for linear and rotational motions, respectively. A lead-lag and lead controller are developed for the linear and rotational motion control loop, respectively. Experimental results show that the tracking algorithm is robust and reduced the distance and angular error by 40% and 50%, respectively. There are small delays (0.5 s for linear motion and 1.5 s for rotational motion) and steady-state errors (0.1 m for linear motion and 1.5° for rotational motion) of the system’s response. However, the delays and errors are acceptable since they do not cause the tracking distance or angle out of the desirable range (±0.05m and ±10° of the reference input). There are four gestures designed for the user to control the robot, two switch-mode gestures, lamp-creation, lamp-selection and color change gesture. Success rates of gesture recognition are more than 90% within the detectable range of the Kinect sensor.

This content is only available via PDF.
You do not currently have access to this content.