Abstract
End-effector tracking for a mobile manipulator is achieved through Sensor Fusion techniques, implemented with a particular visual-inertial sensor suite and an Extended Kalman Filter algorithm. The suite is composed of an Optitrack motion capture system and a Honeywell HG4930 MEMS IMU, for which a further analysis on the mathematical noise model is reported. The filter is constructed in such a way that its complexity remains constant and independent of the visual algorithm, with the possibility of inserting additional sensors, to further improve the estimation accuracy. Experiments in real-time have been performed with the 12-DOF KUKA VALERI robot, extracting the position and the orientation of the end-effector and comparing their estimates with pure sensor measurements. Along with the physical results, issues related to calibration, working frequency and physical mounting are described.