Motional data for instruments or body parts are often needed in medical applications, for example, image-guided interventions, surgical robotics, human–machine interface designs, training based on virtual reality (VR) or augmented reality (AR), and motion tracking for patients with movement disorders.
Existing tracking tools used in our previous VR work [1,2] have shown room for improvement: electromagnetic tracking has interference problems or recalibration overhead; optical/video tracking is challenged by occlusion; and mechanically attached encoders restrict object motion and workspace.
Inertial measurement unit (IMU) has become a popular gizmo for orientation tracking. With an accelerometer, gyroscope, and magnetometer all assembled on a single chip, low-cost IMUs can be found in most smartphones and in various wearable electronics, including VR/AR displays. However, if used for position tracking, IMU rapidly accumulates errors. To tackle this problem, a few studies [3–5] used Kalman filters [6]...