Abstract
Manufacturers currently struggle with the assessment of a machine/robots’ accuracy degradation that limits the efficiency of machine/robots in high precision applications. Current best practice in industry is to inspect the final products or add redundancies (local calibration, etc.) during the process to determine the machine’s accuracy and performance. These create complexities in the process and increase the maintenance costs of applications such as high precision robot operations (welding, robotic drilling/riveting, and composite material layout), in-process metrology, and machines in mobile applications. A higher speed, more precise control of position and orientation is required to remedy these complexities. A novel smart target was designed at the National Institute of Standards and Technology (NIST) to integrate with a vision system to acquire high-accuracy, real-time 6-D (six-dimensional x, y, and z position, roll, pitch, and yaw orientation) information. This paper presents the development of the smart target and the image processing algorithm to output 6-D information. A use case is presented using the smart target on universal robots (UR3 and UR5) to demonstrate the feasibility of using the smart target to perform the robot accuracy assessment.