Abstract

While a Computer Numerical Control (CNC) machine automates most of the machining processes, the pre- and post-processes are still manually and inefficiently done by a human operator. Specifically, failure to eliminate chips from a worktable completely can adversely affect the machining process, leading to incorrect clamping and cutting of a workpiece and internal dimension measurements. An operator blows high-pressure air or coolant at various angles and positions in a random manner to remove the debris on a worktable, however, chips often disperse in unintended directions. Furthermore, blind spots such as corners or areas covered by other parts hinder full inspection and cleaning processes. Thus, optimizing air-blowing direction depending on the feature of chips and devising a vision system inspecting the inside of a CNC machine with diverse angles and locations is essential for an autonomous and robust chip removal algorithm. However, simulating diverse conditions in a physical CNC machine for optimizing air-blowing directions consumes many resources and may cause damage to a machine. To tackle this, this preliminary study developed a DT environment to train an autonomous chip removal deep learning model for a collaborative robot (cobot). In a DT environment including a transplanted virtual CNC machine, a chip cluster localization deep learning model was trained using a data set synthetically generated by scattering chip models in the virtual CNC model. Annotating a data set, Gray Level Co-occurrence Matrix (GLCM) energy, a texture analysis method, was implemented since it presented different values on the region with and without a chip cluster. The YOLOv8 algorithm was used to build a deep learning model of localizing chip clusters. The deep learning model predicted the coordinates of chip clusters in new cases in the DT environment correctly, and the vector from the center of the worktable to the predicted location of a chip cluster was calculated to estimate an air-blowing direction. Next, coordinate conversion from an image to a cyber space was performed for the visualization of an estimated air-blowing direction. Lastly, a virtual reality (VR) interface was utilized to record human skills cleaning chips from a worktable.

This content is only available via PDF.
You do not currently have access to this content.