Tool status monitoring is a fundamental aspect in the evolution of production techniques. As the quality of the cutting tool is directly related to the quality of the product, the level of tool status should be kept under control during machining operations. An attempt is made here to extract maximum information from image captured from machine vision and Acoustic Emission (AE) signals acquired during turning of Inconel 718 nickel alloy. Nickel-base super alloy Inconel 718 is a high-strength, thermal-resistant. Because of its excellent mechanical properties, it plays an important part in recent years in aerospace, petroleum and nuclear energy industries. Due to the extreme toughness and work hardening characteristic of the alloy, the problem of machining Inconel 718 is one of ever-increasing magnitude. The experiments were conducted for different cutting speed and feed combinations. An image processing method, the blob analysis technique, was used to extract parameters called features representing the state of the cutting tool. Area and perimeter of the machine vision, AE RMS and AE COUNT of the AE signals studied as features and found to be effective in tool condition monitoring. Once all these features are extracted after preliminary processing of image and AE signals, tool Status, whether worn out or not worn out (serviceable), is decided on the basis of extracted features. In this study, theoretical estimation using ANN is carried out for machine vision parameters like Wear area and perimeter Acoustic Emission parameters like AE RMS and AE COUNT. In estimating vision parameter i.e. Wear area: perimeter, machining time, AE RMS, AE COUNT are considered as the independent variables and vice versa in order to have the performance well in multi sensory situations. In order to identify the tool status based on the signal measured, an Artificial Neural Network, using a Feed Forward Back-Propagation algorithm, has been adopted. The input parameters that are being used for estimation in this study were found to be non linearly varying with the desired output. The training and estimation has generated closer outputs as compared to the wear area observed from the machine vision approach and AE RMS from the acoustic emission approach. Artificial neural network estimates have better correlation at higher feed rate. Under these conditions, there will be large scale values, resulting in vision and AE parameters. Due to higher values, correlation may have been better.

This content is only available via PDF.
You do not currently have access to this content.