High temperature corrosion is a major operating problem because it results in unscheduled shutdowns in Waste-to-Energy (WTE) plants and accounts for a significant fraction of the total operating cost of WTE plants. Due to the heterogeneous nature of municipal solid waste (MSW) fuel and the presence of aggressive elements such as sulfur and chlorine, WTE plants have higher corrosion rates than coal-fired power plants which operate at higher temperature. To reduce corrosion rates while maximizing the heat recovery efficiency has long been a critical task for WTE operators. Past researchers focused on high temperature corrosion mechanisms and have identified important factors which affect the corrosion rate [1–4]. Also, there have been many laboratory tests seeking to classify the effects of these corrosion factors. However, many tests were performed under isothermal conditions where temperatures of flue gas and metal surface were the same and did not incorporate the synergistic effect of the thermal gradient between environment (flue gas) and metal surface. This paper presents a corrosion resistance test using an apparatus that can maintain a well controlled thermal gradient between the environment and the surface of the metals tested for corrosion resistance. Two commercial substrates (steels SA213-T11 and NSSER-4) were tested under different corrosive environments. The post-test investigation consisted of mass loss measurement of tested coupons, observation of cross-sectional morphology by scanning electron microscopy (SEM), and elemental analysis of corrosion products by energy dispersive spectrometry (EDS). The stainless steel NSSER-4 showed good corrosion resistance within the metal temperature range of 500 °C to 630 °C. The alloy steel SA213-T11 had an acceptable corrosion resistance at metal temperatures up to 540 °C, and the performance decreased dramatically at higher temperatures.

This content is only available via PDF.
You do not currently have access to this content.