Abstract
Hypoeutectic white cast irons containing 25% Cr are used in very demanding environments that require high resistance to erosive wear, for instance, the crushing and processing of minerals or the manufacture of cement. This high percentage in Cr, in turn, favors corrosion resistance. The application of a Design of Experiments (DoE) allows the analysis of the effects of modifying certain factors related to the heat treatments applied to these alloys. Among these factors, the influence of prior softening treatment to facilitate the machining of these cast irons and the influence of the factors related to the destabilization of austenite, during both quenching and tempering, were analyzed. The precipitated phases were identified by X-ray diffraction (XRD), while the Rietveld structural refinement method was used to determine their percentages by weight. Erosive wear resistance was calculated using the ASTM G76 standard test method. It is concluded that the thermal softening treatment, consisting of 2 h at 1000 °C and 24 h at 700 °C, does not result in additional softening of the material compared to its as-cast state. Furthermore, it is observed that not only eutectic carbides influence wear resistance, but that the influence of the matrix constituent is also significant. It is also verified that the tempering treatment plays a decisive role in wear resistance. Temperatures of 500 °C and tempering times of 6 h increase the wear resistance and hardness of the aforementioned matrix constituent. Tempering temperatures of 200 °C lead to an increase in retained austenite content and the presence of M3C carbides versus mixed M7C3 and M23C6 carbides. The quench cooling medium is not found to have a significant influence on the hardness or wear resistance.
Subject
General Materials Science,Metals and Alloys