The effect of microstructure on the corrosion behavior of a 6% chromium (Cr) white iron containing 5.5 to 6% manganese (Mn) and 1% copper (Cu) was studied to evaluate its suitability as a corrosion-resistant material. Heat treatments involved holding at 900, 950, 1000, and 1050°C for 4, 6, 8, and 10 h followed by oil quenching. Corrosion resistance was assessed using the weight loss method in a 5% NaCl solution under stagnant conditions at ambient temperature for 168 and 720 h, respectively. On heat treating at 900 and 950°C, the corrosion rate increased over that in the as-cast state (characterized by a multiplicity of the matrix microconstituents) despite the matrix being predominantly austenitic. Increasing the soaking period at 900 and 950°C had little effect on the corrosion rate. A marginal increase followed by a decrease was observed despite a decrease in the level of massive carbides. This reflected adversely on the presence of dispersed carbides. On heat-treating at 1000°C, the expected improvement in corrosion resistance with soaking period was stalled by unfavorable morphology of the newly forming M7C3 carbide. This effect persisted even on heat treating at 1050°C up to 4 h soaking period. The volume fraction of the second phase decreased with time, leading to improved corrosion resistance.

You do not currently have access to this content.