The corrosion of metallic alloys underneath fused salt films - hot corrosion - is an important degradation problem in the internal components of aero-engines and in land-based marine and industrial turbines. Chromium is a key alloying element in defending against hot corrosion because of its ability to form a protective, adherent oxide layer. The role of chromium in the hot corrosion resistance of binary nickel-chromium alloys exposed to molten sodium sulfate was studied by systematically varying the chromium content of the alloys. Alloy samples were coated with a thin film of molten salt and tested for various times at 900°C in stagnant air. Wastage rates were determined using destructive metallographic analysis. These results were compared with those obtained from samples fully immersed under identical experimental conditions. Microstructural analysis of the corrosion front obtained using both testing methods are reported.

The immersion test method failed to reproduce features reported as characteristics of Type I hot corrosion. The oxide content in immersion tests appears to be insufficient to support the formation of nickel oxide scales.

Salt drip tests resulted in a better simulation of Type I hot corrosion of our alloy selection. However, the rate of attack did not follow a consistent trend as predicted by the current status quo. It is evident that a more reproducible test approach is needed for the creation of lifetime predictive tools for alloy selection in turbine applications. Subsequent publications by this group will attempt to generate a more accurate and reproducible corrosion test methodology.

You do not currently have access to this content.