Abstract
Tubes within bubbling fluidized bed combustors have in many instances suffered wastage. The wastage can be quite high at temperatures near 300 °C, but it typically shows an abrupt decrease at approximately 400 °C. Superheater tubes, which operate at higher temperatures, generally do not experience wastage. It is widely believed that this decrease in wastage with temperature is due to the development of a continuous oxide layer that protects the metal substrate by virtue of its hardness and resistance to spalling. In this study, the temperature effect is examined using a wear rig specially designed to simulate the impact conditions relevant to in-bed tubes. It was discovered that wastage for mild steel can decrease from a relatively high value to essential zero within the temperature range of 400 to 430 °C. This decrease was attributable, not to the presence of an oxide scale, but to the development of a protective deposit layer. The deposit consisted of an agglomeration of sub-micron bed material particles. The sub-micron dust is created through the normal attrition process and it tends to form an adherent coating on the bulk bed particles. Deposition on the specimen occurs by transfer of agglomerated material from bulk particles during impact. Subsequent impacts compact the deposit into a continuous protective layer.