Evaluation of corrosion inhibitors for high temperature (HT) upstream oilfield applications can be challenging due to fixed fluid volume testing typically encountered in laboratory testing. A series of laboratory testing methodologies were conducted to further elucidate the factors which affect laboratory corrosion inhibitor performance in high temperature conditions. Under certain HT conditions, inhibitor performance may be skewed due to testing effects which may occur in closed cell testing such as Fe2+ saturation and/or scaling of the test fluids which may artificially lower the overall general corrosion rate. This testing program was designed to minimize these effects and ensure that corrosion inhibition in laboratory testing is identified solely due to performance of the inhibitor. For these studies, corrosion measurements in stirred autoclaves were performed by linear polarization resistance (LPR) or with weight loss measurements in rotating cage autoclaves (RCA). Surface morphology of corrosion products, scale deposition and effects of localized attack were evaluated by microscopic evaluations. Factors affecting inhibited and uninhibited general corrosion rates measured in laboratory test environments such as brine composition, effect of scale inhibitor inclusion, effect of metal surface area to fluid volume ratio, and method of acid gas charging were evaluated.

You do not currently have access to this content.