Oilfield corrosion inhibitors undergoing performance testing for in field applications are often first assessed in the laboratory by means of step-wise, incremental dosing. This laboratory assessment involves initially dosing corrosion inhibitors at a low dose rate and then increasing the dose until a suitable dose rate, to mitigate corrosion to acceptable levels, is achieved. Weldment corrosion tests on a 1% nickel weld specimen were performed to investigate the advantages and drawbacks of this methodology for determining dose rates. Coupled Linear Polarisation Resistance (LPR) measurements, galvanic current measurements and microscopic analysis were performed on specimens that had been exposed to step-wise, incremental dosing of corrosion inhibitor, in comparison to adding the corrosion inhibitor in a single aliquot at the required rate. Linear Polarisation Resistance measurements showed that step-wise, incremental dosing is useful in determining the dose rate of corrosion inhibitor required to reduce general corrosion rates to an acceptable level. However, weld testing showed that severe localised corrosion occurred in the first phase of testing, when the system is initially in an ‘under-dosed' state, which was not seen when dosed at higher rates. As a result, using sequential dosing has the potential to highlight localised corrosion problems where in the field or in the laboratory, if effective dosing is done, none would be seen. Microscopic analysis confirmed sequential dosing to have limitations.

You do not currently have access to this content.