Abstract
The development of corrosion prediction software systems over the years has concentrated on scalar analysis, that is, using fixed input values to develop a single output parameter (corrosion rate). This is consistent with the origins of the majority of prediction systems which are based on the results of a series of laboratory tests using a standard changing variable approach to develop an algorithm providing corrosion rate from the varying inputs.
However, it has also been recognised that in real life situations, input parameters are not fixed and also that any algorithm will have a degree of uncertainty (prediction error). In order to account for variable conditions, a series of sensitivity calculations are often carried out, covering expected changes in (for example), flow rates, operating temperatures & pressures, process chemistry variations, etc. Then taking the varying predicted corrosion rates to provide an expected operating range.
Modelling uncertainty is normally handled by considering that the actual expected corrosion rate can be described as a distribution, with the predicted scalar value taken as an input to the distribution.
Both approaches are valid but can be time consuming. In particular, the resulting predicted corrosion rate range is not necessarily well defined (especially with respect to the minimum & maximum values). In order to provide both a more efficient and rigorous approach, one of the leading corrosion prediction software packages used in the oil & gas industry (with respect to downhole, pipelines and pipework) has been expanded to include Monte Carlo analysis.
This paper will describe the Monte Carlo approach adopted, consider the practical (engineering) implication of the distribution results and compare the output to more traditional sensitivity approaches.