Owners and restorers of old cars know that blistering paint on a door or fender is a tell-tale sign that rust has penetrated through the metal and a hole will soon form. Industrial maintenance teams also look for signs of corrosion, but often they aren’t visible until it’s too late.
There are various ways to infer how much metal has been lost from pipes and vessels, letting operators know when a failure may be approaching, but these really only provide an educated guess. There is just one technology today able to quantify precisely how much metal has been lost and how much remains intact. The differences between methods for inferring versus measuring is the topic of my article at automation.com, Real-Time Corrosion and Erosion Monitoring Enhances Maintenance Planning.
The first traditional method used weight-loss coupons as a proxy:
This method entailed placing a small, pre-weighed specimen of material identical to the piping or vessel directly into the process stream. After a set period, typically several months or more, the coupon was retrieved, cleaned and re-weighed. The weight loss was then used to calculate an average corrosion rate of the vessel over the exposure period. However, this determination was a lagging indicator with no way of identifying the impact of short-term process upsets.
This was a very labor-intensive practice, and it required a shutdown to check. The second approach monitored conditions conducive to corrosion:
Permanently installed electrical resistance and linear polarization resistance probes provided more frequent data, but they were intrusive to operations. Inserting a probe into a process created a potential leak path, which is a major safety concern. These probes were also susceptible to flow damage, and they provided only a highly localized measurement that could be rendered inaccurate by fouling, potentially missing more severe corrosion occurring elsewhere in the system.
While the second approach could provide indication of when given conditions were capable of increasing corrosion, it still could not indicate actual metal loss. Manual measurements using portable ultrasonic thickness detectors could provide spot readings but had little ability to provide useful historical data. So what’s the answer?
Rosemount™ Wireless Corrosion and Erosion Transmitters continuously monitor assets by measuring wall thickness directly in real time. This quantifies metal loss from corrosion and erosion, sensing changes as small as 10 microns (.0005 inch). These non-intrusive devices monitor the wall thickness of a pipe or vessel, providing reliable, accurate data to make more informed decisions and keep your operations running at their peak potential.
The transmitters can monitor wall thicknesses up to 100 mm (4 inches) at temperatures up to 600°C (1100°F). They’re quick and easy to install, and can be moved if needed, and they deliver their data via a WirelessHART® network.
The move from traditional to modern corrosion monitoring techniques is part of a broader asset management shift throughout industry, favoring continuous, automated and repeatable data—as compared to sporadic, manual, and variable measurements. These new measurement methodologies enable replacement of reactive maintenance methods with predictive servicing to maintain efficient production with lower operational expenses.
By also integrating these real-time data streams into digital ecosystems, facilities can predict assets’ remaining useful life with higher accuracy by leveraging historical data points and process trends. Driven by nonintrusive corrosion and erosion monitoring technology that provides previously unattainable insights, companies are maximizing reliability, operating more safely and producing more profitability in today’s hyper-competitive industrial markets.
For more information, visit Emerson’s Corrosion & Erosion Monitoring pages at Emerson.com. You can also connect and interact with other engineers at the Emerson Exchange 365 community.