What is My Pressure Transmitter’s Calibration Frequency?

by | Apr 2, 2014 | Measurement Instrumentation, Pressure | 0 comments

20140402-152635.jpgEmerson’s Burcu Karakulak discuss ways to optimize calibration frequency for pressure transmitters. Their abstract:

Save time & resources from performing routine calibration on your pressure transmitters by calculating the expected calibration frequency. You might find that you are calibrating too frequently or not frequently enough with routine maintenance. This workshop will present a method to estimate the required calibration frequency.

Burcu opened with how pressure measurement technology is used to measure not only pressure but flow and level. Its use crosses all the process industries. The definition of calibration is comparing an instrument’s accuracy against a known standard, and adjusting it to meet those standards.

Every transmitter needs calibration over time due to many factors including the age of the electronics, the service in which the device is installed, and the stability & accuracy of the transmitter.

It is important to know when not to calibrate a transmitter. Burcu pointed to a study where calibration actually made things worse 10% of the time. The calibration equipment must be calibrated and traceable to known standards such as NIST.

The goal is to be more cost efficient, time efficient, and resource efficient. Recalibration takes approximately two hours per device. Manufacturers typically provide minimum calibration intervals. Newer electronics technologies are more stable than older technologies increasing the stability of the transmitter. Rosemount 3051 calibration is stable for 5 years, Rosemount 3051S is stable for 10 years, and this stability is guaranteed. Specifications for 15-year stability are coming soon.

As part of the test practices, the devices are overpressure cycled, static line pressure cycled, and dynamic line pressure cycled. The dynamic pressure cycles reflect real world operations.

One pharmaceutical manufacturer with a history of calibration logs did a random inspection of calibrations over a three year history. In every instrument checked, no calibration was required–all were within the stability specifications.

Burcu noted that the application determines the calibration interval. Another factor is the performance required of the measurement. How critical is the accuracy for the process? Step one is determine the performance is required. Step 2 is to determine the operating conditions.

Step 3 os to calculate the total probable error using the root mean squared (RMS) method. Step 4 is to determine the stability of the output. Step 5 is to calculate the calibration frequency. The Rosemount team has a spreadsheet to calculate these calibration intervals for Rosemount transmitter. Contact the Emerson team managing the Rosemount products to get a copy of the spreadsheet.

Popular Posts



Related Posts

Follow Us

We invite you to follow us on Facebook, LinkedIn, Twitter and YouTube to stay up to date on the latest news, events and innovations that will help you face and solve your toughest challenges.

Do you want to reuse or translate content?

Just post a link to the entry and send us a quick note so we can share your work. Thank you very much.

Our Global Community

Emerson Exchange 365

The opinions expressed here are the personal opinions of the authors. Content published here is not read or approved by Emerson before it is posted and does not necessarily represent the views and opinions of Emerson.

PHP Code Snippets Powered By : XYZScripts.com