Recommended Verification Practice for Rosemount 3144P HART

advertisement
Recommended Verification Practice for
Rosemount 3144P HART Temperature Transmitters
And
Resistance Temperature Detectors
March 2010
Page 1
Temp_Verification_RevAD.doc
Table of Contents
Section 1: Introduction and Temperature Measurement Overview
Section 2: Model 3144P Temperature Transmitter Overview
Section 3: Transmitter Verification Guidelines
Section 4: Transmitter Verification Frequency
Section 5: Temperature Sensor Overview
Section 6: Temperature Sensor Verification Guidelines and Procedure
Appendix A: Glossary of Terms
Appendix B: Sample Methodology To Determine Verification Frequency
Appendix C: Sensor Verification Background
Appendix D: Transmitter Troubleshooting
Appendix E: Notes to Transmitter and Sensor Verification
Page 2
Temp_Verification_RevAD.doc
Section 1: Introduction and Temperature Measurement Overview
This guideline will provide users of
Rosemount Model 3144P HART
Temperature Transmitters and Temperature
Sensors to develop specific verification (and
re-verification) procedures for their particular
installation. This guideline is not intended to
address the verification guidelines for Rosemount’s Fieldbus version of the
3144P family of Temperature transmitters.
This document provides guidelines and standards as recommended by
Rosemount. It is the end-user’s responsibility to develop their own individualized
procedures incorporating the policies, procedures and standards common to their
own company and industry sector for all instrumentation.
It is assumed that the transmitters and sensors used are correctly
specified, sized, and installed by the end-user. This document will focus solely
on verification practices.
The measurement of temperature is extremely common in today’s
industrial environment. Industries such as Oil & Gas, Chemicals,
Biopharmaceuticals, Power Generation, Metals & Mining, Automotive, HVAC,
Plastics and Aerospace all depend on the measurement and/or control of
temperature to some degree. Moreover, many modern industrial processes in
these industries are extremely sensitive to changes in temperature, for process
success, yields, and safety.
As a result, temperature measurement technology has also developed to
keep pace with demand. Modern electronics, software, materials, and
manufacturing processes have combined to provide a broad array of temperature
measurement solutions available to end-users of all types.
In modern industrial process industries (e.g. Chemical, Oil & Gas,
Biopharmaceuticals, etc..) the preferred choice of Temperature measurement
equipment is usually composed of a sensing unit, either an RTD (Resistance
Temperature Detector) or a thermocouple, and oftentimes a temperature
transmitter to read, interpret, and relay the temperature signal back to where it
can be utilized to full advantage. Additional equipment such as thermowells,
extension hardware, indicating meters can also be specified as required.
In order to ascertain the effectiveness of any temperature measurement
systems, one must take into account the performance criteria for each relevant
component. In most cases, this will require analysis of both the temperature
Page 3
Temp_Verification_RevAD.doc
sensor and temperature transmitter. To focus on one, at the expense of the
other could result in substantial misinterpretation of a system’s performance or
suitability.
Detailed information on Model 3144P HART Temperature transmitter
operation, installation and maintenance can be referenced in Rosemount
document 00809-0100-4021, “Model 3144P Smart Temperature Transmitter,
Product Manual”.
For additional questions, please contact Rosemount technical support
personnel on www.rosemount.com.
Page 4
Temp_Verification_RevAD.doc
Section 2: Model 3144P Temperature Transmitter Overview
The scope of this guideline from the Temperature Transmitter perspective
includes the HART versions of the Model 3144P transmitter. The Model 3144P
is a single and dual-sensor input transmitter. This dual-input capability allows for
the measurement of differential temperature, average temperature, or to provide
redundant temperature measurements.
These transmitters utilize a dual-compartment housing design that
enables high reliability and stability in harsh environments. This configuration
isolates the sensitive electronic components from environmental effects (such as
humidity) in the terminal compartment. Additionally, these transmitters are
capable of being matched to a calibrated sensor, thereby increasing
measurement system accuracy. In order to lessen the need for future transmitter
verification and limit operating costs, the Model 3144P transmitter is outfitted with
stability specifications of up to 5 years in duration.
In the Model 3144P, a microprocessor-based sensor board is used to
complete the analog-to-digital conversion from the temperature sensor (RTD or
thermocouple) to the transmitter. One hundred percent of production units are
fully temperature characterized to compensate for the temperature-dependency
of electronic components and increase accuracy and stability.
The sensor output is fed into the Output Board that carries the transmitter
specific data and completes the HART/4-20mA output conversion. The Model
375 and 475 Universal HART communicators are used to communicate locally
with the Models 3144P.
Page 5
Temp_Verification_RevAD.doc
3144P Functional Block Diagram
Galvanic Isolation
4-20
mA
signal
Sensor 1
Analog – Digital
Conversion
Sensor 2
Reference
Input
Cold Junction
Compensation
Microprocessor
- Sensor
Linearization
- Rerange
-Damping
- Diagnostics
- Eng Units
- Communications
- Temperature
Correction
- Averaging
- Hot Backup
- Differential
Temperature
DigitalAnalog
Conversion
Digital HART
Communications
Page 6
Temp_Verification_RevAD.doc
Diagnostic ALERTS and ALARMS:
Model 3144P transmitters are designed with Diagnostics that initiate an ALERT or
ALARM when the appropriate condition exists.
ALERTS – cover diagnostics that are determined to not affect the devices’ ability to
output the correct signal and thus will not disrupt the 4-20mA output. For example,
“Process Variable out of Range”. To read an “ALERT”, one must read the indicating
meter, read on a 375/475 Communicator, or an Asset Management System that can
read HART information.
ALARMS – cover diagnostics that are determined to affect the device ability to output the
correct signal. Detected alarms will drive the transmitter output high or low as specified
by the user. Alarms will be read on the indicating meter, Model 375/475 Communicator
or any asset management system that can read HART information.
In the event of a transmitter failure or malfunction, the Model 3144P is designed with
various diagnostics. If the device detects any failure that affects the transmitter output,
the device will drive the Hardware Alarm (high or low as selected by the user). The
alarm output levels are < 3.75 mA low and > 21.75 mA high. Option for NUMUR levels
is also available (< 3.6 mA and > 22.75 mA). Alarms will also be indicated on the
transmitter meter and via the Model 275 communicator.
Process Variable
Out of Range
3.9
20.8
4
Normal Operating Range
20
mA
< 3.75
> 21.75
Hardware Alarm
Page 7
Temp_Verification_RevAD.doc
HART
Field Communications Protocol
Communications Signal
Physical Layer
The HART protocol uses a communication physical layer called Bell 202
Frequency Shift Keying (FSK). A very small sinewave signal with an amplitude
of around 1.2 volts peak-to-peak is superimposed on the 4-20 mA analog
signal.
Since the average value of the HART digital signal is zero, the value of the
analog signal is basically unchanged. This allows for simultaneous analog and
digital communication on the same set of loop wires.
20 mA
Process Variable
Communications
4 mA
Ones and Zeros
The HART protocol like most digital communication uses a string of zeros (0)
and ones (1) to transfer a message between the host device and a slave
device. A frequency of 1200 Hertz (Hz) will be equal to a digital value of “one”,
while a frequency of 2200 Hz will equal a “zero” digital value.
Frequency Shift Keying Based on Bell
+0.5 mA
“1” = 1200 Hz
“0” = 2200 Hz
–0.5 mA
Page 8
Temp_Verification_RevAD.doc
Communications
A HART 4-20 mA loop is similar to a 4-20 MA analog loop. One of the most
important items is to ensure the correct amount of loop resistance is in place, to
generate the HART digital signal. A HART loop needs a minimum of 250 ohms
and a maximum of 1100 ohms of loop resistance. This loop resistance is required
to generate a digital signal with enough voltage and current to be read by both the
HART master (375/475) and the HART slave (3144P transmitter).
Voltage
E=IR
Current
Minimum 250
Page 9
Temp_Verification_RevAD.doc
Power
Supply
Section 3: Transmitter Verification Guidelines
The Need For Verification
While it is true that most of a temperature point’s drift with time can be
associated to the temperature sensor, the temperature transmitter may also drift.
The electrical components that comprise a temperature transmitter are
designed and specified for their accuracy and stability, but it is a known fact that
all electrical components drift. Even though Rosemount-specified resistors,
capacitors, inductors and other components undergo rigorous testing, they all
experience some amount of performance drift over time due to temperaturerelated stresses and other affects.
Individual component drift over time may be small, but the combined
effects of these drifts may amalgamate to eventually affect the performance of
various transmitter subsystems, and the transmitter as a whole.
Due to this drift, temperature transmitters will periodically be “calibrated”.
This is accomplished through the use of an INPUT Trim (or Sensor Trim) as
described next.
Transmitter Verification Overview – Transmitter Trimming
To calibrate a 3144P transmitter the user may use one or more “trim”
functions. All 3144P transmitters are “factory-characterized”, which means they
are shipped with an industry-standard sensor (resistance versus temperature in
the case of an RTD) curve in the transmitter’s memory. In actual operation, the
transmitter uses this stored information to continuously interpret the sensor input
and output the process variable in the appropriate engineering units. Verification
of a 3144P transmitter entails attaching the transmitter to known resistance (or
millivolt) sources at one or more points, and adjusting or “trimming” the
transmitter to the correct output reading. This adjustment will change the
transmitter’s interpretation of the sensor input across the entire temperature
range, resulting in a calibrated transmitter. Two types of Trimming functions are
available for transmitter verification. A Single-Point Trim, as its name implies,
entails calibrating the transmitter at a single temperature (i.e. resistance) point
and shifting the entire sensor curve appropriately. A Two-Point Trim entails
calibrating the transmitter at two different temperatures (i.e. resistances) and
than adjusting the sensor curve by both shifting and changing the slope. The
Two-Point Trim is recommended in most applications. See Appendix C for
additional information.
Page 10
Temp_Verification_RevAD.doc
Section 4: Transmitter Verification Frequency
In general, the verification frequency for a transmitter is very application
dependent. The accuracy and relative importance of the measurement as
determined by the user. For Temperature measurement points intended for
process control, critical systems (i.e. safety shutdown, etc..), or extremely high
value processes, more frequent verification checks, on the order of 1 per month
initially may be required. This time duration may be gradually increased to 3 or 6
months if the Temperature transmitter readings are holding true. Please
reference Appendix B for a sample methodology for determination of Verification
frequency.
The 3144P transmitter has a 1 year and a 5 year stability specification as
follows.
1) +/- 0.1% of reading or 0.1°C, whichever is greater, for 24 months for RTDs
2) +/- 0.1% of reading or 0.1°C, whichever is greater, for 12 months for
thermocouples
3) +/- 0.25% of reading or 0.25°C, whichever is greater, for 5 years for RTDs
4) +/- 0.5% of reading or 0.5°C, whichever is greater, for 5 years for
thermocouples
This information is generated from accelerated life testing involving
temperature cycles and gives a very good indication of transmitter drift in actual
operation.
In general, the verification/verification interval for a given application is
determined by two criteria:
1) Installed performance and stability of the transmitter
2) Accuracy requirements of the application.
By calculating the installed performance of the transmitter and comparing
this against the application performance requirements, the verification interval
can be approximated.
4.1 Calculating Transmitter Installed Performance
To calculate the performance of a transmitter installed in a specific application,
one must factor in all potential aspects that could have a detrimental impact.
These typically include:
Digital Accuracy (of electronics)
Digital/Analog accuracy (of D/A conversion)
Page 11
Temp_Verification_RevAD.doc
Ambient Temperature Effects (changes in ambient temperature affecting
transmitter electronics).
The total performance of the transmitter will be a RSS (Root Sum Squares) of
these factors:
2
2
2 ½
Total Performance = (Digital + DA + TE )
where DA is the Digital/Analog accuracy and TE is the Ambient Temperature
Effects accuracy. Full explanations and listings of these accuracies can be found
in the 3144/3244 Product Data Sheets (Rosemount document 00813-0100-4724
or 00813-0100-4021 for the 3144P)
4.2 Comparing Transmitter Performance and Application Requirements
When the Application requirements are known, and the transmitter total
performance and stability has been determined, one can set the required
verification interval.
For example, if the application requires + 0.5% Reading performance (for safety,
product quality, etc.) and the Transmitter Performance is + 0.10% plus stability of
+0.10% per year, the maximum interval could theoretically be set at 3+ years.
Note: Verification intervals can be shortened for critical measurement points,
and lengthened for non-critical measurement points as determined by user
experience.
4.3 General Recommended Calibration Practices for Temperature Transmitters
In lieu of enduser-determined application-specific verification intervals the
following general guidelines may be used as “manufacturer recommended
practices” for calibration of temperature transmitters. (Note: this may be used to
assist users in their compliance with EPA regulations governing Greenhouse Gas
measurement points.)
Type of Installation
Recommended Calibration Interval
“Benign” installations
Once every 3 years
“Severe” service installations
Once every year or end-user
determined (whichever is less)
“Benign” installations are defined as measurement points where the transmitter is
not exposed to high levels of shock, vibration and extreme/repeated ambient
temperature extremes.
Page 12
Temp_Verification_RevAD.doc
Section 5: Temperature Sensor Overview
Overview
The two most common methods of industrial temperature measurement
today are with thermocouples and resistance temperature detectors (RTD’s).
While the installed base of industrial temperature sensors is composed mostly of
thermocouples mainly due
to lower costs and wider
temperature ranges, new
sensor installations are
gradually migrating to RTD
usage due to improved
manufacturing technologies
(and therefore lower costs),
better accuracy/stability, and less susceptibility to electromagnetic interference.
To keep with the scope of this document, this section will be limited to a
discussion of basic RTD principles.
Operating Principles (RTDs)
RTD’s operate on the basic principle that the resistance of a metal
increases with temperature, i.e. thermoresistivity. This electrical resistance
property of any given type of metal is termed resistivity, and is denoted by “r”.
The overall resistance R, of a quantity of metal (a piece of wire for example) will
be proportional to its length L, and inversely proportional to its cross-sectional
area A, i.e.
R = (r X L) / A
Note that the resistivity “r” will change with temperature for any given
metal. A plot of a metal’s Resistance versus Temperature for any given metal
shall be a curve that is “almost” linear in nature.
Many types of RTD metals are in existence, including platinum, copper,
nickel, and others. Metals are usually chosen on the basis of many factors
including, a) The temperature coefficient of resistance (i.e. slope of Resistance
vs. Temperature curve) and whether it is large enough to give resistance
changes that can be easily measured as temperature changes. b) Whether the
metal will react with materials used during the sensor fabrication process (and
thereby negatively affecting the measurement readings). c) Whether the metal
can be easily and economically fabricated into form factors that permit easy
manufacture and use.
Page 13
Temp_Verification_RevAD.doc
In today’s industrial environment, platinum is the most commonly utilized
RTD metal. Platinum possesses the desirable properties of high accuracy, high
repeatability, and a relatively high resistance change per degree of temperature
change. Additionally, platinum RTD’s are fairly linear throughout their
temperature range. A common abbreviation found in industry for platinum RTD’s
is PRT for Platinum Resistance Thermometer.
In order to practically use RTD’s in industrial applications, temperature
resistance formulas for platinum have been devised. A common equation, the
Callendar-Van Dusen equation compensates for the small deviation from linearity
inherent in the platinum resistance versus temperature curve. The CallendarVan Dusen equation is:
Rt = Ro + Ro[t-(0.01t-1)(0.01t)-(0.01t-1)(0.01t)3]
where:
t = Temperature in C
Rt = Resistance of the sensor at temperature t.
Ro = Resistance of the sensor at 0C
Unique, sensor-specific constants
The most typical nominal value of alpha for industrial PRTs is .00385
although other values are also available. The Callendar-Van Dusen equation
can be used to easily convert from the RTD’s resistance value to Temperature
and vice versa.
RTD Drift
As a general rule, all sensors, RTDs included will drift with time. RTD drift
can be induced physically or chemically, but is always present. Most
manufacturers will provide some quantitative measure of expected RTD drift with
time, but these numbers should only be used as a rough guideline for users.
Most sensor drift specifications are under lab or “simulated” use conditions, and
will not be representative of actual installed conditions in real industrial
processes.
Physical effects on RTD stability include both mechanical and thermal
factors. Mechanical effects such as vibration (whether process or
environmentally-induced) and shock (due to improper handling) introduces
defects in the crystalline structure of an RTD’s underlying Platinum sensing
element. This in turn will affect the RTD’s resistance versus temperature
relationship.
Thermal effects are primarily caused by differing thermal coefficients of
expansion between a sensor’s Platinum sensing element and the material
Page 14
Temp_Verification_RevAD.doc
(typically ceramic) used to house or support the platinum. As temperature rises
and falls, this imbalance (however minute) will introduce stresses in the platinum,
thereby affecting performance.
Chemical phenomena (i.e. contamination or poisoning) of an RTD’s
Platinum sensing element will also have a negative effect on a sensor’s
performance with time. This shift in resistance versus temperature performance
will accelerate with elevated temperatures, i.e. above 400 or 500°C.
RTD Verification
The general goal of RTD verification is to either experimentally determine
the actual, unique verification constants () for each individual RTD or to
check the RTD’s resistance reading at one or more finely controlled
temperatures.
The former is accomplished by finding the actual resistance versus
temperature points for an RTD at 3 or 4 points. This information is then used to
mathematically find the constants by solving for 3 (or 4) unknowns. The later is
accomplished by simply immersing the sensor into baths, ovens, or calibrators
operating at known (and controlled) temperatures, and measuring the RTD’s
resistance reading. This reading can be compared with the ideal or factory
reading.
Callendar-Van Dusen Constants and Sensor Accuracy
Standard RTD’s conform to industry-defined resistance versus
temperature curves (such as those prescribed by ASTM or IEC). In general,
RTD’s can be manufactured to conform to these industry-standard curves to
some level of accuracy and industrial temperature transmitters (including the
3144P family) have these industry standard curves embedded in memory.
Higher levels of accuracy may be accomplished by through the use of the
Callendar-Van Dusen constants described above. These constants are unique
for each individual RTD, and once known, can accurately reproduce the RTD’s
unique Resistance versus Temperature relationship through the Callendar-Van
Dusen equation. Practical implementation of this entails the “loading” of an
RTD’s unique set of constants into a 3144P temperature transmitter, and
thereafter having the transmitter interpret resistance and temperature values
from these constants.
Page 15
Temp_Verification_RevAD.doc
Section 6: Temperature Sensor Verification Guidelines and
Procedure
Background
Sensor verification practices may vary depending on the level of accuracy
desired from the measurement, the criticality of the measurement, the equipment
used, etc. For the purposes of this report, RTD verification shall be segmented
by the criticality of the measurement. The following is a general guideline for
determining the level of verification required for a given temperature sensor
measurement, as well as some recommended procedures, notes on equipment,
and other guidelines. Actual practices will be dependent on individual user
needs and preferences, and corporate or industry best practices.
Tolerance Level
Accuracy
Low
Medium
High
+/- 1.0 to 2.0C
+/- .05 to 1.0C
+/- 0.5C or better
*Note: Accuracy levels are recommended minimums.
Sensor Verification Practices and Measurement Criticality
Low-Level of Criticality
While it is assumed that all measurements need to have some level of
accuracy and reliability, there will be some applications where the failure or gross
inaccuracy of the measurement point does not present an immediate or severe
consequence. Points such as these can generally be classified as purely
monitoring applications where it is either beneficial to know the periodically see
the general status of the process in question.
RTD verification for these types of points should comprise a single point
verification check. The most common, but by no means exclusive, methodology
is that of an ice-point check as described next.
An RTD’s ice point or also called R0 is the most fundamental
measurement point, providing the user an opportunity to glean much information
regarding the relative health of the sensor. The ice point is popular both because
of its relative ease of achieving and maintaining, and because it is included in
every major sensor/verification standard in the world.
As an RTD drifts due to mechanical or thermal forces, it’s Resistance
versus Temperature curve will shift or translate. Ice Point checks (over time) will
allow the user to determine the severity of this drift. For instance, if the ice point
for a given sensor is maintaining Class B (an IEC-specified tolerance level) on
that sensor, then the user has a reasonable certainty that the sensor performs to
Class B across its operating range. The ice point check is a basic indication that
Page 16
Temp_Verification_RevAD.doc
the sensor is structurally and materially sound. A check of ice-point shift followed
by an appropriate trim to the entire sensor curve (i.e. translating the curve by the
amount of ice-point shift) is the best type of sensor re-verification for lowcriticality applications.
Other temperatures may be used in place of an ice-point check provided
the verification equipment setup is adequate for the accuracy level required.
Medium Level Criticality
For a more critical process where accuracy is of a higher concern, it is
recommended that the user conduct periodic two point verification checks. One
of these points should be the ice point. The second point can be 100°C (i.e. the
boiling point of water) or some other easily reproducible point. A two point check
allows the determination of two pieces of information. First, the Ro or ice point
will be known. Secondly, the user can calculate the RTD’s Alpha value, which is
a general indication of the 1st order slope of the RTD’s Resistance versus
Temperature curve. This gives another level of indication of the quality of the
sensor. While mechanical or thermal degradation may be reflected in the ice
point drifting, a chemical/material contamination of an RTD’s platinum sensing
element is often accompanied by a shift in the RTD’s Alpha property. If a history
is kept and Alpha is degrading, this is an indication that the sensor is degrading.
NOTE: As a rule of thumb, typically as R0 drifts up, the Alpha property
decreases. This may be an indication that the purity or quality of platinum is
diminishing.
High Level Criticality
For the most important processes, where measurement accuracy and
stability is of the utmost concern, it is recommended that a “full” verification or
recharacterization of the RTD is performed. Processes where this may be
relevant are various applications involving very fine process control where yield
may be negatively impacted if temperature is not tightly regulated.
A full span verification or characterization is basically a redetermination of
an RTD’s unique Resistance versus Temperature relationship. This is typically
accomplished by comparison verification of the RTD in question at either 3 or 4
verification points (4 is required if the verification temperature range bridges 0°C).
As a result, the most accurate picture of the RTD’s health and performance will
be accomplished.
Comparison of Verification Results (How to Handle Verification Results)
For non-calibrated sensors (i.e. sensors without uniquely known constants
such as IEC Class A or Class B sensors) the user should compare the
verification results with the resistance versus temperature information found in
industry standard specifications (e.g. IEC-751). These standards publish the
Page 17
Temp_Verification_RevAD.doc
nominal resistance versus temperature information for a “standard” industrial
RTD, as well as the tolerance information around this nominal.
For sensors originally “calibrated” at the factory (i.e. those requiring very
high accuracy), the new verification check data may be compared with the actual
accuracy desired for the specific application or the original factory-provided
information.
For High Level Criticality applications, the sensor should be fully reverified or re-characterized, and the new Resistance versus Temperature data
should be used moving forward. For Low and Medium-Level Criticality
applications, the sensors are simply reused with the standard factory assumed
Resistance versus Temperature data. Significant deviation of the temperature
instrumentation from the initial or theoretical values (as defined above) should
flag the user to a potential failure or replacement condition. As a minimum step,
the sensor should be replaced with a new unit. Gross deviations in readings
should warrant an investigation of both the process and the instrumentation for
possible fault conditions.
Sensor Verification Equipment and Procedure Overview
For the most part, the same basic procedures and equipment are
employed regardless of the complexity of the verification scheme. Typically, the
verification media will consist of either a fluid bath or a dry-block calibrator. If a
free-standing fluid bath is utilized, then some type of accurate, standard
thermometer will be required to monitor the actual verification bath temperature.
Lastly, a Digital Multimeter (DMM) will be required to determine the resistance
(temperature) of the sensor to be validated.
As a rule of thumb, the equipment utilized in the verification setup should
be at least 4 times better than the resolution of the phenomena being measured.
Sensor Verification Equipment Guidelines
DMM: A wide range and variation (in capability and cost) are available on
the market. Ideally, a DMM accurate to at least 4 decimal places is
recommended. The user should match the DMM’s capability to the level of
verification to be completed. Entry-level DMMs may not have the EMF
compensation required for high accuracy measurements. If long wires or
multiple wire junctions are utilized, then EMF compensation may be a
requirement. Otherwise, for lower-criticality measurements, such as just an ice –
point check, an entry-level device may be adequate. Manufacturers such as
Keithley, Hewlett Packard, and others offer a broad range of devices. Examples
include (but are not limited to):
HP3478A (4-decimal, entry to mid-level)
Keithley 2001 (mid-level)
Keithley 2002 (mid-to-high level)
HP3458A (5 or 6 decimal, high-level).
Page 18
Temp_Verification_RevAD.doc
Fluid Baths / Dry-Block Calibrator: Again, a wide range of fluid baths and
dry-block calibrators are available. The level of accuracy of the fluid bath or dryblock calibrator will be determined by the verification to be performed. Typically,
a good fluid bath will provide a level of uncertainty of approximately .01°C
uncertainty. Typically Dry Block calibrators are 2x or 3x this level. In general,
fluid baths are harder to maintain (special facilities, fluid storage and refilling,
special training, etc) and more costly, but can more accurate. Fluid baths also
require a separate Standard thermometer. Dry-block calibrators have internal
thermometers, eliminating the need for a separate Standard. Regardless, both a
fluid bath’s Standard thermometer and the Dry-Block calibrator will to be
periodically re-calibrated. For field verification, the only practical alternatives are
typically portable Dry-Block calibrators. NOTE: Some calibrators will provide a
separate specification for Uniformity and/or other contributors to uncertainty. The
user should take all effects into account when determining the overall uncertainty
of the calibrator.
Standards: Temperature Standards are available at varying levels of
uncertainty depending requirements. Typically, Standards provide up to a .01C
level of uncertainty and are traceable back to NIST or some other nationally or
internationally recognized agency. Certified digital thermometers are also
suitable for sensor verification provided that the accuracy is suitable, and the
certification is current.
Sensor Verification Procedure Overview
Regardless of the level of verification to be performed, the procedure
utilized will be a comparison verification. In other words, the fluid bath’s
Standard or the Dry Block calibrator’s thermometer will define temperature and
the user collects the resistance at that temperature. At the highest level of
verification (i.e. full characterization of the resistance versus temperature
relationship) the temperature and resistance values will be utilized to calculate
the RTD’s unique Callendar-Van Dusen equation. Otherwise the resistance and
temperature data will be compared with past readings.
Sensor Verification Procedure: General Steps
Step 1: Set up fluid bath or dry block calibrator per manufacturer’s
instructions.
Step 2: Set up DMM per manufacturer’s instructions. Enable EMF
compensation if available. NOTE: Most DMM manufacturers do NOT set EMF
compensation to “On” for a default setting.
Step 3: For fluid bath, clean sensor before insertion into bath. This will
minimize contamination of bath fluid.
Step 4: Insert sensor(s) into bath or calibrator. If required, use an
equilibrium block to lessen effects of bath or calibrator non-uniformities.
Page 19
Temp_Verification_RevAD.doc
Step 5: Connect sensors to DMM. Recheck for good electrical
connections. It is preferable to utilize a 4-wire measurement (or 3-wire with dual
element sensors). This will allow for leadwire compensation.
Step 6: Set fluid bath or dry block calibrator to desired temperature.
Step 7: Monitor the output of the bath or calibrator and allow time for bath
or calibrator to stabilize at temperature. Typically, once the medium has reached
some level of stability, there will be some perturbation around the setpoint. A
dry block will typically cycle on and off at the achieved temperature.
NOTE 1: At higher temperatures of above 400°C, if the resistance value of the
target sensor appears to continuously decrease after the bath or calibrator is
stable, user may want to verify that sensor’s Insulation Resistance (IR) value is
not compromised (versus RTD manufacturer’s specifications). IR will always
degrade somewhat with temperature but sensors that are wet or structurally
compromised will be excessively and negatively affected. In these instances,
the addition of heat results in moisture redistribution. This moisture reduces the
sensor’s effective resistance as current starts to shunt. NOTE: An IR of
>10MegOhms is considered satisfactory).
NOTE 2: For FIELD VERIFICATION, the general steps and guidelines outlined
above are valid with the exception of the use of a portable dry block calibrator
and a portable DMM. The user should pay special attention to the accuracy
levels associated with this equipment in order to achieve a good verification. The
user should also be vigilant of extremes in environmental conditions that could
affect the verification process.
NOTE 3: Many industrial Temperature Verification practices involve 3, 5 or
another number of point checks around a setpoint or across a temperature
range. Many of these practices may be offshoots of verification practices for
capacitive pressure devices. In general, the platinum elements found in
industrial RTDs are usually more predictable (when operating normally, and
when failing) and fairly linear. In most instances, the user will receive satisfactory
results for Temperature RTD’s without doing multiple point verifications,
especially around a set point. If two or three points are “behaving” correctly, then
temperatures in between these points are in all likelihood also “good” (although
RTD readings can deviate at higher temperatures, i.e. > 600°C).
SYSTEM VERIFICATIONS (TRANSMITTER AND SENSOR COMBINATION)
Many installations utilize “System Verifications” or the combined
verification of a temperature transmitter and sensor assembly. This type of a
verification is based on the sound principle that the transmitter/sensor
measurement is a temperature SYSTEM measurement, and therefore should be
treated as such. This methodology may also be more readily accomplished with
less time and fewer resources than individual RTD and Transmitter verifications.
Note: Investigations into deviations in measurement output of a system should
Page 20
Temp_Verification_RevAD.doc
include determination of the specific cause of the inaccuracy (i.e. element,
transmitter, wiring, terminations, etc).
Two general methodologies exist for system verifications. In the first, the
transmitter effectively takes the place of the DMM in a standard RTD verification
as described previously in this section. In this case, the DMM will still be used,
but to measure transmitter 4-20ma output.
The second methodology involves the Trimming functions described in
Section 3. In this instance, the sensor and verification bath effectively take the
place of the decade box as the resistance source. The transmitter is then
“trimmed” per the procedure in Section 3.
Page 21
Temp_Verification_RevAD.doc
Appendix A: Glossary of Terms
Alpha: An RTD constant. Calculated from the RTD’s resistance at 0°C and
100°C.
Analog Output Trim: Analog Output trim is the process used to adjust the analog
output (transmitter’s current output at the 4 and 20 mA points) to match the plant
standard or the control loop.
ASTM: American Society of Testing and Materials. A consortium based in the
United States that, among other things, defines guidelines for industrial RTD’s
(ASTM E644 and ASTM E1137).
Beta: An RTD constant.
Callendar-Van Dusen: 1) The set of constants (Alpha, Delta, Beta and ice point)
that fully characterize an RTD’s resistance versus temperature relationship. 2)
The equation used to characterize an RTD’s resistance versus temperature
relationship.
Rt = Ro + Ro[t-(0.01t-1)(0.01t)-(0.01t-1)(0.01t)3]
Configuration: Process of setting parameters that determine how the transmitter
functions.
Delta: An RTD constant.
IEC: International Electrotechnical Commission. An international consortium
that, among other things, defines guidelines for industrial RTD’s (IEC-751).
Class A and Class B: RTD tolerance grades as defined by IEC-751.
Grade A and Grade B: RTD tolerance grades as defined by ASTM.
Primary Standards: Laboratory grade temperature sensors that are calibrated at
nationally or internationally recognized calibration agencies. These sensors have
the highest accuracies available.
R0 or Ice Point: An RTD’s resistance reading at 0°C.
Secondary Standards: Laboratory grade temperature sensors that are calibrated
and traceable to Primary Standards.
Sensor Trim: Sensor trim is the process used to adjust the position of the factory
characterization curve to optimize the transmitter performance over a specified
temperature range or to adjust for mounting effects.
Verification: A general term for the checking of a temperature sensor or
transmitter performance in the field.
Page 22
Temp_Verification_RevAD.doc
Appendix B: Sample Methodology To Determine Verification
Frequency
In order to maintain and verify the accuracy of any measurement system,
it is necessary to conduct regular verifications and to perform a verification of a
measurement system component if it is deemed to be outside of acceptable
limits.
Values for instrument stability (or lack thereof) are commonly provided for
instruments of all types. While it is possible to calculate a recommended
verification interval from these specifications (i.e. by factoring in how much drift is
acceptable for a given application), the results may be misleading. Most
instrument stability information is based upon either laboratory conditions (e.g. an
RTD soaking in a temperature oven) or some arbitrary environmental conditions
(i.e. nominal vibration and elevated temperature) meant to simulate a “typical”
installation. Regardless, the most accurate determination of a verification interval
would factor in actual, installed conditions. This is due to the fact that differing
levels of process-induced shock and vibration, differing ambient temperature
levels, shock due to use and handling, etc. will impact the performance of
measurement instruments, and hence their associated verification intervals.
Sensors in particular may have vastly different drift rates, and therefore
verification intervals depending upon installation conditions.
Simply “checking” a measurement instrument’s accuracy at an arbitrary
interval (i.e. every week, month, 6 months, etc..) may result in a system that is
operating out of specification for a period of time leading up to the verification
check that revealed a need for verification, or it may lead to additional costs due
to needless (or too frequent) verifications of subsystems that may not drift
appreciably in the time interval used.
Both of these eventualities may be avoided if an analysis is completed
which results in a verification period, which is less than the earliest chance of a
system or component drifting out of acceptable specifications. This maximum
period of time for verification can be part of a preventative maintenance program.
The following general method is a common way to determine verification
intervals.
Step 1: For a given instrument (an RTD for example) utilized in one “grouping” of
installations (e.g. all pipe flow temperature measurements of one velocity and
temperature) perform verification checks of as many systems as possible for a
given period of time (e.g. monthly). For an installation grouping where the
differences between actual and expected readings are relatively constant (i.e.
small standard deviation of differences) a relatively small number of
measurements (e.g. 3 to 5) is required. For installations with larger variations
Page 23
Temp_Verification_RevAD.doc
from reading to reading (i.e. larger standard deviations of differences) a larger
number of readings (>5) would be desirable.
Step 2: Calculate the mean and standard deviation of all differences from the
expected readings for the given grouping. Tabulate a graph of the standard
deviations for the given calibration interval as in Figure 1. Repeat Step 1 for
each grouping at the time interval chosen to complete Figure 1.
Error
(Degrees
or Ohms)
Graph of standard
deviations of differences
between actual and
expected instrument
readings.
1
2
3
4
5
6
Verification Period (Months)
Figure 1
Page 24
Temp_Verification_RevAD.doc
7
8
9
.........
Step 3: Generate a graph similar to Figure 2 where the plot of one, two, and
three standard deviations is presented, as well as a line with the maximum
acceptable error for the given installation.
3 Standard
Deviations
Line representing
maximum
acceptable error
2 Standard
Deviations
Error
(Degrees
or Ohms)
1 Standard
Deviation
1
2
3
4
5
6
Verification Period (Months)
7
8
9
.........
Figure 2
Assuming a normal distribution of the errors, the intersection of the 2
standard deviation curve with the line of maximum error would provide the
recommended verification period for a 95% certainty that the system will remain
within specification between verification periods (in this instance, 9 months). The
intersection of the 3 standard deviation curve with the line of maximum error
would provide a recommended verification period with a 99% certainty that the
system will remain within specification between verification periods (in this
instance, 4 months)
Page 25
Temp_Verification_RevAD.doc
Note that if there is not clear trend in the standard deviation graphs then
there may be an intrinsic problem with the system in question or with the
procedure utilized. This can range from measurement error during the
experiment, to extreme system fluctuations, to a process fluctuating outside of
normal conditions.
The use of this methodology will have to be balanced by the criticality of
the system or reading provided by the instrument in question. If many different
processes (with many different process conditions) exist in an installation, it may
be cost prohibitive to perform the above methodology.
For instance, if non-critical or benign processes exist, these may by checked
every 6 to 12 months. If very critical or very severe processes (high flowrates,
severe process shutdown/startup conditions, shocks to process surges, etc..)
are present then above mentioned methodology is highly recommended for those
particular points. The cost of performing the method for these applications is
outweighed by the benefits of knowing when the system/subsystem is predicted
to go “out-of-spec”. Another variation is to utilize different intersection points
depending on process criticality, i.e. 2 sigma line for less critical systems, and the
3 sigma line for more critical systems, etc.
It is also relevant to mention that once this methodology is completed, it
does not need to be duplicated for a given installation unless a relevant variable
changes (i.e. flow rate, temperature, vibration level, etc..). A library of
experimentally determined verification intervals may be generated once and
reused as required throughout the plant (or in other plants as applicable). Please
note, that a change in sensor design or vendor source may require a new
experimentally-determined verification interval. This necessity is dependent on
how different the sensor designs are from each other. This overall methodology
is very similar to other industrial (and governmental) methods used as part of
Preventative Maintenance (PM) programs.
Page 26
Temp_Verification_RevAD.doc
Appendix C Sensor Verification Background
Sensor Trim: Verification Overview
Sensor trims may be accomplished with either a single point trim or a 2
point trim. A single point trim is fairly typical and most cases adequate, but 2
point trims can provide better ultimate accuracy for critical measurement points.
1 or 2 Point Trim
- A one point trim is sufficient if the transmitter and sensor will be
operating for the majority of it’s uptime around a singular setpoint or a very small
temperature range.
- A one point trim shifts the transmitter’s resistance versus temperature
curve up or down based on the actual reading of the singular point. This is
equivalent to a vertical translation of the sensor curve. The shape and general
slope of the sensor curve is not changed.
- Note that the one point trim is less suitable if the accuracy across wide
temperature ranges is important. Certain skewness may be introduced at the
temperature extremes which could negatively impact measurement readings.
- A two point trim is appropriate when the accuracy of a temperature range
is important (e.g. 50-150°C). This type of a transmitter trim accepts the actual
reading at two different points and appropriately adjusts the sensor curve to
compensate. The resultant effect is the translation AND rotation of the sensor
curve to account for drift.
Page 27
Temp_Verification_RevAD.doc
1) The best method of calibrating a Temperature measurement point to verify the
instruments as a complete system, (i.e. Transmitter + Sensor). This type of a
verification setup ensures that the entire system is characterized as it will be
used in practice. A possible methodology is to take the Temperature assembly
and immerse the sensor (sans thermowell or protection tube) into a wellmaintained verification bath or a suitable dry block verification cell. The user
should then verify the 4-20mA points and trim the upper and lower points as
required.
2) An alternate methodology is to calibrate the Temperature Transmitter as an
individual unit. This method is also valid, and may be a more practical alternative
to system verification. In this instance, the user applies a well-characterized
input (such as a decade box or other resistance source) and again, trims the
upper and lower points of the transmitter.
Page 28
Temp_Verification_RevAD.doc
Appendix D: Transmitter Troubleshooting
SYMPTOM
Transmitter Does Not
Communicate with
HART Communicator
High Output
POTENTIAL
SOURCE
Loop Wiring
Sensor Input Failure
or Connection
Loop Wiring
Power Supply
Electronics Module
Erratic Output
Low Output or No
Output
Loop Wiring
Electronics Module
Sensor Element
Loop Wiring
Electronics Module
CORRECTIVE ACTION
- Connect a HART communicator and
enter the transmitter test mode to isolate
a sensor failure.
- Check for a sensor open circuit
- Check the process variable to see if it is
out of range.
- Check for dirty or defective terminals,
interconnecting pins, or receptacles
- Check the output voltage of the power
supply at the transmitter terminals. It
should be 12.0 to 42.4 VDC (over entire
3.90 to 20.5 mA operating range).
- Connect a HART communicator and
enter the transmitter test mode to isolate
module failure.
- Connect a HART communicator and
check the sensor limits to ensure
verification adjustments are within the
sensor range.
- Connect a HART communicator and
enter the transmitter test mode to isolate
module failure.
- Connect a HART communicator and
enter the transmitter test mode to isolate
a sensor failure.
- Check the process variable to see if it is
out of range.
- Connect a HART communicator and
enter the transmitter test mode to isolate
module failure.
- Connect a HART communicator and
check the sensor limits to ensure
verification adjustments are within the
sensor range.
Page 29
Temp_Verification_RevAD.doc
Appendix E: Notes to Transmitter and Sensor Verification
Potential Verification Issues
- The use of portable “calibrators” is very widespread in today’s industrial
environment. These devices are often able to provide a very accurate, simulated
sensor input into a Temperature transmitter (whether resistance for RTD’s or
millivolts for Thermocouples). For Thermocouples, the millivolts mode of many
calibrators will also enable a built-in CJC (cold junction compensation) feature.
It should be noted that while Rosemount Temperature transmitters operate under
a “pulse current” environment, many commercially available Calibrators will only
operate when fed with “constant current”. If this is the case with a particular
verification setup, the operator may switch the Temperature transmitter into
“Active Calibrator Mode” to engage the constant current feature of the
transmitter. This may be accomplished with a Model 375/475 Communicator or
through the AMS (Asset Management Solutions) system. Also note that even if
constant current mode is enabled, your particular calibrator may still not operate
correctly if there are discrepancies in current between the transmitter and the
calibrator. Unstable transmitter readings may result.
- For system verifications (i.e. sensor attached to transmitter), users must be
cognizant of accuracy issues. When a sensor and transmitter combination is
calibrated (for instance in a Calibrator) unwanted EMFs (i.e. voltages) may be
introduced, which will affect the verification accuracy. Errors of approx 0.1C for a
0 to 100C span have been detected. While under normal operation the
transmitter will compensate for these EMFs, but during the verification trim
operation, the transmitter will automatically engage Active Calibrator Mode. This
mode does not allow the transmitter to compensate for EMFs.
General Note on Transmitter / Sensor Verification
In general, the sensor portion of a temperature measurement system will
display more drift than the transmitter. Therefore, while transmitter verification
and reverification is important, special care and attention should also be focused
on the sensor.
While it may or may not be less time consuming to recalibrate a sensor
and transmitter as a singular unit, more accurate results can oftentimes be
obtained if the units are calibrated separately.
Verification Equipment - General
In general, the combined uncertainty of a verification setup should be at
least 4x less than the expected values for the verification. This total verification
measurement uncertainty can be calculated by RSS’ing (Root Sum of Squares)
Page 30
Temp_Verification_RevAD.doc
the component uncertainties such as: verification bath, dry block calibrator,
standard thermometers, meters, etc.
Verification Equipment – Input Accuracy
A common issue with single or 2 point trims, is the use of verification
sensor inputs that are not accurate enough for the overall
verification/measurement accuracy desired. Use of furnaces, industrial sensors,
and active calibrators may introduce measurement errors (such as unwanted
EMFs) or may just not be accurate enough for the intended application. High
quality decade boxes or fixed resistors may be used to avoid these issues.
Page 31
Temp_Verification_RevAD.doc
Download