Sie sind auf Seite 1von 6

Basic Principles of Instrument Calibration

instrumentationtoolbox.com/2012/12/understanding-instrument-calibration.html

Every instrument has at least one input and one output. For a pressure sensor, the input would
be some fluid pressure and the output would (most likely) be an electronic signal. For a variable-
speed motor drive, the input would be an electronic signal and the output would be electric
power to the motor. To calibrate an instrument means to check and adjust (if necessary) its
response so the output accurately corresponds to its input throughout a specified range

Calibration is one exercise that is often taken for granted within an industrial plant. Even the
most important industrial equipment will become useless if

it is not calibrated. Through the process of calibration, adjustments are made to a piece of
equipment or device to ensure that it performs as expected to deliver predictable, accurate and
reliable results that meet quality standards. Adjustments made during calibration must fall
within certain tolerances. Such tolerances represent very small, acceptable deviations from the
equipment’s specified accuracy

Definition of Calibration.

Instrument calibration can be defined in several ways. Put simply, calibration is the process of
adjusting an instrument or equipment to meet the manufacturer’s specifications.

Calibration can also be defined as the process of issuing data including a report or certificate of
calibration that assures an end user of a product’s conformance with its specifications.

To the instrument engineer or technician, calibration is the process of determining the


relationship between the values of the quantity being measured and that indicated on a
measuring instrument. The calibration of an instrument can be carried out by comparing the
readings on the instrument with those given by a reference instrument or calibrator. From time
to time, the manufacturer’s reference instruments are sent to a calibration center to be
calibrated against national standards.

1/6
When an instrument is purchased, the manufacturer’s calibration data is generally supplied.
Most instrument manufacturers have sets of reference instruments against which all
instruments they produce are calibrated

Why Calibrate an Instrument?

Virtually all equipment degrades in some fashion over time, and electronic equipment, a
mainstay of today’s manufacturing process, is not an exception. As components age, they lose
stability and drift from their published specifications. Even normal handling can adversely affect
calibration, and rough handling can throw a piece of equipment completely out of calibration
even though it may appear to be okay physically. Continuing calibration assures the equipment
continually meets the specification required at installation and it should be checked frequently
thereafter. Calibration is required after any maintenance to ensure that the equipment still
conforms to the required calibration data

A well designed and organized calibration program often leads to benefits in quality,
productivity and increased revenue.
How Often Should We Calibrate?

This can vary greatly within an industry or a plant. The manufacturer usually does the initial
calibration on its equipment. Subsequent calibrations are to be done by the end user or by the
manufacturer as it were. The frequency of recalibration will vary with the type of equipment and
the prevailing conditions where the equipment is applied. Deciding when to recalibrate an
instrument depends mainly on how well the equipment performs in the application.

As a rule, however, re-calibration should be performed at least once a year. In more critical
applications however, the frequency will be much greater.

Common Terms Used in Instrument Calibration:

Calibration Range

The calibration range of an instrument is defined as the region between the limits within which a
quantity is measured, received or transmitted, expressed by stating the lower (LRV) and
upper(URV) range values. These limits are defined by the Zero and Span values. The zero value
is the lower end of the range or LRV and the upper range value is the URV. For example if an
instrument is to be calibrated to measure pressure in the range 0psig to 400psig, then LRV = 0
and the URV = 400psig. The calibration range is therefore 0 to 400psig.

Span

Span is defined as the algebraic difference between the upper and lower range values.
2/6
Span = URV – LRV

For the example considered above, where the calibration range is 0 to 400psig. Then our span =
400 – 0 = 400psig.

Instrument Range

The instrument range refers to the capability of the instrument. It is often the nameplate rating
of the instrument. For example an instrument nameplate rating may read: Instrument range 0 -
800psig; Output 4 to 20mA.

Never confuse the instrument range with the calibration range. They are two different things.
Although our instrument range is 0 – 800psig, we may decide to calibrate it to a range 0 –
400psig or even 0 – 800psig for an application with high input pressure in which case the
instrument range becomes the calibration range of the device.

Ranging an Instrument

To range an instrument means to set the lower and upper range values so it responds with the
desired sensitivity to changes in input. Suppose we want to use a pressure transmitter to
measure pressure in the range 0 -100 bar to give an output of 4 – 20mA. To range this
transmitter, we simply set:

0 bar = 4mA

100 bar = 20mA

Closely related to ranging is re-ranging which simply means resetting the lower and upper range
values to a different measurement range. For example, suppose we want to re-range the above
transmitter to now measure pressure in the range 50 – 150 bar we simply reset as follows:

50 bar = 4mA

150 bar = 20mA.

Zero and Span Adjustments

Zero and Span Adjustments are commonly done on analog and smart instruments. By adjusting
both zero and span, we may set the instrument for any range of measurement within the
manufacturer’s limits. For most analog instruments, zero and span adjustments are interactive.
3/6
That is, adjusting one has an effect on the other. Specifically, changes made to the span
adjustment almost always alter the instrument’s zero point. An instrument with interactive zero
and span adjustments requires much more effort to accurately calibrate, as one must switch
back and forth between the lower- and upper-range points repeatedly to adjust for accuracy

For smart instruments however, there is no interaction between the zero and span adjustments.

Five Point Calibration

When calibrating an instrument, as a general rule, the instrument data points should include
readings taken at 0%, 25%, 50%, 75% and 100% of the calibration range of the instrument. This
is often referred to as a five-point calibration. During a five-point calibration exercise, both
upscale(increasing) and down scale(decreasing) testing should be done to determine the
repeatability and hysteresis of the particular instrument.

Field Calibration

In field calibration, the instrument is not removed from the process. In fact it remains in its
mounting brackets. Field calibration allows the field instrument to be tested or calibrated at the
true process and ambient conditions. Calibration done under field conditions is often very
different from those done under shop conditions and they even produce different calibration
results. Most field instruments have isolating valve manifold that make it easy to disconnect
them from the process. After disconnection, the instrument is vented to the atmosphere before
the test or calibration signal is applied.

In-Shop or Bench Calibration

A bench calibration is a procedure where the instrument is calibrated at a calibration bench


using calibration devices to simulate the process, rather than calibrating the device in the field
using the actual process itself as the input means. Here, the instrument is disconnected from
the process, cleaned and taken to the shop where it is mounted on a test stand at a calibration
bench.

Bench Tester

A bench tester is used for carrying out bench calibration of an instrument or device. It consists of
a highly accurate standard gauge and a pressure source for producing test pressure required
for testing the instrument. Most bench testers are fabricated on the job site by instrument
technicians, while some are ordered as complete systems from vendors. A standard bench
should have various hoses and pumps that are well labelled and organized to aid technicians in
the calibration process.

4/6
Calibrators

Calibrators are used to calibrate instruments that require calibration. They vary in form and
function with the equipment or device they are designed to calibrate. Typical calibrators include:

(a) Block calibrator and fluidized baths are used to calibrate temperature probes –RTDs,
Thermocouples etc.

(b) Signal Reference is used to calibrate panel meters and temperature controllers. It is a type of
calibrator that can generate a known electrical signal. There are voltage, current, and frequency
signal references. Once a signal from one of these calibrators is fed into the equipment in
question, the display or output value of the equipment can be adjusted until it matches the
known signal.

The simulator, a special kind of signal reference, generates sensor output. Signal references and
simulators can often read as well as generate signals.

(c) Pneumatic Calibrators. These are calibrators which provide a regulated pressure regime
required to test or calibrate pressure instruments. They are often used in conjunction with a
pressure source.

Calibration Records

Calibration records are the documentation that is done to ensure that the history of the device
or instrument is not lost. It also aid in troubleshooting any drift in the instrument’s performance
over time. Calibration records should show:

(a) The as found data

(b) The current calibration date

(c) The final calibration or as left data

(d) The name or initials of the technician who did the calibration

(e) The date the instrument is due for the next calibration

As Found Data

The as found data of an instrument to be calibrated is the response (reading) from the device at

5/6
the points of calibration (0%, 25%, 50%, 75% and 100%) before the actual calibration exercise
begins.

As Left Data

The as left data of an instrument is the response (reading) from the device at the points of
calibration (0%, 25%, 50%, 75% and 100%) after the instrument has been calibrated.

Traceability

All calibrations should be performed traceable to a nationally or internationally recognized


standard. Traceability is defined as the property of a result of a measurement whereby it can be
related to appropriate standards, generally national or international standards, through an
unbroken chain of comparisons. This means that the calibrations performed are traceable to a
national or international standard. In the U.S, we have NIST as a national standard. The National
Institute of Standards and Technology (NIST), part of the U.S. Department of Commerce,
oversees the development of measurement standards and technology consistent with the
International System of Units (SI).

Traceability is achieved by ensuring that the test standards we use for calibration operations are
regularly calibrated by higher level reference standards. Typically

the measurement standards we use in a workshop are sent out periodically to a standards
laboratory which has more accurate test equipment. The standards from the calibration
laboratory are in turn periodically checked for calibration by higher level standards, and so on
until eventually the standards are tested against Primary Standards maintained by NIST or
another internationally recognized standard.

6/6

Das könnte Ihnen auch gefallen