Beruflich Dokumente
Kultur Dokumente
_____________________________________________________________________________
Module 9: PRINCIPLES OF METROLOGY
_____________________________________________________________________________
GENERAL METROLOGY
Metrology is the science of measurement. The word metrology derives from two
Greek words: metron (meaning measure) and logia (meaning logic), with today’s
sophisticated industrial climate the measurement and control of products and
processes are critical to the total quality effort. Metrology encompasses the
following key elements:
BASE SI UNITS
While all other countries of the world have completely accepted and converted to
the metric system, the United States has not. Some industries, such as scientific
and medical practitioners, are almost completely metric. The automobile industry
used a great deal of metric measurements in their internal manufacturing
systems, but communicates with the public in inches and pounds. The beverage
bottling industries is completely committed to both system, offering spirits in both
liter and ounce containers. Land measurements are exclusively in English units.
Currently many manufacturing companies must deal with both English and the
International System of Units (SI). Certain machinery operations used metric
measurements and other areas use English units. Engineers, inspector, and
technician must be able to convert measurement from metric to English and from
English to metric. The seven (SI) units of measure are shown in Table 1.
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 2 of 9
_____________________________________________________________________________
Most of the world is committed to use the SI units. Further description of the
seven fundamental SI units is as follows:
1. Length (meter) - The meter is the length of the path traveled by light in
vacuum during the time interval of 1/299 792 458 of a second. The speed of
light is fixed at 299 792 458 m s-2.
2. Time (second) – The second is defined as the duration of 9 192 631 770
periods of the radiation corresponding to the transition between the two
hyperfine levels of the ground of the cesium133 atom.
4. Electric current (Ampere). The ampere is the constant flow of electric charge
that, if maintained in two straight parallel conductors of infinite length, of
negligible circular cross section, and placed one meter apart in vacuum
would produce between these conductors a force equal to 2 x 10 -7 Newton
per each meter of length.
DERIVED SI Units
In addition to the seven base units, a large number of other useful units have
been defined in the SI. These derived units are based on elementary physics and
can be calculated from ordinary formulas relating to the base unit of quantities.
For additional unit details, please refer to IEEE/ASTM SI 10 (1997) and NIST
Special Publication 811 (Taylor, 1995).
this process is that it confirms that the UUT is working correctly and that the
result it measures is correct.
What does it mean for the measurement result to be correct? Earlier talked about
the SI base and derived units and how they are standardized either in based on
physical experiment or based on consensus that a certain object (artifact) is
defined as the standard. Either of those approaches is called a realization of the
unit.
A correct measurement result is one that is the same as would be found when
using the realization of the units as a reference. If every laboratory has access to
the grand master prototype kilogram, to a Josephson array for voltage, or to an
iodine stabilized laser for length, and if those standards were used for every
measurement, traceability would be automatically achieved.
This approach is neither practical nor necessary. Most calibration do not require
the extremely low uncertainty associated with using a national or intrinsic
standard. Such devices are also often difficult to use and not portable. Some of
them require liquid nitrogen or liquid helium for cooling. None of them are cheap.
What has been developing to address this issue is a calibration pyramid. The
realization of units and grand master standards are kept at and maintained by the
National Measurement Institute (NMI) in each country and sometimes also by
few private or other government organizations. These realizations are used to
calibrate master standards that constitute the everyday working standards for the
NMI. Customers with the highest level of routine reference standards have them
calibrated at the NMI and those devices are then used as masters in the
calibration of equipment further down the structure.
This is seen as a pyramid because the lower levels of the structures are larger.
Each level can used to calibrate dozen or hundreds of other devices, so there are
many more working standards or tools lower down, and fewer of them higher up.
So the relationship between local measurements and the top of the pyramid is
definitely established as to both value and uncertainty.
care of ensuring that their results and reports are correct. No evaluation by their
customer is needed.
Intrinsic or
International
Standards
National Standards
Reference Standards
Master Standards
Working Standards
The process by which standards at each level are quantitatively related to the
next higher level is called traceability. Each industry, each situation, may have
slightly different definitions of each type of standards and the meaning of the
various designations.
The equipment that realizes the value of the standard does so in some sort of
scientific experiment, usually in physics. It is possible, of course, for the
equipment to malfunction or to be operated incorrectly and to therefore to yield
different value. All intrinsic standards have a part of their definition various
indicators, such as a pattern on an oscilloscope that can be inspected to
determine correct operation. In addition, labs that operate intrinsic standards are
still required to subject their measurements to interlaboratory comparisons – just
to make sure that their use of the intrinsic standards is comparable to other labs
attempting the same measurement.
Practical considerations dictate that only the highest quality standards and
artifacts will be accepted by an NMI for direct comparison. Usually, further
dissemination of measurements values to lower levels in the pyramid is left to
private or other government organization.
Master Standards – A large laboratory will typically keep one reference standard
carefully stored, and use it only to maintain the traceability relationship between
the work of that lab and the next higher echelon. Such a standard (say a set of
gage blocks) is expensive and of high quality, and will be used infrequently to
minimize damage from wear, handling, accidents. Since higher echelon
calibration may be time consuming, the reference standard may not always be on
site. This is the next level down from the reference standard. It is use for best
routine work the lab performs, and is also used within the lab to calibrate the next
lower level of standard. The master set is not typically “sent out” but rather is
compared to the labs reference standard for its traceability. A smaller
organization or one with smaller workload for a parameter will skip this step and
use its reference standard as the master.
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 6 of 9
_____________________________________________________________________________
Working standard (or working master) – The working standard is used every day
for direct comparison to each customer’s unit under test. A laboratory will
calibrate its own working standards against its master or reference standard.
Working standards experience the most wear and abuse, since they are
constantly being handled or transported. They are also generally of lower quality
(higher uncertainty) than the higher echelon, but must still good enough to
provide an effective standard for the UUT.
CALIBRATION SYSTEM
This requirement is not difficult if the unit being calibrated (often called TI for test
instrument) is always the same and if the reference standard and other test
equipment being used, is always the same. When this is true, the instruction can
be very specific- referring to each control and display and accompanied by
pictures – so that little uncertainty remains exactly how to carry out the work
repeatedly.
In fact, there are many different test instruments (TIs) and so many possible
configurations of calibration standards and instruments that it would practically
impossible to write detailed instruction for every combinations. Procedures are
often written in a more general fashion so they will apply properly to a range of
possible equipment. Most of the effort that goes into writing calibration
procedures is concentrated on designing method that can be described
accurately and still cover a wide range of circumstances.
ISO/IEC 17025 specifies several possible sources for test and calibration
methods. It is quite clear that there is an order of preference when choosing
source. This list is expanded here to include other possibilities, in descending
order of precedence.
The following order is correct for United States laws and practices ONLY. Each
country will have a different priority.
There are some important rules that apply to the use of any procedure:
The latest applicable edition of the procedure must be used (there may
be later revision that do not apply to the specific instrument or method).
All measurements must be traceable to National Standards where
possible.
Many times, a regulatory agency will specify the details of a test method
when the result affects the public health and safety.
The American Society for Testing Materials (ASTM) is the big name in
this field. It publishes thousand of methods, many of which include
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 8 of 9
_____________________________________________________________________________
Standards
Calibration
Design
Stability
Bias
Thermal Equilibrium
Recalibration
Reported Uncertainty
Damage and Cleanliness
Facility
Cleanliness
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 9 of 9
_____________________________________________________________________________
Space
Air Current
Security
Vibration
Environment (T,P,H)
Power
Equipment
Correction Equipment
Heat Source
Maintenance
Suitability
______________________________________________________________________________________
Rev-2007