Sie sind auf Seite 1von 9

Instrumentation & Control Accelerated Training Program page 1 of 9

_____________________________________________________________________________
Module 9: PRINCIPLES OF METROLOGY
_____________________________________________________________________________

GENERAL METROLOGY

Metrology is the science of measurement. The word metrology derives from two
Greek words: metron (meaning measure) and logia (meaning logic), with today’s
sophisticated industrial climate the measurement and control of products and
processes are critical to the total quality effort. Metrology encompasses the
following key elements:

 Establishment of measurement standards and procedures for their use.


Establishment of national and international agreements for these
standards.
 Design of measurement equipment and process to allow comparisons of
unknown items with established standards. Design of equipment and
procedures for the regular calibration, monitoring and maintenance of
these processes.
 Design of measurement process and system to support commerce and
trade, and the deployment of those efforts to regulatory and service
agencies, to businesses, and to public in general.
 Design of measurement processes and system to support inspection and
test effort in manufacturing, distribution, and trade.
 Education of practitioners and of the general public in the principles of
and need for metrology.

BASE SI UNITS

While all other countries of the world have completely accepted and converted to
the metric system, the United States has not. Some industries, such as scientific
and medical practitioners, are almost completely metric. The automobile industry
used a great deal of metric measurements in their internal manufacturing
systems, but communicates with the public in inches and pounds. The beverage
bottling industries is completely committed to both system, offering spirits in both
liter and ounce containers. Land measurements are exclusively in English units.

Currently many manufacturing companies must deal with both English and the
International System of Units (SI). Certain machinery operations used metric
measurements and other areas use English units. Engineers, inspector, and
technician must be able to convert measurement from metric to English and from
English to metric. The seven (SI) units of measure are shown in Table 1.

Characteristics Fundamental Units


Length Meter
Mass Kilogram
Time Second
Electric current Ampere
Thermodynamic Kelvin
temperature
Luminous intensity Candela
Amount of substance Mole

______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 2 of 9
_____________________________________________________________________________

Most of the world is committed to use the SI units. Further description of the
seven fundamental SI units is as follows:

1. Length (meter) - The meter is the length of the path traveled by light in
vacuum during the time interval of 1/299 792 458 of a second. The speed of
light is fixed at 299 792 458 m s-2.

2. Time (second) – The second is defined as the duration of 9 192 631 770
periods of the radiation corresponding to the transition between the two
hyperfine levels of the ground of the cesium133 atom.

3. Mass (kilogram) – is equal to the mass of the international prototype which is


a cylinder of platinum-iridium alloy kept by the International Bureau of
Weights and Measures at Sevres, France.

4. Electric current (Ampere). The ampere is the constant flow of electric charge
that, if maintained in two straight parallel conductors of infinite length, of
negligible circular cross section, and placed one meter apart in vacuum
would produce between these conductors a force equal to 2 x 10 -7 Newton
per each meter of length.

5. Temperature (Kelvin) – The Kelvin is the fraction 1/273.16 of the


thermodynamic temperature of the triple point of water. The triple point of
water is 273.16 K (0.01 ˚C).

6. Light (candela). The candela is defined as the luminous intensity, in a given


direction, of a source that emits monochromatic radiation of frequency 540 x
10 12 hertz and has a radiant intensity in that direction of 1/683 watt per
steradian.

7. Amount of substance (mole). The mole is the amount of substance which


contains as many elementary entities as there are atoms in 0.012 kilogram of
carbon 12.

DERIVED SI Units

In addition to the seven base units, a large number of other useful units have
been defined in the SI. These derived units are based on elementary physics and
can be calculated from ordinary formulas relating to the base unit of quantities.

For additional unit details, please refer to IEEE/ASTM SI 10 (1997) and NIST
Special Publication 811 (Taylor, 1995).

PRINCIPLES AND PRACTICE OF TRACEABILITY

The international vocabulary of Basic and General Terms in Metrology defines


traceability as “property of a measurement or the value of a standard whereby it
can be related to stated reference, usually national or international standards
through an unbroken chain of comparisons all having stated uncertainties.” (ISO
VIM, 1993, definition 6.10)

So what is traceability? A typical measurement is a calibration laboratory is


performed by making a comparison between a reference standard and an
unknown unit under test (sometimes called a UUT or TI). A narrow view of this of
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 3 of 9
_____________________________________________________________________________

this process is that it confirms that the UUT is working correctly and that the
result it measures is correct.

What does it mean for the measurement result to be correct? Earlier talked about
the SI base and derived units and how they are standardized either in based on
physical experiment or based on consensus that a certain object (artifact) is
defined as the standard. Either of those approaches is called a realization of the
unit.

A correct measurement result is one that is the same as would be found when
using the realization of the units as a reference. If every laboratory has access to
the grand master prototype kilogram, to a Josephson array for voltage, or to an
iodine stabilized laser for length, and if those standards were used for every
measurement, traceability would be automatically achieved.

This approach is neither practical nor necessary. Most calibration do not require
the extremely low uncertainty associated with using a national or intrinsic
standard. Such devices are also often difficult to use and not portable. Some of
them require liquid nitrogen or liquid helium for cooling. None of them are cheap.

What has been developing to address this issue is a calibration pyramid. The
realization of units and grand master standards are kept at and maintained by the
National Measurement Institute (NMI) in each country and sometimes also by
few private or other government organizations. These realizations are used to
calibrate master standards that constitute the everyday working standards for the
NMI. Customers with the highest level of routine reference standards have them
calibrated at the NMI and those devices are then used as masters in the
calibration of equipment further down the structure.

This is seen as a pyramid because the lower levels of the structures are larger.
Each level can used to calibrate dozen or hundreds of other devices, so there are
many more working standards or tools lower down, and fewer of them higher up.

If a calibration takes place without using a grand master or intrinsic standards as


the reference, how is the reference value known? It is known through traceability.

Traceability is achieved by a formal description of how the value of the local


working master is established in reference to national standards. There are three
basic requirements:
 An unbroken chain of comparisons
 That connects the local reference to national standards where possible
 Each comparison in the chain having a stated measurement value and
its associated uncertainty.

So the relationship between local measurements and the top of the pyramid is
definitely established as to both value and uncertainty.

CERTIFICATES ISSUED BY NMI

Any calibration result provided directly by an NMI is automatically considered


traceable without further validation as long as it is accompanied by a statement
of uncertainty. Note also that even though some clients may attempt to require it,
National Measurement Institute cannot be accredited and most will not allow
audits from their customers. NMIs have a world wide system of interlaboratory
comparisons, international meetings, and scientific visiting committees that take
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 4 of 9
_____________________________________________________________________________

care of ensuring that their results and reports are correct. No evaluation by their
customer is needed.

TYPES OF MESUREMENT STANDARDS

There is an inherent hierarchy of standards by which every few standards of the


highest accuracy and stability are used to periodically calibrate standards of
slightly lesser properties. One highest level standard may serve as reference for
tens or even hundreds of standards at the next lower level. This process is
repeated as necessary, so that by the time the pyramid is traverse to the level of
the customers equipment (unit under test) being calibrated, serviced, repaired,
there are enough standards available so that every laboratory, even every test
bench or traveling suitcase can have one.

Intrinsic or
International
Standards

National Standards

Reference Standards

Master Standards

Working Standards

Unit under Test

The process by which standards at each level are quantitatively related to the
next higher level is called traceability. Each industry, each situation, may have
slightly different definitions of each type of standards and the meaning of the
various designations.

International Standards – A standard agreed upon by many countries to serve as


their mutual highest level reference. Most International Standards are generally
accepted by all industrialized countries in the world. These standards are kept at
the International Bureau of Weights and Measures in Sevres France, a suburb of
Paris. As of to date, mass is the only SI primary unit represented by a physical
artifact maintained at Sevres. Typically, International Standards are used only for
intercomparison with National Standards.

Intrinsic Standards – A standard that is realized as the result of correct operation


of a scientific apparatus. The value of an intrinsic standard is inherent property of
nature and is therefore unchanging over time and space. Intrinsic standards are
defined – that is the numerical value assigned t the physical quantity is a matter
of agreement and therefore may change if the governing body decides to do so.

An intrinsic standard (e.g. Josephson Array Voltage Standard and the


Iodine-Stabilized Helium-Neon Laser Length Standard) is based on well- Working Standards
characterized laws of physics, fundamental constant of nature, or variant
______________________________________________________________________________________
Rev-2007 Unit Under Test
Instrumentation & Control Accelerated Training Program page 5 of 9
_____________________________________________________________________________

properties of materials and make ideal, stable, precise, and accurate


measurement standards if properly designed, characterized, operated, monitored
and maintained.

The equipment that realizes the value of the standard does so in some sort of
scientific experiment, usually in physics. It is possible, of course, for the
equipment to malfunction or to be operated incorrectly and to therefore to yield
different value. All intrinsic standards have a part of their definition various
indicators, such as a pattern on an oscilloscope that can be inspected to
determine correct operation. In addition, labs that operate intrinsic standards are
still required to subject their measurements to interlaboratory comparisons – just
to make sure that their use of the intrinsic standards is comparable to other labs
attempting the same measurement.

National Standards – A standard maintained as the agreed reference for a


country or occasionally a group of consortium of countries. National standards
are typically maintained by government organizations known as National
Measurement Institute (NMI). In the Philippines, the NMI is the Industrial
Technology Development Institute-DOST. An NMI will maintain national
standards as both artifacts and intrinsic standards, and typically provide
calibration services to its citizen and corporation by comparison to these national
standards.

Practical considerations dictate that only the highest quality standards and
artifacts will be accepted by an NMI for direct comparison. Usually, further
dissemination of measurements values to lower levels in the pyramid is left to
private or other government organization.

Reference Standards – An item of the highest metrological quality for a given


parameter located at a site where calibration is being done. A reference standard
is defined as: “A standard, generally of the highest metrological quality available
at a given location, from which measurements made at that location are derived”
ISO VIM, 1993, definition 6.08).

A commercial, industrial, or government lab will have one or more reference


standards for each parameter, and these are usually very high quality. Note,
though, that the definition includes field calibration work and as such, the
reference standard at a job site does not have to be so, and ma not be,
something recognizable as a top-quality device. In some cases, a steel ruler
might be the best length standard available, at which point it is the reference
standard at the time and place.

Master Standards – A large laboratory will typically keep one reference standard
carefully stored, and use it only to maintain the traceability relationship between
the work of that lab and the next higher echelon. Such a standard (say a set of
gage blocks) is expensive and of high quality, and will be used infrequently to
minimize damage from wear, handling, accidents. Since higher echelon
calibration may be time consuming, the reference standard may not always be on
site. This is the next level down from the reference standard. It is use for best
routine work the lab performs, and is also used within the lab to calibrate the next
lower level of standard. The master set is not typically “sent out” but rather is
compared to the labs reference standard for its traceability. A smaller
organization or one with smaller workload for a parameter will skip this step and
use its reference standard as the master.

______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 6 of 9
_____________________________________________________________________________

Working standard (or working master) – The working standard is used every day
for direct comparison to each customer’s unit under test. A laboratory will
calibrate its own working standards against its master or reference standard.
Working standards experience the most wear and abuse, since they are
constantly being handled or transported. They are also generally of lower quality
(higher uncertainty) than the higher echelon, but must still good enough to
provide an effective standard for the UUT.

CALIBRATION SYSTEM

CALIBRATION PROCEDURES - The international standard for calibration labs.


ISO/IEC 17025 (1990), states: “The laboratory shall use appropriate methods
and procedures for all test and/or calibration within its scope”. In fact, this
requirement does not state that the method must be documented although nearly
everyone uses written procedures. The United States military, in particular, has
operated a highly successful calibration program for many years that depends on
having good documented procedures and ensuring that they are followed. This
produces a high degree of uniformity in the resulting work and also makes it
possible to develop corps of trained technician.

In its simplest form, a calibration procedure is a step-by-step description of what


must be done in order to carry out proper calibration. It should be possible,
following the instruction, to come out with repeatable results when the procedure
is conducted many times.

This requirement is not difficult if the unit being calibrated (often called TI for test
instrument) is always the same and if the reference standard and other test
equipment being used, is always the same. When this is true, the instruction can
be very specific- referring to each control and display and accompanied by
pictures – so that little uncertainty remains exactly how to carry out the work
repeatedly.

In fact, there are many different test instruments (TIs) and so many possible
configurations of calibration standards and instruments that it would practically
impossible to write detailed instruction for every combinations. Procedures are
often written in a more general fashion so they will apply properly to a range of
possible equipment. Most of the effort that goes into writing calibration
procedures is concentrated on designing method that can be described
accurately and still cover a wide range of circumstances.

INDUSTRY PRACTICE AND REGULATIONS

ISO/IEC 17025 specifies several possible sources for test and calibration
methods. It is quite clear that there is an order of preference when choosing
source. This list is expanded here to include other possibilities, in descending
order of precedence.

The following order is correct for United States laws and practices ONLY. Each
country will have a different priority.

From a legal perspective, the order is:


 OIML standards if adopted by law (OIML is the international Organization
for Legal Metrology. It writes standards mostly for weights and measures.
 NIST standards if adopted by law.
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 7 of 9
_____________________________________________________________________________

 ASTM standards if adopted by law


 Individual States law
 Methods required by regulatory agency

From a technical perspective, the order is:

 Methods required by treaty, law, or organization


 Methods specified by the client purchasing the service
 Methods specified by a National Measurement Institute
 Methods that have been published either in international, regional or
national standard.
 Methods published by reputable technical organization
 Methods published in relevant scientific text or journals
 Methods specified by the manufacturer of the equipment
 Laboratory-develop methods or methods adopted by the laboratory may
also be used if they are appropriate for intended used and if they are
validated

There are some important rules that apply to the use of any procedure:
 The latest applicable edition of the procedure must be used (there may
be later revision that do not apply to the specific instrument or method).
 All measurements must be traceable to National Standards where
possible.

To follow are some additional details about each method source:

 Methods specified by international treaty


The International Organization of Legal Metrology (OIML) published a
range of standard practices, mostly in weights and measures.

 Methods required by a regulatory agency.

Many times, a regulatory agency will specify the details of a test method
when the result affects the public health and safety.

 Methods specified by the client purchasing the services


Most calibration works is purchased by clients who expect that the
laboratory will be expert in the relevant areas, so this requirement is
mostly for testing. Still, there are cases where an organization has written
its own procedures and will expect its own laboratory employees or those
from a contracting laboratory to carry them out. A contracting laboratory
shall inform the client when the method proposes by the client is
considered to be inappropriate or out of date.

 Methods published by reputable technical organizations

There are many standard-writing bodies. Often, these committees are


convened by a professional society such as ASQ. Sometimes,
procedures written by these bodies become national or even
international standards. Sometimes they remain simply as published by
the organization.

The American Society for Testing Materials (ASTM) is the big name in
this field. It publishes thousand of methods, many of which include

______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 8 of 9
_____________________________________________________________________________

calibration methods. Of special note to metrologist is their temperature


handbook, which list a wide range of thermometric standards.

 Methods specified by the manufacturer of the equipment

Many labs consider the procedures published by the manufacturer of a


test instrument or system to be the best available. Yet, the standard and
accreditation community has placed them next-to-last, at other times,
instructions are non-existent. Sometimes, they have proven incomplete
or even incorrect as well. For this reason, it is inappropriate to have a
ruling that automatically accepts manufacturer’s procedures.

 Laboratory-developed methods or methods adopted by the laboratory


may also be used if they are appropriate for the intended use and if they
are validated.

Validation is a potentially complex process by which the suitability of a


procedure is evaluated by a defined, documented, scientific approach.

Very few calibration require a laboratory to generate its own procedures,


since an extraordinarily wide variety of published procedures are
available.

It is permitted, however, for a lab to copy published procedures into its


own format as long as the technical content and the hearth of the
procedure are unchanged. The lab can modify the method with
instruction for completing paperwork, cleaning and preparation, etc. and
place the whole thing on their own stationery and forms.

SOURCES OF VARIABILITY IN MEASUREMENT – THERE ARE a number of


sources of variability in measurements. It is important to identify, eliminate or
minimize the variability in calibration labs to make as accurate and as precise
measurement. Some of these are as follows:

Artifacts (Items to be calibrated)


 Cleanliness
 Stability
 Design
 Packaging
 Handling
 Source

Standards
 Calibration
 Design
 Stability
 Bias
 Thermal Equilibrium
 Recalibration
 Reported Uncertainty
 Damage and Cleanliness

Facility
 Cleanliness
______________________________________________________________________________________
Rev-2007
Instrumentation & Control Accelerated Training Program page 9 of 9
_____________________________________________________________________________

 Space
 Air Current
 Security
 Vibration
 Environment (T,P,H)
 Power

Equipment
 Correction Equipment
 Heat Source
 Maintenance
 Suitability

Staff and Procedure


 Choice, documented procedures, deviation
 Integrated Measurement Control
 Uncertainty
 Method of Calculation
 Mistakes
 Care
 Attitude
 Management
 Capability and experience

--- end ---

______________________________________________________________________________________
Rev-2007

Das könnte Ihnen auch gefallen