Sie sind auf Seite 1von 50



Se Co Ma Tr Pro Pre

Primary sensing element

Variable conversion element
Variable manipulation element SYSTEM /
Data transmission element ELEMENTS OF A
Data processing element MEASURING
Data presentation element
Accuracy / Precission
Precision Vs Accuracy
Factors affecting the accuracy of measuring

There are various factors affecting the accuracy of the

measuring system. They are
1. Factors affecting the standard of measurement
2. Factors affecting the work piece to be measured
3. Factors affecting the inherent characteristics of
4. Factors affecting person
5. Factors affecting environment
1. Factors affecting the standard of measurement:
a) Co-efficient of thermal expansion
b) Elastic properties
c) Stability with time
d) Geometric compatibility
2. Factors affecting the work piece to be measured:
a) Co-efficient of thermal expansion
b) Elastic properties
c) Arrangement of supporting work piece
d) Hidden geometry
e) Surface defects such as scratches, waviness, etc.
3. Factors affecting the inherent characteristics of
a) Repeatability & readability
b) Calibration errors
c) Effect of friction, backlash, etc
d) Inadequate amplification for accuracy objective
e) Deformation in handling or use
4. Factors affecting person:
a) Improper training / skill
b) Inability to select proper standards /instruments
c) Less attitude towards personal accuracy
5. Factors affecting environment:
a) Temperature, humidity, atmospheric pressure, etc.
b) Cleanliness
c) Adequate illumination
d) Heat radiation from lights / heating elements
Precision vs Accuracy

Repeatability Problem: Reproducibility Problem:

The same person (or station) Different people (or stations)
cant get the same result cant agree on the result
twice on the same subject obtained on the same subject
Within Inspector Error between Inspector Error
Repeatability Reproducibility
Ability of the same gage to Ability of the same gage to
give consistent give consistent measurement
measurement readings no readings regardless of who
matter how many times performs the
the same operator of the measurements.
gage repeats the The evaluation of a gage's
measurement process. reproducibility, therefore,
requires measurement
readings to be acquired by
different operators under the
same conditions.

Of course, in the real world, there are no existing gages or measuring devices
that give exactly the same measurement readings all the time for the same
parameter. Ex: Cloning
Performance Characteristics
Static / Dynamic Characteristics
Selection of Measuring Instruments
Accuracy: Quantitative measure of the degree of
conformance to recognized national or
international standards of measurement.
Precision: The ability of the instrument to reproduce
its readings or observation again and again for
constant input signal
Resolution: It is the smallest change in a measured
value that the instrument can respond
Sensitivity: It is the ratio of the change in output of
the instrument to change in input or measured
Threshold: It is the minimum value of input signal that
is required to make a change or start from zero.
Drift: If an instrument does not reproduce the same
reading at different times of measurement for the same
input signal, it is said to be measurement drift.
True Value: The value measured without any error
Measured Value: The value measured from the
Error: The deviation from true value to the measured
True size: Theoretical size of a dimension which is free from errors.
If the precision measuring instrument is highly calibrated for its error of measurement &
the constant error of measurement is known in advance, then the accurate (true) value
can be obtained as follows ;
True value = Measured value Error
Hence, calibrated & precision measuring instrument is more reliable and hence is used in
metrological laboratories.
Actual size: Size obtained through measurement with permissible error.

Error: Error in measurement is the difference between the measured value and
the true value of the measured dimension.
Error in measurement = Measured value True value
Error = V t~ Vm
Precission Error: The error created by the limitation
of the scale reading is known as precision error
Standard value = 29.705
Measured value = 29.70
Repeatability: Ability of the same gage to give
consistent measurement readings no matter how many
times the same operator of the gage repeats the
measurement process.
Reproducibility: Ability of the same gage to give
consistent measurement readings regardless of the
operator, who performs the measurements.
Dead Zone: It is the largest change in the physical
variable to which the measuring instrument does not
Backlash: The maximum distance through which one
part of the instrument is moved without disturbing the
other part
Range: The limit of measurement values that an
instrument is capable of reading. The dimension being
measured must fit inside this range. The physical
variables that are measured between two values.
One is the higher calibration value Hc and the other is
Lower value Lc.
Span: The algebraic difference between higher
calibration values to lower calibration values
Bias: It is a characteristic of a measure or measuring
instruments to give indications of the value of a
measured quantity for which the average value differs
from true value
Response Time: The time at which the instrument
begins its response for a change in the measured
Magnification: It means the magnitude of output signal
of measuring instrument many times increases to make
it more readable
Hysteresis: All the energy put into the stressed
component when loaded is not recovered upon
unloading. so the output of measurement partially
depends on input called Hysteresis.
Hysteresis error of a pressure sensor is the maximum
difference in output at any measurement value within
the sensor's specified range when approaching the
point first with increasing and then with decreasing
Linearity: The ability to produce the input
characteristics symmetrically and linearly
Y = mx +c
Tolerance: It is the maximum allowable error in the
Stability: The ability of a measuring instrument to retain
its calibration over a long period of time. Stability
determines an instrument's consistency over time.
Correction: Correction is defined as a value which is
added algebraically to the uncorrected result of the
measurement to compensate to an assumed systematic
Standard: A recognized true value. Calibration must
compare measurement values to a known standard.
Uncertainty: The range about the measured value
within the true value of the measured quantity is likely
to lie at the stated level of confidence.
Reliability: It is defined as the probability that a
given system will perform its function adequately
for its specified period of lifetime under specified
operating conditions.
Readability is a word which is frequently used in the analog
measurement. The readability is depends on the both the
instruments and observer.
Readability is defined as the closeness with which the scale
of an analog instrument can be read.
It is the susceptibility of a measuring instrument to having
its indications converted to a meaningful number. It implies
the ease with which observations can be made accurately.
For getting better readability the instrument scale should
be as high as possible.
A known input is given to the measurement
the output deviates from the given input
corrections are made in the instrument and
then the output is measured.
This process is called Calibration.
Traceability: It is establishing a calibration chain by
step by step comparison with better standards.
Ability of a measuring instrument's
metrological characteristics to remain constant
over time. (paraphrased from the ISO
International guide of basic and general terms
in metrology, 1993; item 5.14.)
From this, Stability is a property of an individual
measuring instument -- its variation over time.
The degree of agreement of the measured dimension
with its true magnitude
Magnification (amplification)
the smallest dimension that can be read on an
Rules of 10 (gage makers rule)
At least 10 times accurate than the tolerance
Stability (drift); capability to maintain calibrated status
Sensitivity of the instrument is defined as the
ratio of the magnitude of the output signal to
the magnitude of the input signal.
It denotes the smallest change in the measured
variable to which the instruments responds.
Sensitivity has no unique unit. It has wide range
of the units which dependent up on the
instrument or measuring system.