Beruflich Dokumente
Kultur Dokumente
Definition of Terms
Readability
Readability indicates closeness with which the scale of the instrument may be read.
Dependent on the scale length, spacing of graduations, size of pointer and parallax effect.
Least count
Least count is the smallest difference between two indications that can be detected on the
instrument scale.
Dependent on the scale length, spacing of graduations, size of pointer and parallax effect.
It is the difference between the indications of measuring instrument when the same value
of the measured quantity is reached by increasing or by decreasing that quantity.
An instrument is said to exhibit hysteresis when there is a difference in reading depending
on whether the value of the measured quantity is approached from above or below.
Hysteresis may be the result of mechanical friction, magnetic effects, elastic deformation
or thermal effects.
For example a temperature measuring device measures 102 K, 103 K, 99 K, 102 K for
known input 100 K, then the accuracy of the device is 3% would be accurate within 3K
over the entire range of the device.
Precision
Precision of an instrument indicates its ability to reproduce a certain reading with a given
accuracy.
Precision is a term that describes an instruments degree of freedom from random errors.
Uncertainty
In many experimental situations we may not have a known value with which to compare
instrument readings and yet we feel fairly confident that the instrument is within a certain
plus or minus range of the true value. In such cases we say that the plus or minus range
expresses the uncertainty of the instrument readings.
Calibration
The calibration of all instrument is important, for it affords the opportunity to check the
instrument against a known standard and subsequently to reduce errors in accuracy. The
calibration of all instrument is important, for it affords the opportunity to check the
instrument against a known standard and subsequently to reduce errors in accuracy.
Calibration procedures involve a comparison of the particular instrument with either
(1) a primary standard
(2) a secondary standard with a higher accuracy than the instrument to be calibrated, or
(3) a known input source.
Standards
In order that investigator in different parts of the country and different parts of the world
may compare the results of their experiments on a consistence basis, it is necessary to
establish certain standard units of length, weight, time, temperature and electrical
quantities.
NIST has the primary responsibility for maintaining these standards in the United States.
The meter and the kilogram are considered fundamental units upon which through
appropriate conversion factors, the English system of length and mass is based.
At one time, the standard meter was defined as the length of a platinum iridium bar
maintained at very accurate conditions at the International Bureau of Weight and
Measures in Serves, France.
Standards (contd.)
Similarly kilograms was defined in terms of a platinum iridium mass maintained at this
same bureau.
1 meter = 39.37 inches
1 pound-mass = 453.59237 grams
Standard of length and mass are maintained at NIST for calibration purposes.
In 1960 the General Conference on Weight and Measures defined the standards meter in
terms of the wavelength of the orange-red light of kryptopm-86 lamp.
The standard meter is thus
1 meter = 16,50,763.73 wave lengths
Standards (contd.)
In 1982 the definition of the meter was changed to the distance light travels in
1/299,792,458ths of a second.
For the measurement, light from a helium neon laser illuminates iodine which fluoresces at
a highly stable frequency.
The inch exactly defined as
1 inch = 2.54 centimeters
Experiment Planning
The key success in experimental work is to asking continually:
What I am looking for?
Why am I measuring this- does the measurement really answer any of my questions?
What does the measurement tell me?
Determine as nearly as possible the accuracy which may be required in the primary
measurements and the number of such measurements which will be required for
proper data analysis.
Set up data reduction calculations before conduction the experiments to be sure
that adequate data are being collected to meet the objective of the experiment.
Analyze the possible errors in the anticipated results before the experiments are
conducted so that modification in accuracy requirements on the various
measurements may be changed if necessary.
6. Modify the instrumental apparatus and /or procedure in accordance with the findings in
item 5.
7. Collect the bulk of experimental data and analyze the result.
8. Organize discus and publish the findings and results of the experiments, being sure to
include information pertaining to all items 1 to 7, above.
Establish
need for
experiment
Establish
time and
financial
limitations
Estimate
scope of
accompa
nying
analytical
work
Carefully
review
previous
work in
the field
Establish
general
feasibility
within
original
budget
and time
limitations
Specify
required
instruments
Modify in accordance
with budget limitations
Arrange for
design and
construction
equipment
Collect a few
preliminary data
points
If comparison not
favorable possibly
modify experiment
and /or analysis
Proceed with analytical work
Purchase instruments
Analyze
uncertainty
in data
Compare
preliminary data
with theories
available
Continue analysis
including possible
computer runs
Correlate
with need for
experiment
Discuss and
publish results of
experiments
The term repeatability and reproducibility mean approximately the same but are applies in
different contexts as given below.
Tolerance is a term that is closely related to accuracy and defines the maximum error that
is to be expected in some value.
Accuracy of some instrument is some times quoted as a tolerance figure. When used
correctly tolerance describes the maximum deviation of a manufactured component form
some specified value.
Range or span
The range or span of an instrument defines the minimum and maximum values of a
quantity that the instrument is designed to measure.
Dead space is defined as the range of different input values over which there is no change
in output value.