Sie sind auf Seite 1von 38

Experimental Planning

Definition of Terms
Readability

Readability indicates closeness with which the scale of the instrument may be read.
Dependent on the scale length, spacing of graduations, size of pointer and parallax effect.

Least count
Least count is the smallest difference between two indications that can be detected on the
instrument scale.
Dependent on the scale length, spacing of graduations, size of pointer and parallax effect.

Definition of Terms (contd.)


Sensitivity
Sensitivity of an instrument is the ratio of the linear movement of the pointer on an analog
instrument to the change in the measured variable causing this motion.
Sensitivity refers to the ability of a measuring device to detect small differences in a
quantity.
For example, a 1-mV recorder might have a 25 cm scale length. It sensitivity would be 25
cm/mV, assuming that the measurement was linear all across the scale.
For a digital instruments readout the term sensitivity does not have the same meaning
because different scale factors can be applied with a button.

Definition of Terms (contd.)


Hysteresis

It is the difference between the indications of measuring instrument when the same value
of the measured quantity is reached by increasing or by decreasing that quantity.
An instrument is said to exhibit hysteresis when there is a difference in reading depending
on whether the value of the measured quantity is approached from above or below.
Hysteresis may be the result of mechanical friction, magnetic effects, elastic deformation
or thermal effects.

Definition of Terms (contd.)


Accuracy
Accuracy of an instrument indicates the deviation of the reading form a known input.

For example a temperature measuring device measures 102 K, 103 K, 99 K, 102 K for
known input 100 K, then the accuracy of the device is 3% would be accurate within 3K
over the entire range of the device.
Precision
Precision of an instrument indicates its ability to reproduce a certain reading with a given
accuracy.
Precision is a term that describes an instruments degree of freedom from random errors.

Definition of Terms (contd.)


Deviation
Deviation of an instrument reading form a known value.
Error
The deviation is called the error.

Uncertainty
In many experimental situations we may not have a known value with which to compare
instrument readings and yet we feel fairly confident that the instrument is within a certain
plus or minus range of the true value. In such cases we say that the plus or minus range
expresses the uncertainty of the instrument readings.

Calibration
The calibration of all instrument is important, for it affords the opportunity to check the
instrument against a known standard and subsequently to reduce errors in accuracy. The
calibration of all instrument is important, for it affords the opportunity to check the
instrument against a known standard and subsequently to reduce errors in accuracy.
Calibration procedures involve a comparison of the particular instrument with either
(1) a primary standard
(2) a secondary standard with a higher accuracy than the instrument to be calibrated, or
(3) a known input source.

Standards
In order that investigator in different parts of the country and different parts of the world
may compare the results of their experiments on a consistence basis, it is necessary to
establish certain standard units of length, weight, time, temperature and electrical
quantities.

NIST has the primary responsibility for maintaining these standards in the United States.
The meter and the kilogram are considered fundamental units upon which through
appropriate conversion factors, the English system of length and mass is based.
At one time, the standard meter was defined as the length of a platinum iridium bar
maintained at very accurate conditions at the International Bureau of Weight and
Measures in Serves, France.

Standards (contd.)
Similarly kilograms was defined in terms of a platinum iridium mass maintained at this
same bureau.
1 meter = 39.37 inches
1 pound-mass = 453.59237 grams
Standard of length and mass are maintained at NIST for calibration purposes.
In 1960 the General Conference on Weight and Measures defined the standards meter in
terms of the wavelength of the orange-red light of kryptopm-86 lamp.
The standard meter is thus
1 meter = 16,50,763.73 wave lengths

Standards (contd.)
In 1982 the definition of the meter was changed to the distance light travels in
1/299,792,458ths of a second.
For the measurement, light from a helium neon laser illuminates iodine which fluoresces at
a highly stable frequency.
The inch exactly defined as
1 inch = 2.54 centimeters

Experiment Planning
The key success in experimental work is to asking continually:
What I am looking for?

Why am I measuring this- does the measurement really answer any of my questions?
What does the measurement tell me?

Experiment Planning (contd.)


Some particular questions which should be asked in the initial phase of experiment
planning are:
1. What primary variables shall be investigated?
2. What control must be exerted on the experiment?
3. What ranges of the primary variables will be necessary to describe the phenomena
under study?
4. How many data points should be taken in the various ranges of operation to ensure good
sampling of data considering instrument accuracy and other factors?

Experiment Planning (contd.)


5. What instrument accuracy is required for each measurement?
6. If a dynamic measurement is involved, what frequency response must the instrument
have?
7. Are the instruments available commercially or must be constructed especially for the
particular experiment?
8. What safety precautions are necessary if some kind of hazardous operation is involved in
the experiment?

Experiment Planning (contd.)


9. What financial resources are available to perform the experiment, and how do the
various instrument requirements fit into the proposed budget?
10. What provisions have been made for recording the data?
11. What provisions have been made for either on line or subsequent computer reduction
of data?
12. If the data reduction is not of a research nature where manipulation and calculations
depend somewhat on the results of measurements, what provisions are made to have
direct output of a data acquisition system available for the find report?

Generalized experimental procedure


1. Establish the need for the experiment.
2. Establish the optimum budgetary, man power and time requirements, including time
sequencing of the project. Modify scope of the experiment to actual budget, manpower
and time schedule which are allowable.
3. Begin detail planning for the experiment: clearly establish objectives of experiments. If
experiments are similar to those of previous investigators be sure to make use of
experience of the previous workers. Never overlook the possibility that the work may
have been done before and reported in the literature.

Generalized experimental procedure (contd.)


4. Continue planning by performing the following steps:
Establish the primary variables which must be measured

Determine as nearly as possible the accuracy which may be required in the primary
measurements and the number of such measurements which will be required for
proper data analysis.
Set up data reduction calculations before conduction the experiments to be sure
that adequate data are being collected to meet the objective of the experiment.
Analyze the possible errors in the anticipated results before the experiments are
conducted so that modification in accuracy requirements on the various
measurements may be changed if necessary.

Generalized experimental procedure (contd.)


5. Select instrumentation for the various measurement to match the anticipated accuracy
requirements.

6. Modify the instrumental apparatus and /or procedure in accordance with the findings in
item 5.
7. Collect the bulk of experimental data and analyze the result.
8. Organize discus and publish the findings and results of the experiments, being sure to
include information pertaining to all items 1 to 7, above.

Primary Stages of Experimental Planning


If not feasible require modification
of budget and/or time schedule of
discontinue effort

Establish
need for
experiment

Establish
time and
financial
limitations

Possible modify scope


of analytical work

Estimate
scope of
accompa
nying
analytical
work

Possible modify original notion of need.

Carefully
review
previous
work in
the field

Establish
general
feasibility
within
original
budget
and time
limitations

Intermediate Stages of Experimental Planning


Begin preliminary analytical work in order to

Determine variable ranges


and accuracies required

Specify
required

instruments

Modify in accordance
with budget limitations

Arrange for
design and
construction
equipment

Collect a few
preliminary data
points

If comparison not
favorable possibly
modify experiment
and /or analysis
Proceed with analytical work

Purchase instruments

Analyze
uncertainty
in data

Compare
preliminary data
with theories
available

Final Stage of Planning


Collect bulk of data

Continue analysis
including possible
computer runs

Correlate
with need for
experiment

Match theories with


experiment,
correlate
data, create new theories
to explain data etc.

Discuss and
publish results of
experiments

Static Characteristic of Instrument


Accuracy of measurement is one consideration in the choice of instrument for a particular
application. Other parameters such as sensitivity, linearity and the reaction to ambient
temperature changes are further considerations. These attributes collectively known as the
static characteristics of the instruments.
Accuracy/ inaccuracy (measurement uncertainty)
Accuracy of an instrument is a measure of how close the output reading of the instrument
is to the correct value.
In practice, it is more usual the inaccuracy figure rather than the accuracy figure for an
instrument.
Inaccuracy is the extent to which a reading might be wrong and it is often quoted as a
percentage of the full scale reading of an instrument.

Static Characteristic of Instrument (contd.)


Precision/ repeatability/ reproducibility
Precision term that describes an instruments degree of freedom form random errors.
High precision does not imply anything about measurement accuracy.
A high precision may have a low accuracy.
Low accuracy measurement form a high precision instruments are normally caused by a
bias in the measurements, which is removable by recalibration.

The term repeatability and reproducibility mean approximately the same but are applies in
different contexts as given below.

Static Characteristic of Instrument (contd.)


Repeatability describes closeness of output reading when the same input is applied
repetitively over a short period of time, with the same measurement conditions, same
instrument, and observer, same location and same conditions of use maintained
throughout.
Reproducibility describes the closeness of output reading for the same input when there
are changes in the methods of measurement, observer, measuring instrument, location,
condition, condition of use and time of measurement.

Static Characteristic of Instrument (contd.)


Tolerance

Tolerance is a term that is closely related to accuracy and defines the maximum error that
is to be expected in some value.
Accuracy of some instrument is some times quoted as a tolerance figure. When used
correctly tolerance describes the maximum deviation of a manufactured component form
some specified value.
Range or span
The range or span of an instrument defines the minimum and maximum values of a
quantity that the instrument is designed to measure.

Static Characteristic of Instrument (contd.)


Linearity
It is normally desirable that the output reading of an instrument is linearly proportional to
the quantity being measured.
Sensitivity of measurement
The sensitivity of measurement is a measure of the change in instrument output that
occurs when the quantity being measured changes by a given amount.
For example a pressure of 2 bar produces a deflection of 10 degrees in a pressure
transducer, the sensitivity of the instrument is 5 degree/bar.

Static Characteristic of Instrument (contd.)


Threshold
The minimum value of input signal that is required to make a change or start from zero.
Resolution
The minimum value of the input signal is required to cause an appreciable change in the
output known as resolution.

Static Characteristic of Instrument (contd.)


Sensitivity to disturbance
All calibrations and specifications of an instrument are only valid under controlled
conditions of temperature, pressure etc.
These standard ambient conditions are usually defined in the instrument specification.
As variation occur in the ambient temperature, certain static instruments characteristics
change and the sensitivity to disturbance is measure of the magnitude of this change.
Such environmental changes affect instruments in two main way, known as the zero drift
and sensitivity drift.

Static Characteristic of Instrument (contd.)


Zero drift or bias
Zero drift or bias describes the effect where the zero reading of an instrument is modified
by a change in ambient conditions.
Example of bathroom scale.

Static Characteristic of Instrument (contd.)


Sensitivity drift
Sensitivity drift defines the amount by which an instruments sensitivity of measurement
varies as ambient conditions change.
Give example of
Spring balance.

Static Characteristic of Instrument (contd.)


Hysteresis effects.
All the energy put into the stressed component when loaded is not recovered upon
unloading.
So the output of measurement partially depends on input called Hysteresis.
It is defined as the magnitude of error caused in the output for a given value of input,
when this value is approached from opposite direction i.e. from ascending order and then
descending order.
This is caused by backlash, elastic deformation but is mainly caused due to frictional
effects.

Static Characteristic of Instrument (contd.)


Hysteresis effects are best eliminated by taking observation in both the direction i.e. In
ascending and then descending order values of input and then taking the arithmetic mean.
Hysteresis is most commonly found in instruments that contain springs, such as the passive
pressure gauge and the Prony brake.
Dead space.

Dead space is defined as the range of different input values over which there is no change
in output value.

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Static Characteristic of Instrument (contd.)

Das könnte Ihnen auch gefallen