Sie sind auf Seite 1von 4

Accuracy and precision

Accuracy, in science, engineering, industry and statistics, is the degree of conformity of


a measured/calculated quantity to its actual (true) value. Precision (also called
reproducibility or repeatability) is the degree to which further measurements or
calculations will show the same or similar results.
The results of a measurement or calculations can be accurate but not precise, precise but
not accurate, neither, or both; if a result is both accurate and precise, it is called valid.
The related terms in surveying are error (random variability in research) and bias
(nonrandom or directed effects caused by a factor or factors unrelated by the independent
variable).

The target analogy

High accuracy, but low precision

High precision, but low accuracy


By far the most common analogy used to explain the concept is the target comparison.
Repeated measurements are compared to arrows that are fired at a target. Accuracy
describes the closeness of arrows to the bullseye at the target center. Arrows that strike
closer to the bullseye are considered more accurate. The closer a system's measurements
to the accepted value, the more accurate the system is considered to be.
To continue the analogy, if a large number of arrows are fired, precision would be the size
of the arrow cluster. (When only one arrow is fired, precision is the size of the cluster one
would expect if this was repeated many times under the same conditions.) When all
arrows are grouped tightly together, the cluster is considered precise since they all struck
close to the same spot, if not necessarily near the bullseye. The measurements are precise,
though not necessarily accurate.

However, it is not possible to reliably achieve accuracy in individual measurements


without precision - if the arrows are not grouped close to one another, they cannot all be
close to the bullseye. (Their average position might be an accurate estimation of the
bullseye, but the individual arrows are inaccurate.)
Accuracy is the degree of veracity while precision is the degree of reproducibility.
See also Circular error probable for application of precision to the science of ballistics.

Quantifying accuracy and precision


Ideally a measurement device is both accurate and precise, with measurements all close
to and tightly clustered around the known value.
The accuracy and precision of a measurement process is usually established by repeatedly
measuring some traceable reference standard. Such standards are defined in the
International System of Units and maintained by national standards organizations such as
the National Institute of Standards and Technology.

Precision is usually characterised in terms of the standard deviation of the measurements,


sometimes called the measurement process's standard error. The interval defined by the
standard deviation is the 68.3% ("one sigma") confidence interval of the measurements.
If enough measurements have been made to accurately estimate the standard deviation of
the process, and if the measurement process produces normally distributed errors, then it
is likely that 68.3% of the time, the true value of the measured property will lie within
one standard deviation, 95.4% of the time it will lie within two standard deviations, and
99.7% of the time it will lie within three standard deviations of the measured value.

This also applies when measurements are repeated and averaged. In that case, the term
standard error is properly applied: the precision of the average is equal to the known
standard deviation of the process divided by the square root of the number of
measurements averaged. Further, the central limit theorem shows that the probability
distribution of the averaged measurements will be closer to a normal distribution than
that of individual measurements.
With regard to accuracy we can distinguish:

the difference between the mean of the measurements and the reference value, the
bias. Establishing and correcting for bias is necessary for calibration.
the combined effect of that and precision

A common convention in science and engineering is to express accuracy and/or precision


implicitly by means of significant figures. Here, when not explicitly stated, the margin of
error is understood to be one-half the value of the last significant place. For instance, a
recording of '8430 m' would imply a margin of error of 5 m (the last significant place is
the tens place), while '8000 m' would imply a margin of 500 m. To indicate a more
accurate measurement that just happens to lie near a round number, one would use
scientific notation: '8.000 x 10^3 m' indicates a margin of 0.5 m. However, reliance on
this convention can lead to false precision errors when accepting data from sources that
do not obey it.
Looking at this in another way, a value of 8 would mean that the measurement has been
made with a precision of '1' (the measuring instrument was able to measure only up to 1's
place) whereas a value of 8.0 (though mathematically equal to 8) would mean that the
value at the first decimal place was measured and was found to be zero. (The measuring
instrument was able to measure the first decimal place.) The second value is more
precise. Neither of the measured values may be accurate (the actual value could be 9.5
but measured inaccurately as 8 in both instances).
Precision is sometimes stratified into:

Repeatability - the variation arising when all efforts are made to keep conditions
constant by using the same instrument and operator, and repeating during a short
time period; and
Reproducibility - the variation arising using the same measurement process
among different instruments and operators, and over longer time periods.
With respect to a set of independent devices of the same design, precision is the
ability of these devices to produce the same value or result, given the same input
conditions and operating in the same environment. (As defined by Federal
Standard 1037C and MIL-STD-188.)
With respect to a single device, put into operation repeatedly without adjustments,
precision is the ability to produce the same value or result, given the same input
conditions and operating in the same environment. (As defined by Federal
Standard 1037C and MIL-STD-188.)

accuracy (k'y r- -s)


n.
1. Conformity to fact.
2. Precision; exactness.
3. The ability of a measurement to match the actual value of the quantity being
measured.
The noun accuracy has 2 meanings:
Meaning #1: the quality of nearness to the truth or the true value
Synonym: truth
Antonym: inaccuracy (meaning #1)
Meaning #2: (mathematics) the number of significant figures given in a number

Das könnte Ihnen auch gefallen