Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Accuracy and precision

(Redirected from Accuracy)

In science, engineering, industry and statistics, accuracy is the degree of conformity of a measured or calculated quantity to its actual, nominal, absolute, or some other reference, value. Precision characterises the degree of mutual agreement or repeatability among a series of individual measurements, values, or results.

A useful analogy

In a common analogy illustrating the difference between accuracy and precision, repeated measurements are compared to arrows that are fired at a target. Accuracy describes the closeness of arrows to the bullseye at the target center. Arrows that strike closer to the bullseye are considered more accurate. The closer a system's measurements to the accepted value, the more accurate the system is considered to be.

To continue the analogy, precision would be size of the arrow cluster. When all arrows are grouped tightly together, the cluster is considered precise since they all struck close to the same spot, if not necessarily near the bullseye. The measurements are precise, though not necessarily accurate.

However, it is not possible to reliably achieve accuracy without precision - if the arrows are not grouped close to one another, they cannot all be close to the bullseye.

Quantifying accuracy and precision

Ideally a measurement device is both accurate and precise, with measurements all close to and tightly clustered around the known value.

The accuracy and precision of a measurement process is usually established by repeatedly measuring some traceable reference standard. Such standards are defined in the International System of Units and maintained by national standards organisations such as the National Institute of Standards and Technology.

Image:Accuracy_and_precision.png

Precision is usually characterised in terms of the standard deviation of the measurements, sometimes called the measurement process's standard error.

With regard to accuracy we can distinguish:

  • the difference between the mean of the measurements and the reference value, the bias. Establishing and correcting for bias is necessary for calibration.
  • the combined effect of that and precision

A common convention in science and engineering is to express accuracy and/or precision implicitly by means of significant figures. Here, when not explicitly stated, the margin of error is understood to be one-half the value of the last significant place. For instance, a recording of '8430 m' would imply a margin of error of 5 m (the last significant place is the tens place), while '8000 m' would imply a margin of 500 m. To indicate a more accurate measurement that just happens to lie near a round number, one would use scientific notation: '8.000 x 10^3 m' indicates a margin of 0.5 m. However, reliance on this convention can lead to false precision errors when accepting data from sources that do not obey it.

Precision is sometimes stratified into:

  • Repeatability - the variation arising when all efforts are made to keep conditions constant by using the same instrument and operator, and repeating during a short time period; and
  • Reproducibility - the variation arising using the same measurement process among different instruments and operators, and over longer time periods.

The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy