Difference between Accuracy and Precision

Accuracy and precision are two important factors to consider when taking data measurements. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value.

Definition

There are two common definitions of the term accuracy. In math, science, and engineering, It refers to how close a measurement is to the true value.

The ISO applies a more rigid definition, where It refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises the term accurate be used when a measurement is both accurate and precise.

Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error.

Accuracy

Difference between Accuracy and Precision

 Difference

Accuracy

Precision

Meaning It refers to the level of agreement between the actual measurement and the absolute measurement. Precision implies the level of variation that lies in the values of several measurements of the same factor.
Represents How closely result agree with the standard value? How closely the results agree with one another?
Degree Degree of conformity Degree of reproducible
Factor Single factor Multiple factors
Measurement of Statistical bias Statistical variability
Concerned with Systematic Error Random Error