# Accuracy, Precision, and Measurement Uncertainty

## Accuracy vs Precision

Accuracy and Precision are two terms often used interchangeably. However, for scientists, the distinction between the two is critical. **Accuracy** describes how close a measurement is to the actual value. **Precision**, on the other hand, describes how close repeated measurements are to each other.

Precision is independent of Accuracy, which means measurements can be accurate without being precise, or they can be precise but not accurate. The easiest way to demonstrate the difference between accuracy and precision is through a darts board, with the center (bull's eye) representing the actual (true) value.

If the darts are close to the center, it means they are accurate, and if they are close to one another, regardless of where they are on the board, it means they are precise.

The accuracy and precision of measurements depend mainly on the precision of the measuring tool. In general, a precise measuring tool is one that can measure values in very small increments. For example, a standard ruler can measure length to the nearest millimeter, while a caliper can measure length to the nearest 0.01 millimeter. The caliper is a more precise measuring tool because it can measure extremely small differences in length.

As science is based on observations and experiments, both accuracy and precision are important to have the most reliable results.

## Measurement Uncertainty

If you were to give one hundred people a measuring tape and ask them to measure the same wooden board, you would get a multitude of different measurements. That is mainly because no measurement is ever perfect, and all measurements are subject to some degree of doubt or measurement uncertainty.

We can define the **uncertainty of measurement** as a quantitative measure of how much your measured values deviate from a standard or expected value. If your measurements are not very accurate or precise, then the uncertainty of your values will be very high. In more general terms, uncertainty can be thought of as a disclaimer for your measured values.

The uncertainty in a measurement (A), is often denoted as (δA) "delta A", so the measurement result would be recorded as A ±δA. For instance, the length of something could be expressed as 10 cm ±0.2 cm

**Uncertainty Factors include:**

- Limitations of the measuring device.
- The skill of the person making the measurement.
- Irregularities in the object being measured.
- Any other factors that might affect the outcome (highly dependent on the situation).

### Uncertainty Real Life Example

Uncertainty is a critical piece of information, both in physics and in many other real-life applications. Imagine you are caring for a sick child. You suspect the child has a fever, so you check his or her temperature with a thermometer. What if the uncertainty of the thermometer were 3°C?

It means that if the child's temperature reading was 37°C, which is normal body temperature, the actual temperature could be anywhere from a hypothermic 34°C to a dangerously high 40°C. Thus a thermometer with an uncertainty of 3°C would be quite useless.

### Percent Uncertainty

One way of expressing uncertainty is as a percent of the measured value. If a measurement A is expressed with uncertainty, δA, the percent uncertainty (%unc) is defined as:

**Percent Uncertainty Formula**

### Percent Uncertainty Example 1.3

Let's say you buy a car that runs 15 km per liter, and you wanted to test the accuracy of that statement, so you drove a distance of 45 km, and recorded the fuel consumption for every 15 km. You end up with the following measurements:

First 15 km the consumption was 0.95 liters Second 15 km the consumption was 1.10 liters Third 15 km the consumption was 1.06 liters

You determine that the uncertainty of running 15 km per liter is ±0.15 liter. What is the percent uncertainty of the running 15 km per liter?

Solution)

To find the percent uncertainty, you simply substitute the values you have in place of the variables in the formula as follows:

### Frequently Asked Questions

- Are accuracy and precision the same thing?
- No, accuracy refers to how close your measurements are to the actual "true" value, whereas precision is how close the measurements are to one another regardless of how close they are to the actual "true" value.

- Are accuracy and precision dependent on each other?
- No, accuracy is independent of precision. For instance, measurements can be accurate without being precise, or they can be very precise but not accurate at all.

- Is accuracy or precision more important?
- Both accuracy and precision are quite important if you want to have the most reliable results. As you want to be as close as possible to the actual "true" value of what you are trying to measure (accuracy), and to have consistent results each time (precision).

- What causes measurement uncertainty?
- There are many factors that might affect a measurement's uncertainty. For instance, the accuracy and precision of the measuring device, the skill of the person making the measurement, irregularities in the object being measured, and any other factors that might affect the outcome.

**References:**Paul Peter Urone. “College Physics - Accuracy, Precision, and Significant Figures.” OpenStax, 2012. June 21. https://openstax.org/books/college-physics/pages/1-3-accuracy-precision-and-significant-figures.