“Measure twice, cut once,” is great advice you can apply in many different situations. But what if you can’t trust your measurements? Suddenly, it doesn’t matter how carefully you work, everything is off. An equipment calibration program is how you ensure measurement devices give you accurate readings within predetermined tolerance levels. With regular, systematic calibration you get measurements you can trust, and that means consistent, safe processes and products.

What is equipment calibration?

Equipment calibration is a two-step process. First, you measure your equipment’s accuracy by determining the variance between the results and an objective, established standard. You then fine-tune the equipment to be within an acceptable range.

The size of the range depends on what you are measuring and the equipment you are using. The scales you use in a pharmaceuticals research lab need to be much more accurate than the ones you use at the warehouse distribution center when loading a truck.

What types of instruments are you most likely calibrating? It depends on your industry, but generally the most common are instruments for measuring pressure, temperature, flow, mechanical, and electrical.

Pressure instruments include sensors and barometers. For temperature, you have thermometers and thermocouples. With mechanical measurements, you have changes in volume, mass, force, torque, and vibration. You can measure electrical voltage, current, resistance, and frequency.

What is the difference between accuracy and precision?

Calibration is how you ensure your measurement devices are both accurate and precise. Which leads us to an important question when talking about calibration: What’s the difference between accuracy and precision?

An accurate definition of precision

Precision tells you about the distance between results. If you were playing a game of darts, the precision measures how close those darts are to one another on the board. It’s unrelated to how close any of them are to the bullseye. If your darts are all close together, your aim is precise. But if they’re far apart (even if some of them are close or even directly on the bullseye), your aim is imprecise.

A precise definition of accuracy

Accuracy tells you about the distance between a result and the true value. So, in that same game of darts, you’re accurate if you’re hitting close to or right inside the bullseye.

More examples: If you’re at sea level, water should boil at 100 C. If your thermometer shows 99.9 C, it’s accurate. If your scale says a one-pound weight is three pounds, it’s inaccurate. If the amp draw is value X, and your multimeter reads value X, it’s accurate.

When you’re checking precision and going for accuracy, there are two types of calibration.

Relative calibration

If you have an already calibrated instrument, you can use it to calibrate a second one. For example, you can calibrate a flow gauge by comparing its results to those from a flow gauge already calibrated.

Absolute calibration

Here, you compare the equipment’s results against an established value. In the case of the kilogram, for example, there’s a whole history of how we defined that unit of mass. In 1775, the unit was extrapolated from the mass of one gram, which was the mass of one cubic centimeter of water.

From 1889 to 2019, everyone used the International Prototype of the Kilogram, a golf-ball-sized mass made from an alloy of platinum and iridium. Today, we define the kilogram using a combination of the Planck constant, the speed of light, and the cesium standard.

Why is equipment calibration important?

“Measure twice, cut once” only works if you trust your measurements. If you’re getting inaccurate numbers, you’re going to make mistakes. And depending on your industry, those mistakes can range from frustrating to dangerous.

Calibration is critical for quality control. Every product in a food processing plant has a strict recipe that ensures uniform taste and texture. If you’re baking bread, you need to know there’s the right amounts of flour and yeast. If your scales are off, so is every loaf.

Calibration also counts in process control. In a manufacturing setting, for example, if you’re not heating parts to a high enough temperature, you can’t trust the paint to bond correctly. In chemical, oil and gas, food, pharmaceutical, agrichemicals, and materials industries, where high-pressure processes are common, faulty readers can lead to lower yields and explosive failures.

How can you set up an equipment calibration program?

The ISO 9001 from the International Organization for Standardization contains many of the most common criteria for quality management, including steps for calibrated equipment procedures.

Start by making a complete list of the equipment you need to keep calibrated. Include names, descriptions, and locations. Next, determine appropriate frequencies, making sure to consider the equipment’s:

  • Purpose
  • Specifications
  • Use
  • Reliability

Remember that requirements differ depending on the industry. For example, in a calibration program for food processing, according to the U.S. Code of Federal Regulations 21 CFR Part 113, “Each thermometer should have a tag, seal or other means of identity that includes the date on which it was last tested for accuracy.” And in the Hazard Analysis and Critical Control Points (HACCP) regulations that are for meat, poultry, seafood, and juice, you need “Records that document the calibration of process monitoring instruments.”

Now that you have the what and when, you need to consider the who. Choose and train your calibration team carefully. Ideally, you can find people with both the right technical knowledge and approach to their work. If training is required, remember that it’s always easier to teach someone a process than to convince them to work carefully or pay attention to small details.


Calibration is important to ensure consistency for both processes and products. It’s a two-step process, where you first test for accuracy and then fine-tune the equipment so it’s within an acceptable range. Although they’re interchangeable in daily conversation, there are important differences between precision in a calibration program.

Accuracy is how close your results are to a defined standard. Precision is how close your results are to one another. When setting up a program, start by looking at what you need to keep calibrated. From there, determine the frequencies. Depending on your industry, there might be related regulations.

Food processing, for example, has many calibration requirements. When choosing who will participate in the program, you want people with both the right knowledge and mindset.

About the author

Jonathan Davis

Share this post


Subscribe to the ManagerPlus blog

Stay up-to-date with ManagerPlus’ news, tips, and product updates by subscribing to our weekly email notifications.