Sentinel 3D Scanning

When I first got into metrology, accuracy wasn’t always at the top of my mind. Instead, I was more focused on figuring out how to use new tools and how I could produce the measurements my customers needed in a timely fashion. But as I learned and grew in my role as a metrology engineer, I started to become more aware of the concept of accuracy, and why it mattered. Of course, my original oversight may have been acceptable for jobs with larger tolerances, but as soon as jobs started coming in with 20 micron tolerances, I was forced to ask myself, “can my equipment actually measure this?”

If you’re reading this, hopefully you’ve already done the work to gain a better understanding of how accurate your equipment is. And if you haven’t, hopefully this article and video will inspire you to do that work! Whether you’re using a simple tape measure for your inspection, or something as complicated as a multisensor CMM, it is important to understand your instrument’s accuracy so that you know your tool’s capabilities and limitations.

Unfortunately, there isn’t one right way to evaluate your tool’s accuracy. The method that works best for one person, part, or tool may not work well for someone else’s application. For instance, checking the accuracy of a CT scan may be as simple as comparing the measured data to data collected on a regularly-calibrated CMM. But for someone that doesn’t have access to a CMM, another solution may work better. In the sections below, we will explore a few different methods you can use to better understand the accuracy of your measurement process.

Specifications

The first place you should check when investigating your tool’s accuracy is its specifications. Manufacturers usually provide accuracy specifications in their advertising materials or product literature. But this isn’t always the case. Some manufacturers actually shy away from providing specifications for accuracy.

Be warned, even if you are able to find accuracy specifications for your tool, don’t take them at face value. Instead, think of them as a goal rather than a guarantee. Your measurement application may not be ideal for your measurement system, which could prevent you from achieving the specified accuracy.

Compare with Another Gage

If you are lucky enough to have access to another measurement tool that you already know and trust, try comparing measurements from that tool against the tool you are trying to understand.

As an example, maybe you just purchased a new laser micrometer, but you’re not sure how it will perform on reflective materials. It would be a fairly quick test to compare your measurements against the same measurements taken with a recently calibrated non-laser micrometer. This will start to paint a picture of how accurate your new gage really is.

Measure Calibrated Standards

A great way to establish traceability to international standards organizations, and to learn about your tool’s accuracy at the same time, is to measure calibrated artifacts. This method requires that you have calibrated artifacts, such as gage blocks, ball bars, or even parts on hand that have been measured by another lab.

Before springing for this method, understand that it is important that the other lab is higher on the food chain of traceability than your lab is. In other words, you should be sending your artifacts to a national lab or primary lab. Otherwise, you may be asking a lab less accurate than your own to measure your parts, which defeats the purpose.

Calculate Measurement Uncertainty

If you really want to take a deep dive and accurately quantify your tool’s accuracy for a specific measurement, then perhaps you should try calculating uncertainty. Although this method is fairly involved, it is probably the most comprehensive method for estimating your tool’s accuracy.

If you want to learn more about calculating uncertainty, check out JCGM 100 Guide to the Expression of Uncertainty in Measurement. This standard is a good guide for uncertainty calculation, and the best part is that it is available for free online!

Perform a Gage R&R Study

While gage R&R studies don’t measure accuracy directly (they’re typically used to help identify if your measurements will be repeatable and reproducible), they can be useful in identifying shortcomings in your measurement system due to variability. Just because you compare one set of measurements between two gages and they are the same does not mean that they will be the same every time. 

For more information about gage R&R studies, check out Measurement Systems Analysis by the Automotive Industry Action Group. This book details several methods for performing a gage R&R study, while also providing the necessary equations to do so.

Final Points

Although evaluating your measurement system’s accuracy may seem daunting at first, you will soon find that, after performing even the easiest of these tests, you will begin to feel more comfortable with your tool and its limits. If you don’t feel confident in performing only one of these methods, try more than one, or maybe even all of them!

But what if your findings reveal that your tool isn’t as accurate as it should be? Don’t panic quite yet! It could be that it needs to be calibrated. Or perhaps more experience is needed. Just because your findings don’t validate your expectations doesn’t mean your tool isn’t capable of measuring your part – it may just mean you have some additional work to do before it is.

We hope you found this article useful. If you want to see an example of one of the methods we discussed, check out our video below.