Skip links

Measurement Standards: Why Are They Important?

Measurement Standards: Why Are They Important?

We all know that test equipment is used to define a measurement of a parameter. Whether it is the temperature level of a pipe, the electrical charge running through a wire or even the moisture content in a building’s walls, specialist test equipment devices are able to accurately determine a measurement specific to that application.

But how do we know we can trust this measurement? Outside factors such as drift, knocks, bumps and even dirt can affect the overall accuracy of a piece of test equipment, and that’s exactly why we define a measurement standard to make sure that the tester you are using is able to generate results within a specific standard.

This basically means there are two parts to every measurement – the actual measurement taken, and the standard that this measurement is matched up against.  Without this standard, the measurement taken has no point of reference, and the chance of the measurement being completely incorrect increases dramatically.

The device you’re using to make measurements basically relies on a calculation that is integrated into its internal components – by using this calculation the device generates a result, and without the standard, there can be no calculation and no results.

Standards and Calibration

Whenever a piece of test equipment is constructed by a manufacturer, it will be calibrated to the standard (using the appropriate calculation) that the manufacturer has set for the product.

This means that each new product should, as standard, generate as accurate a result as possible. There are instances where this isn’t the case and the initial calibration of a fault inside the tester might render the measurement taken a bit dodgy, but these are generally few and far between.

This delicate balance is maintained by the state of the components inside the device and how well they are functioning in relation to the calculation. When the tester is used, the overall efficiency of a tester is then compromised somewhat, and the overall measurements taken by the device can be affected by outside parameters.

This effect can be negligible, but it can also be completely catastrophic to the readings taken by your device. In the case of something like a piece of electrical test equipment, generating incorrect readings as a result of the degradation compromises the safety of whatever you’re testing and the people who might be working around it or directly with it.

This is why a regular period of calibration, where your equipment is matched up to an existing standard and adjusted accordingly if necessary, is required.  The calibration process basically refers to a manufacturer’s standard, and a piece of equipment many times more accurate than the product you’re testing with is used to perform the calibration check and adjusts are made to the tester as necessary to ensure that the meter is generating measurements that are as accurate as possible.

It is highly recommended that the period between calibrations is around a year, although more specialist, high accuracy testers may require more regular calibration.

Written by Sofia