Taking measurements is one of the first procedures we learn in middle school science class, and an important part of taking measurements is understanding how the measuring equipment works. Scales need to be tared, a liquid in a beaker, test tube, or other graduated cylinder will have a meniscus, and the ends of rulers are the most vulnerable to warping due to time and temperature. If you don’t know how your equipment influences the measurement, you can’t understand the accuracy of the measurement.
Using a level or pressure sensor of any kind is no different. Industrial sensors have electronic outputs, so the difficulty of reading a meniscus is eliminated (phew!). But with the dependence on electromechanical devices to make the reading and translate it into an electronic signal, a quantifiable amount of uncertainty has been introduced to the reading. Understanding that quantifiable uncertainty is what we’re going to look at today.
There are three ways we talk about the uncertainty in a reading: percent of reading, percentage of full scale, and ± actual units.
Percent of Reading
Although the phrase “percent of reading” is a little awkward, the concept is simple: the amount of error (or limitation of accuracy) in an instrument’s reading is a percentage of either that reading or the range over which the reading was taken. For example, the accuracy of ultrasonic sensors is usually stated in terms of detected range.
Let’s say we have an ultrasonic sensor with an operating range of 25’, we have defined a ±2’ window at 10’, and the stated accuracy of the sensor is ±0.25% of detected range. It’s easy to think that our ±2’ window is the “detected range” in question, but the detected range is actually the 10’ from the sensor to the latest reading. So, the accuracy is 10’ * 0.0025 = 0.025’, or just less than 5/16”.
What happens when the level being measured is 15’ from the sensor? The same equation applies, but instead of a 10’ detected range, we have a 15’ detected range. The accuracy then becomes 15’ * 0.0025 = 0.0375’, or just less than ½”.
Has the accuracy of the sensor really decreased by about 50%? No, the comparative accuracy of the readings between the two situations appears to have changed, because we have increased the range of the sensor. We increased the amount of uncertainty in the measurement, but the senor is still accurate to ±0.25% of the detected range.
Percent of Full-scale
Percent of Full-scale seems straight forward, as a concept, but there is a small twist that is easily overlooked. The accuracy of the instrument is defined as a percentage of the entire range (i.e. Full-scale) of the instrument, no matter how much of that range is being used.
The accuracy of submersible pressure transducers is often stated in terms of percent of span. So, let’s put one in a reservoir where the water can reach a depth of 75’. If the transducer’s stated accuracy is 0.10% of span, then the accuracy of any reading will be within 75’ * .001 = .075’, or just more than 7/8”, at all times. Even if the water level stays within 5’ of 20’, the uncertainty of the reading will be about 7/8” because we calibrated the transducer to reach up to 75’.
But, if we put the same submersible pressure transducer in a tank with a maximum submerged depth of 20’, the accuracy “improves” to 20’ * .001 = .02’, or about ¼”. That looks like more than a 70% improvement! But it’s not, really, because in both cases, the accuracy of the reading is based on the full operating scale of the sensor. Again, limiting the range of the sensor—in this case the full-scale range rather, than the secondarily defined sub-range of the ultrasonic sensor above—limits the uncertainty in the measurement.
± Actual Units
A sensor reading with an accuracy stated in terms of actual units is the most intuitive to understand. No matter what the reading, no matter what the range of the instrument, the error range is always the same.
For instance, let’s look at a resistive chain level sensor with a stated accuracy of 3 mm (1/8”). The stem of the sensor could be 1’-6” long or 6’ long, and the depth of the liquid being measured could be 2’ or 4’. For any and all of these of these cases, the reading given by the sensor will be within 1/8” of the actual level of the liquid.
Application Is King
We’ve looked at three ways level-measuring instruments quantify the uncertainty in their measurements. For two of those ways, Percentage of Reading and Percent Full-scale, the amount of uncertainty in a reading is directly related to the operating range of the instrument. The third, ± Actual Units, is constant across the range of the instrument.
Each sensor technology has advantages over the others in certain situations. And, as we say often in this space, the more you know about your measuring situation, the better we can help you find the instrument that is best suited for your needs.
Accuracy is important. And it is directly affected by the several variables in your application. Talk to us about those variables. We want to help you understand which sensor is right for your unique situation, and how you can expect that sensor to perform.
Want to keep your sensor as accurate as possible? Ask us about our calibration and NIST certification services. Contact us to learn more: