Shopping Cart

No products in the cart.

The resolution, accuracy and precision of sensors

A deep dive into two key concepts from industrial automation with sensors: resolution and accuracy.

In this article we would like to discuss a number of terms related to sensors. These are often used interchangeably – usually incorrectly. These are very important specifications of sensors that are decisive in the functioning of the sensor. It's all about the resolution, accuracy and precision of sensors.

The resolution of sensors

The resolution of the sensor helps in this application.

The term resolution is often associated with a measurement. However, when it comes to sensors, the term resolution gains a totally different meaning. Simply put, the resolution is: the smallest possible change that a sensor can perceive. For a laser light grid, for example, this is a shift in position.

A sensor with a low(er) resolution will only detect or report displacements in whole centimetres, for example. When a sensor with a higher resolution is used, it is possible to do this down to millimetres. Of course, this is only of use when the application requires it.

It can be said as well that the application is decisive for the required resolution. A critical application, such as the monitoring of components on a PCB, as seen in the image, requires a high resolution. In other words, sensors with a higher resolution have a lower minimum voltage to which they switch.

The accuracy and precision of sensors

A frequently asked question when selecting a sensor is: how accurate is it? When it comes to sensor accuracy, one often tends to think about the difference between a measured distance and the actual distance. Within sensors there are two known types of accuracy: absolute accuracy and precision, also known as the repeatability of a measurement. 

Absolute accuracy is often what's meant when someone talks about accuracy. So, the absolute accuracy is the deviation within a single measurement. For example, you can think of measuring the distance of a truck backing up towards a loading bay. To prevent a collision it is of utmost importance to know whether the sensor will indicate 1 meter remaining when there is 50 cm to go in reality. When the sensor has an absolute accuracy of ±10 cm the real value will be somewhere in between 40 and 60 cm, in relation to the previous example with a 50 cm measurement.

The repeatability of accuracy is the difference between two measurements. When the first measurement indicates a distance of 101 mm and a second measurement - under exactly the same circumstances - indicates 102 mm we can say that the repeatability of the sensor is ±1 mm. In many (but definitely not all) applications, the repeatability is more important than the absolute accuracy.

A high-accuracy sensor here prevents trucks from stopping too early or too late at the loading bay.

But...

However, knowing these two values is not enough to judge the right choice of sensor. People naturally want the best specs, so it's easy to go for the best repeatability and best possible absolute accuracy. To give an example of an application in which this is not the best choice, one can think of measuring the thickness of a product on a conveyor belt using a displacement laser.

When a laser with an absolute or repeat accuracy of ±0.01 mm is used, it cannot function along or above a conveyor belt because it vibrates constantly and thus produces a deviation of, for example, at least 1 mm. If all measurements are that accurate to 0.01 mm, it means you are consistently measuring and detecting errors on a vibrating conveyor belt. 

Too high an accuracy in an application is not useful when objects vibrate a lot, detecting even the slightest movement
Our specialists are glad to help

Call directly (working days between 08.30 and 17.00 GMT+1)

Ask a question about your application directly