LiveZilla Live Help

Blog Search

Blog Articles

thermometer resolution, accuracy and tolerance

analytical instruments, pH meters, TDS testers, anemometers, refractometers

Resolution should not be confused with Accuracy.  The resolution of an instrument is the smallest value that is shown on the display. Thus an instrument that has a 0.1°C Resolution means that it will read to the nearest 0.1°C (Perhaps 46.6°C) whereas a 1°C Resolution instrument will only read to the nearest 1°C (i.e. 47°C). The better the resolution, the better the measurement display capability of the instrument. This does not necessarily mean that the instrument is going to be more accurate, however it does usually indicate a superior instrument.

accuracy & tolerance
Accuracy is the degree of conformity with the established standard; hence accuracy = zero means the display corresponds exactly to the ideal value.

Accuracy is a definition of how ‘accurate’ an instrument is, compared with the known temperature. It is usually accompanied by a reference to a tolerance, as it is very unlikely that anything will be exactly accurate, i.e. accuracy = zero.

Tolerance corresponds to the value of inaccuracy that is inherent in the instrument by virtue of its manufacture or capabilities.

Thus Accuracy will usually be stated as a tolerance, about, or either side of an absolute, or exact temperature. This tolerance may be stated as a measured amount, i.e. ± 0.5°C, or as a percentage, i.e. ±2% (at 50°C the tolerance will be ±1.0°C but at 100°C the tolerance will be ±2.0°C) The tolerance may also be accompanied by a reference to the final digit of the reading, and will therefore be an additional 1° for a 1° instrument or 0.1° for a 0.1° instrument, and so on.

Often this will also be combined with an indication of the range of temperatures over which the accuracy is appropriate.


Therefore accuracy may be described as follows: ±0.5°C ±1digit over the range –30° to 100°C or perhaps ±1% over the range 100° to 250°C

which is the most accurate - a sundial or a watch?
The differences in tolerance v accuracy may be considered with reference to time. A sundial may have dial marks at every half hour - its resolution is strictly half an hour (although you could estimate more closely) However, its accuracy, is absolute i.e. = zero. (if properly positioned of course - and it’s sunny!)

Whereas a watch may have a second hand and therefore a resolution of 1 second, and yet the actual time, and therefore accuracy, may be totally wrong. As the saying goes “A stopped watch is absolutely accurate twice a day!”

Tolerances should be taken into account when measuring temperature as with any other measurement. If you have a table that is going to be placed in the centre of a large room it won’t matter if it is a little bigger or smaller than the ideal size. If it is a cupboard that fits into an alcove, then its tolerance may be slightly less than the alcove width, but no more. A gap for a dishwasher in a kitchen range wants to be as close as possible to the actual width to prevent gaps i.e. a very small plus, but no minus tolerance.

Tolerances for temperature measurement will depend on a number of factors.
The stability and accuracy of the instrument.
Understanding of calibration tolerances and the need to apply the correction factors to the actual readings that you are taking.
The importance of the process temperature - i.e. if you need to store food at a temperature of 1 to 5°C, set the targets at 2 to 4°C and give yourself a tolerance of ±1°C.

But the most important thing is to keep records, so that you can analyse the performance of both the thermometer and the process. Looking at trends enables an educated assessment of what settings are required and provides the ability to be as effective and efficient as possible.