Thermometer resolution should not be confused with Accuracy. The resolution of an instrument is the smallest value that is shown on the display. Thus an instrument that has a 0.1°C Resolution means that it will read to the nearest 0.1°C (Perhaps 46.6°C) whereas a 1°C Resolution instrument will only read to the nearest 1°C (i.e. 47°C). The better the resolution, the better the measurement display capability of the instrument. This does not necessarily mean that the instrument is going to be more accurate, however it does usually indicate a superior instrument.
accuracy & tolerance
Accuracy is the degree of conformity with the established standard; hence accuracy = zero means the display corresponds exactly to the ideal value.
Accuracy is a definition of how ‘accurate’ an instrument is, compared with the known temperature. It is usually accompanied by a reference to a tolerance, as it is very unlikely that anything will be exactly accurate, i.e. accuracy = zero.
Tolerance corresponds to the value of inaccuracy that is inherent in the instrument by virtue of its manufacture or capabilities.
Thus Accuracy will usually be stated as a tolerance, about, or either side of an absolute, or exact temperature. This tolerance may be stated as a measured amount, i.e. ± 0.5°C, or as a percentage, i.e. ±2% (at 50°C the tolerance will be ±1.0°C but at 100°C the tolerance will be ±2.0°C) The tolerance may also be accompanied by a reference to the final digit of the reading, and will therefore be an additional 1° for a 1° instrument or 0.1° for a 0.1° instrument, and so on.
Often this will also be combined with an indication of the range of temperatures over which the accuracy is appropriate.