Menu
LiveZilla Live Help

Blog Search

Blog Articles

FAQ

Do you need help? Go to see our FAQ section. To learn about thermometers and temperature measurement in more depth visit the ETI Learning Centre.

Thermometers - a brief history

Water expands and contracts with temperature. So does air. So do metals like mercury.

 

Before Galileo - who understood that science is based on precise measurement - some had worked out that it should to be possible to invent a device that would measure the grades from cold to hot and back again. These early experiments were called thermoscopes. They used a column of air in a tube with one end in a container of coloured water. In 1610 Galileo tried it with wine, and so is credited with inventing the first alcohol thermometer.


The first sealed thermometer designed in 1641 for the grand duke of Tuscany, used alcohol, and it had degree marks. But the man credited with using the freezing point of water as the zero, or starting point was an Englishman from  London, Robert Hooke, in 1664. An astronomer called Roemer in Copenhagen chose ice and the boiling point of water as his two reference points, and started keeping weather records, but there were still uncertainties about how to devise an accurate scale that would be reliable everywhere.

 

In 1724, the German instrument maker, Gabriel Fahrenheit thought mercury the most suitable liquid for measuring temperature. He calibrated his first thermometer using a mixture of ice and water with sea salt as his zero. But salt water has a much lower freezing point than ordinary water, so for his purposes he chose his freezing point as 30, and the temperature inside the mouth of a healthy human as 96. With those three points, he established the boiling point of water at 212 and later adjusted his freezing point of water to 32. That way, he could count 180 degrees between boiling and freezing, at sea level. 

 

But 180 is an awkward number. So two decades later, Linnaeus - who invented the taxonomic system for naming species - and a Swedish astronomer, Anders Celsius separately worked out a scale of one hundred degrees between freezing and boiling points. Because there were 100 steps between the two states, it was called the centigrade scale.

 

A little more than a century later - in 1848 - Lord Kelvin started contemplating the theory of heat and of course a much greater range of temperature. He used the centigrade scale, but started from absolute zero, the point at which all molecular motion stops, the lowest conceivable temperature in the universe. This turned out to be -273.16C. There is an absolute temperature version of Fahrenheit, called the Rankine scale, but hardly anybody uses it. In 1948, an international conference on weights and measures adopted the Celsius scale as the standard measure, but Fahrenheit is still used in the United States.

 

Eventually, scientists found other physical properties that respond reliably to the application of heat and cold. Dial thermometers depend upon the expansion and contraction of metal. Electronic thermometers, like thermistor and thermocouple thermometers use the effects of heat and cold on the speed or flow of electronic circuits to calculate temperature. Infrared thermometers measure the emission of infrared radiation. Still other thermometers measure the affect of heat and cold on sound waves, photoluminescence, fluorescence, magnetism, gamma rays, and many other physical phenomena.