Hubbry Logo
ThermometerThermometerMain
Open search
Thermometer
Community hub
Thermometer
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Thermometer
Thermometer
from Wikipedia
Mercury thermometer (mercury-in-glass thermometer) for measurement of room temperature.[1]

A thermometer, from Ancient Greek θερμός (thermós), meaning "warmth", and μέτρον (métron), meaning "measure", is a device that measures temperature (the hotness or coldness of an object) or temperature gradient (the rates of change of temperature in space). A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermometer or the pyrometric sensor in an infrared thermometer) in which some change occurs with a change in temperature; and (2) some means of converting this change into a numerical value (e.g. the visible scale that is marked on a mercury-in-glass thermometer or the digital readout on an infrared model). Thermometers are widely used in technology and industry to monitor processes, in meteorology, in medicine (medical thermometer), and in scientific research.

A standard scale

[edit]

While an individual thermometer is able to measure degrees of hotness, the readings on two thermometers cannot be compared unless they conform to an agreed scale. Today there is an absolute thermodynamic temperature scale. Internationally agreed temperature scales are designed to approximate this closely, based on fixed points and interpolating thermometers. The most recent official temperature scale is the International Temperature Scale of 1990. It extends from 0.65 K (−272.5 °C; −458.5 °F) to approximately 1,358 K (1,085 °C; 1,985 °F).

History

[edit]

Sparse and conflicting historical records make it difficult to pinpoint the invention of the thermometer to any single person or date with certitude. In addition, given the many parallel developments in the thermometer's history and its many gradual improvements over time, the instrument is best viewed not as a single invention, but an evolving technology.

Ancient developments

[edit]

Early pneumatic devices and ideas from antiquity provided inspiration for the thermometer's invention during the Renaissance period.

Philo of Byzantium

[edit]
Fludd's figure of Philo's experiment
Fludd's figure of Philo's experiment

In the 3rd century BC, Philo of Byzantium documented his experiment with a tube submerged in a container of liquid on one end and connected to an air-tight, hollow sphere on the other. When air in the sphere is heated with a candle or by exposing it to the sun, expanding air exits the sphere and generates bubbles in the vessel. As air in the sphere cools, a partial vacuum is created, sucking liquid up into the tube. Any changes in the position of the liquid will now indicate whether the air in the sphere is getting hotter or colder.

Translations of Philo's experiment from the original ancient Greek were utilized by Robert Fludd sometime around 1617 and used as the basis for his air thermometer.[2]: 15 

Hero of Alexandria

[edit]

In his book, Pneumatics, Hero of Alexandria (10–70 AD) provides a recipe for building a "Fountain which trickles by the Action of the Sun's Rays," a more elaborate version of Philo's pneumatic experiment but which worked on the same principle of heating and cooling air to move water around.[3] Translations of the ancient work Pneumatics were introduced to late 16th century Italy and studied by many, including Galileo Galilei, who had read it by 1594.[2]: 5 

First temperature scale with a fixed point

[edit]
Hasler's temperature scale showing degrees of temperature based on an individual's latitude
Hasler's temperature scale showing degrees of body temperature based on an individual's latitude.

The Roman Greek physician Galen is given credit for introducing two concepts important to the development of a scale of temperature and the eventual invention of the thermometer. First, he had the idea that hotness or coldness may be measured by "degrees of hot and cold." He also conceived of a fixed reference temperature, a mixture of equal amounts of ice and boiling water, with four degrees of heat above this point and four degrees of cold below. 16th century physician Johann Hasler developed body temperature scales based on Galen's theory of degrees to help him mix the appropriate amount of medicine for patients.[2]: 3 

Late Renaissance developments

[edit]

Thermoscope

[edit]

In the late 16th and early 17th centuries, several European scientists, notably Galileo Galilei[4] and Italian physiologist Santorio Santorio,[5] developed devices with an air-filled glass bulb, connected to a tube, partially filled with water. As the air in the bulb warms or cools, the height of the column of water in the tube falls or rises, allowing an observer to compare the current height of the water to previous heights to detect relative changes of the heat in the bulb and its immediate environment. Such devices, with no scale for assigning a numerical value to the height of the liquid, are referred to as a thermoscope because they provide an observable indication of sensible heat (the modern concept of temperature was yet to arise).[2]

Air thermometer

[edit]

The difference between a thermoscope and a thermometer is that the latter has a scale.[6][2]: 4 

A thermometer is simply a thermoscope with a scale. ... I propose to regard it as axiomatic that a “meter” must have a scale or something equivalent. ... If this is admitted, the problem of the invention of the thermometer becomes more straightforward; that of the invention of the thermoscope remains as obscure as ever.

— W. E. K. Middleton, A history of the thermometer and its use in meteorology

Given this, Middleton claimed that the possible inventors of the thermometer are Galileo, Santorio, Dutch inventor Cornelis Drebbel, or British mathematician Robert Fludd.[2]: 5  Though Galileo is often said to be the inventor of the thermometer, there is no surviving document that he actually produced any such instrument.

The first clear diagram of a thermoscope was published in 1617 by Giuseppe Biancani (1566 – 1624);[2]: 10  the first showing a scale and thus constituting a thermometer was by Santorio Santorio in 1625.[5] This was a vertical tube, closed by a bulb of air at the top, with the lower end opening into a vessel of water. The water level in the tube was controlled by the expansion and contraction of the air, so it was what we would now call an air thermometer.[7]

Coining of thermometer

[edit]

The word thermometer (in its French form) first appeared in 1624 in La Récréation Mathématique by Jean Leurechon, who describes one with a scale of 8 degrees.[8] The word comes from Ancient Greek θερμός (thermós), meaning "warmth", and μέτρον (métron), meaning "measure".

Sealed liquid-in-glass thermometer

[edit]
Fifty-degree thermometers from the mid-17th century on exhibit at the Museo Galileo with black dots representing single degrees and white represented 10-degree increments; used to measure atmospheric temperatures

The above instruments suffered from the disadvantage that they were also barometers, i.e. sensitive to air pressure. In 1629, Joseph Solomon Delmedigo, a student of Galileo and Santorio in Padua, published what is apparently the first description and illustration of a sealed liquid-in-glass thermometer. It is described as having a bulb at the bottom of a sealed tube partially filled with brandy. The tube had a numbered scale. Delmedigo did not claim to have invented this instrument. Nor did he name anyone else as its inventor.[9] In about 1654, Ferdinando II de' Medici, Grand Duke of Tuscany (1610–1670) did produce such an instrument, the first modern-style thermometer, dependent on the expansion of a liquid and independent of air pressure.[8] Many other scientists experimented with various liquids and designs of thermometer. However, each inventor and each thermometer was unique — there was no standard scale.

Early attempts at standardization

[edit]

Early attempts at standardization added a single reference point such as the freezing point of water. The use of two references for graduating the thermometer is said to have been introduced by Joachim Dalence in 1668,[10]: 7–8  although Christiaan Huygens (1629–1695) in 1665 had already suggested the use of graduations based on the melting and boiling points of water as standards[11] and, in 1694, Carlo Rinaldini (1615–1698) proposed using them as fixed points along a universal scale divided into degrees.[12][13][10]: 56  In 1701, Isaac Newton (1642–1726/27) proposed a scale of 12 degrees between the melting point of ice and body temperature.[10]: 57–60 

Precision thermometry

[edit]
A medical mercury-in-glass maximum thermometer.
An alcohol thermometer.
Thermometer with Fahrenheit (symbol °F) and Celsius (symbol °C) units.

In 1714, scientist and inventor Daniel Gabriel Fahrenheit invented a reliable thermometer, using mercury instead of alcohol and water mixtures. In 1724, he proposed a temperature scale which now (slightly adjusted) bears his name. In 1742, Anders Celsius (1701–1744) proposed a scale with zero at the boiling point and 100 degrees at the freezing point of water,[14] though the scale which now bears his name has them the other way around.[15] French entomologist René Antoine Ferchault de Réaumur invented an alcohol thermometer and, temperature scale in 1730, that ultimately proved to be less reliable than Fahrenheit's mercury thermometer.

Very Slippy-Weather
A caricature by James Gillray, 1808

The first physician to use thermometer measurements in clinical practice was Herman Boerhaave (1668–1738).[16] In 1866, Sir Thomas Clifford Allbutt (1836–1925) invented a clinical thermometer that produced a body temperature reading in five minutes as opposed to twenty.[17] In 1999, Dr. Francesco Pompei of the Exergen Corporation introduced the world's first temporal artery thermometer, a non-invasive temperature sensor which scans the forehead in about two seconds and provides a medically accurate body temperature.[18][19]

Registering

[edit]

Traditional thermometers were all non-registering thermometers. That is, the thermometer did not hold the temperature reading after it was moved to a place with a different temperature. Determining the temperature of a pot of hot liquid required the user to leave the thermometer in the hot liquid until after reading it. If the non-registering thermometer was removed from the hot liquid, then the temperature indicated on the thermometer would immediately begin changing to reflect the temperature of its new conditions (in this case, the air temperature). Registering thermometers are designed to hold the temperature indefinitely, so that the thermometer can be removed and read at a later time or in a more convenient place. Mechanical registering thermometers hold either the highest or lowest temperature recorded until manually re-set, e.g., by shaking down a mercury-in-glass thermometer, or until an even more extreme temperature is experienced. Electronic registering thermometers may be designed to remember the highest or lowest temperature, or to remember whatever temperature was present at a specified point in time.

Thermometers increasingly use electronic means to provide a digital display or input to a computer.

Physical principles of thermometry

[edit]
Various thermometers from the 19th century.
Comparison of the Celsius and Fahrenheit scales

Thermometers may be described as empirical or absolute. Absolute thermometers are calibrated numerically by the thermodynamic absolute temperature scale. Empirical thermometers are not in general necessarily in exact agreement with absolute thermometers as to their numerical scale readings, but to qualify as thermometers at all they must agree with absolute thermometers and with each other in the following way: given any two bodies isolated in their separate respective thermodynamic equilibrium states, all thermometers agree as to which of the two has the higher temperature, or that the two have equal temperatures.[20] For any two empirical thermometers, this does not require that the relation between their numerical scale readings be linear, but it does require that relation to be strictly monotonic.[21] This is a fundamental character of temperature and thermometers.[22][23][24]

As it is customarily stated in textbooks, taken alone, the so-called "zeroth law of thermodynamics" fails to deliver this information, but the statement of the zeroth law of thermodynamics by James Serrin in 1977, though rather mathematically abstract, is more informative for thermometry: "Zeroth Law – There exists a topological line which serves as a coordinate manifold of material behaviour. The points of the manifold are called 'hotness levels', and is called the 'universal hotness manifold'."[25] To this information there needs to be added a sense of greater hotness; this sense can be had, independently of calorimetry, of thermodynamics, and of properties of particular materials, from Wien's displacement law of thermal radiation: the temperature of a bath of thermal radiation is proportional, by a universal constant, to the frequency of the maximum of its frequency spectrum; this frequency is always positive, but can have values that tend to zero. Another way of identifying hotter as opposed to colder conditions is supplied by Planck's principle, that when a process of isochoric adiabatic work is the sole means of change of internal energy of a closed system, the final state of the system is never colder than the initial state; except for phase changes with latent heat, it is hotter than the initial state.[26][27][28]

There are several principles on which empirical thermometers are built, as listed in the section of this article entitled "Primary and secondary thermometers". Several such principles are essentially based on the constitutive relation between the state of a suitably selected particular material and its temperature. Only some materials are suitable for this purpose, and they may be considered as "thermometric materials". Radiometric thermometry, in contrast, can be only slightly dependent on the constitutive relations of materials. In a sense then, radiometric thermometry might be thought of as "universal". This is because it rests mainly on a universality character of thermodynamic equilibrium, that it has the universal property of producing blackbody radiation.

Thermometric materials

[edit]
Bi-metallic stem thermometers used to measure the temperature of steamed milk
Bi-metallic thermometer for cooking and baking in an oven

There are various kinds of empirical thermometer based on material properties.

Many empirical thermometers rely on the constitutive relation between pressure, volume and temperature of their thermometric material. For example, mercury expands when heated.

If it is used for its relation between pressure and volume and temperature, a thermometric material must have three properties:

(1) Its heating and cooling must be rapid. That is to say, when a quantity of heat enters or leaves a body of the material, the material must expand or contract to its final volume or reach its final pressure and must reach its final temperature with practically no delay; some of the heat that enters can be considered to change the volume of the body at constant temperature, and is called the latent heat of expansion at constant temperature; and the rest of it can be considered to change the temperature of the body at constant volume, and is called the specific heat at constant volume. Some materials do not have this property, and take some time to distribute the heat between temperature and volume change.[29]

(2) Its heating and cooling must be reversible. That is to say, the material must be able to be heated and cooled indefinitely often by the same increment and decrement of heat, and still return to its original pressure, volume and temperature every time. Some plastics do not have this property;[30]

(3) Its heating and cooling must be monotonic.[21][31] That is to say, throughout the range of temperatures for which it is intended to work,

(a) at a given fixed pressure,
either (i) the volume increases when the temperature increases, or else (ii) the volume decreases when the temperature increases;
but not (i) for some temperatures and (ii) for others; or
(b) at a given fixed volume,
either (i) the pressure increases when the temperature increases, or else (ii) the pressure decreases when the temperature increases;
but not (i) for some temperatures and (ii) for others.

At temperatures around about 4 °C, water does not have the property (3), and is said to behave anomalously in this respect; thus water cannot be used as a material for this kind of thermometry for temperature ranges near 4 °C.[23][32][33][34][35]

Gases, on the other hand, all have the properties (1), (2), and (3)(a)(α) and (3)(b)(α). Consequently, they are suitable thermometric materials, and that is why they were important in the development of thermometry.[36]

Constant volume thermometry

[edit]

According to Preston (1894/1904), Regnault found constant pressure air thermometers unsatisfactory, because they needed troublesome corrections. He therefore built a constant volume air thermometer.[37] Constant volume thermometers do not provide a way to avoid the problem of anomalous behaviour like that of water at approximately 4 °C.[35]

Radiometric thermometry

[edit]

Planck's law very accurately quantitatively describes the power spectral density of electromagnetic radiation, inside a rigid walled cavity in a body made of material that is completely opaque and poorly reflective, when it has reached thermodynamic equilibrium, as a function of absolute thermodynamic temperature alone. A small enough hole in the wall of the cavity emits near enough blackbody radiation of which the spectral radiance can be precisely measured. The walls of the cavity, provided they are completely opaque and poorly reflective, can be of any material indifferently.

Primary and secondary thermometers

[edit]

A thermometer is called primary or secondary based on how the raw physical quantity it measures is mapped to a temperature. As summarized by Kauppinen et al., "For primary thermometers the measured property of matter is known so well that temperature can be calculated without any unknown quantities. Examples of these are thermometers based on the equation of state of a gas, on the velocity of sound in a gas, on the thermal noise voltage or current of an electrical resistor, and on the angular anisotropy of gamma ray emission of certain radioactive nuclei in a magnetic field."[38]

In contrast, "Secondary thermometers are most widely used because of their convenience. Also, they are often much more sensitive than primary ones. For secondary thermometers knowledge of the measured property is not sufficient to allow direct calculation of temperature. They have to be calibrated against a primary thermometer at least at one temperature or at a number of fixed temperatures. Such fixed points, for example, triple points and superconducting transitions, occur reproducibly at the same temperature."[38]

Calibration

[edit]
Mercury-in-glass thermometer

Thermometers can be calibrated either by comparing them with other calibrated thermometers or by checking them against known fixed points on the temperature scale. The best known of these fixed points are the melting and boiling points of pure water. (Note that the boiling point of water varies with pressure, so this must be controlled.)

The traditional way of putting a scale on a liquid-in-glass or liquid-in-metal thermometer was in three stages:

  1. Immerse the sensing portion in a stirred mixture of pure ice and water at atmospheric pressure and mark the point indicated when it had come to thermal equilibrium.
  2. Immerse the sensing portion in a steam bath at standard atmospheric pressure and again mark the point indicated.
  3. Divide the distance between these marks into equal portions according to the temperature scale being used.

Other fixed points used in the past are the body temperature (of a healthy adult male) which was originally used by Fahrenheit as his upper fixed point (96 °F (35.6 °C) to be a number divisible by 12) and the lowest temperature given by a mixture of salt and ice, which was originally the definition of 0 °F (−17.8 °C).[39] (This is an example of a frigorific mixture.) As body temperature varies, the Fahrenheit scale was later changed to use an upper fixed point of boiling water at 212 °F (100 °C).[40]

These have now been replaced by the defining points in the International Temperature Scale of 1990, though in practice the melting point of water is more commonly used than its triple point, the latter being more difficult to manage and thus restricted to critical standard measurement. Nowadays manufacturers will often use a thermostat bath or solid block where the temperature is held constant relative to a calibrated thermometer. Other thermometers to be calibrated are put into the same bath or block and allowed to come to equilibrium, then the scale marked, or any deviation from the instrument scale recorded.[41] For many modern devices calibration will be stating some value to be used in processing an electronic signal to convert it to a temperature.

Precision, accuracy, and reproducibility

[edit]
The "Boyce MotoMeter" radiator cap on a 1913 Car-Nation automobile, used to measure temperature of vapor in 1910s and 1920s cars.
Separated columns are often a problem in both alcohol and mercury thermometers, and they can make a temperature reading inaccurate.

The precision or resolution of a thermometer is simply to what fraction of a degree it is possible to make a reading. For high temperature work it may only be possible to measure to the nearest 10 °C or more. Clinical thermometers and many electronic thermometers are usually readable to 0.1 °C. Special instruments can give readings to one thousandth of a degree.[42] However, this precision does not mean the reading is true or accurate, it only means that very small changes can be observed.

A thermometer calibrated to a known fixed point is accurate (i.e. gives a true reading) at that point. The invention of the technology to measure temperature led to the creation of scales of temperature.[43] In between fixed calibration points, interpolation is used, usually linear.[41] This may give significant differences between different types of thermometer at points far away from the fixed points. For example, the expansion of mercury in a glass thermometer is slightly different from the change in resistance of a platinum resistance thermometer, so these two will disagree slightly at around 50 °C.[44] There may be other causes due to imperfections in the instrument, e.g. in a liquid-in-glass thermometer if the capillary tube varies in diameter.[44]

For many purposes reproducibility is important. That is, does the same thermometer give the same reading for the same temperature (or do replacement or multiple thermometers give the same reading)? Reproducible temperature measurement means that comparisons are valid in scientific experiments and industrial processes are consistent. Thus if the same type of thermometer is calibrated in the same way its readings will be valid even if it is slightly inaccurate compared to the absolute scale.

An example of a reference thermometer used to check others to industrial standards would be a platinum resistance thermometer with a digital display to 0.1 °C (its precision) which has been calibrated at 5 points against national standards (−18, 0, 40, 70, 100 °C) and which is certified to an accuracy of ±0.2 °C.[45]

According to British Standards, correctly calibrated, used and maintained liquid-in-glass thermometers can achieve a measurement uncertainty of ±0.01 °C in the range 0 to 100 °C, and a larger uncertainty outside this range: ±0.05 °C up to 200 or down to −40 °C, ±0.2 °C up to 450 or down to −80 °C.[46]

Indirect methods of temperature measurement

[edit]
Thermal expansion
Utilizing the property of thermal expansion of various phases of matter.
Pairs of solid metals with different expansion coefficients can be used for bi-metal mechanical thermometers. Another design using this principle is Breguet's thermometer.
Some liquids possess relatively high expansion coefficients over a useful temperature ranges thus forming the basis for an alcohol or mercury thermometer. Alternative designs using this principle are the reversing thermometer and Beckmann differential thermometer.
As with liquids, gases can also be used to form a gas thermometer.
Pressure
Vapour pressure thermometer
Density
Galileo thermometer[47]
Thermochromism
Some compounds exhibit thermochromism at distinct temperature changes. Thus by tuning the phase transition temperatures for a series of substances the temperature can be quantified in discrete increments, a form of digitization. This is the basis for a liquid crystal thermometer.
Band edge thermometry (BET)
Band edge thermometry (BET) takes advantage of the temperature-dependence of the band gap of semiconductor materials to provide very precise optical (i.e. non-contact) temperature measurements.[48] BET systems require a specialized optical system, as well as custom data analysis software.[49][50]
Blackbody radiation
An infrared thermometer is a kind of pyrometer (bolometer).
All objects above absolute zero emit blackbody radiation for which the spectra is directly proportional to the temperature. This property is the basis for a pyrometer or infrared thermometer and thermography. It has the advantage of remote temperature sensing; it does not require contact or even close proximity unlike most thermometers. At higher temperatures, blackbody radiation becomes visible and is described by the colour temperature. For example a glowing heating element or an approximation of a star's surface temperature.
Fluorescence
Phosphor thermometry
Optical absorbance spectra
Fiber optical thermometer
Electrical resistance
Resistance thermometer which use materials such as Balco alloy
Thermistor
Coulomb blockade thermometer
Electrical potential
Thermocouples are useful over a wide temperature range from cryogenic temperatures to over 1000°C, but typically have an error of ±0.5-1.5°C.
Silicon bandgap temperature sensors are commonly found packaged in integrated circuits with accompanying ADC and interface such as I2C. Typically they are specified to work within about —50 to 150°C with accuracies in the ±0.25 to 1°C range but can be improved by binning.[51][52]
Electrical resonance
Quartz thermometer
Nuclear magnetic resonance
Chemical shift is temperature dependent. This property is used to calibrate the thermostat of NMR probes, usually using methanol or ethylene glycol.[53][54] This can potentially be problematic for internal standards which are usually assumed to have a defined chemical shift (e.g 0 ppm for TMS) but in fact exhibit a temperature dependence.[55]
Magnetic susceptibility
Above the Curie temperature, the magnetic susceptibility of a paramagnetic material exhibits an inverse temperature dependence. This phenomenon is the basis of a magnetic cryometer.[56][57]

Applications

[edit]

Thermometers utilize a range of physical effects to measure temperature. Temperature sensors are used in a wide variety of scientific and engineering applications, especially measurement systems. Temperature systems are primarily either electrical or mechanical, occasionally inseparable from the system which they control (as in the case of a mercury-in-glass thermometer). Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist. Indoors, thermistors are used in climate control systems such as air conditioners, freezers, heaters, refrigerators, and water heaters.[58] Galileo thermometers are used to measure indoor air temperature, due to their limited measurement range.

Such liquid crystal thermometers (which use thermochromic liquid crystals) are also used in mood rings and used to measure the temperature of water in fish tanks.

Fiber Bragg grating temperature sensors are used in nuclear power facilities to monitor reactor core temperatures and avoid the possibility of nuclear meltdowns.[59]

Nanothermometry

[edit]

Nanothermometry is an emergent research field dealing with the knowledge of temperature in the sub-micrometric scale. Conventional thermometers cannot measure the temperature of an object which is smaller than a micrometre, and new methods and materials have to be used. Nanothermometry is used in such cases. Nanothermometers are classified as luminescent thermometers (if they use light to measure temperature) and non-luminescent thermometers (systems where thermometric properties are not directly related to luminescence).[60]

Cryometer

[edit]

Thermometers used specifically for low temperatures.

Medical

[edit]
A Kinsa QuickCare smart thermometer.

Various thermometric techniques have been used throughout history such as the Galileo thermometer to thermal imaging.[47] Medical thermometers such as mercury-in-glass thermometers, infrared thermometers, pill thermometers, and liquid crystal thermometers are used in health care settings to determine if individuals have a fever or are hypothermic.

Food and food safety

[edit]

Thermometers are important in food safety, where food at temperatures within 41 and 135 °F (5 and 57 °C) can be prone to potentially harmful levels of bacterial growth after several hours which could lead to foodborne illness. This includes monitoring refrigeration temperatures and maintaining temperatures in foods being served under heat lamps or hot water baths.[58] Cooking thermometers are important for determining if a food is properly cooked. In particular meat thermometers are used to aid in cooking meat to a safe internal temperature while preventing over cooking. They are commonly found using either a bimetallic coil, or a thermocouple or thermistor with a digital readout. Candy thermometers are used to aid in achieving a specific water content in a sugar solution based on its boiling temperature.

Environmental

[edit]

Alcohol thermometers, infrared thermometers, mercury-in-glass thermometers, recording thermometers, thermistors, and Six's thermometers (maximum-minimum thermometer) are used in meteorology and climatology in various levels of the atmosphere and oceans. Aircraft use thermometers and hygrometers to determine if atmospheric icing conditions exist along their flight path. These measurements are used to initialize weather forecast models. Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist and indoors in climate control systems.

See also

[edit]

References

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A thermometer is an instrument designed to measure temperature by detecting and quantifying changes in physical properties, such as the expansion or contraction of liquids like mercury or alcohol, or variations in electrical resistance or infrared radiation emitted by an object. The development of thermometers traces back to the early 17th century, evolving from rudimentary thermoscopes—devices that indicated temperature changes without numerical scales—to precise instruments with standardized scales. Key milestones include Galileo Galilei's 1610 invention of an alcohol-based thermoscope, Ferdinand II de’ Medici's 1654 sealed alcohol thermometer, and Gabriel Fahrenheit's mercury thermometer (invented 1714) with the Fahrenheit scale (proposed 1724), which marked the transition to reliable quantitative measurement. Later advancements, such as Anders Celsius's 1742 centigrade scale and Thomas Clifford Allbutt's 1867 clinical thermometer, expanded their utility in medicine and science. Thermometers operate on diverse principles and come in various types to suit different applications, from everyday use to specialized scientific measurements. Liquid-in-glass thermometers, historically common, rely on the of liquids within a capillary tube, though they have largely been replaced due to hazards like mercury . Digital thermometers, using thermistors or thermocouples, convert resistance or voltage changes into readings and offer advantages like higher accuracy (up to ±0.05°C) and faster response times. Non-contact options, such as thermometers, detect for surface measurements, while specialized variants like fiber-optic sensors enable distributed monitoring in challenging environments. These devices are essential across fields including , , and industry, where accurate temperature data informs everything from to diagnosing fevers (normal is approximately 37°C or 98.6°F). Modern standards, such as the , , and scales, ensure global consistency, with the scale defining at 0 K for thermodynamic applications.

Introduction

Definition and Purpose

A thermometer is an instrument designed to measure by detecting and quantifying changes in the physical properties of a substance or system in response to thermal variations, converting these changes into a numerical value on a calibrated scale. This device enables the objective assessment of thermal states, distinguishing it from subjective empirical evaluations based on human sensation and providing instead a standardized, absolute essential for consistency across observations. The core purpose of a thermometer is to facilitate the precise quantification of hotness or coldness in diverse contexts, including scientific experiments, industrial monitoring, assessments, and routine environmental checks, thereby supporting informed and protocols. By translating phenomena into reproducible , thermometers underpin advancements in fields ranging from quantum physics to , while also aiding everyday tasks like cooking or tracking. At its foundation, a thermometer relies on the predictable variation of an observable property—such as volume expansion, electrical resistance, or spectral emission—with , allowing the correlation of these changes to a defined scale. Essential components include a sensing element that responds to thermal input, a graduated scale for numerical interpretation, and a display for user-readable output, ensuring the device's functionality across applications. These elements produce readings aligned with established scales, such as or .

Temperature Scales

Temperature scales provide standardized systems for measuring , enabling consistent quantification of across scientific, industrial, and everyday applications. These scales are defined relative to fixed points, such as phase transitions of water, and absolute references like zero . The primary scales in use today are the and scales in the (SI), alongside the scale in certain regions, with historical scales like Rankine and Réaumur offering additional context for thermodynamic measurements. The is the of , defined such that the is exactly 1.380 649 × 10^{-23} J/, establishing 0 as —the theoretical point where molecular motion ceases. The degree size matches that of the scale, with the of fixed at exactly 273.16 , serving as a fundamental reference for . This avoids negative values and is essential for equations in physics and chemistry involving . The Celsius scale, denoted °C, is a relative scale originally defined by assigning 0 °C to the freezing point of water at standard and 100 °C to its , dividing the interval into 100 equal degrees. Since 2019, it is formally tied to the scale, where 0 °C equals 273.15 K, maintaining the same interval size as one . This scale's practical fixed points facilitate everyday and use, though modern calibrations rely on the for precision. The scale, denoted °F, sets the freezing point of at 32 °F and the at 212 °F under standard , creating 180 divisions between these points—thus, one Fahrenheit degree is 5/9 the size of a degree. Developed for empirical consistency in early thermometry, it remains prevalent in the United States for non-scientific contexts. Other scales include the (°R), an absolute counterpart to Fahrenheit where 0 °R corresponds to and the degree size equals one Fahrenheit degree; for instance, the freezing point of is 491.67 °R. The Réaumur scale (°Re or °Ré), a historical , defines 's freezing point as 0 °Ré and as 80 °Ré, with each degree being 1.25 degrees, once used in European engineering but now obsolete. Fixed points are critical for defining and calibrating these scales, with the —where solid, liquid, and vapor phases coexist in equilibrium at 0.01 °C (273.16 K or 32.018 °F)—serving as the modern due to its and from variations. This point replaced earlier reliance on the ice point (0 °C) and steam point (100 °C) for greater accuracy in the International Temperature Scale of 1990 (ITS-90). Conversions between scales derive from their interval ratios and zero-point offsets. For Celsius to Kelvin, add 273.15, as the scales share identical degree sizes and 0 °C is defined as 273.15 K: T(K)=t(C)+273.15T(\mathrm{K}) = t(^\circ\mathrm{C}) + 273.15 This offset stems from the assignment, where 0.01 °C = 273.16 K, approximating the historical ice-point relation. The Fahrenheit-to-Celsius conversion accounts for the 1.8:1 degree ratio (from 180 °F spanning 100 °C) and 32 °F offset at the ice point. Subtract 32 °F to align zeros, then divide by 1.8: t(C)=t(F)321.8t(^\circ\mathrm{C}) = \frac{t(^\circ\mathrm{F}) - 32}{1.8} Conversely, multiply by 1.8 and add 32 for Celsius to Fahrenheit: t(F)=t(C)×1.8+32t(^\circ\mathrm{F}) = t(^\circ\mathrm{C}) \times 1.8 + 32 These derive directly from the fixed points: boiling water difference yields the ratio (212 - 32) °F = 180 °F for 100 °C, so 9/5 = 1.8 °F/°C. For Rankine, add 459.67 to Fahrenheit values, as 0 °F = 459.67 °R from absolute zero alignment. Réaumur conversions use its 0.8:1 ratio to Celsius (80 °Ré for 100 °C), so multiply Celsius by 0.8: t(Reˊ)=t(C)×0.8t(^\circ\mathrm{Ré}) = t(^\circ\mathrm{C}) \times 0.8 These transformations ensure interoperability across scales in thermometric applications.

History

Ancient and Early Developments

The earliest efforts to conceptualize and observe temperature changes date back to ancient civilizations, where qualitative assessments predominated before the development of quantitative devices. In metallurgy, practitioners in ancient China and India relied on visual cues, such as the color of heated metals, to gauge hotness during forging and smelting processes; for instance, terms like "red heat" indicated specific temperature ranges suitable for working wootz steel in India. Evaporative cooling techniques, such as using wet materials to lower ambient heat, also served as rudimentary methods to sense and manage temperature differences in these contexts. In the , Greek engineers made initial strides toward instrumental measurement. (c. 280–220 BC) described a thermoscope-like device that exploited air expansion: a hollow lead sphere connected by a tube to a vessel, where heating caused the air to expand and displace the level, demonstrating temperature-induced volume changes. This apparatus, detailed in Philo's , marked an early recognition of as a detectable phenomenon. Hero of Alexandria (c. 10–70 AD) refined such concepts in his Pneumatica, employing a similar open tube system with to make variations more visible through fluid displacement, though without numerical calibration. These devices functioned as qualitative indicators, showing relative hotness or coldness via mechanical effects rather than precise measurement. By the late , progress shifted toward quantification in . Galileo Galilei (c. 1593) developed a water-filled —a glass tube with a inverted into a basin—allowing observation of variations through liquid level shifts. He introduced one of the first fixed-point scales, marking approximately 100 arbitrary divisions between the ice point (as a cold reference) and (as a warm reference), enabling comparative readings despite inconsistencies. These ancient and early prototypes shared critical limitations as open systems: they were highly sensitive to atmospheric pressure fluctuations, which altered fluid levels independently of temperature, rendering them unreliable for absolute measurements and distinguishing them from true sealed thermometers.

Renaissance and Standardization Efforts

Building upon the qualitative thermoscopes of antiquity, the period marked a pivotal shift toward quantitative through the development of sealed instruments and the introduction of numerical scales. In 1612, Italian physician adapted the for clinical use, applying the first numerical scale to track fever in patients. This innovation transformed the instrument from a mere indicator of expansion into a tool for precise medical observation, emphasizing its role in quantifying bodily temperatures. A key advancement in reliability came with the invention of the sealed liquid-in-glass thermometer by , of , in 1654. By enclosing alcohol within a bulb and stem, hermetically sealing both ends, Ferdinando eliminated the influence of variations that plagued open thermoscopes, enabling more consistent readings across different conditions. Concurrently, early gas-based thermometers emerged, with French physicist developing an air thermometer in the late —around 1699—that measured temperature via pressure changes in a constant volume of air, laying groundwork for later constant-volume gas thermometry. Efforts toward began in the early , as physicians and scientists sought uniform scales to facilitate comparable measurements. French doctor Jean Rey constructed the first liquid-expansion thermometer using water around 1631, representing an initial step toward scalable designs, though it remained unsealed and lacked a formalized division. By 1714, German instrument maker introduced the , which offered greater precision due to mercury's uniform expansion, and in 1724 proposed his scale with fixed points at 32° for water's freezing and 212° for boiling, calibrated against a mixture at 0°. Swedish astronomer advanced this further in 1742 by devising the centigrade scale for his mercury thermometers, initially setting 0° at water's and 100° at freezing (later inverted), using the ice and steam points as anchors for reproducibility.60910-0/fulltext) Despite these innovations, early faltered without international consensus, as varying fixed points and divisions—such as those based on or arbitrary gradations—hindered widespread adoption until later refinements.

Modern Precision Advancements

In the mid-19th century, precision thermometry advanced significantly with proposal of an absolute temperature scale in 1848, based on Carnot's thermodynamic principles, which defined temperature independently of material properties and established zero as the point of no thermal motion. This scale provided a theoretical foundation for accurate measurements, influencing subsequent instrument designs by emphasizing reproducibility and thermodynamic consistency. A key practical innovation came in 1887 when Hugh Longbourne Callendar developed the platinum resistance thermometer at the , demonstrating that platinum's electrical resistance varies predictably and linearly with , enabling stable and reproducible measurements up to 500°C with precision to 1 part in 10,000. Callendar's design, detailed in his experiments on resistance as a measure, proved superior to gas thermometers for industrial applications due to its portability and minimal , facilitating accurate and widespread adoption in by the early . The 20th century saw further milestones in thermoelectric thermometry, building on Thomas Seebeck's 1821 discovery of the , where a difference across dissimilar metals generates voltage. By , quantitative characterization of alloys such as those with , tin, and enabled practical thermocouples for industrial use, with commercial standardization for high-temperature monitoring in manufacturing and power generation. Non-contact methods advanced with pyrometers in the 1920s and 1930s; Hungarian Kálmán Tihanyi's 1929 for an camera laid groundwork for thermal imaging, while the first dedicated emerged in 1931, allowing remote measurement of hot objects without physical contact, crucial for and wartime applications. Ratio pyrometers, developed commercially by 1939, improved accuracy by comparing intensities at multiple wavelengths, reducing errors from variations. Post-2000 developments integrated (MEMS) into digital thermometers, enabling compact, low-power devices with resolutions below 0.1°C for biomedical and consumer uses, as reviewed in advancements leveraging microstructures for sensing in healthcare monitoring. Quantum advancements in the 2020s have introduced nitrogen-vacancy (NV) centers in as nanoscale thermometers, offering sub-micron and sensitivities down to millikelvin changes via shifts, with applications in cellular and microelectronics mapping. Emerging in the 2010s, fiber-optic thermometers for (IoT) applications utilize fluorescence decay or in optical fibers to enable distributed, EMI-resistant sensing over kilometers, supporting smart grids and with accuracies of ±0.5°C. Complementing these, IoT temperature sensors, driven by low-power wide-area networks like LoRaWAN, proliferated for remote data logging in and , achieving battery lives exceeding five years while integrating with cloud analytics for real-time alerts.

Physical Principles

Thermometric Properties of Materials

Thermometric properties refer to the measurable physical characteristics of materials that vary predictably and reproducibly with , serving as the foundation for temperature sensing in thermometers. These properties include changes in , , electrical resistance, voltage generation, and phase transitions, which allow materials to indicate through observable or quantifiable alterations. Selection of materials depends on factors such as sensitivity (the magnitude of property change per unit ), operational range, (discrepancy in readings during heating versus cooling), and long-term stability, with solids often preferred for mechanical robustness and gases for high accuracy in idealized conditions despite challenges in thermal equilibration. Thermal expansion is a key thermometric property exploited in liquid-based thermometers, where substances like mercury and alcohol increase in volume linearly with . The change in ΔL of a is given by the ΔL=αLΔT,\Delta L = \alpha L \Delta T, where α is the linear coefficient (approximately 10^{-4} K^{-1} for liquids), L is the original , and ΔT is the change; this volumetric expansion in confined liquids produces a visible rise in a capillary tube. Electrical properties provide precise thermometric responses, particularly through resistance variations in metals. For , widely used due to its stability, resistance R changes as R=R0(1+αΔT),R = R_0 (1 + \alpha \Delta T), where R_0 is the resistance at a reference and α ≈ 0.00385 K^{-1}; this positive enables accurate resistance temperature detectors (RTDs). The Seebeck effect in thermocouples generates a voltage ΔV across junctions of dissimilar metals proportional to the temperature difference, expressed as ΔV=αΔT,\Delta V = \alpha \Delta T, with α (the ) around 40 μV/K for common types like chromel-alumel, allowing measurement over wide ranges from cryogenic to high temperatures. Phase changes offer visual or mechanical indications of temperature through structural alterations. Bimetallic strips consist of two bonded metals with differing expansion coefficients, such as and , causing bending upon heating due to differential expansion rates, which can deflect a pointer or trigger a switch. Liquid crystals exhibit , changing color reversibly as temperature alters their molecular helical structure and light properties, enabling non-contact displays for surface temperature mapping. Material selection prioritizes high sensitivity for fine resolution (e.g., thermocouples at 40 μV/K), broad range (e.g., -200 to 1300°C for certain alloys), low to ensure , and stability against aging or ; solids like provide excellent long-term consistency, while gases excel in theoretical precision for constant-volume applications but require careful handling due to lower thermal conductivity.

Constant-Volume and Gas Thermometry

Constant-volume gas thermometry is a primary method for measuring based on the pressure changes of a gas confined to a fixed . According to the , PV=nRTPV = nRT, where PP is , VV is , nn is the number of moles, RR is the , and TT is the absolute , temperature is directly proportional to pressure when volume and the amount of gas are held constant. Thus, by monitoring pressure variations in a sealed bulb, the temperature can be determined with high precision, making this technique fundamental to scales. In operation, a constant-volume gas thermometer typically employs low-density gases such as or to minimize deviations from ideal behavior. The apparatus consists of a rigid connected to a gauge, often a manometer, immersed in the environment whose is to be measured. As changes, the gas adjusts accordingly, and readings are taken relative to reference points like the of (273.16 ). For practical , the is calculated using the formula T=PP0PtpP0×273.16KT = \frac{P - P_0}{P_{\text{tp}} - P_0} \times 273.16 \, \text{K}, where PP is the measured , PtpP_{\text{tp}} is the at the , and P0P_0 is the extrapolated at . To define the thermodynamic rigorously, measurements are extrapolated to the limit of zero gas density (or infinite volume), where TT is proportional to the limit of P/TP / T approaching the constant, ensuring independence from the specific gas used. is particularly favored for low- applications due to its inertness and behavior close to ideality even near . This method played a pivotal historical role in establishing the Kelvin scale, as it allowed metrologists to extrapolate to , defining the scale's foundation in the late . Its advantages include exceptional accuracy, often achieving uncertainties below 0.001 in controlled settings, and reliability across a wide range, particularly with for measurements approaching where other thermometers fail. However, constant-volume gas thermometers are inherently bulky due to the need for large bulbs and precise systems, and they exhibit slow thermal response times, limiting their use to standards rather than routine applications.

Radiometric and Optical Methods

Radiometric and optical methods for rely on the principles of emitted by objects, enabling non-contact sensing across a wide range of temperatures and distances. These techniques are grounded in theory, which describes the emitted by an idealized body that absorbs all incident . The total emissive power JJ of a blackbody is given by the Stefan-Boltzmann law: J=σT4J = \sigma T^4, where σ\sigma is the Stefan-Boltzmann constant (5.6704×1085.6704 \times 10^{-8} W m2^{-2} K4^{-4}) and TT is the absolute temperature in . This law quantifies how the total radiated energy scales with the fourth power of temperature, forming the basis for radiometric thermometry. Additionally, states that the wavelength λmax\lambda_{\max} at which the spectral radiance peaks is inversely proportional to temperature: λmaxT=b\lambda_{\max} T = b, with b2898b \approx 2898 μm·K. This relation shifts the peak emission to shorter wavelengths as temperature increases, guiding the selection of detection wavelengths in optical systems. Pyrometry utilizes these radiation laws to infer from the intensity and distribution of emitted . In optical pyrometers, such as the disappearing filament type, the brightness of a heated filament is visually matched to the target's glow through an optical , with the filament current calibrated to via the Planck radiation law approximation in the visible range. When the filament "disappears" against the background, their radiances are equal, allowing direct estimation for high-temperature sources like furnaces. For lower temperatures, infrared thermometers detect in the 8-14 μm , where atmospheric absorption by and CO₂ is minimal, enabling accurate measurement of thermal emission from surfaces. These devices apply the Stefan-Boltzmann law, adjusted for the target's (a measure of how closely it approximates a blackbody), to convert detected to . Advanced optical methods extend these principles using light interactions for precise, localized sensing. Fiber-optic sensors based on fluorescence decay employ phosphorescent materials, such as chromium-doped , where the excited-state lifetime inversely correlates with : . is transmitted via optical s to the sensor tip, and the decay time of returned fluorescence is analyzed, providing immunity to fiber losses and enabling measurements up to 700°C or higher. offers remote sensing by probing molecular vibrations in the target; the Stokes-to-anti-Stokes intensity ratio in scattered varies with temperature, allowing non-invasive profiling in gases or liquids over distances. This technique is particularly suited for environmental or industrial , as the Raman shift provides a direct spectroscopic thermometer independent of . In the 2020s, has advanced remote thermometry by capturing narrow spectral bands across the , enabling precise discrimination of surface temperatures in complex scenes. These systems, often deployed on satellites or drones, leverage Wien's law to map variations for climate monitoring, such as tracking sea surface temperatures or vegetation stress with sub-degree accuracy. By integrating multiple wavelengths, hyperspectral approaches mitigate uncertainties and enhance in dynamic environments.

Types of Thermometers

Primary Thermometers

Primary thermometers are devices that measure by directly realizing the scale, independent of prior calibration against other thermometers, typically relying on fundamental physical laws such as the or . A prominent example is the constant-volume gas thermometer, which operates by enclosing a fixed volume of gas, often or another , in a connected to a -measuring system; as changes, the gas varies proportionally according to the PV=nRTPV = nRT, where at constant volume VV, PP is directly proportional to absolute TT, allowing TT to be determined from measured PP relative to a reference point like the of . Another example is the acoustic gas thermometer, which determines temperature from the in a , such as , confined in a resonant cavity; the speed of sound cc follows cTc \propto \sqrt{T}
Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.