Respect all members: no insults, harassment, or hate speech.
Be tolerant of different viewpoints, cultures, and beliefs. If you do not agree with others, just create separate note, article or collection.
Clearly distinguish between personal opinion and fact.
Verify facts before posting, especially when writing about history, science, or statistics.
Promotional content must be published on the “Related Services and Products” page—no more than one paragraph per service. You can also create subpages under the “Related Services and Products” page and publish longer promotional text there.
Do not post materials that infringe on copyright without permission.
Always credit sources when sharing information, quotes, or media.
Be respectful of the work of others when making changes.
Discuss major edits instead of removing others' contributions without reason.
If you notice rule-breaking, notify community about it in talks.
Do not share personal data of others without their consent.
Mercury thermometer (mercury-in-glass thermometer) for measurement of room temperature.[1]
A thermometer, from Ancient Greek θερμός (thermós), meaning "warmth", and μέτρον (métron), meaning "measure", is a device that measures temperature (the hotness or coldness of an object) or temperature gradient (the rates of change of temperature in space). A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermometer or the pyrometric sensor in an infrared thermometer) in which some change occurs with a change in temperature; and (2) some means of converting this change into a numerical value (e.g. the visible scale that is marked on a mercury-in-glass thermometer or the digital readout on an infrared model). Thermometers are widely used in technology and industry to monitor processes, in meteorology, in medicine (medical thermometer), and in scientific research.
While an individual thermometer is able to measure degrees of hotness, the readings on two thermometers cannot be compared unless they conform to an agreed scale. Today there is an absolute thermodynamic temperature scale. Internationally agreed temperature scales are designed to approximate this closely, based on fixed points and interpolating thermometers. The most recent official temperature scale is the International Temperature Scale of 1990. It extends from 0.65 K (−272.5 °C; −458.5 °F) to approximately 1,358 K (1,085 °C; 1,985 °F).
Sparse and conflicting historical records make it difficult to pinpoint the invention of the thermometer to any single person or date with certitude. In addition, given the many parallel developments in the thermometer's history and its many gradual improvements over time, the instrument is best viewed not as a single invention, but an evolving technology.
In the 3rd century BC, Philo of Byzantium documented his experiment with a tube submerged in a container of liquid on one end and connected to an air-tight, hollow sphere on the other. When air in the sphere is heated with a candle or by exposing it to the sun, expanding air exits the sphere and generates bubbles in the vessel. As air in the sphere cools, a partial vacuum is created, sucking liquid up into the tube. Any changes in the position of the liquid will now indicate whether the air in the sphere is getting hotter or colder.
Translations of Philo's experiment from the original ancient Greek were utilized by Robert Fludd sometime around 1617 and used as the basis for his air thermometer.[2]: 15
In his book, Pneumatics, Hero of Alexandria (10–70 AD) provides a recipe for building a "Fountain which trickles by the Action of the Sun's Rays," a more elaborate version of Philo's pneumatic experiment but which worked on the same principle of heating and cooling air to move water around.[3] Translations of the ancient work Pneumatics were introduced to late 16th century Italy and studied by many, including Galileo Galilei, who had read it by 1594.[2]: 5
Hasler's temperature scale showing degrees of body temperature based on an individual's latitude.
The Roman Greek physician Galen is given credit for introducing two concepts important to the development of a scale of temperature and the eventual invention of the thermometer. First, he had the idea that hotness or coldness may be measured by "degrees of hot and cold." He also conceived of a fixed reference temperature, a mixture of equal amounts of ice and boiling water, with four degrees of heat above this point and four degrees of cold below. 16th century physician Johann Hasler developed body temperature scales based on Galen's theory of degrees to help him mix the appropriate amount of medicine for patients.[2]: 3
In the late 16th and early 17th centuries, several European scientists, notably Galileo Galilei[4] and Italian physiologist Santorio Santorio,[5] developed devices with an air-filled glass bulb, connected to a tube, partially filled with water. As the air in the bulb warms or cools, the height of the column of water in the tube falls or rises, allowing an observer to compare the current height of the water to previous heights to detect relative changes of the heat in the bulb and its immediate environment. Such devices, with no scale for assigning a numerical value to the height of the liquid, are referred to as a thermoscope because they provide an observable indication of sensible heat (the modern concept of temperature was yet to arise).[2]
The difference between a thermoscope and a thermometer is that the latter has a scale.[6][2]: 4
A thermometer is simply a thermoscope with a scale. ... I propose to regard it as axiomatic that a “meter” must have a scale or something equivalent. ... If this is admitted, the problem of the invention of the thermometer becomes more straightforward; that of the invention of the thermoscope remains as obscure as ever.
— W. E. K. Middleton, A history of the thermometer and its use in meteorology
Given this, Middleton claimed that the possible inventors of the thermometer are Galileo, Santorio, Dutch inventor Cornelis Drebbel, or British mathematician Robert Fludd.[2]: 5 Though Galileo is often said to be the inventor of the thermometer, there is no surviving document that he actually produced any such instrument.
The first clear diagram of a thermoscope was published in 1617 by Giuseppe Biancani (1566 – 1624);[2]: 10 the first showing a scale and thus constituting a thermometer was by Santorio Santorio in 1625.[5] This was a vertical tube, closed by a bulb of air at the top, with the lower end opening into a vessel of water. The water level in the tube was controlled by the expansion and contraction of the air, so it was what we would now call an air thermometer.[7]
The word thermometer (in its French form) first appeared in 1624 in La Récréation Mathématique by Jean Leurechon, who describes one with a scale of 8 degrees.[8] The word comes from Ancient Greek θερμός (thermós), meaning "warmth", and μέτρον (métron), meaning "measure".
Fifty-degree thermometers from the mid-17th century on exhibit at the Museo Galileo with black dots representing single degrees and white represented 10-degree increments; used to measure atmospheric temperatures
The above instruments suffered from the disadvantage that they were also barometers, i.e. sensitive to air pressure. In 1629, Joseph Solomon Delmedigo, a student of Galileo and Santorio in Padua, published what is apparently the first description and illustration of a sealed liquid-in-glass thermometer. It is described as having a bulb at the bottom of a sealed tube partially filled with brandy. The tube had a numbered scale. Delmedigo did not claim to have invented this instrument. Nor did he name anyone else as its inventor.[9] In about 1654, Ferdinando II de' Medici, Grand Duke of Tuscany (1610–1670) did produce such an instrument, the first modern-style thermometer, dependent on the expansion of a liquid and independent of air pressure.[8] Many other scientists experimented with various liquids and designs of thermometer. However, each inventor and each thermometer was unique — there was no standard scale.
Early attempts at standardization added a single reference point such as the freezing point of water. The use of two references for graduating the thermometer is said to have been introduced by Joachim Dalence in 1668,[10]: 7–8 although Christiaan Huygens (1629–1695) in 1665 had already suggested the use of graduations based on the melting and boiling points of water as standards[11] and, in 1694, Carlo Rinaldini (1615–1698) proposed using them as fixed points along a universal scale divided into degrees.[12][13][10]: 56 In 1701, Isaac Newton (1642–1726/27) proposed a scale of 12 degrees between the melting point of ice and body temperature.[10]: 57–60
The first physician to use thermometer measurements in clinical practice was Herman Boerhaave (1668–1738).[16] In 1866, Sir Thomas Clifford Allbutt (1836–1925) invented a clinical thermometer that produced a body temperature reading in five minutes as opposed to twenty.[17] In 1999, Dr. Francesco Pompei of the Exergen Corporation introduced the world's first temporal artery thermometer, a non-invasive temperature sensor which scans the forehead in about two seconds and provides a medically accurate body temperature.[18][19]
Traditional thermometers were all non-registering thermometers. That is, the thermometer did not hold the temperature reading after it was moved to a place with a different temperature. Determining the temperature of a pot of hot liquid required the user to leave the thermometer in the hot liquid until after reading it. If the non-registering thermometer was removed from the hot liquid, then the temperature indicated on the thermometer would immediately begin changing to reflect the temperature of its new conditions (in this case, the air temperature). Registering thermometers are designed to hold the temperature indefinitely, so that the thermometer can be removed and read at a later time or in a more convenient place. Mechanical registering thermometers hold either the highest or lowest temperature recorded until manually re-set, e.g., by shaking down a mercury-in-glass thermometer, or until an even more extreme temperature is experienced. Electronic registering thermometers may be designed to remember the highest or lowest temperature, or to remember whatever temperature was present at a specified point in time.
Thermometers increasingly use electronic means to provide a digital display or input to a computer.
Various thermometers from the 19th century.Comparison of the Celsius and Fahrenheit scales
Thermometers may be described as empirical or absolute. Absolute thermometers are calibrated numerically by the thermodynamic absolute temperature scale. Empirical thermometers are not in general necessarily in exact agreement with absolute thermometers as to their numerical scale readings, but to qualify as thermometers at all they must agree with absolute thermometers and with each other in the following way: given any two bodies isolated in their separate respective thermodynamic equilibrium states, all thermometers agree as to which of the two has the higher temperature, or that the two have equal temperatures.[20] For any two empirical thermometers, this does not require that the relation between their numerical scale readings be linear, but it does require that relation to be strictly monotonic.[21] This is a fundamental character of temperature and thermometers.[22][23][24]
As it is customarily stated in textbooks, taken alone, the so-called "zeroth law of thermodynamics" fails to deliver this information, but the statement of the zeroth law of thermodynamics by James Serrin in 1977, though rather mathematically abstract, is more informative for thermometry: "Zeroth Law – There exists a topological line which serves as a coordinate manifold of material behaviour. The points of the manifold are called 'hotness levels', and is called the 'universal hotness manifold'."[25] To this information there needs to be added a sense of greater hotness; this sense can be had, independently of calorimetry, of thermodynamics, and of properties of particular materials, from Wien's displacement law of thermal radiation: the temperature of a bath of thermal radiation is proportional, by a universal constant, to the frequency of the maximum of its frequency spectrum; this frequency is always positive, but can have values that tend to zero. Another way of identifying hotter as opposed to colder conditions is supplied by Planck's principle, that when a process of isochoric adiabatic work is the sole means of change of internal energy of a closed system, the final state of the system is never colder than the initial state; except for phase changes with latent heat, it is hotter than the initial state.[26][27][28]
There are several principles on which empirical thermometers are built, as listed in the section of this article entitled "Primary and secondary thermometers". Several such principles are essentially based on the constitutive relation between the state of a suitably selected particular material and its temperature. Only some materials are suitable for this purpose, and they may be considered as "thermometric materials". Radiometric thermometry, in contrast, can be only slightly dependent on the constitutive relations of materials. In a sense then, radiometric thermometry might be thought of as "universal". This is because it rests mainly on a universality character of thermodynamic equilibrium, that it has the universal property of producing blackbody radiation.
Bi-metallic stem thermometers used to measure the temperature of steamed milkBi-metallic thermometer for cooking and baking in an oven
There are various kinds of empirical thermometer based on material properties.
Many empirical thermometers rely on the constitutive relation between pressure, volume and temperature of their thermometric material. For example, mercury expands when heated.
If it is used for its relation between pressure and volume and temperature, a thermometric material must have three properties:
(1) Its heating and cooling must be rapid. That is to say, when a quantity of heat enters or leaves a body of the material, the material must expand or contract to its final volume or reach its final pressure and must reach its final temperature with practically no delay; some of the heat that enters can be considered to change the volume of the body at constant temperature, and is called the latent heat of expansion at constant temperature; and the rest of it can be considered to change the temperature of the body at constant volume, and is called the specific heat at constant volume. Some materials do not have this property, and take some time to distribute the heat between temperature and volume change.[29]
(2) Its heating and cooling must be reversible. That is to say, the material must be able to be heated and cooled indefinitely often by the same increment and decrement of heat, and still return to its original pressure, volume and temperature every time. Some plastics do not have this property;[30]
(3) Its heating and cooling must be monotonic.[21][31] That is to say, throughout the range of temperatures for which it is intended to work,
(a) at a given fixed pressure,
either (i) the volume increases when the temperature increases, or else (ii) the volume decreases when the temperature increases;
but not (i) for some temperatures and (ii) for others; or
(b) at a given fixed volume,
either (i) the pressure increases when the temperature increases, or else (ii) the pressure decreases when the temperature increases;
but not (i) for some temperatures and (ii) for others.
At temperatures around about 4 °C, water does not have the property (3), and is said to behave anomalously in this respect; thus water cannot be used as a material for this kind of thermometry for temperature ranges near 4 °C.[23][32][33][34][35]
Gases, on the other hand, all have the properties (1), (2), and (3)(a)(α) and (3)(b)(α). Consequently, they are suitable thermometric materials, and that is why they were important in the development of thermometry.[36]
According to Preston (1894/1904), Regnault found constant pressure air thermometers unsatisfactory, because they needed troublesome corrections. He therefore built a constant volume air thermometer.[37] Constant volume thermometers do not provide a way to avoid the problem of anomalous behaviour like that of water at approximately 4 °C.[35]
Planck's law very accurately quantitatively describes the power spectral density of electromagnetic radiation, inside a rigid walled cavity in a body made of material that is completely opaque and poorly reflective, when it has reached thermodynamic equilibrium, as a function of absolute thermodynamic temperature alone. A small enough hole in the wall of the cavity emits near enough blackbody radiation of which the spectral radiance can be precisely measured. The walls of the cavity, provided they are completely opaque and poorly reflective, can be of any material indifferently.
A thermometer is called primary or secondary based on how the raw physical quantity it measures is mapped to a temperature. As summarized by Kauppinen et al., "For primary thermometers the measured property of matter is known so well that temperature can be calculated without any unknown quantities. Examples of these are thermometers based on the equation of state of a gas, on the velocity of sound in a gas, on the thermal noisevoltage or current of an electrical resistor, and on the angular anisotropy of gamma ray emission of certain radioactivenuclei in a magnetic field."[38]
In contrast, "Secondary thermometers are most widely used because of their convenience. Also, they are often much more sensitive than primary ones. For secondary thermometers knowledge of the measured property is not sufficient to allow direct calculation of temperature. They have to be calibrated against a primary thermometer at least at one temperature or at a number of fixed temperatures. Such fixed points, for example, triple points and superconducting transitions, occur reproducibly at the same temperature."[38]
Thermometers can be calibrated either by comparing them with other calibrated thermometers or by checking them against known fixed points on the temperature scale. The best known of these fixed points are the melting and boiling points of pure water. (Note that the boiling point of water varies with pressure, so this must be controlled.)
The traditional way of putting a scale on a liquid-in-glass or liquid-in-metal thermometer was in three stages:
Immerse the sensing portion in a stirred mixture of pure ice and water at atmospheric pressure and mark the point indicated when it had come to thermal equilibrium.
Divide the distance between these marks into equal portions according to the temperature scale being used.
Other fixed points used in the past are the body temperature (of a healthy adult male) which was originally used by Fahrenheit as his upper fixed point (96 °F (35.6 °C) to be a number divisible by 12) and the lowest temperature given by a mixture of salt and ice, which was originally the definition of 0 °F (−17.8 °C).[39] (This is an example of a frigorific mixture.) As body temperature varies, the Fahrenheit scale was later changed to use an upper fixed point of boiling water at 212 °F (100 °C).[40]
These have now been replaced by the defining points in the International Temperature Scale of 1990, though in practice the melting point of water is more commonly used than its triple point, the latter being more difficult to manage and thus restricted to critical standard measurement. Nowadays manufacturers will often use a thermostat bath or solid block where the temperature is held constant relative to a calibrated thermometer. Other thermometers to be calibrated are put into the same bath or block and allowed to come to equilibrium, then the scale marked, or any deviation from the instrument scale recorded.[41] For many modern devices calibration will be stating some value to be used in processing an electronic signal to convert it to a temperature.
The "Boyce MotoMeter" radiator cap on a 1913 Car-Nation automobile, used to measure temperature of vapor in 1910s and 1920s cars.Separated columns are often a problem in both alcohol and mercury thermometers, and they can make a temperature reading inaccurate.
The precision or resolution of a thermometer is simply to what fraction of a degree it is possible to make a reading. For high temperature work it may only be possible to measure to the nearest 10 °C or more. Clinical thermometers and many electronic thermometers are usually readable to 0.1 °C. Special instruments can give readings to one thousandth of a degree.[42] However, this precision does not mean the reading is true or accurate, it only means that very small changes can be observed.
A thermometer calibrated to a known fixed point is accurate (i.e. gives a true reading) at that point. The invention of the technology to measure temperature led to the creation of scales of temperature.[43] In between fixed calibration points, interpolation is used, usually linear.[41] This may give significant differences between different types of thermometer at points far away from the fixed points. For example, the expansion of mercury in a glass thermometer is slightly different from the change in resistance of a platinum resistance thermometer, so these two will disagree slightly at around 50 °C.[44] There may be other causes due to imperfections in the instrument, e.g. in a liquid-in-glass thermometer if the capillary tube varies in diameter.[44]
For many purposes reproducibility is important. That is, does the same thermometer give the same reading for the same temperature (or do replacement or multiple thermometers give the same reading)? Reproducible temperature measurement means that comparisons are valid in scientific experiments and industrial processes are consistent. Thus if the same type of thermometer is calibrated in the same way its readings will be valid even if it is slightly inaccurate compared to the absolute scale.
An example of a reference thermometer used to check others to industrial standards would be a platinum resistance thermometer with a digital display to 0.1 °C (its precision) which has been calibrated at 5 points against national standards (−18, 0, 40, 70, 100 °C) and which is certified to an accuracy of ±0.2 °C.[45]
According to British Standards, correctly calibrated, used and maintained liquid-in-glass thermometers can achieve a measurement uncertainty of ±0.01 °C in the range 0 to 100 °C, and a larger uncertainty outside this range: ±0.05 °C up to 200 or down to −40 °C, ±0.2 °C up to 450 or down to −80 °C.[46]
Some compounds exhibit thermochromism at distinct temperature changes. Thus by tuning the phase transition temperatures for a series of substances the temperature can be quantified in discrete increments, a form of digitization. This is the basis for a liquid crystal thermometer.
Band edge thermometry (BET)
Band edge thermometry (BET) takes advantage of the temperature-dependence of the band gap of semiconductor materials to provide very precise optical (i.e. non-contact) temperature measurements.[48] BET systems require a specialized optical system, as well as custom data analysis software.[49][50]
Blackbody radiation
An infrared thermometer is a kind of pyrometer (bolometer).All objects above absolute zero emit blackbody radiation for which the spectra is directly proportional to the temperature. This property is the basis for a pyrometer or infrared thermometer and thermography. It has the advantage of remote temperature sensing; it does not require contact or even close proximity unlike most thermometers. At higher temperatures, blackbody radiation becomes visible and is described by the colour temperature. For example a glowing heating element or an approximation of a star's surface temperature.
Thermocouples are useful over a wide temperature range from cryogenic temperatures to over 1000°C, but typically have an error of ±0.5-1.5°C.
Silicon bandgap temperature sensors are commonly found packaged in integrated circuits with accompanying ADC and interface such as I2C. Typically they are specified to work within about —50 to 150°C with accuracies in the ±0.25 to 1°C range but can be improved by binning.[51][52]
Chemical shift is temperature dependent. This property is used to calibrate the thermostat of NMR probes, usually using methanol or ethylene glycol.[53][54] This can potentially be problematic for internal standards which are usually assumed to have a defined chemical shift (e.g 0 ppm for TMS) but in fact exhibit a temperature dependence.[55]
Thermometers utilize a range of physical effects to measure temperature. Temperature sensors are used in a wide variety of scientific and engineering applications, especially measurement systems. Temperature systems are primarily either electrical or mechanical, occasionally inseparable from the system which they control (as in the case of a mercury-in-glass thermometer). Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist. Indoors, thermistors are used in climate control systems such as air conditioners, freezers, heaters, refrigerators, and water heaters.[58] Galileo thermometers are used to measure indoor air temperature, due to their limited measurement range.
Nanothermometry is an emergent research field dealing with the knowledge of temperature in the sub-micrometric scale. Conventional thermometers cannot measure the temperature of an object which is smaller than a micrometre, and new methods and materials have to be used. Nanothermometry is used in such cases. Nanothermometers are classified as luminescent thermometers (if they use light to measure temperature) and non-luminescent thermometers (systems where thermometric properties are not directly related to luminescence).[60]
Thermometers are important in food safety, where food at temperatures within 41 and 135 °F (5 and 57 °C) can be prone to potentially harmful levels of bacterial growth after several hours which could lead to foodborne illness. This includes monitoring refrigeration temperatures and maintaining temperatures in foods being served under heat lamps or hot water baths.[58]
Cooking thermometers are important for determining if a food is properly cooked. In particular meat thermometers are used to aid in cooking meat to a safe internal temperature while preventing over cooking. They are commonly found using either a bimetallic coil, or a thermocouple or thermistor with a digital readout.
Candy thermometers are used to aid in achieving a specific water content in a sugar solution based on its boiling temperature.
^Knake, Maria (April 2011). "The Anatomy of a Liquid-in-Glass Thermometer". AASHTO re:source, formerly AMRL (aashtoresource.org). Retrieved 4 August 2018. For decades mercury thermometers were a mainstay in many testing laboratories. If used properly and calibrated correctly, certain types of mercury thermometers can be incredibly accurate. Mercury thermometers can be used in temperatures ranging from about -38 to 350°C. The use of a mercury-thallium mixture can extend the low-temperature usability of mercury thermometers to -56°C. (...) Nevertheless, few liquids have been found to mimic the thermometric properties of mercury in repeatability and accuracy of temperature measurement. Toxic though it may be, when it comes to LiG [Liquid-in-Glass] thermometers, mercury is still hard to beat.
^Benedict, Robert P (1984). Fundamentals of Temperature, Pressure, and Flow Measurements. Wiley. pp. 4–5. ISBN978-0-471-89383-7.
^Howarth, RJ (2024). "The Heat of the Earth". The Emergence of Geophysics: Journeys into the Twentieth Century. Vol. 60. London: Geological Society Memoirs. p. 300. doi:10.1144/M60-2022-32.
^R.P. Benedict (1984) Fundamentals of Temperature, Pressure, and Flow Measurements, 3rd ed, ISBN0-471-89383-8 page 6
^Mach, E. (1900). Die Principien der Wärmelehre. Historisch-kritisch entwickelt, Johann Ambrosius Barth, Leipzig, section 22, pages 56-57. English translation edited by McGuinness, B. (1986), Principles of the Theory of Heat, Historically and Critically Elucidated, D. Reidel Publishing, Dordrecht, ISBN90-277-2206-4, section 5, pp. 48–49, section 22, pages 60–61.
^ abTruesdell, C.A. (1980). The Tragicomical History of Thermodynamics, 1822-1854, Springer, New York, ISBN0-387-90403-4.
^Serrin, J. (1986). Chapter 1, 'An Outline of Thermodynamical Structure', pages 3-32, especially page 6, in New Perspectives in Thermodynamics, edited by J. Serrin, Springer, Berlin, ISBN3-540-15931-2.
^Serrin, J. (1978). The concepts of thermodynamics, in Contemporary Developments in Continuum Mechanics and Partial Differential Equations. Proceedings of the International Symposium on Continuum Mechanics and Partial Differential Equations, Rio de Janeiro, August 1977, edited by G.M. de La Penha, L.A.J. Medeiros, North-Holland, Amsterdam, ISBN0-444-85166-6, pages 411-451.
^Planck, M. (1926). Über die Begründung des zweiten Hauptsatzes der Thermodynamik, S.-B. Preuß. Akad. Wiss. phys. math. Kl.: 453–463.
^Buchdahl, H.A. (1966). The Concepts of Classical Thermodynamics, Cambridge University Press, London, pp. 42–43.
^Truesdell, C., Bharatha, S. (1977). The Concepts and Logic of Classical Thermodynamics as a Theory of Heat Engines. Rigorously Constructed upon the Foundation Laid by S. Carnot and F. Reech, Springer, New York, ISBN0-387-07971-8, page 20.
^Ziegler, H., (1983). An Introduction to Thermomechanics, North-Holland, Amsterdam, ISBN0-444-86503-9.
^Landsberg, P.T. (1961). Thermodynamics with Quantum Statistical Illustrations, Interscience Publishers, New York, page 17.
^Maxwell, J.C. (1872). Theory of Heat, third edition, Longmans, Green, and Co., London, pages 232-233.
^Lewis, G.N., Randall, M. (1923/1961). Thermodynamics, second edition revised by K.S Pitzer, L. Brewer, McGraw-Hill, New York, pages 378-379.
^Thomsen, J.S.; Hartka, T.J. (1962). "Strange Carnot cycles; thermodynamics of a system with a density extremum". Am. J. Phys. 30 (1): 26–33. Bibcode:1962AmJPh..30...26T. doi:10.1119/1.1941890.
^ abTruesdell, C., Bharatha, S. (1977). The Concepts and Logic of Classical Thermodynamics as a Theory of Heat Engines. Rigorously Constructed upon the Foundation Laid by S. Carnot and F. Reech, Springer, New York, ISBN0-387-07971-8, pages 9-10, 15-18, 36-37.
^ abR.P. Benedict (1984) Fundamentals of Temperature, Pressure, and Flow Measurements, 3rd ed, ISBN0-471-89383-8, chapter 11 "Calibration of Temperature Sensors"
^"Band-edge thermometry". Molecular Beam Epitaxy Research Group. 2014-08-19. Retrieved 2019-08-14.
^Johnson, Shane (May 1998). "In situ temperature control of molecular beam epitaxy growth using band-edge thermometry". Journal of Vacuum Science & Technology B: Microelectronics and Nanometer Structures. 16 (3): 1502–1506. Bibcode:1998JVSTB..16.1502J. doi:10.1116/1.589975. hdl:2286/R.I.27894.
^Findeisen, M.; Brand, T.; Berger, S. (February 2007). "A1H-NMR thermometer suitable for cryoprobes". Magnetic Resonance in Chemistry. 45 (2): 175–178. doi:10.1002/mrc.1941. PMID17154329. S2CID43214876.
Middleton, W.E.K. (1966). A history of the thermometer and its use in meteorology. Baltimore: Johns Hopkins Press. Reprinted ed. 2002, ISBN0-8018-7153-0.
A thermometer is an instrument designed to measure temperature by detecting and quantifying changes in physical properties, such as the expansion or contraction of liquids like mercury or alcohol, or variations in electrical resistance or infrared radiation emitted by an object.[1][2]The development of thermometers traces back to the early 17th century, evolving from rudimentary thermoscopes—devices that indicated temperature changes without numerical scales—to precise instruments with standardized scales. Key milestones include Galileo Galilei's 1610 invention of an alcohol-based thermoscope, Ferdinand II de’ Medici's 1654 sealed alcohol thermometer, and Gabriel Fahrenheit's mercury thermometer (invented 1714) with the Fahrenheit scale (proposed 1724), which marked the transition to reliable quantitative measurement.[1] Later advancements, such as Anders Celsius's 1742 centigrade scale and Thomas Clifford Allbutt's 1867 clinical thermometer, expanded their utility in medicine and science.[1]Thermometers operate on diverse principles and come in various types to suit different applications, from everyday use to specialized scientific measurements. Liquid-in-glass thermometers, historically common, rely on the thermal expansion of liquids within a capillary tube, though they have largely been replaced due to hazards like mercury toxicity.[3][2] Digital thermometers, using thermistors or thermocouples, convert resistance or voltage changes into temperature readings and offer advantages like higher accuracy (up to ±0.05°C) and faster response times.[3] Non-contact options, such as infrared thermometers, detect thermal radiation for surface measurements, while specialized variants like fiber-optic sensors enable distributed monitoring in challenging environments.[3][2]These devices are essential across fields including meteorology, medicine, and industry, where accurate temperature data informs everything from weather forecasting to diagnosing fevers (normal human body temperature is approximately 37°C or 98.6°F).[1] Modern standards, such as the Celsius, Fahrenheit, and Kelvin scales, ensure global consistency, with the Kelvin scale defining absolute zero at 0 K for thermodynamic applications.[2]
Introduction
Definition and Purpose
A thermometer is an instrument designed to measure temperature by detecting and quantifying changes in the physical properties of a substance or system in response to thermal variations, converting these changes into a numerical value on a calibrated scale.[1] This device enables the objective assessment of thermal states, distinguishing it from subjective empirical evaluations based on human sensation and providing instead a standardized, absolute measurement essential for consistency across observations.[4]The core purpose of a thermometer is to facilitate the precise quantification of hotness or coldness in diverse contexts, including scientific experiments, industrial monitoring, medical assessments, and routine environmental checks, thereby supporting informed decision-making and safety protocols.[5][6] By translating thermal phenomena into reproducible data, thermometers underpin advancements in fields ranging from quantum physics to manufacturing, while also aiding everyday tasks like cooking or weather tracking.[5]At its foundation, a thermometer relies on the predictable variation of an observable property—such as volume expansion, electrical resistance, or spectral emission—with temperature, allowing the correlation of these changes to a defined thermal scale.[7] Essential components include a sensing element that responds to thermal input, a graduated scale for numerical interpretation, and a display for user-readable output, ensuring the device's functionality across applications.[8] These elements produce readings aligned with established temperature scales, such as Celsius or Kelvin.[8]
Temperature Scales
Temperature scales provide standardized systems for measuring thermal energy, enabling consistent quantification of temperature across scientific, industrial, and everyday applications. These scales are defined relative to fixed points, such as phase transitions of water, and absolute references like zero kinetic energy. The primary scales in use today are the Kelvin and Celsius scales in the International System of Units (SI), alongside the Fahrenheit scale in certain regions, with historical scales like Rankine and Réaumur offering additional context for thermodynamic measurements.[9]The Kelvin scale is the SI base unit of thermodynamic temperature, defined such that the Boltzmann constant is exactly 1.380 649 × 10^{-23} J/K, establishing 0 K as absolute zero—the theoretical point where molecular motion ceases. The degree size matches that of the Celsius scale, with the triple point of water fixed at exactly 273.16 K, serving as a fundamental reference for calibration. This absolute scale avoids negative values and is essential for equations in physics and chemistry involving temperature.[10][9]The Celsius scale, denoted °C, is a relative scale originally defined by assigning 0 °C to the freezing point of water at standard atmospheric pressure and 100 °C to its boiling point, dividing the interval into 100 equal degrees. Since 2019, it is formally tied to the Kelvin scale, where 0 °C equals 273.15 K, maintaining the same interval size as one kelvin. This scale's practical fixed points facilitate everyday and laboratory use, though modern calibrations rely on the triple point for precision.[9]The Fahrenheit scale, denoted °F, sets the freezing point of water at 32 °F and the boiling point at 212 °F under standard pressure, creating 180 divisions between these points—thus, one Fahrenheit degree is 5/9 the size of a Celsius degree. Developed for empirical consistency in early thermometry, it remains prevalent in the United States for non-scientific contexts.[9]Other scales include the Rankine scale (°R), an absolute counterpart to Fahrenheit where 0 °R corresponds to absolute zero and the degree size equals one Fahrenheit degree; for instance, the freezing point of water is 491.67 °R.[11] The Réaumur scale (°Re or °Ré), a historical system, defines water's freezing point as 0 °Ré and boiling point as 80 °Ré, with each degree being 1.25 Celsius degrees, once used in European engineering but now obsolete.[12][13]Fixed points are critical for defining and calibrating these scales, with the triple point of water—where solid, liquid, and vapor phases coexist in equilibrium at 0.01 °C (273.16 K or 32.018 °F)—serving as the modern international standard due to its reproducibility and independence from pressure variations. This point replaced earlier reliance on the ice point (0 °C) and steam point (100 °C) for greater accuracy in the International Temperature Scale of 1990 (ITS-90).[9]Conversions between scales derive from their interval ratios and zero-point offsets. For Celsius to Kelvin, add 273.15, as the scales share identical degree sizes and 0 °C is defined as 273.15 K:T(K)=t(∘C)+273.15This offset stems from the triple point assignment, where 0.01 °C = 273.16 K, approximating the historical ice-point relation.[9]The Fahrenheit-to-Celsius conversion accounts for the 1.8:1 degree ratio (from 180 °F spanning 100 °C) and 32 °F offset at the ice point. Subtract 32 °F to align zeros, then divide by 1.8:t(∘C)=1.8t(∘F)−32Conversely, multiply by 1.8 and add 32 for Celsius to Fahrenheit:t(∘F)=t(∘C)×1.8+32These derive directly from the fixed points: boiling water difference yields the ratio (212 - 32) °F = 180 °F for 100 °C, so 9/5 = 1.8 °F/°C.[9]For Rankine, add 459.67 to Fahrenheit values, as 0 °F = 459.67 °R from absolute zero alignment. Réaumur conversions use its 0.8:1 ratio to Celsius (80 °Ré for 100 °C), so multiply Celsius by 0.8:t(∘Reˊ)=t(∘C)×0.8These transformations ensure interoperability across scales in thermometric applications.[11][13][12]
History
Ancient and Early Developments
The earliest efforts to conceptualize and observe temperature changes date back to ancient civilizations, where qualitative assessments predominated before the development of quantitative devices. In metallurgy, practitioners in ancient China and India relied on visual cues, such as the color of heated metals, to gauge hotness during forging and smelting processes; for instance, terms like "red heat" indicated specific temperature ranges suitable for working wootz steel in India.[14] Evaporative cooling techniques, such as using wet materials to lower ambient heat, also served as rudimentary methods to sense and manage temperature differences in these contexts.[15]In the Hellenistic period, Greek engineers made initial strides toward instrumental measurement. Philo of Byzantium (c. 280–220 BC) described a thermoscope-like device that exploited air expansion: a hollow lead sphere connected by a tube to a water vessel, where heating caused the air to expand and displace the water level, demonstrating temperature-induced volume changes.[1] This apparatus, detailed in Philo's Pneumatics, marked an early recognition of thermal expansion as a detectable phenomenon.[16]Hero of Alexandria (c. 10–70 AD) refined such concepts in his Pneumatica, employing a similar open tube system with water to make temperature variations more visible through fluid displacement, though without numerical calibration.[17] These devices functioned as qualitative indicators, showing relative hotness or coldness via mechanical effects rather than precise measurement.By the late 16th century, progress shifted toward quantification in Europe. Galileo Galilei (c. 1593) developed a water-filled thermoscope—a glass tube with a bulb inverted into a water basin—allowing observation of temperature variations through liquid level shifts.[18] He introduced one of the first fixed-point scales, marking approximately 100 arbitrary divisions between the ice point (as a cold reference) and human body temperature (as a warm reference), enabling comparative readings despite inconsistencies.[19]These ancient and early prototypes shared critical limitations as open systems: they were highly sensitive to atmospheric pressure fluctuations, which altered fluid levels independently of temperature, rendering them unreliable for absolute measurements and distinguishing them from true sealed thermometers.[18]
Renaissance and Standardization Efforts
Building upon the qualitative thermoscopes of antiquity, the Renaissance period marked a pivotal shift toward quantitative temperature measurement through the development of sealed instruments and the introduction of numerical scales. In 1612, Italian physician Santorio Santorio adapted the thermoscope for clinical use, applying the first numerical scale to track fever in patients.[20] This innovation transformed the instrument from a mere indicator of heat expansion into a tool for precise medical observation, emphasizing its role in quantifying bodily temperatures.A key advancement in reliability came with the invention of the sealed liquid-in-glass thermometer by Ferdinando II de' Medici, Grand Duke of Tuscany, in 1654. By enclosing alcohol within a glass bulb and stem, hermetically sealing both ends, Ferdinando eliminated the influence of atmospheric pressure variations that plagued open thermoscopes, enabling more consistent readings across different conditions.[1] Concurrently, early gas-based thermometers emerged, with French physicist Guillaume Amontons developing an air thermometer in the late 17th century—around 1699—that measured temperature via pressure changes in a constant volume of air, laying groundwork for later constant-volume gas thermometry.[21]Efforts toward standardization began in the early 17th century, as physicians and scientists sought uniform scales to facilitate comparable measurements. French doctor Jean Rey constructed the first liquid-expansion thermometer using water around 1631, representing an initial step toward scalable designs, though it remained unsealed and lacked a formalized division.[22] By 1714, German instrument maker Daniel Gabriel Fahrenheit introduced the mercury-in-glass thermometer, which offered greater precision due to mercury's uniform expansion, and in 1724 proposed his scale with fixed points at 32° for water's freezing and 212° for boiling, calibrated against a brine mixture at 0°.[23] Swedish astronomer Anders Celsius advanced this further in 1742 by devising the centigrade scale for his mercury thermometers, initially setting 0° at water's boiling point and 100° at freezing (later inverted), using the ice and steam points as anchors for reproducibility.60910-0/fulltext) Despite these innovations, early standardization faltered without international consensus, as varying fixed points and divisions—such as those based on human body temperature or arbitrary gradations—hindered widespread adoption until later refinements.
Modern Precision Advancements
In the mid-19th century, precision thermometry advanced significantly with William Thomson's (later Lord Kelvin) proposal of an absolute temperature scale in 1848, based on Carnot's thermodynamic principles, which defined temperature independently of material properties and established zero as the point of no thermal motion. This scale provided a theoretical foundation for accurate measurements, influencing subsequent instrument designs by emphasizing reproducibility and thermodynamic consistency.[24]A key practical innovation came in 1887 when Hugh Longbourne Callendar developed the platinum resistance thermometer at the Cavendish Laboratory, demonstrating that platinum's electrical resistance varies predictably and linearly with temperature, enabling stable and reproducible measurements up to 500°C with precision to 1 part in 10,000. Callendar's design, detailed in his experiments on resistance as a temperature measure, proved superior to gas thermometers for industrial applications due to its portability and minimal hysteresis, facilitating accurate calibration and widespread adoption in engineering by the early 20th century.[25][26]The 20th century saw further milestones in thermoelectric thermometry, building on Thomas Seebeck's 1821 discovery of the thermoelectric effect, where a temperature difference across dissimilar metals generates voltage. By 1910, quantitative characterization of bismuth alloys such as those with antimony, tin, and tellurium enabled practical thermocouples for industrial use, with commercial standardization for high-temperature monitoring in manufacturing and power generation.[27]Non-contact methods advanced with infrared pyrometers in the 1920s and 1930s; Hungarian physicist Kálmán Tihanyi's 1929 patent for an infrared camera laid groundwork for thermal imaging, while the first dedicated infrared thermometer emerged in 1931, allowing remote measurement of hot objects without physical contact, crucial for metallurgy and wartime applications. Ratio pyrometers, developed commercially by 1939, improved accuracy by comparing infrared intensities at multiple wavelengths, reducing errors from emissivity variations.[28]Post-2000 developments integrated microelectromechanical systems (MEMS) into digital thermometers, enabling compact, low-power devices with resolutions below 0.1°C for biomedical and consumer uses, as reviewed in advancements leveraging silicon microstructures for thermal sensing in healthcare monitoring.[29]Quantum advancements in the 2020s have introduced nitrogen-vacancy (NV) centers in diamond as nanoscale thermometers, offering sub-micron spatial resolution and sensitivities down to millikelvin changes via optically detected magnetic resonance shifts, with applications in cellular biology and microelectronics thermal mapping.[30]Emerging in the 2010s, fiber-optic thermometers for Internet of Things (IoT) applications utilize fluorescence decay or interferometry in optical fibers to enable distributed, EMI-resistant sensing over kilometers, supporting smart grids and environmental monitoring with accuracies of ±0.5°C. Complementing these, wireless IoT temperature sensors, driven by low-power wide-area networks like LoRaWAN, proliferated for remote data logging in agriculture and logistics, achieving battery lives exceeding five years while integrating with cloud analytics for real-time alerts.[31]
Physical Principles
Thermometric Properties of Materials
Thermometric properties refer to the measurable physical characteristics of materials that vary predictably and reproducibly with temperature, serving as the foundation for temperature sensing in thermometers. These properties include changes in volume, length, electrical resistance, voltage generation, and phase transitions, which allow materials to indicate temperature through observable or quantifiable alterations. Selection of materials depends on factors such as sensitivity (the magnitude of property change per unit temperature), operational range, hysteresis (discrepancy in readings during heating versus cooling), and long-term stability, with solids often preferred for mechanical robustness and gases for high accuracy in idealized conditions despite challenges in thermal equilibration.[32][33]Thermal expansion is a key thermometric property exploited in liquid-based thermometers, where substances like mercury and alcohol increase in volume linearly with temperature. The change in length ΔL of a material is given by the formulaΔL=αLΔT,where α is the linear thermal expansion coefficient (approximately 10^{-4} K^{-1} for liquids), L is the original length, and ΔT is the temperature change; this volumetric expansion in confined liquids produces a visible rise in a capillary tube.[32][33]Electrical properties provide precise thermometric responses, particularly through resistance variations in metals. For platinum, widely used due to its stability, resistance R changes asR=R0(1+αΔT),where R_0 is the resistance at a reference temperature and α ≈ 0.00385 K^{-1}; this positive temperature coefficient enables accurate resistance temperature detectors (RTDs).[32] The Seebeck effect in thermocouples generates a voltage ΔV across junctions of dissimilar metals proportional to the temperature difference, expressed asΔV=αΔT,with α (the Seebeck coefficient) around 40 μV/K for common types like chromel-alumel, allowing measurement over wide ranges from cryogenic to high temperatures.[32][33]Phase changes offer visual or mechanical indications of temperature through structural alterations. Bimetallic strips consist of two bonded metals with differing expansion coefficients, such as invar and brass, causing bending upon heating due to differential expansion rates, which can deflect a pointer or trigger a switch.[34] Liquid crystals exhibit thermochromism, changing color reversibly as temperature alters their molecular helical structure and light diffraction properties, enabling non-contact displays for surface temperature mapping.[35]Material selection prioritizes high sensitivity for fine resolution (e.g., thermocouples at 40 μV/K), broad range (e.g., -200 to 1300°C for certain alloys), low hysteresis to ensure repeatability, and stability against aging or contamination; solids like platinum provide excellent long-term consistency, while gases excel in theoretical precision for constant-volume applications but require careful handling due to lower thermal conductivity.[32][33]
Constant-Volume and Gas Thermometry
Constant-volume gas thermometry is a primary method for measuring temperature based on the pressure changes of a gas confined to a fixed volume. According to the ideal gas law, PV=nRT, where P is pressure, V is volume, n is the number of moles, R is the gas constant, and T is the absolute temperature, temperature is directly proportional to pressure when volume and the amount of gas are held constant. Thus, by monitoring pressure variations in a sealed bulb, the temperature can be determined with high precision, making this technique fundamental to thermodynamic temperature scales.[36]In operation, a constant-volume gas thermometer typically employs low-density gases such as hydrogen or helium to minimize deviations from ideal behavior. The apparatus consists of a rigid bulb connected to a pressure gauge, often a manometer, immersed in the environment whose temperature is to be measured. As temperature changes, the gas pressure adjusts accordingly, and readings are taken relative to reference points like the triple point of water (273.16 K). For practical calibration, the temperature is calculated using the formula T=Ptp−P0P−P0×273.16K, where P is the measured pressure, Ptp is the pressure at the triple point, and P0 is the extrapolated pressure at absolute zero. To define the thermodynamic temperature rigorously, measurements are extrapolated to the limit of zero gas density (or infinite volume), where T is proportional to the limit of P/T approaching the ideal gas constant, ensuring independence from the specific gas used. Helium is particularly favored for low-temperature applications due to its inertness and behavior close to ideality even near absolute zero.[36][37][38]This method played a pivotal historical role in establishing the Kelvin scale, as it allowed metrologists to extrapolate to absolute zero, defining the scale's foundation in the late 19th century. Its advantages include exceptional accuracy, often achieving uncertainties below 0.001 K in controlled settings, and reliability across a wide range, particularly with helium for measurements approaching absolute zero where other thermometers fail. However, constant-volume gas thermometers are inherently bulky due to the need for large bulbs and precise pressure measurement systems, and they exhibit slow thermal response times, limiting their use to laboratory standards rather than routine applications.[39][36]
Radiometric and Optical Methods
Radiometric and optical methods for temperature measurement rely on the principles of thermal radiation emitted by objects, enabling non-contact sensing across a wide range of temperatures and distances. These techniques are grounded in blackbody radiation theory, which describes the electromagnetic radiation emitted by an idealized body that absorbs all incident radiation. The total emissive power J of a blackbody is given by the Stefan-Boltzmann law: J=σT4, where σ is the Stefan-Boltzmann constant (5.6704×10−8 W m−2 K−4) and T is the absolute temperature in kelvin.[40] This law quantifies how the total radiated energy scales with the fourth power of temperature, forming the basis for radiometric thermometry.[41] Additionally, Wien's displacement law states that the wavelength λmax at which the spectral radiance peaks is inversely proportional to temperature: λmaxT=b, with b≈2898 μm·K.[42] This relation shifts the peak emission to shorter wavelengths as temperature increases, guiding the selection of detection wavelengths in optical systems.[43]Pyrometry utilizes these radiation laws to infer temperature from the intensity and spectral distribution of emitted light. In optical pyrometers, such as the disappearing filament type, the brightness of a heated filament is visually matched to the target's glow through an optical system, with the filament current calibrated to temperature via the Planck radiation law approximation in the visible range.[44] When the filament "disappears" against the background, their radiances are equal, allowing direct temperature estimation for high-temperature sources like furnaces.[45] For lower temperatures, infrared thermometers detect radiation in the 8-14 μm atmospheric window, where atmospheric absorption by water vapor and CO₂ is minimal, enabling accurate measurement of thermal emission from surfaces.[46] These devices apply the Stefan-Boltzmann law, adjusted for the target's emissivity (a measure of how closely it approximates a blackbody), to convert detected irradiance to temperature.[47]Advanced optical methods extend these principles using light interactions for precise, localized sensing. Fiber-optic sensors based on fluorescence decay employ phosphorescent materials, such as chromium-doped sapphire, where the excited-state lifetime τ inversely correlates with temperature: τ∝1/T.[48]Light is transmitted via optical fibers to the sensor tip, and the decay time of returned fluorescence is analyzed, providing immunity to fiber losses and enabling measurements up to 700°C or higher.[49]Raman spectroscopy offers remote temperature sensing by probing molecular vibrations in the target; the Stokes-to-anti-Stokes intensity ratio in scattered light varies with temperature, allowing non-invasive profiling in gases or liquids over distances.[50] This technique is particularly suited for environmental or industrial remote sensing, as the Raman shift provides a direct spectroscopic thermometer independent of emissivity.[51]In the 2020s, hyperspectral imaging has advanced remote thermometry by capturing narrow spectral bands across the infrared, enabling precise discrimination of surface temperatures in complex scenes. These systems, often deployed on satellites or drones, leverage Wien's law to map thermal variations for climate monitoring, such as tracking sea surface temperatures or vegetation stress with sub-degree accuracy.[52] By integrating multiple wavelengths, hyperspectral approaches mitigate emissivity uncertainties and enhance spatial resolution in dynamic environments.[53]
Types of Thermometers
Primary Thermometers
Primary thermometers are devices that measure temperature by directly realizing the thermodynamic temperature scale, independent of prior calibration against other thermometers, typically relying on fundamental physical laws such as the ideal gas law or statistical mechanics.[54]A prominent example is the constant-volume gas thermometer, which operates by enclosing a fixed volume of gas, often helium or another ideal gas, in a bulb connected to a pressure-measuring system; as temperature changes, the gas pressure varies proportionally according to the ideal gas lawPV=nRT, where at constant volume V, pressureP is directly proportional to absolute temperatureT, allowing T to be determined from measured P relative to a reference point like the triple point of water.[55][56]Another example is the acoustic gas thermometer, which determines temperature from the speed of sound in a monatomic gas, such as argon, confined in a resonant cavity; the speed of sound c follows c∝T from the relation derived from the ideal gas law and adiabatic processes, enabling precise thermodynamic temperature measurement through acoustic resonance frequencies.[57][58]Johnson noise thermometry provides a solid-state alternative, measuring the mean-square voltage fluctuations ⟨V2⟩=4kTRΔf across a resistor of resistance R, where k is Boltzmann's constant, T is temperature, and Δf is the bandwidth; these thermal fluctuations, known as Johnson-Nyquist noise, directly yield T without reliance on intermediate calibrations.[59][60]These primary methods offer absolute accuracy traceable to fundamental constants and are essential for defining international temperature standards, such as those used in the kelvin's realization.[61][62]
Secondary Thermometers
Secondary thermometers are temperature-measuring devices that are calibrated against primary thermometers to ensure traceability to absolute thermodynamic scales, enabling practical and reproducible measurements across a wide range of applications without requiring direct computation from fundamental physical laws.[63] These instruments rely on well-characterized empirical relationships between a measurable property and temperature, offering high sensitivity and convenience for industrial, laboratory, and environmental monitoring, though they demand periodic recalibration to maintain accuracy.[64] Unlike primary methods, secondary thermometers prioritize portability and response time over absolute precision, with uncertainties typically on the order of 0.01°C to 1°C depending on the type and calibration.[65]A classic example of a secondary thermometer is the liquid-in-glass type, where thermal expansion of a liquid within a capillary tube indicates temperature; mercury-filled versions operate reliably from -39°C to 357°C, providing visual readability and low hysteresis when calibrated against fixed points like ice or steam.[66] For broader or more precise needs, resistance temperature detectors (RTDs) use the predictable change in electrical resistance of a metal wire with temperature; platinum RTDs, valued for their stability and linearity, function from -200°C to 850°C and follow the Callendar-Van Dusen equation for temperatures above 0°C:R(T)=R0(1+AT+BT2)where R(T) is the resistance at temperatureT (in °C), R0 is the resistance at 0°C (typically 100 Ω), and A and B are material-specific coefficients (e.g., A=3.9083×10−3 °C⁻¹, B=−5.775×10−7 °C⁻² for industrial-grade platinum).[67] This quadratic approximation ensures accuracy within ±0.05°C over wide ranges when calibrated.[64]Thermocouples represent another key secondary thermometer category, exploiting the Seebeck effect to generate a voltage from the temperature-dependent junction of two dissimilar metals; the output emf follows ΔE=αΔT, where α is the Seebeck coefficient (specific to the material pair) and ΔT is the temperature difference from a reference junction.[68] Type K thermocouples, composed of chromel (nickel-chromium) and alumel (nickel-aluminum), are widely used for their robustness and cover 0°C to 1260°C with α≈41 μV/°C, making them suitable for high-temperature processes like furnace monitoring after calibration at multiple points.[69]Beyond these, bimetallic thermometers employ the differential thermal expansion of two bonded metal strips (e.g., brass and invar) to produce mechanical deflection proportional to temperature, offering simple, cost-effective indication from -70°C to 500°C without electrical power, though with coarser resolution around ±1°C.[70] Semiconductor-based thermistors, particularly negative temperature coefficient (NTC) types made from metal oxides like manganese-nickel, provide high sensitivity for narrow ranges (e.g., -50°C to 150°C) via exponential resistance changes; their behavior is characterized by the beta parameterβ=(1/T1−1/T2)ln(R1/R2), where R1,R2 are resistances at absolute temperatures T1,T2 (in K), typically yielding β values of 3000–4000 K for precise curve fitting after two-point calibration.[71]
Registering and Recording Devices
Registering and recording devices integrate temperature-sensing elements with mechanisms to automatically capture and log data over time, generating outputs like graphical charts or digital files for analyzing temporal variations.[72]Mechanical registering thermometers, such as chart recorders, employ a rotating drum or circular chart driven by a clock mechanism, with a stylus tracing temperature changes on paper. These devices often use bimetallic strips, which consist of two metals bonded together that differentially expand with heat to produce mechanical movement driving the stylus.[72][73] The development of mechanical thermographs dates to the mid-19th century, with early photographic recording methods introduced in 1845 by Francis Ronalds and Charles Brooke, followed by bimetallic strip designs in the 1860s by inventors like Heinrich Wild and Daniel Draper.[73] Another mechanical variant, the mercury-in-steel thermometer, fills a steel bulb and capillary tube with mercury under pressure; temperature-induced expansion transmits pressure through the tube to a remote Bourdon tube or diaphragm that actuates a pointer or recorder for industrial logging over distances up to 100 meters.[74][75]Digital recording devices, or data loggers, incorporate microcontrollers to periodically sample data from secondary sensors like thermocouples or resistance temperature detectors, storing readings in internal memory accessible via USB, SD cards, or serial ports.[76] These emerged in the late 1960s as electronic successors to analog strip-chart systems, enabling higher sampling rates and larger storage capacities without physical media.[76] Since around 2010, wireless IoT variants have proliferated, using Bluetooth Low Energy (BLE) protocols developed from the original Bluetooth standard introduced in 1998, to transmit temperature data from battery-powered sensors to gateways or mobile devices for real-time remote logging.[77]These devices provide the advantage of unattended continuous monitoring, capturing detailed time-series data essential for processes requiring oversight without constant intervention; in meteorology, for instance, thermographs based on bimetallic mechanisms have recorded ambient temperatures automatically since the 1860s establishment of permanent observatories.[73][72]
Calibration and Standards
Calibration Techniques
Calibration of thermometers typically involves fixed-point methods that leverage well-defined phase transitions in pure substances to establish reference temperatures with high precision. One fundamental fixed-point is the triple point of water, defined at exactly 273.16 K (0.01 °C), where solid, liquid, and vapor phases coexist in equilibrium; this is realized using a sealed cell containing high-purity water, and the thermometer is inserted into the cell's reentrant well for measurement.[78] Another common fixed point is the ice point at 0 °C, achieved by immersing the thermometer in a well-stirred bath of crushed ice and water, ensuring the mixture remains at the melting point through continuous agitation to prevent supercooling or stratification.[79] These fixed points provide absolute temperature references for calibrating secondary thermometers, such as platinum resistance thermometers (PRTs), with uncertainties as low as 1 mK at the water triple point.[80]For broader temperature ranges, comparison calibration methods are employed, where the thermometer under test is immersed alongside a reference standard in a controlled environment to measure deviations. Stirred liquid baths, filled with fluids like water, silicone oil, or alcohols, maintain uniform temperatures from -80 °C to 300 °C by continuous circulation, minimizing gradients and enabling simultaneous calibration of multiple devices.[78] Dry-block calibrators offer a portable alternative for field use, inserting the thermometer into a heated metal block with interchangeable inserts to simulate temperatures up to 650 °C, though they generally provide slightly lower uniformity compared to liquid baths due to the absence of convective mixing.[81] For resistance-based sensors like resistance temperature detectors (RTDs), calibration often uses Wheatstone bridge circuits to precisely measure resistance changes, compensating for lead wire effects in three- or four-wire configurations.[82] Thermocouples, meanwhile, are calibrated via comparison in similar baths, with voltage outputs referenced against standard tables while accounting for cold junction compensation.[83]Calibration procedures generally require multi-point measurements to characterize the thermometer's response across its operating range, followed by fitting a calibration curve to correct readings. For instance, data from several fixed points or comparison temperatures are collected, and a polynomial equation of the formt=a0+a1R+a2R2is fitted to relate temperature t to resistance R (or voltage for thermocouples), where coefficients a0,a1,a2 are determined via least-squares regression to minimize residuals.[84] This approach ensures the device's output aligns with the reference, with higher-order polynomials used for non-linear responses over extended ranges. Uncertainty in the calibration is estimated according to ISO/IEC 17025 guidelines, incorporating contributions from reference standards, environmental stability, and repeatability through methods like the Guide to the Expression of Uncertainty in Measurement (GUM), typically yielding expanded uncertainties of 0.05 °C to 0.5 °C depending on the device and range.[85]All calibrations must ensure traceability to the International Temperature Scale of 1990 (ITS-90) through accredited national metrology institutes, such as the National Institute of Standards and Technology (NIST), which realizes ITS-90 fixed points using standard platinum resistance thermometers (SPRTs) calibrated against primary cells like the water triple point.[80] This chain of comparisons, documented in calibration certificates, guarantees that industrial and scientific thermometers align with global standards, supporting applications from laboratoryresearch to regulatory compliance.[78]
International Temperature Scales
The International Temperature Scale of 1927 (ITS-27) was the first formally adopted global standard for temperature measurement, established by the 7th General Conference on Weights and Measures (CGPM) in 1927. It defined temperatures from 0°C (ice point) to approximately 1600°C using a set of fixed points, such as the boiling point of sulfur at 444.60°C, and interpolation via platinum resistance thermometers, Pt-10%Rh/Pt thermocouples, and optical pyrometers. This scale aimed to approximate thermodynamic temperatures through reproducible physical states but was limited in lower ranges and accuracy.[86]The International Practical Temperature Scale of 1968 (IPTS-68), promulgated by the International Committee of Weights and Measures (CIPM) in 1968 following the 13th CGPM, extended the range downward to 13.81 K (triple point of hydrogen) and refined the fixed points, adding six new ones like the triple point of argon at 83.80 K while removing the sulfurboiling point. It improved alignment with thermodynamic scales through updated interpolation formulas for resistance thermometers but revealed non-uniqueness issues in certain ranges, prompting further revisions.[86]The current standard, the International Temperature Scale of 1990 (ITS-90), was adopted by the CIPM in 1989 and took effect on January 1, 1990, as recommended by the 18th CGPM. It extends the measurable range to 0.65 K using helium vapor-pressure equations and up to 3020 K via Planck radiation laws, superseding IPTS-68 and the 1976 Provisional 0.5 K to 30 K scale (EPT-76). ITS-90 enhances thermodynamic fidelity by specifying 17 defining fixed points—phase transitions of high-purity substances—and range-specific interpolation procedures, primarily using standard platinum resistance thermometers (SPRTs) for contact thermometry between 13.8033 K and 1234.93 K. Key fixed points include the triple point of equilibrium hydrogen at 13.8033 K, the triple point of neon at 24.5561 K, the triple point of water at 273.16 K (0.01°C), the melting point of gallium at 29.7646°C, the freezing point of indium at 156.5985°C, the freezing point of tin at 231.928°C, the freezing point of zinc at 419.527°C, the freezing point of aluminum at 660.323°C, the freezing point of silver at 961.78°C, the freezing point of gold at 1064.18°C, and the freezing point of copper at 1084.62°C. These points anchor the scale, with deviations from thermodynamic temperatures estimated to be less than 0.1% above 1000 K and smaller at lower temperatures.[86][87]ITS-90 employs non-linear interpolation equations tailored to each subrange to derive temperatures between fixed points, ensuring high reproducibility with SPRTs. For the subrange from 0 °C to 660.323 °C—covering the triple point of water and the freezing points of tin, zinc, and aluminum—the formulation uses resistance ratios W(T90)=R(T90)/R(273.16K), where R is the thermometer resistance. The interpolation equation is a cubic deviation function:ΔW(T90)=a(W−1)+b(W−1)2+c(W−1)3,where ΔW(T90)=W(T90)−Wr(T90), Wr(T90) is a reference resistance ratio function, and coefficients a,b,c are determined by calibration at the fixed points. This form accounts for the non-linear resistance-temperature relationship of platinum, minimizing deviations across the range. Other subranges use similar polynomial or rational approximations, such as cubic deviation functions ΔW=a(W−1)+b(W−1)2+c(W−1)3 for specific calibrations, or vapor-pressure formulations at cryogenic temperatures.[87]In 2019, the 26th CGPM revised the International System of Units (SI), redefining the kelvin by fixing the Boltzmann constant at exactly k=1.380649×10−23J/K, effective May 20, 2019. This anchors the kelvin to a fundamental physical constant rather than the water triple point alone, introducing a relative uncertainty of about 3.7×10−7 to the ITS-90 water triple point value while preserving the scale's practical realization. The update enhances the ITS-90's alignment with thermodynamic temperatures without altering its fixed points or equations, allowing primary thermometry via acoustic gas or dielectric constant methods at any temperature.[88]
Measurement Quality
Precision and Accuracy
In thermometry, precision refers to the closeness of agreement between independent measurements of the same quantity under the same conditions, typically quantified as the standard deviation of repeated readings. Accuracy, by contrast, describes how closely a measured value approaches the true value of the temperature, influenced primarily by systematic errors rather than random variations. These distinctions are essential for evaluating thermometer performance, as high precision does not guarantee accuracy, and vice versa.Several factors affect precision and accuracy in thermometric measurements. Resolution, the smallest change in temperature that can be detected, varies between instrument types; for instance, some coarser analog thermometers limited to 1°C increments, while digital thermometers often achieve resolutions of 0.1°C, allowing finer discrimination.[89]Hysteresis in sensing materials, such as platinum resistance thermometers, introduces discrepancies where the output differs depending on whether the temperature is increasing or decreasing, thereby degrading both repeatability (precision) and alignment with true values (accuracy).[90]Quantification of these qualities follows the Guide to the Expression of Uncertainty in Measurement (GUM), which outlines uncertainty budgets combining random and systematic components into a combined standard uncertainty. For example, precision clinical thermometers calibrated using standards like NIST SRM 934 can achieve uncertainties as low as 0.03°C at calibration points in the physiological range (24–38°C).[91] Improvements can be achieved by averaging multiple readings to reduce random errors via the central limit theorem, enhancing precision, or through environmental controls such as shielding from drafts and heat sources to minimize systematic biases.[92][93]
Reproducibility and Error Analysis
Reproducibility in thermometry refers to the consistency of temperature measurements obtained from the same or different devices over repeated uses under identical conditions. It is influenced by factors such as material stability and environmental exposure, with long-term degradation potentially leading to drift in readings. For instance, liquid-in-glass thermometers exhibit noticeable drift when subjected to steady rather than intermittent use or elevated temperatures, compromising their ability to yield consistent results over time.[94] In resistance temperature detectors (RTDs), reproducibility can be assessed through repeated calibrations at fixed points, where deviations in resistance values indicate instability, as observed in platinum RTDs tested up to the aluminum point.[95]Thermometers encounter two primary error types: random and systematic. Random errors arise from unpredictable fluctuations, such as electrical noise in digital sensors or minor variations in ambient conditions, and are quantified by the standard deviation σ of repeated measurements under controlled settings. Systematic errors, in contrast, produce consistent biases; a common example is stem conduction, where heat transfers along the thermometer's stem from the immersion point to the exposed portion, causing the sensor to read lower than the true temperature when the ambient differs from the source. This error is exacerbated by shallow immersion depths and larger temperature gradients, with magnitudes depending on the probedesign, immersion depth, and temperature gradients (e.g., observed in tests at 80°C).[96][97] To correct stem conduction, immersion tables provide adjustments based on stem temperature and depth, as specified in standards like ASTM E77 for partial immersion liquid-in-glass thermometers, ensuring the emergent stem temperature aligns with reference conditions.[98]Error analysis in thermometry involves combining individual uncertainty components to estimate overall measurement reliability. The combined standard uncertaintyuc is calculated using the root-sum-square method, assuming independent contributions:uc=u12+u22+⋯+un2where each ui represents the standard uncertainty from sources like random noise or systematic corrections. For temperature measurements, this might integrate thermometer resolution (u1=0.1∘C) and calibration drift (u2=0.05∘C), yielding uc≈0.11∘C.[99] Drift testing evaluates long-term reproducibility by monitoring calibration check points over intervals, such as quarterly verifications against fixed points, to detect deviations exceeding specified tolerances.[100]Mitigation strategies focus on maintaining stability through proactive measures. Regular recalibration against traceable standards restores accuracy, with frequency determined by usage intensity—e.g., annual for intermittent industrial thermometers or more frequent for continuous processes. For thermocouples, cold-junction compensation addresses systematic errors from non-zero reference temperatures by measuring the cold junction with an auxiliary sensor (e.g., RTD) and adding the equivalent thermoelectric voltage, enabling accurate hot-junction calculations without an ice bath. Material stability testing, such as accelerated aging simulations, further ensures reproducibility by identifying degradation-prone components early.[94][101]
Indirect Methods
Thermography and Pyrometry
Thermography and pyrometry are non-contact indirect temperature measurement techniques that rely on detecting thermal radiation emitted by objects, enabling surface temperature mapping without physical interaction. These methods are particularly useful for imaging extended areas or high-temperature environments where traditional sensors are impractical. The underlying principle is blackbody radiation, governed by Planck's law, which describes the spectral radiance B(λ,T) of an ideal blackbody at temperatureT and wavelengthλ:B(λ,T)=λ52hc2ehc/(λkT)−11where h is Planck's constant, c is the speed of light, and k is Boltzmann's constant.[102]Infrared thermography employs specialized cameras to capture thermal images by detecting infrared radiation in the long-wave infrared band, typically 7–14 μm, where most terrestrial objects emit peak radiation at ambient temperatures. These cameras convert detected photon flux into temperature distributions, producing visual maps of surface temperatures. Accurate measurements require emissivity correction, as real surfaces emit less than a perfect blackbody; the surface temperature T is calculated using the Stefan-Boltzmann law approximation:T=[εσJ]1/4where J is the measured total radiance, ε is the surface emissivity (0 < ε ≤ 1), and σ is the Stefan-Boltzmann constant. Without correction, errors can exceed 10–20 K for low-emissivity materials like metals.[103][104]Pyrometry, a related technique, measures temperature by analyzing emitted radiation intensity, often at specific wavelengths, and is suited for high-temperature scenarios above 500°C, such as in metallurgy or combustion processes. Ratio pyrometers improve accuracy by simultaneously measuring radiation at two distinct wavelengths and computing the intensity ratio, which minimizes errors from unknown or varying emissivity since the ratio depends primarily on temperature. This dual-wavelength approach assumes a graybody model where emissivity is wavelength-independent, reducing systematic biases that plague single-wavelength pyrometers.[102][105]Both techniques face limitations from atmospheric absorption, particularly by water vapor and carbon dioxide in bands like 5–7.5 μm and 13–19 μm, which attenuates signals over distances greater than a few meters and necessitates corrections for path length and humidity. Recent advances in the 2020s include drone-mounted infrared systems for remote inspections, enabling high-resolution thermographic surveys of infrastructure like bridges to detect subsurface defects via temperature anomalies, with improved autonomy and data processing for real-time analysis.[103][106]
Thermocouples and Resistance-Based Sensing
Thermocouples function as indirect temperature sensors by exploiting the Seebeck effect, in which a temperature gradient across the junction of two dissimilar metals produces a measurable electromotive force (emf).[107] This voltage, typically in the millivolt range, is proportional to the temperature difference between the measuring junction (exposed to the environment) and a reference junction maintained at a known temperature.[107] Standardized letter-designated types, such as Type J (iron-constantan), are defined with reference tables that correlate emf to temperature across specific ranges.[108] For instance, Type J thermocouples operate from -210°C to 1200°C, making them suitable for a variety of industrial applications.[107]
These tables, developed by organizations like NIST, ensure consistent calibration and interpolation for accurate readings.[108]Resistance-based sensors, including thermistors and resistance temperature detectors (RTDs), measure temperature indirectly through changes in electrical resistance. Thermistors, typically made from semiconductor materials, display a steep, nonlinear resistance-temperature curve described by the equation R=R0exp(B(T1−T01)), where R is the resistance at temperatureT (in Kelvin), R0 is the resistance at reference temperatureT0, and B is the material's constant reflecting sensitivity.[109] This exponential behavior provides high sensitivity over narrow ranges, often -50°C to 150°C, ideal for precise monitoring.[109] RTDs, in contrast, offer a nearly linear resistance increase with temperature—approximately 0.385 Ω/°C for platinum-based Pt100 elements—and are referenced in secondary thermometer standards for their stability and reproducibility.In indirect configurations, both thermocouple and resistance-based probes are housed in protective sheaths, such as metal or ceramic tubes, to enable immersion in harsh or inaccessible environments without direct exposure of the sensing element.[110] Signals from these probes, which are low-level and prone to noise, undergo conditioning via amplifiers, filters, and cold-junction compensation circuits to yield reliable outputs for data acquisition systems.[110] These methods excel in industrial settings due to their rugged construction, wide operational ranges, and ability to withstand vibrations and corrosive conditions.[110] Developments in the 2010s introduced wireless variants, leveraging metamaterials and integrated circuits for battery-free, remote sensing in dynamic processes.[111]
Applications
Medical and Biological Uses
In medical and biological applications, thermometers are essential for monitoring body temperature to assess health status, detect infections, and support physiological research. Traditional clinical thermometers, such as oral and rectal glass models using mercury, were designed as max-registering devices where the liquid column would expand to the peak temperature and remain visible until shaken down, allowing reliable fever detection without continuous monitoring.[112] These mercury-based thermometers have been largely phased out in clinical settings due to environmental and health risks associated with mercury exposure, with regulatory efforts promoting safer alternatives since the early 2000s; phase-out timelines vary by region, including EU compliance under the Minamata Convention by 2020 and US EPA recommendations since 2001.[6] The normal human core body temperature is approximately 37°C, with deviations indicating potential issues like fever, typically defined as exceeding 38°C, which triggers immune responses and requires prompt evaluation in patients.[113]Non-invasive options like tympanic infrared thermometers have become standard for rapid clinical assessments, measuring temperature in the ear canal to approximate core body heat with an accuracy of ±0.2°C when used correctly, making them suitable for pediatric and adult care without discomfort.[114] In veterinary medicine, rectal thermometers—often digital for livestock such as cattle—provide accurate internal temperature readings to diagnose illnesses like infections or heat stress, with normal ranges varying by species but typically around 38–39.5°C for bovines.[115] Standards such as ASTM E1112 ensure the reliability of electronic medical thermometers by specifying performance criteria for intermittent patient monitoring, with maximum errors ranging from ±0.1°C to ±0.3°C over the 35–42°C range depending on the sub-range (e.g., ±0.1°C in 37–39°C).[116] Digital basal body thermometers, precise to 0.01°C, enable fertility tracking by charting subtle daily temperature shifts post-ovulation, aiding natural family planning with rises of about 0.2–0.5°C indicating the luteal phase.[117]Recent advances in the 2020s include wearable temperature sensors, such as skin-contact patches and smartwatch-integrated devices, which continuously monitor peripheral temperatures to infer trends in core body heat, inflammation, or ovulation without manual intervention, enhancing biological studies and remote health monitoring.[118] These innovations, often adhering to medical standards for accuracy, support applications from fever surveillance in pandemics to longitudinal tracking of circadian rhythms in research settings.[119]
Industrial and Environmental Monitoring
In industrial settings, resistance temperature detectors (RTDs), which rely on the principle of resistance-based sensing, are widely employed for precise contact measurements in pipelines, particularly in oil refining processes where temperatures can range from -200°C to 600°C to monitor fluid flows and prevent overheating.[120][121] These rugged sensors provide high accuracy and stability, essential for process control in harsh environments like petrochemical plants. For non-contact applications in high-temperature zones, pyrometers are commonly used to measure furnace interiors, enabling real-time monitoring of molten metals and combustion processes to optimize energy efficiency and ensure product quality.[122][123]Environmental monitoring utilizes platinum resistance thermometers, often integrated into automated weather stations, to deliver accurate air temperature readings shielded from solar radiation for reliable climate data collection.[124][125] In marine environments, conductivity-temperature-depth (CTD) sensors deployed on ocean buoys measure seawater temperature alongside salinity and depth, supporting long-term observations of ocean currents and heat distribution critical for climate modeling.[126][127]Safety applications in food processing adhere to Hazard Analysis and Critical Control Points (HACCP) guidelines, requiring thermometers to verify that poultry reaches an internal temperature of 74°C to eliminate pathogens like Salmonella.[128] In heating, ventilation, and air conditioning (HVAC) systems, industrial thermometers such as bimetallic or digital models monitor air and fluid temperatures to maintain optimal indoor conditions and prevent equipment failures.[129][130]Modern advancements include satellite-based infrared thermometry, such as NASA's Atmospheric Infrared Sounder (AIRS), which generates three-dimensional global temperature maps by detecting thermal radiation from Earth's surface and atmosphere.[131] In the 2020s, AI-enhanced systems have emerged for predictive temperature monitoring in industrial facilities, using machine learning algorithms to forecast anomalies from sensor data and reduce downtime through proactive maintenance.[132][133]
Scientific and Specialized Measurements
In scientific research, nanothermometry enables precise temperature measurements at the nanoscale, crucial for understanding thermal phenomena in materials like semiconductors. Scanning thermal microscopy (SThM) achieves spatial resolutions below 10 nm by using a heated probe to detect local heat fluxes, allowing mapping of temperature variations in self-heated nanostructures. For instance, SThM has been applied to identify hot spots in metal interconnects, providing insights into thermal management in nanoelectronics with sub-nanoWatt sensitivity.[134] Fluorescent nanodiamonds, containing nitrogen-vacancy (NV) centers, offer another approach, leveraging optically detected magnetic resonance (ODMR) shifts for thermometry with millikelvin sensitivity and nanoscale resolution. These biocompatible probes have been used for intracellular temperature mapping in living cells, such as HeLa cells, revealing gradients up to several kelvins during biological processes.[135]Cryometry addresses temperature measurement in extreme low-temperature regimes, below 1 K, essential for superconductivity and quantum studies. Vapor pressure thermometers operate on the principle that the saturated vapor pressure of cryogenic fluids like helium-3 or helium-4 correlates uniquely with temperature, providing primary calibration standards from 0.65 K to 5 K with realization uncertainties typically below 1 mK. These devices are particularly valuable in dilution refrigerators, which achieve millikelvin temperatures (down to 5-10 mK) through phase separation of helium isotopes, enabling continuous cooling for experiments in quantum materials.[136] In such systems, thermometry integrates vapor pressure gauges alongside resistance sensors to monitor the mixing chamber, ensuring stable conditions for low-noise measurements.[137]At the opposite extreme, high-temperature thermometry exceeding 2000°C is vital for plasma physics, where conventional sensors fail due to harsh conditions. Optical fiber sensors, often based on sapphire fibers or fluorescence decay, withstand these regimes by transmitting light signals immune to electromagnetic interference, measuring temperatures in plasma deposition processes with resolutions around 1°C. For example, blackbody cavity designs at fiber tips enable non-contact pyrometry in fusion plasmas, capturing rapid transients without material degradation.[138]Noise thermometry provides dissipation-free temperature sensing in quantum computing environments, exploiting Johnson-Nyquist noise in resistors to infer electron temperatures at millikelvin scales. In dilution refrigerators housing superconducting qubits, voltage noise cross-correlation techniques achieve precisions below 10 mK, independent of external calibration, by analyzing thermal fluctuations in integrated circuits. This method is scalable for multi-channel monitoring, mitigating decoherence from thermal gradients at the classical-quantum interface.[139]Specialized applications include food safety probes using thermocouples to verify pasteurization temperatures, ensuring pathogen inactivation without overprocessing. Type T or K thermocouples, with thin probes for rapid response (under 5 seconds), measure core temperatures around 72°C for 15 seconds in dairy products, complying with regulatory standards for microbial safety.[140] Recent advances in 2025 feature quantum sensors, such as cryo-CMOS systems for qubit control and sensing, delivering millikelvin precision in dilution systems for enhanced quantum device control. These systems operate below 70 mK, supporting scalable qubit arrays with reduced wiring heat loads.[141]