Hubbry Logo
CelsiusCelsiusMain
Open search
Celsius
Community hub
Celsius
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Celsius
Celsius
from Wikipedia

degree Celsius
A thermometer calibrated in degrees Celsius, showing a temperature of 0 °C in a glass of ice and water
General information
Unit systemSI
Unit oftemperature
Symbol°C
Named afterAnders Celsius
Conversions
x °C in ...... corresponds to ...
   Kelvin scale   (x + 273.15) K
   Fahrenheit scale   (9/5x + 32) °F

The degree Celsius is the unit of temperature on the Celsius temperature scale[1] (originally known as the centigrade scale outside Sweden),[2] one of two temperature scales used in the International System of Units (SI), the other being the closely related Kelvin scale. The degree Celsius (symbol: °C) can refer to a specific point on the Celsius temperature scale or to a difference or range between two temperatures. It is named after the Swedish astronomer Anders Celsius (1701–1744), who proposed the first version of it in 1742. The unit was called centigrade in several languages (from the Latin centum, which means 100, and gradus, which means steps) for many years. In 1948, the International Committee for Weights and Measures[3] renamed it to honor Celsius and also to remove confusion with the term for one hundredth of a gradian in some languages. Most countries use this scale (the Fahrenheit scale is still used in the United States, some island territories, and Liberia).

Throughout the 19th and the first half of the 20th centuries, the scale was based on 0 °C for the freezing point of water and 100 °C for the boiling point of water at 1 atm pressure. (In Celsius's initial proposal, the values were reversed: the boiling point was 0 degrees and the freezing point was 100 degrees.)

Between 1954 and 2019, the precise definitions of the unit degree Celsius and the Celsius temperature scale used absolute zero and the temperature of the triple point of water. Since 2007, the Celsius temperature scale has been defined in terms of the kelvin, the SI base unit of thermodynamic temperature (symbol: K). Absolute zero, the lowest temperature, is now defined as being exactly 0 K and −273.15 °C.[4]

Countries by usage
  Celsius (°C)
  Celsius (°C) and Fahrenheit (°F)
  Fahrenheit (°F)

History

[edit]
Anders Celsius's original thermometer used a reversed scale, with 100 as the freezing point and 0 as the boiling point of water.

In 1742, Swedish astronomer Anders Celsius (1701–1744) created a temperature scale that was the reverse of the scale now known as "Celsius": 0 represented the boiling point of water, while 100 represented the freezing point of water.[5] In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that the melting point of ice is essentially unaffected by pressure. He also determined with remarkable precision how the boiling point of water varied as a function of atmospheric pressure. He proposed that the zero point of his temperature scale, being the boiling point, would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. The BIPM's 10th General Conference on Weights and Measures (CGPM) in 1954 defined one standard atmosphere to equal precisely 1,013,250 dynes per square centimeter (101.325 kPa).[6]

In 1743, the French physicist Jean-Pierre Christin, permanent secretary of the Academy of Lyon, inverted the Celsius temperature scale so that 0 represented the freezing point of water and 100 represented the boiling point of water. Some credit Christin for independently inventing the reverse of Celsius's original scale, while others believe Christin merely reversed Celsius's scale.[7][8] On 19 May 1743 he published the design of a mercury thermometer, the "Thermometer of Lyon" built by the craftsman Pierre Casati that used this scale.[9][10][11]

In 1744, coincident with the death of Anders Celsius, the Swedish botanist Carl Linnaeus (1707–1778) reversed Celsius's scale.[12] His custom-made "Linnaeus-thermometer", for use in his greenhouses, was made by Daniel Ekström, Sweden's leading maker of scientific instruments at the time, whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale;[13] among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Daniel Ekström [sv], the instrument maker; and Mårten Strömer (1707–1770) who had studied astronomy under Anders Celsius.

The first known Swedish document[14] reporting temperatures in this modern "forward" Celsius temperature scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to a student of his, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the University of Uppsala Botanical Garden:

... since the caldarium (the hot part of the greenhouse) by the angle of the windows, merely from the rays of the sun, obtains such heat that the thermometer often reaches 30 degrees, although the keen gardener usually takes care not to let it rise to more than 20 to 25 degrees, and in winter not under 15 degrees ...

Centigrade vis-à-vis Celsius

[edit]

Since the 19th century, the scientific and thermometry communities worldwide have used the phrase "centigrade scale" and temperatures were often reported simply as "degrees" or, when greater specificity was desired, as "degrees centigrade", with the symbol °C.

In the French language, the term centigrade also means one hundredth of a gradian, when used for angular measurement. The term centesimal degree was later introduced for temperatures[15] but was also problematic, as it means gradian (one hundredth of a right angle) in the French and Spanish languages. The risk of confusion between temperature and angular measurement was eliminated in 1948 when the 9th meeting of the General Conference on Weights and Measures and the Comité International des Poids et Mesures (CIPM) formally adopted "degree Celsius" for temperature.[16][a]

While "Celsius" is commonly used in scientific work, "centigrade" is still used in French and English-speaking countries, especially in informal contexts. The frequency of the usage of "centigrade" has declined over time.[17]

Due to metrication in Australia, after 1 September 1972 weather reports in the country were exclusively given in Celsius.[18] In the United Kingdom, it was not until February 1985 that forecasts by BBC Weather switched from "centigrade" to "Celsius".[19]

Common temperatures

[edit]
Equivalent temperatures in kelvin (K), Rankine (R), Celsius (°C), and Fahrenheit (°F)

All phase transitions are at standard atmosphere. Figures are either by definition, or approximated from empirical measurements.

Key temperature scale relations
Kelvin Celsius Fahrenheit Rankine
Absolute zero[A] 0 K −273.15 °C −459.67 °F 0 °R
Boiling point of liquid nitrogen 77.4 K −195.8 °C[20] −320.4 °F 139.3 °R
Sublimation point of dry ice 195.1 K −78 °C −108.4 °F 351.2 °R
Intersection of Celsius and Fahrenheit scales[A] 233.15 K −40 °C −40 °F 419.67 °R
Melting point of ice[21] 273.1499 K −0.0001 °C 31.9998 °F 491.6698 °R
Common room temperature[B][22] 293 K 20 °C 68 °F 528 °R
Average normal human body temperature[23] 310.15 K 37.0 °C 98.6 °F 558.27 °R
Boiling point of water[b] 373.1339 K 99.9839 °C 211.971 °F 671.6410 °R
  1. ^ a b Exact value, by SI definition of the kelvin
  2. ^ NIST common reference temperature, provided as round numbers

Name and symbol typesetting

[edit]

The "degree Celsius" has been the only SI unit whose full unit name contains an uppercase letter since 1967, when the SI base unit for temperature became the kelvin, replacing the capitalized term degrees Kelvin. The plural form is "degrees Celsius".[24]

The general rule of the International Bureau of Weights and Measures (BIPM) is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g. "30.2 °C" (not "30.2°C" or "30.2° C").[25] The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle (°, , and ″, respectively), for which no space is left between the numerical value and the unit symbol.[26] Other languages, and various publishing houses, may follow different typographical rules.

Unicode character

[edit]

Unicode provides the Celsius symbol at code point U+2103 DEGREE CELSIUS. However, this is a compatibility character provided for roundtrip compatibility with legacy encodings. It easily allows correct rendering for vertically written East Asian scripts, such as Chinese. The Unicode standard explicitly discourages the use of this character: "In normal use, it is better to represent degrees Celsius '°C' with a sequence of U+00B0 ° DEGREE SIGN + U+0043 C LATIN CAPITAL LETTER C, rather than U+2103 DEGREE CELSIUS. For searching, treat these two sequences as identical."[27]

Temperatures and intervals

[edit]

The degree Celsius is subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. "Gallium melts at 29.7646 °C" and "The temperature outside is 23 degrees Celsius"), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. "The output of the heat exchanger is hotter by 40 degrees Celsius", and "Our standard uncertainty is ±3 °C").[28] Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.[c] This is sometimes solved by using the symbol °C (pronounced "degrees Celsius") for a temperature, and C° (pronounced "Celsius degrees") for a temperature interval, although this usage is non-standard.[29] Another way to express the same is "40 °C ± 3 K", which can be commonly found in literature.

Celsius measurement follows an interval system but not a ratio system; and it follows a relative scale not an absolute scale. For example, an object at 20 °C does not have twice the energy of when it is 10 °C; and 0 °C is not the lowest Celsius value. Thus, degrees Celsius is a useful interval measurement but does not possess the characteristics of ratio measures like weight or distance.[30]

Coexistence with Kelvin

[edit]

In science and in engineering, the Celsius and Kelvin scales are often used in combination in close contexts, e.g. "a measured value was 0.01023 °C with an uncertainty of 70 μK". This practice is permissible because the magnitude of the degree Celsius is equal to that of the kelvin. Notwithstanding the official endorsement provided by decision no. 3 of Resolution 3 of the 13th CGPM,[31] which stated "a temperature interval may also be expressed in degrees Celsius", the practice of simultaneously using both °C and K remains widespread throughout the scientific world as the use of SI-prefixed forms of the degree Celsius (such as "μ°C" or "microdegrees Celsius") to express a temperature interval has not been widely adopted.

Melting and boiling points of water

[edit]
Celsius temperature conversion formulae
from Celsius to Celsius
Fahrenheit x °C ≘ (x × 9/5 + 32) °F x °F ≘ (x − 32) × 5/9 °C
Kelvin x °C ≘ (x + 273.15) K x K ≘ (x − 273.15) °C
Rankine x °C ≘ (x + 273.15) × 9/5 °R x °R ≘ (x − 491.67) × 5/9 °C
For temperature intervals rather than specific temperatures,
1 °C = 1 K = 9/5 °F = 9/5 °R
Conversion between temperature scales

The melting and boiling points of water are no longer part of the definition of the Celsius temperature scale. In 1948, the definition was changed to use the triple point of water.[32] In 2005, the definition was further refined to use water with precisely defined isotopic composition (VSMOW) for the triple point. In 2019, the definition was changed to use the Boltzmann constant, completely decoupling the definition of the kelvin from the properties of water. Each of these formal definitions left the numerical values of the Celsius temperature scale identical to the prior definition to within the limits of accuracy of the metrology of the time.

When the melting and boiling points of water ceased being part of the definition, they became measured quantities instead. This is also true of the triple point.

In 1948 when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water's known melting point, it was simply defined as precisely 0.01 °C. However, later measurements showed that the difference between the triple and melting points of VSMOW is actually very slightly (< 0.001 °C) greater than 0.01 °C. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water's triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero). Now decoupled from the actual boiling point of water, the value "100 °C" is hotter than 0 °C – in absolute terms – by a factor of exactly 373.15/273.15 (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW under one standard atmosphere of pressure was actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW was slightly less, about 99.974 °C.[33]

This boiling-point difference of 16.1 millikelvins between the Celsius temperature scale's original definition and the previous one (based on absolute zero and the triple point) has little practical meaning in common daily applications because water's boiling point is very sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 in) causes the boiling point to change by one millikelvin.[citation needed]

See also

[edit]

Explanatory notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The Celsius scale is a system in which the freezing point of under standard atmospheric pressure is defined as 0 °C and the as 100 °C, dividing the interval between these points into 100 equal degrees. In the (SI), the degree Celsius (°C) is a derived unit equal in magnitude to the (K), with 0 °C exactly equivalent to 273.15 K, making it suitable for both absolute thermodynamic measurements and practical applications. Named after Swedish astronomer (1701–1744), the scale originated from his 1742 proposal presented to the Swedish Royal Academy of Sciences, where he described experiments using mercury thermometers to establish fixed points based on water's phase changes. Celsius's original version inverted the modern convention, assigning 0 °C to water's and 100 °C to its freezing point; this was reversed shortly after his death in 1744 by colleagues like , establishing the direct scale still used today. Initially known as the centigrade scale—reflecting its division into 100 grades—the name was officially changed to Celsius in 1948 by the International Committee for Weights and Measures, selecting "degree Celsius" from proposed names to standardize the unit. The scale is the standard for in the SI system and is employed globally in science, industry, and daily life, with water's at 0.01 °C serving as a precise reference for thermometers. It facilitates intuitive readings for human-perceived temperatures, such as body around 37 °C, and is used in most countries for weather reporting and , though the scale persists in the United States for non-scientific contexts. This widespread adoption underscores Celsius's role in standardizing across disciplines, from monitoring to chemical reactions.

Fundamentals

Definition

The Celsius scale, denoted by the symbol °C, is the () derived unit for expressing Celsius temperature, a measure of offset from the scale. It is defined by the equation t/C=T/K273.15t/^\circ\mathrm{C} = T/\mathrm{K} - 273.15, where tt is the Celsius temperature and TT is the in , establishing 0 °C as equivalent to 273.15 . This definition ensures that , the theoretical lowest temperature at which thermal motion ceases, corresponds to exactly -273.15 °C. The interval of 1 °C is equal in magnitude to 1 K, meaning temperature differences are identical on both scales, with the Celsius scale divided into 100 equal parts between the approximate freezing point (0 °C) and boiling point (100 °C) of water at standard atmospheric pressure. Prior to the 2019 SI revision, the scale was anchored to the triple point of water (the temperature at which water's solid, liquid, and gas phases coexist in equilibrium) at exactly 0.01 °C, providing a fixed reference for calibration. Since the 26th General Conference on Weights and Measures in 2018 (effective 20 May 2019), the Celsius scale is tied to the through a definition based on fundamental physical constants, independent of water's properties for greater universality and precision. The is now defined by fixing the kk at exactly 1.380649×10231.380649 \times 10^{-23} J/K, where the joule is the SI unit of energy, ensuring the scale's stability without reliance on material artifacts or phase changes. Under this framework, the of water is measured to be approximately 0.01 °C, maintaining practical continuity with prior calibrations. The is the primary unit for everyday and scientific use in nearly all countries worldwide. The is the only major country where the predominates for everyday non-scientific purposes; a few other countries and territories, such as , , , , and , use alongside .

Scale Properties

The is an interval scale, characterized by its and additive nature, where differences remain constant regardless of the absolute level. This stems from its definition as t=T273.15t = T - 273.15 K, where TT is the in kelvins, ensuring that equal intervals represent equal changes in across the scale. For instance, the physical magnitude of a 10 ° rise from 15 ° to 25 ° is identical to that from -5 ° to 5 °, allowing reliable comparisons of variations without dependence on the reference point. A key property of the Celsius scale is the equality of its intervals with those on the scale, as the degree Celsius (°C) is by definition equal in magnitude to the (K). Consequently, the numerical value of any temperature difference or interval is the same when expressed in either unit; for example, a change of 5 °C corresponds exactly to 5 . This equivalence permits direct addition and subtraction of temperature intervals for computations, such as calculating or , without requiring unit conversion, which is particularly advantageous in scientific and engineering applications. Negative temperatures on the Celsius scale, denoted by values below 0 °C, represent conditions colder than the freezing point of under standard , equivalent to thermodynamic temperatures below 273.15 . These readings indicate sub-freezing states, such as those encountered in winter climates or processes, where transitions to or other substances exhibit reduced molecular motion compared to the reference. The scale's ability to accommodate negative values facilitates the description of such low-temperature phenomena without altering the interval structure. For precise measurements, the Celsius scale employs decimal subdivisions, enabling fine-grained reporting such as 18.7 °C, which aligns with the metric system's decimal framework for consistency and ease of calculation. This precision is essential in , where the specifies resolutions of 0.1 °C for air observations to ensure accurate interval comparisons in reports and data. In scientific contexts, such as and , the scale's decimal structure supports uncertainties as low as 0.1 , promoting standardized and reproducible interval reporting across global research efforts.

Historical Development

Invention and Early Proposals

The development of the Celsius temperature scale built upon earlier efforts to standardize thermometry, particularly the work of Danish astronomer , who in 1701 proposed one of the first practical scales using a . Rømer set zero at the freezing point of a solution, with the of at 60 degrees and pure water's freezing at 7.5 degrees, establishing a framework of fixed reference points based on observable physical changes rather than arbitrary markers. This approach influenced subsequent scientists by emphasizing reproducible intervals, though Rømer's scale used different references and a smaller range compared to later centesimal systems. In 1742, Swedish astronomer presented his proposal to the Swedish Royal Society of Sciences in , introducing a centesimal scale divided into 100 equal parts between two fixed points: the of water at 0 degrees and the freezing point at 100 degrees. Celsius based this on meticulous experiments using a mercury thermometer, demonstrating that the freezing point remained consistent across varying atmospheric pressures and latitudes, while the varied with pressure but could be standardized to sea-level conditions, thus aiming for an international standard to replace the proliferation of inconsistent scales like those of Réaumur and Delisle. He was particularly influenced by Joseph-Nicolas Delisle's reversed scale, which placed zero at the , and chose the inversion—higher numbers for colder temperatures—to minimize negative readings in typical conditions, reflecting a practical consideration for scientific measurements. Celsius's scale gained early traction in Sweden following its publication in the paper "Observations of two persistent degrees on a thermometer," with adoption by 1744 through instruments like those crafted by local makers such as Ekström and Strömer, and notably by Botanist for precise environmental recordings. Celsius himself conducted supporting tests, including measurements of variations in points under different pressures, which aligned closely with modern values (e.g., a decrease of approximately 1 °C for every 300 m increase in altitude) and underscored the scale's reliability for meteorological applications. This initial implementation in Swedish scientific circles laid the groundwork for further refinements, though the inversion persisted until shortly after Celsius's death in 1744.

Naming Conventions and Adoption

The inverted form of the temperature scale, with 0° as the freezing point of water and 100° as the , was independently proposed in 1743 by French physicist Jean-Pierre Christin and adopted in in 1744 by Botanist shortly after Anders Celsius's death, establishing the configuration that became the international standard. Originally termed the "centigrade" scale from the Latin words for "hundred" and "steps" to reflect its division into 100 equal intervals between the reference points, the name led to confusion with the centigrade unit of angular measurement (one-hundredth of a , or ). This ambiguity prompted the 9th General Conference on Weights and Measures (CGPM) to officially rename it the "Celsius" scale in 1948, honoring the original proposer while honoring the SI system's emphasis on clarity. By the 19th century, the centigrade (later Celsius) scale had become the de facto standard for temperature measurement in scientific literature and experiments across Europe, facilitating consistent international collaboration in fields like physics and chemistry. Its adoption accelerated globally through 20th-century metrication efforts, with the United Kingdom formally endorsing the metric system including Celsius in 1965 via government policy, leading to widespread implementation in education, industry, and weather reporting by the 1970s. Australia followed suit, with the Bureau of Meteorology switching weather forecasts from Fahrenheit to Celsius on September 1, 1972, as part of broader metric conversion. By the 1980s, Celsius had achieved near-universal integration as the SI unit for temperature in over 190 countries, supporting standardized scientific and trade practices worldwide. In the United States, Fahrenheit persists alongside Celsius due to entrenched cultural familiarity and the inertia of the customary imperial measurement system, which was inherited from British colonial practices and never fully supplanted despite federal metrication initiatives in the 1970s. Dual usage continues in sectors like , where Fahrenheit's finer gradations (180 divisions versus Celsius's 100 between water's phase transitions) aid precise monitoring of temperatures, such as the normal range around 98.6°F.

Notation and Representation

Symbol Usage

The standard symbol for the Celsius temperature unit is °C, consisting of the degree sign (°) immediately followed by a capital letter C with no intervening space, as defined in the (SI). This notation reflects the unit's status as the degree Celsius, a special name for a temperature interval equal in magnitude to the . According to SI recommendations, a must separate the numerical value from the symbol, as in 23 °C, to ensure clarity in expressing Celsius temperatures. In , the symbol °C is rendered in roman (upright) type, while any associated symbols (such as t for ) are italicized; the degree sign itself is positioned as a superscript-like for proper alignment. In plain text or non-technical contexts, the degree sign may appear at full size rather than raised. Formal SI usage advises against redundancy, such as writing "degrees °C," preferring simply the symbol or the full unit name "degree Celsius" (with lowercase "degree" and uppercase "Celsius"). Contextual applications of the °C symbol vary slightly by domain. In programming and digital encoding, it is often represented using Unicode as the degree sign (U+00B0) followed by "C," or the combined glyph ℃ (U+2103). Engineering drawings and technical specifications adhere to SI spacing rules, placing the symbol after values like 100 °C for boiling point annotations. Weather reports and meteorological contexts universally employ °C with a space before it, such as 15 °C, to denote ambient temperatures. Common errors in °C usage include employing a lowercase "c" instead of capital "C," inserting a space between the degree sign and "C" (e.g., ° C), or omitting the required space between the number and symbol (e.g., 20°C). Another frequent mistake is confusing the temperature symbol with the angular degree (also °), though context typically distinguishes them; additionally, the obsolete "degree centigrade" should be avoided in favor of "degree Celsius."

Typesetting and Unicode

The Celsius degree symbol is typically represented in Unicode as the combination of the degree sign U+00B0 (°) followed by the Latin capital letter C (U+0043), forming °C. Although a exists at U+2103 (℃), the Unicode Standard recommends using the sequence U+00B0 U+0043 in normal text for better compatibility and semantic clarity, with fonts encouraged to render this ligature-like combination tightly for optimal appearance. The degree sign U+00B0 was introduced in Unicode version 1.1 in June 1993, within the block. In mathematical and scientific typesetting, the is rendered as a small superscript circle immediately preceding the 'C', ensuring precise alignment and no space between the elements. For example, in , this is achieved with the notation C^{\circ}\mathrm{C}, where the degree is raised and the 'C' is in upright roman font to distinguish it as a unit symbol. adjustments are often applied in professional to optically center the superscript degree relative to the 'C', compensating for the degree's smaller width and preventing visual gaps, particularly in fonts. Legacy systems lacking full support may approximate the symbol using ASCII text such as "deg C" or "C", which avoids encoding issues but sacrifices precision. In , the degree sign is commonly inserted via the ° or numeric entity °, followed by 'C', as in °C, ensuring in web rendering. Subsequent versions, such as 4.0 (2003) and later, have improved font and rendering support for these characters through enhanced normalization and compatibility decompositions, facilitating consistent display across platforms.

Conversions and Equivalences

Relation to Kelvin Scale

The Celsius and scales are directly linked through the (SI), where the (K) serves as the base unit for , and the degree Celsius (°C) is a derived unit equal in magnitude to the but offset for practical reference. The conversion between the two is given by the formula T/K=t/C+273.15,T / \text{K} = t / ^\circ\text{C} + 273.15, where TT is the in kelvins and tt is the Celsius temperature; this relation holds exactly, as the numerical value of a temperature interval is identical in both units. In physics and , adding 273.15 to a Celsius value shifts measurements to the absolute scale without altering the size of temperature intervals, enabling applications like thermodynamic calculations where negative temperatures are invalid. Prior to the 2019 SI redefinition, the was defined as exactly 1/273.16 of the at the of (assigned 273.16 K, corresponding to 0.01 °C), which established the 273.15 K offset between 0 °C and through the ice point reference. The 2019 revision, effective May 20, fixed the at exactly 1.380649×10231.380\,649 \times 10^{-23} J/K to define the , decoupling it from the water for greater precision and universality while preserving the exact Celsius-Kelvin relation. In scientific practice, the kelvin is preferred for absolute thermodynamic quantities, such as in the PV=nRTPV = nRT, where temperature must be on an to ensure physical consistency. Conversely, Celsius is commonly used for temperature differences and , such as reporting, due to its alignment with familiar points like water's freezing and . This coexistence supports versatile applications across disciplines, from chemistry ( for reaction kinetics) to ( for climate data).

Relation to Fahrenheit Scale

The scale, proposed by in 1724, differs from the Celsius scale in both its reference points and interval size, leading to conversions that account for the offset in reference points and the difference in interval sizes. Fahrenheit initially defined 0 °F as the freezing point of a brine mixture of equal parts , , and (or salt in some accounts), and later set 32 °F as the freezing point of pure and 96 °F as the approximate normal , with the of at 212 °F; this created 180 divisions between freezing and boiling, compared to the 100 divisions in Celsius. These arbitrary zero points, rooted in Fahrenheit's early thermometry experiments rather than a universal standard like water's , necessitate an offset in conversions, unlike the direct interval scaling between Celsius and . The standard conversion formulas account for this offset and the interval ratio. To convert from Fahrenheit to Celsius: t(C)=t(F)321.8=(t(F)32)×59t(^\circ \text{C}) = \frac{t(^\circ \text{F}) - 32}{1.8} = (t(^\circ \text{F}) - 32) \times \frac{5}{9} This derives from aligning the scales: subtracting 32 °F shifts the Fahrenheit zero to match Celsius's 0 °C ( freezing), and dividing by 1.8 (or multiplying by 5/9) adjusts for the larger intervals, as 180 °F spans the same range as 100 °C. Conversely, to convert from Celsius to Fahrenheit: t(F)=t(C)×1.8+32=t(C)×95+32t(^\circ \text{F}) = t(^\circ \text{C}) \times 1.8 + 32 = t(^\circ \text{C}) \times \frac{9}{5} + 32 Here, multiplying by 9/5 expands the smaller Celsius intervals to Fahrenheit's scale, and adding 32 realigns the zero point. For temperature differences (intervals), the conversion is linear and scale-invariant: a change of 1 °C equals a change of 1.8 °F, so differences in Celsius are simply multiplied by 9/5 to obtain Fahrenheit equivalents, without the offset. In the United States, where Fahrenheit remains the predominant scale for everyday use, many weather applications and forecasts display both scales simultaneously to assist users familiar with Celsius, such as in scientific contexts or for international audiences. A common mental approximation for converting Fahrenheit to Celsius is to subtract 30 from the Fahrenheit value and divide by 2, providing reasonable accuracy for typical room and weather temperatures (e.g., 86 °F ≈ (86 - 30)/2 = 28 °C, close to the exact 30 °C).

Key Reference Points

Water's Phase Transitions

The Celsius scale was historically defined using the phase transitions of water as fixed reference points, with the freezing point of pure water set at 0 °C and the boiling point approximately at 100 °C under standard atmospheric pressure of 1 atm (101.325 kPa). These points provided the anchors for the scale, dividing the interval between them into 100 equal degrees. However, precise measurements show that the boiling point of pure water (using Vienna Standard Mean Ocean Water, VSMOW) at exactly 1 atm is 99.9839 °C, a value very close to the nominal 100 °C used for calibration purposes. In the modern SI system, adopted in 2019, these water-based points are secondary, as the Celsius scale is now derived directly from the Kelvin scale through fixed constants, ensuring the freezing point remains exactly 0 °C by definition while aligning with fundamental physical properties. A more precise reference for the scale prior to 2019 was the triple point of water, where solid, liquid, and vapor phases coexist in equilibrium at 0.01 °C and a pressure of 611.657 Pa. This point served as the primary reproducible standard for calibrating thermometers, offering higher accuracy than the freezing or boiling points, which can be affected by slight variations in pressure or purity. The triple point temperature is exactly 273.16 K on the Kelvin scale, linking Celsius directly to absolute temperature measurements. The sublimation curve of water, representing the transition from ice to vapor without melting, extends downward to approach absolute zero at -273.15 °C, where molecular motion theoretically ceases, though practical sublimation rates diminish exponentially at low temperatures. Environmental factors influence these phase transitions, altering the reference points from ideal conditions. Impurities, such as salts in seawater with a salinity of about 35 parts per thousand, depress the freezing point through colligative properties, lowering it to approximately -1.8 °C at 1 atm, which prevents surface oceans from freezing solid in polar regions. Similarly, the boiling point decreases with altitude due to reduced atmospheric pressure; at 2000 m, where pressure is roughly 0.8 atm, water boils at about 93 °C, affecting cooking and sterilization processes in high-elevation areas. These variations highlight the scale's reliance on standardized conditions for consistency.

Common Everyday Temperatures

In human physiology, the normal core body is 37.0 °C, which maintains optimal metabolic function and . A fever, indicating an to or , is typically defined as a temperature exceeding 38 °C. Conversely, sets in when core falls below 35 °C, impairing bodily functions and requiring immediate medical intervention. Everyday environmental temperatures on the Celsius scale span a wide range relevant to comfort and activity. , often considered the comfort zone for indoor living and working, falls between 20–25 °C, balancing with energy efficiency in occupied spaces. for cooking or sterilization reaches 100 °C at standard , a fixed reference point for heat-based processes. At the theoretical extreme, represents the coldest possible temperature at -273.15 °C, where molecular motion ceases, foundational to and . Weather extremes highlight the Celsius scale's utility in meteorology. The hottest surface air temperature officially recorded by the WMO is 56.7 °C in Death Valley, California, on July 10, 1913, though recent 2025 studies have questioned its accuracy, illustrating hyper-arid conditions that challenge human survival. The coldest verified reading is -89.2 °C at in on July 21, 1983, demonstrating polar night's radiative cooling in a high-elevation environment. For broader context, Earth's average global surface temperature is approximately 15 °C, serving as a baseline for climate monitoring and variability assessments. In food preparation and storage, the Celsius scale standardizes processes for and quality. Household freezers are set to -18 °C to inhibit and preserve perishables long-term. Typical in ovens occurs at 180 °C, allowing even distribution for items like cakes and breads without excessive drying or burning. These values, often equivalent to 0 °F for freezing and 350 °F for , underscore the scale's practicality in daily applications.

Temperature Intervals

Measuring Differences

Temperature intervals on the Celsius scale are calculated by direct subtraction of two temperature readings, expressed as Δt = t₂ - t₁ in degrees Celsius (°C), where the interval size remains identical to that on the Kelvin scale since one Celsius degree equals one Kelvin interval. This method focuses on changes rather than absolute values, making it suitable for quantifying thermal variations without reference to the scale's zero point. These intervals hold significant physical meaning, representing proportional changes in ; for instance, a 1 °C temperature rise in corresponds to an energy input of approximately 4.184 J per gram, based on its under standard conditions. In applications like , the change in of a is given by ΔL = α L Δt, where α is the linear coefficient and Δt is the Celsius interval, illustrating how such differences drive dimensional adjustments in contexts. In climate analysis, Celsius intervals are essential for tracking trends, such as annual warming rates or diurnal variations, providing a basis for assessing environmental impacts without scale-dependent biases. Representative examples include typical daily air temperature fluctuations of 10–15 °C in many mid-latitude regions, reflecting the diurnal cycle driven by solar heating, as observed in global meteorological datasets. In human physiology, a fever often involves a 2 °C rise above the normal body temperature of approximately 37 °C, signaling immune response without implying proximity to absolute limits. These differences are reference-independent, avoiding complications from absolute zero, as the interval magnitude depends solely on the scale's uniformity rather than its origin. When reporting Celsius intervals, precision should align with the measurement instrument's accuracy to ensure reliability; in settings, this often means resolutions of ±0.1 °C for high-precision thermometers calibrated against standards.

Practical Applications of Intervals

In , Celsius intervals are essential for quantifying and communicating temperature anomalies and trends, such as the observed increase of approximately 1.5°C since pre-industrial levels (1850–1900) as of 2024, when the annual average first exceeded 1.5°C above pre-industrial levels—the warmest year on record—according to the (WMO). This interval-based reporting allows scientists to track changes like the accelerated warming rate of 0.2°C per decade since 1970, facilitating comparisons across datasets and regions without reliance on absolute values. For instance, projections under high-emissions scenarios indicate potential global warming intervals of up to 5°C by the end of the century, underscoring the scale of climate impacts in and seasonal outlooks. In , Celsius intervals guide the design and testing of systems under , particularly in assessing durability through controlled temperature differentials. For example, in and automotive applications, stress tests often involve cycling materials over 40°C intervals—such as from 20°C to 60°C—to evaluate creep and , ensuring compliance with standards like those from the American Society for Testing and Materials (ASTM). In (HVAC) systems, engineers target comfort differentials of around 10–15°C between indoor (typically 20–24°C) and outdoor conditions to optimize energy efficiency and occupant well-being, as outlined in Standard 55 for thermal environmental conditions. Medical applications leverage Celsius intervals for precise monitoring and therapeutic control of body variations. In treatments for cancer, clinicians induce controlled rises of about 1°C per hour to reach tumor temperatures of 40–43°C, enhancing radiotherapy efficacy while minimizing risks to healthy tissue, as demonstrated in clinical protocols from the . Similarly, in cooking and , intervals inform time adjustments; for every 10°C increase in temperature, cooking duration can be roughly halved to achieve equivalent doneness, based on principles validated in research. In climate policy, Celsius intervals define critical thresholds for international agreements, emphasizing differences from baseline periods rather than fixed points. The , adopted under the United Nations Framework Convention on Climate Change, commits nations to limit global warming to well below 2°C above pre-industrial levels, with efforts to cap it at 1.5°C, using these intervals to frame emission reduction targets and strategies. This approach enables projections of interval-based impacts, such as a 2°C threshold triggering irreversible sea-level rise, informing policy decisions on without absolute temperature references.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.