Hubbry Logo
Unit of measurementUnit of measurementMain
Open search
Unit of measurement
Community hub
Unit of measurement
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Unit of measurement
Unit of measurement
from Wikipedia

The former Weights and Measures office in Seven Sisters, London
Units of measurement, Palazzo della Ragione, Padua

A unit of measurement, or unit of measure, is a definite magnitude of a quantity, defined and adopted by convention or by law, that is used as a standard for measurement of the same kind of quantity.[1] Any other quantity of that kind can be expressed as a multiple of the unit of measurement.[2]

For example, a length is a physical quantity. The metre (symbol m) is a unit of length that represents a definite predetermined length. For instance, when referencing "10 metres" (or 10 m), what is actually meant is 10 times the definite predetermined length called "metre".

The definition, agreement, and practical use of units of measurement have played a crucial role in human endeavour from early ages up to the present. A multitude of systems of units used to be very common. Now there is a global standard, the International System of Units (SI), the modern form of the metric system.

In trade, weights and measures are often a subject of governmental regulation, to ensure fairness and transparency. The International Bureau of Weights and Measures (BIPM) is tasked with ensuring worldwide uniformity of measurements and their traceability to the International System of Units (SI).[3]

Metrology is the science of developing nationally and internationally accepted units of measurement.[4]

In physics and metrology, units are standards for measurement of physical quantities that need clear definitions to be useful. Reproducibility of experimental results is central to the scientific method. A standard system of units facilitates this. Scientific systems of units are a refinement of the concept of weights and measures historically developed for commercial purposes.[5]

Science, medicine, and engineering often use larger and smaller units of measurement than those used in everyday life. The judicious selection of the units of measurement can aid researchers in problem solving (see, for example, dimensional analysis).

History

[edit]

A unit of measurement is a standardized quantity of a physical property, used as a factor to express occurring quantities of that property. Units of measurement were among the earliest tools invented by humans. Primitive societies needed rudimentary measures for many tasks: constructing dwellings of an appropriate size and shape, fashioning clothing, or bartering food or raw materials. Before the establishment of the decimal metric system in France during the French Revolution in the late 18th century,[6] many units of length were based on parts of the human body.[7]

The earliest known uniform systems of measurement seem to have all been created sometime in the 4th and 3rd millennia BC among the ancient peoples of Mesopotamia, Egypt and the Indus Valley, and perhaps also Elam in Persia as well.

Weights and measures are mentioned in the Bible (Leviticus 19:35–36). It is a commandment to be honest and have fair measures.

In the Magna Carta of 1215 (The Great Charter) with the seal of King John, put before him by the Barons of England, King John agreed in Clause 35 "There shall be one measure of wine throughout our whole realm, and one measure of ale and one measure of corn—namely, the London quart;—and one width of dyed and russet and hauberk cloths—namely, two ells below the selvage..."

As of the 21st century, the International System is predominantly used in the world. There exist other unit systems which are used in many places such as the United States Customary System and the Imperial System. The United States is the only industrialized country that has not yet at least mostly converted to the metric system.[8] The systematic effort to develop a universally acceptable system of units dates back to 1790 when the French National Assembly charged the French Academy of Sciences to come up such a unit system. This system was the precursor to the metric system which was quickly developed in France but did not take on universal acceptance until 1875 when The Metric Convention Treaty was signed by 17 nations. After this treaty was signed, a General Conference of Weights and Measures (CGPM) was established. The CGPM produced the current SI, which was adopted in 1954 at the 10th Conference of Weights and Measures. Currently, the United States is a dual-system society which uses both the SI and the US Customary system.[9][10]

Systems of units

[edit]

The use of a single unit of measurement for some quantity has obvious drawbacks. For example, it is impractical to use the same unit for the distance between two cities and the length of a needle. Thus, historically they would develop independently. One way to make large numbers or small fractions easier to read, is to use unit prefixes.

At some point in time though, the need to relate the two units might arise, and consequently the need to choose one unit as defining the other or vice versa. For example, an inch could be defined in terms of a barleycorn. A system of measurement is a collection of units of measurement and rules relating them to each other.

As science progressed, a need arose to relate the measurement systems of different quantities, like length and weight and volume. The effort of attempting to relate different traditional systems between each other exposed many inconsistencies, and brought about the development of new units and systems.

Systems of units vary from country to country. Some of the different systems include the centimetre–gram–second, foot–pound–second, metre–kilogram–second systems, and the International System of Units, SI. Among the different systems of units used in the world, the most widely used and internationally accepted one is SI. The base SI units are the second, metre, kilogram, ampere, kelvin, mole and candela; all other SI units are derived from these base units.[11][12]: 132 

Systems of measurement in modern use include the metric system, the imperial system, and United States customary units.

Traditional systems

[edit]

Historically many of the systems of measurement which had been in use were to some extent based on the dimensions of the human body. Such units, which may be called anthropic units, include the cubit, based on the length of the forearm; the pace, based on the length of a stride; and the foot and hand.[13]: 25  As a result, units of measure could vary not only from location to location but from person to person. Units not based on the human body could be based on agriculture, as is the case with the furlong and the acre, both based on the amount of land able to be worked by a team of oxen. Another measure of land used in the Domesday Book was the "caracute".[14]

Metric systems

[edit]

Metric systems of units have evolved since the adoption of the original metric system in France in 1791. The current international standard metric system is the International System of Units (abbreviated to SI). An important feature of modern systems is standardization. Each unit has a universally recognized size.

An example of metrication in 1860 when Tuscany became part of modern Italy (ex. one "libbra" = 339.54 grams)

Both the imperial units and US customary units derive from earlier English units. Imperial units were mostly used in the British Commonwealth and the former British Empire. US customary units are still the main system of measurement used in the United States outside of science, medicine, many sectors of industry, and some of government and military, and despite Congress having legally authorised metric measure on 28 July 1866.[15] Some steps towards US metrication have been made, particularly the redefinition of basic US and imperial units to derive exactly from SI units. Since the international yard and pound agreement of 1959 the US and imperial inch is now defined as exactly 0.0254 m, and the US and imperial avoirdupois pound is now defined as exactly 0.45359237 kg.[16]

Natural systems

[edit]

While the above systems of units are based on arbitrary unit values, formalised as standards, natural units in physics are based on physical principle or are selected to make physical equations easier to work with. For example, atomic units (au) were designed to simplify the wave equation in atomic physics.[17]

Some unusual and non-standard units may be encountered in sciences. These may include the solar mass (2×1030 kg), the megaton (the energy released by detonating one million tons of trinitrotoluene, TNT) and the electronvolt.

[edit]

To reduce the incidence of retail fraud, many national statutes have standard definitions of weights and measures that may be used (hence "statute measure"), and these are verified by legal officers.[citation needed]

Informal comparison to familiar concepts

[edit]

In informal settings, a quantity may be described as multiples of that of a familiar entity, which can be easier to contextualize than a value in a formal unit system. For instance, a publication may describe an area in a foreign country as a number of multiples of the area of a region local to the readership. The propensity for certain concepts to be used frequently can give rise to loosely defined "systems" of units.[18][19]

Base and derived units

[edit]

For most quantities a unit is necessary to communicate values of that physical quantity. For example, conveying to someone a particular length without using some sort of unit is impossible, because a length cannot be described without a reference used to make sense of the value given.

But not all quantities require a unit of their own. Using physical laws, units of quantities can be expressed as combinations of units of other quantities. Thus only a small set of units is required. These units are taken as the base units and the other units are derived units. Thus base units are the units of the quantities which are independent of other quantities and they are the units of length, mass, time, electric current, temperature, luminous intensity and the amount of substance. Derived units are the units of the quantities which are derived from the base quantities and some of the derived units are the units of speed, work, acceleration, energy, pressure etc.[11]

Different systems of units are based on different choices of a set of related units including fundamental and derived units.

Physical quantity components

[edit]

Following ISO 80000-1,[20] any value or magnitude of a physical quantity is expressed as a comparison to a unit of that quantity. The value of a physical quantity Z is expressed as the product of a numerical value {Z} (a pure number) and a unit [Z]:

For example, let be "2 metres"; then, is the numerical value and is the unit. Conversely, the numerical value expressed in an arbitrary unit can be obtained as:

The multiplication sign is usually left out, just as it is left out between variables in the scientific notation of formulas. The convention used to express quantities is referred to as quantity calculus. In formulas, the unit [Z] can be treated as if it were a specific magnitude of a kind of physical dimension: see Dimensional analysis for more on this treatment.

Dimensional homogeneity

[edit]

Units can only be added or subtracted if they are the same type; however units can always be multiplied or divided, as George Gamow used to explain. Let be "2 metres" and "3 seconds", then

.

There are certain rules that apply to units:

  • Only like terms may be added. When a unit is divided by itself, the division yields a unitless one. When two different units are multiplied or divided, the result is a new unit, referred to by the combination of the units. For instance, in SI, the unit of speed is metre per second (m/s). See dimensional analysis. A unit can be multiplied by itself, creating a unit with an exponent (e.g. m2/s2). Put simply, units obey the laws of indices. (See Exponentiation.)
  • Some units have special names, however these should be treated like their equivalents. For example, one newton (N) is equivalent to 1 kg⋅m/s2. Thus a quantity may have several unit designations, for example: the unit for surface tension can be referred to as either N/m (newton per metre) or kg/s2 (kilogram per second squared).

Converting units of measurement

[edit]

Conversion of units is the conversion of the unit of measurement in which a quantity is expressed, typically through a multiplicative conversion factor that changes the unit without changing the quantity. This is also often loosely taken to include replacement of a quantity with a corresponding quantity that describes the same physical property.

Unit conversion is often easier within a metric system such as the SI than in others, due to the system's coherence and its metric prefixes that act as power-of-10 multipliers.

In software development

[edit]

Software developers in a wide variety of fields including scientific, healthcare and financial applications have sought to adopt approaches that reduce bugs and mistakes involving units of measurement. In object-oriented programming, this is often achieved using the Quantity pattern to pair together the value and the unit.[21] (In financial applications, it is common to represent monetary values by storing them with the currency—this is often known as the 'Money pattern'.)[22] The programming language F# has syntactic support for representing units of measure, converting between them, and checking their type safety at compile-time.[23]

Real-world implications

[edit]

One example of the importance of agreed units is the failure of the NASA Mars Climate Orbiter, which was accidentally destroyed on a mission to Mars in September 1999 (instead of entering orbit) due to miscommunications about the value of forces: different computer programs used different units of measurement (newton versus pound force). Considerable amounts of effort, time, and money were wasted.[24][25]

On 15 April 1999, Korean Air cargo flight 6316 from Shanghai to Seoul was lost due to the crew confusing tower instructions (in metres) and altimeter readings (in feet). Three crew and five people on the ground were killed. Thirty-seven were injured.[26][27]

In 1983, a Boeing 767 (which thanks to its pilot's gliding skills landed safely and became known as the Gimli Glider) ran out of fuel in mid-flight because of two mistakes in figuring the fuel supply of Air Canada's first aircraft to use metric measurements.[28] This accident was the result of both confusion due to the simultaneous use of metric and Imperial measures and confusion of mass and volume measures.

When planning his journey across the Atlantic Ocean in the 1480s, Columbus mistakenly assumed that the mile referred to in the Arabic estimate of ⁠56+2/3 miles for the size of a degree was the same as the actually much shorter Italian mile of 1,480 metres. His estimate for the size of the degree and for the circumference of the Earth was therefore about 25% too small.[29]: 1 : 17 

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A unit of measurement is a particular physical quantity, defined and adopted by convention, with which other particular quantities of the same kind are compared to express their value. These units provide standardized references for quantifying attributes such as length, mass, time, and other physical properties, enabling precise communication, scientific inquiry, and consistent trade practices across societies. The International System of Units (SI), established in 1960 and revised in 2019, serves as the contemporary global standard, comprising seven base units—metre for length, kilogram for mass, second for time, ampere for electric current, kelvin for thermodynamic temperature, mole for amount of substance, and candela for luminous intensity—defined invariantly via fundamental constants like the speed of light and Planck's constant to ensure reproducibility independent of artifacts or environmental factors. Derived units, such as the newton for force or joule for energy, follow coherently from base units through multiplication and division without additional numerical factors. Historically, units emerged in ancient civilizations from practical needs like agriculture and construction, relying on variable human-scale references such as the cubit or foot, which prompted ongoing efforts toward uniformity culminating in the metric system's inception during the French Revolution for decimal-based rationality and the SI's international adoption to facilitate empirical precision in physics and engineering. While the SI predominates worldwide, non-metric systems like the imperial persist in select contexts, underscoring challenges in full global standardization despite evident advantages in interoperability and calculability.

Definition and Fundamentals

Core Concept and Role in Quantification

A unit of measurement constitutes a specific, conventionally defined magnitude of a physical quantity, serving as the reference standard against which other magnitudes of the same quantity are compared to express their numerical value. This definition ensures that measurements yield reproducible results grounded in observable, empirical phenomena rather than subjective estimation. Physical quantities, such as length or mass, represent invariant properties of the universe describable through causal interactions, while units provide the scalar framework to quantify them without inherent variability. Though arbitrary in their chosen magnitude, units achieve universality through rigorous standardization, often linking to fundamental physical constants for stability and invariance. For instance, the metre is defined via the fixed speed of light in vacuum at 299792458 m/s, and the kilogram via the Planck constant at 6.62607015 × 10^{-34} kg⋅m²⋅s^{-1}, thereby rooting quantification in first-principles of relativity and quantum mechanics. This approach minimizes drift from material artifacts, enhancing traceability to empirical invariants like electromagnetic propagation or energy quantization. In quantification, units enable the unambiguous encoding of causal relationships, allowing and engineers to formulate predictive models, test hypotheses via numerical consistency, and replicate outcomes across independent experiments. Without such standardized scales, comparisons of would devolve into , impeding advancements in fields reliant on precise , from gravitational dynamics to thermodynamic efficiency. By distinguishing the unit (the measure) from the quantity (the phenomenon), this framework supports causal realism, where quantified invariants underpin verifiable predictions rather than interpretive narratives.

Relation to Physical Quantities

Physical quantities form the basis for units of measurement, categorized into fundamental (or base) quantities that are independent and not definable in terms of others, such as length, mass, and time, and derived quantities that arise from mathematical combinations of these bases, exemplified by velocity as displacement divided by time or force as mass times acceleration. These distinctions ensure that all measurable aspects of the physical world can be systematically quantified, with base quantities selected for their empirical independence and role in expressing natural laws through reproducible operations. The empirical foundation of these quantities relies on direct observation and comparison to standards derived from invariant natural phenomena, such as defining length via the path traveled by light in vacuum or mass through gravitational interaction with prototypes calibrated against atomic properties. This grounding enables precise, intersubjectively verifiable measurements, as operations like timing periodic events or balancing forces yield ratios that hold across contexts, contrasting with quantities lacking such physical realizability. In metrology, this structure underscores the superiority of physical quantities for scientific rigor, as derived metrics like energy—computed as force times distance—inherit the reproducibility of bases, whereas non-physical metrics in fields like social science often depend on subjective indices or proxies without equivalent causal invariance, rendering them vulnerable to interpretive variability and reduced falsifiability. Such distinctions highlight why units tied to physical quantities facilitate predictive models in physics, while alerting to limitations in applying measurement paradigms beyond empirically anchored domains.

Historical Development

Ancient and Empirical Origins

The ancient Egyptian royal cubit, standardized circa 3000 BCE at approximately 524 mm, originated from the practical length of the forearm from elbow to the tip of the middle finger, etched on durable granite rods to facilitate precise construction of monuments like pyramids and obelisks, as well as agricultural and trade apportionments scaled to human bodily proportions. In Mesopotamia, Sumerian and Babylonian systems similarly drew from empirical references, employing a cubit of about 530 mm divided into 30 finger breadths (kus) for building and surveying, while adopting a sexagesimal (base-60) numeral framework from the third millennium BCE to partition time into 60 subunits per hour and angles into 360 degrees per circle, leveraging 60's multiple divisors (1, 2, 3, 4, 5, 6, 10, 12, 15, 20, 30) for efficient fractional divisions in astronomy and commerce without centralized decree. Roman measurement units extended these body-centric origins into codified scales suited to engineering and legions, defining the pes (foot) as the base linear unit subdivided into 12 unciae (inches), with the passus (pace) encompassing five pedes to mirror average marching strides, aggregating to the mille passus (mile) of 1,000 paces for road-building and territorial mapping, thereby anchoring quantification to human locomotion and manual labor. Medieval European systems inherited Roman foundations but devolved into fragmented locales, where units like the foot or ell varied by town—often debased from classical precedents to reflect regional body averages or tools—yielding inconsistencies such as differing perch lengths for land division, which sparked trade frictions as varying bushels or yards across markets undermined equitable exchange, prompting ad hoc local prototypes in agoras or guilds to mitigate disputes absent overarching authority.

Transition to Systematic Standards

During the late 18th century, Enlightenment thinkers and revolutionary governments sought to supplant localized, arbitrary units with systems derived from invariant natural features, spurred by scientific advancements and the need for uniform trade amid expanding commerce. In France, the National Assembly tasked the Academy of Sciences in 1790 with devising such standards; by 1791, they defined the meter as one ten-millionth of the Earth's meridian quadrant from pole to equator through Paris, measured via expeditions led by Jean-Baptiste Delambre and Pierre Méchain. This geodesic basis aimed to yield a universal length invariant to human artifacts, with initial surveys completing in 1798 despite logistical challenges from political turmoil. Complementing length, the gram—prototype for the kilogram—was set in 1795 as the mass of pure water occupying a cubic centimeter at 4°C (its density maximum), linking mass to volume and a reproducible substance without reliance on variable local prototypes. These decimal-multiplied units, formalized by law on April 7, 1795, addressed the proliferation of over 800 regional variants in pre-revolutionary France, which inflated transaction costs and enabled fraud; adoption was mandatory by 1801, though resistance persisted due to cultural entrenchment. Britain responded with the Weights and Measures Act of 1824, which abolished disparate local standards in favor of imperial definitions: the yard as 0.9144 meters (brass standard), avoirdupois pound as 0.453592 kilograms, and imperial gallon as 4.54609 liters, verified against national prototypes. Unlike French decimal purity, imperial refinements preserved customary subdivisions—such as 16 ounces per pound and 12 inches per foot—for compatibility with engineering and artisanal workflows, prioritizing empirical continuity over radical restructuring amid the Industrial Revolution's demands for precise machinery. Efforts to decimalize time, decreed in 1793 to divide the solar day into 10 "hours" of 100 "minutes" each (with 100-second minutes), collapsed by 1795 owing to discordance with astronomical periodicity and human physiology, where circadian entrainment to ~24-hour light-dark cycles resisted recalibration, complicating synchronization in labor, navigation, and horology. The scheme's abstraction from these causal anchors—Earth's rotation yielding fixed diurnal length—exemplified pitfalls of imposing rational bases unmoored from observable invariances, as evidenced by minimal clock adaptations and swift reversion to duodecimal norms.

Modern International Standardization

The International System of Units (SI) was formally established in 1960 by the 11th General Conference on Weights and Measures (CGPM), building on the metre-kilogram-second (MKS) system to create a coherent, decimal-based framework for international scientific and practical measurements. This system defined seven base units—metre, kilogram, second, ampere, kelvin, mole, and candela—intended to promote uniformity in global trade, technology, and research by replacing disparate national standards with interdependent quantities where derived units follow from products or quotients of base units without conversion factors. The SI's adoption accelerated metrication worldwide, with over 90% of countries legally mandating its use by the late 20th century, driven by treaties like the 1875 Metre Convention under the International Bureau of Weights and Measures (BIPM). A pivotal advancement occurred with the 2019 revision of the SI, effective May 20, 2019, which redefined four base units—the kilogram, ampere, kelvin, and mole—in terms of fixed numerical values of fundamental physical constants: the Planck constant (h) for the kilogram, elementary charge (e) for the ampere, Boltzmann constant (k_B) for the kelvin, and Avogadro constant (N_A) for the mole. This eliminated reliance on physical artifacts, such as the International Prototype Kilogram, which had exhibited measurable mass drift over time due to surface contamination and environmental factors, ensuring definitions invariant to human-made objects and reproducible anywhere with sufficient precision instruments. The second, metre, and candela retained prior definitions tied to caesium hyperfine transition frequency, speed of light, and luminous efficacy, respectively, maintaining continuity while extending the system's foundation to quantum-level constants. The redefinition enhances metrological precision by anchoring units to universal constants, reducing uncertainties in calibrations from parts per billion to levels enabling advanced applications like quantum computing and nanotechnology, where artifact-based standards previously introduced propagation errors. It facilitates seamless integration with emerging technologies, such as watt balances and Josephson junctions for electrical metrology, by prioritizing empirical verification over historical prototypes, thereby minimizing long-term drift risks observed in pre-2019 kilogram comparisons that showed variations up to 50 micrograms since 1889. In 2022, the CGPM extended SI prefixes to accommodate exascale data storage and high-energy physics, introducing ronna (R, 10^{27}) and quetta (Q, 10^{30}) for multiples, alongside ronto (r, 10^{-27}) and quecto (q, 10^{-30}) for submultiples, the first additions since 1991. These reflect empirical demands from digital petabyte-to-zettabyte scales in genomics and cosmology, where yotta (10^{24}) proved insufficient, standardizing notations like ronnabytes for global data metrics without ad hoc multipliers.

Classification of Measurement Systems

Traditional and Customary Units

Traditional and customary units comprise non-coherent measurement systems derived from historical English practices, emphasizing empirical standardization for commerce and daily use rather than decimal or multiplicative consistency. The British Imperial system was codified by the Weights and Measures Act of 1824, which unified disparate local standards into national prototypes, including the yard for length—initially the distance between etched lines on a brass bar maintained at Parliament—and the avoirdupois pound for mass, represented by a platinum cylinder weighing 5,760 grains of barley. Length units hierarchically subdivide the yard into 3 feet of 12 inches each, with inches further partitioned binarily into halves, quarters, eighths, and sixteenths, reflecting adaptations to manual tools and body-based approximations like the inch as a thumb's width. Mass follows suit with the pound divided into 16 ounces, suiting portioning in markets and crafts where repeated halving minimizes measurement discrepancies. United States customary units evolved from pre-1824 colonial English measures, retaining differences post-Independence to preserve local trade conventions; the US survey foot, for instance, diverged slightly until 1959 harmonization, while capacity units like the wine gallon—fixed at 231 cubic inches since the 1836 Act—differ from Imperial volumes. These units maintain prevalence in American construction, where tape measures calibrated in fractional inches enable rapid, error-resistant adjustments for lumber and fittings, and in specialized fields like firearms manufacturing, leveraging inherited tooling precision over metric retrofits. Such systems prioritize historical continuity and divisibility tailored to human-scale tasks, with base-12 structures in length (e.g., 12 inches per foot) allowing clean fractions like 1/3 or 1/4 without decimals, which empirical use in trades demonstrates reduces cognitive load in iterative divisions compared to base-10 equivalents. This binary and duodecimal emphasis, rooted in ancient divisions for sharing goods evenly, supports practical efficiency in non-scientific applications despite requiring memorized conversions between units.

Coherent Metric Systems Including SI

The International System of Units (SI), established in 1960 by the 11th General Conference on Weights and Measures, exemplifies a coherent metric system, characterized by its decimal structure and multiplicative derivation of units without extraneous scaling factors. In such systems, base units combine directly via the equations of physics to yield derived units; for instance, the unit of force, the newton (N), equals one kilogram-meter per second squared (kg·m/s²), reflecting Newton's second law without additional constants. This coherency extends across SI's seven base units—metre for length, kilogram for mass, second for time, ampere for electric current, kelvin for temperature, candela for luminous intensity, and mole for amount of substance—enabling seamless algebraic manipulation in scientific equations. Coherent metric systems like SI facilitate global scientific collaboration by minimizing conversion errors and cognitive overhead in computations, as evidenced by their exclusive adoption in peer-reviewed physics and engineering literature since the mid-20th century. Technical advantages include streamlined dimensional analysis and reduced miscalculations in international projects, such as particle accelerator designs or satellite trajectories, where non-coherent systems introduce persistent factors like 4π or g_c, complicating derivations. Empirical outcomes from metric standardization, including SI, correlate with efficiency gains in industries transitioning to it, though direct causation remains tied to broader decimal consistency rather than coherency alone. Despite these strengths, SI's emphasis on universal applicability overlooks domain-specific inefficiencies, such as in human-scale ergonomics or legacy engineering where intuitive scaling (e.g., binary prefixes in computing) better aligns with practical workflows, potentially increasing error risks in non-scientific contexts. Critiques highlight that no single system fully accommodates all scales of phenomena, from quantum to cosmological, rendering SI's rigidity suboptimal for specialized fields despite its dominance in foundational science.

Natural and Fundamental Units

Natural units and fundamental units represent systems of measurement derived exclusively from universal physical constants, such as the speed of light cc, the reduced Planck's constant \hbar, the gravitational constant GG, and sometimes the Boltzmann constant kBk_B, without reliance on human artifacts or empirical prototypes. These frameworks set key constants to unity (e.g., c=1c = 1, =1\hbar = 1) to normalize scales in theoretical physics, simplifying equations and emphasizing dimensionless ratios inherent to nature's laws. By anchoring measurements to invariants of spacetime, matter, and quantum mechanics, they avoid anthropocentric biases in scale selection, revealing empirical hierarchies like the immense disparity between quantum gravity regimes and observable phenomena. Planck units, proposed by Max Planck in 1899, form a foundational example, constructed dimensionally from cc, GG, and \hbar to yield base quantities of length, time, mass, and charge. The Planck length lP=Gc31.616255×1035l_P = \sqrt{\frac{\hbar G}{c^3}} \approx 1.616255 \times 10^{-35}
Add your contribution
Related Hubs
User Avatar
No comments yet.