Hubbry Logo
MicrosecondMicrosecondMain
Open search
Microsecond
Community hub
Microsecond
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Microsecond
Microsecond
from Wikipedia
microsecond
Unit systemSI
Unit oftime
Symbolμs
Conversions
1 μs in ...... is equal to ...
   SI units   10−6 s

A microsecond is a unit of time in the International System of Units (SI) equal to one millionth (0.000001 or 10−6 or 11,000,000) of a second. Its symbol is μs, sometimes simplified to us when Unicode is not available.

A microsecond is to one second, as one second is to approximately 11.57 days.

A microsecond is equal to 1000 nanoseconds or 11,000 of a millisecond. Because the next SI prefix is 1000 times larger, measurements of 10−5 and 10−4 seconds are typically expressed as tens or hundreds of microseconds.

Examples

[edit]
  • 1 microsecond (1 μs) – cycle time for frequency 1×106 hertz (1 MHz), the inverse unit. This corresponds to radio wavelength 300 m (AM medium wave band), as can be calculated by multiplying 1 μs by the speed of light (approximately 3.00×108 m/s).
  • 1 microsecond – the length of time of a high-speed, commercial strobe light flash (see air-gap flash).
  • 1 microsecond – protein folding takes place on the order of microseconds (thus this is the speed of carbon-based life).
  • 1.8 microseconds – the amount of time subtracted from the Earth's day as a result of the 2011 Japanese earthquake.[1]
  • 2 microseconds – the lifetime of a muonium particle.
  • 2.68 microseconds – the amount of time subtracted from the Earth's day as a result of the 2004 Indian Ocean earthquake.[2]
  • 3.33564095 microseconds – the time taken by light to travel one kilometre in a vacuum.
  • 5.4 microseconds – the time taken by light to travel one mile in a vacuum (or radio waves point-to-point in a near vacuum).
  • 8 microseconds – the time taken by light to travel one mile in typical single-mode fiber optic cable.
  • 10 microseconds (μs) – cycle time for frequency 100 kHz, radio wavelength 3 km.
  • 18 microseconds – net amount per year that the length of the day lengthens, largely due to tidal acceleration.[3]
  • 20.8 microseconds – sampling interval for digital audio with 48,000 samples/s.
  • 22.7 microseconds – sampling interval for CD audio (44,100 samples/s).
  • 38 microseconds – discrepancy in GPS satellite time per day (compensated by clock speed) due to relativity .[4]
  • 50 microseconds – cycle time for highest human-audible tone (20 kHz).
  • 50 microseconds – to read the access latency for a modern solid state drive which holds non-volatile computer data.[5]
  • 100 microseconds (0.1 ms) – cycle time for frequency 10 kHz.
  • 125 microseconds – common sampling interval for telephone audio (8000 samples/s).[6]
  • 164 microseconds – half-life of polonium-214.
  • 240 microseconds – half-life of copernicium-277.
  • 260 to 480 microseconds - return trip ICMP ping time, including operating system kernel TCP/IP processing and answer time, between two Gigabit Ethernet devices connected to the same local area network switch fabric.
  • 277.8 microseconds – a fourth (a 60th of a 60th of a second), used in astronomical calculations by al-Biruni and Roger Bacon in 1000 and 1267 AD, respectively.[7][8]
  • 490 microseconds – time for light at a 1550 nm frequency to travel 100 km in a singlemode fiber optic cable (where speed of light is approximately 200 million metres per second due to its index of refraction).
  • The average human eye blink takes 350,000 microseconds (just over 13 second).
  • The average human finger snap takes 150,000 microseconds (just over 17 second).
  • A camera flash illuminates for 1,000 microseconds.
  • Standard camera shutter speed opens the shutter for 4,000 microseconds or 4 milliseconds.
  • 584542 years of microseconds fit in 64 bits: (2**64)/(1e6*60*60*24*365.25).

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A microsecond (symbol: μs) is a unit of time in the (SI) equal to one millionth (10-6) of a second. It is derived by applying the SI prefix micro- (μ), which denotes a factor of 10-6, to the base , the second (s). The second itself is defined as the duration of exactly 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the of the caesium-133 atom (at 0 and at rest). Thus, one microsecond corresponds precisely to 10-6 of this duration. This unit is essential for measuring brief intervals in scientific and technological contexts, where precision at the millionth-of-a-second scale is required. In physics, for instance, travels exactly 299.792458 meters in during one microsecond, a distance known as one light-microsecond, which aids in applications like and ranging. In and , microseconds quantify critical timings such as pulse durations, in networks, and latencies in high-speed , enabling sub-microsecond accuracy in protocols like IEEE for precision time . These applications span fields from systems, where microsecond delays impact performance, to scientific instruments measuring fast chemical reactions or particle decays.

Definition and Notation

Formal Definition

The microsecond, denoted by the symbol μs, is a in the (SI) equal to one millionth (1/1,000,000) of a second. It is formed by applying the SI prefix "micro-" to the base , representing a factor of 10610^{-6}. Mathematically, this is expressed as μs=106s\mu \mathrm{s} = 10^{-6} \, \mathrm{s}. The second (s), the of time upon which the microsecond is based, is defined as the duration of 9,192,631,770 periods of the corresponding to the transition between the two hyperfine levels of the unperturbed of the caesium-133 atom. This definition ensures a precise and reproducible standard for all derived time units, including the microsecond. The prefix "micro-" originates from word mikros, meaning "small," and is combined with "second" to indicate this diminutive scale of measurement.

Symbol and Prefix Usage

The official symbol for the microsecond in the (SI) is μs, where μ represents the micro prefix and s denotes the second. The micro prefix, symbol μ, indicates a factor of 10^{-6} and was approved for general use within the SI framework by the 11th General Conference on Weights and Measures (CGPM) in 1960. In and , the μs uses the Greek letter mu (μ) in upright (roman) , as specified by SI conventions. When the Greek μ is unavailable in plain text or certain digital formats, the lowercase Latin letter u may serve as a substitute, resulting in us, though this should be avoided to prevent potential ambiguity with abbreviations like "U.S." for in mixed contexts. The standard Greek mu (U+03BC, μ) is the recommended for precision in formal , while the micro sign (U+00B5, µ) should be avoided. The unit symbol μs does not change in the plural form; for example, both one microsecond and five microseconds are denoted as 1 μs and 5 μs, respectively. According to guidelines from the International Bureau of Weights and Measures (BIPM), a normal space separates the numerical value from the unit symbol, as in "5 μs," while no space appears between the prefix and the base unit symbol itself (μs). These conventions ensure clarity and consistency in expressions involving the microsecond, which equals 10^{-6} seconds.

Historical Context

Origin and Early Usage

The term "microsecond," referring to one millionth of a second, first emerged in English in 1905, primarily within early contexts to quantify the duration of brief electrical pulses. This usage aligned with growing needs to describe transient phenomena in experiments involving high-speed electrical signals, where traditional second-based measurements proved insufficient. Before the formal adoption of "microsecond," 19th-century physicists relied on ad hoc expressions for sub-second fractions in studies of and propagation, such as calculating signal delays in telegraph lines or rotation times in optical apparatus for speed-of-light determinations. These informal notations captured intervals approaching millionths of a second but lacked a standardized term, reflecting the limitations of at the time. The conceptual foundation drew from the metric system's decimal structure, with prefixes like milli- established by the in 1795 to facilitate precise scaling of units. Key early adopters of the microsecond included researchers in electromagnetic wave propagation and , who leveraged emerging cathode-ray tube devices—pioneered by Karl Ferdinand Braun in 1897—to visualize and measure short-duration events. The "micro-" prefix itself, denoting 10^{-6}, had been integrated into the centimeter-gram-second (CGS) system by 1873, extending the metric framework to finer scales and enabling the term's practical application in quantifying pulse timings in these fields.

Standardization in the 20th Century

The formal standardization of the microsecond as a unit within the (SI) occurred during the 11th General Conference on Weights and Measures (CGPM) in 1960, when the prefix (symbol μ, denoting 10^{-6}) was officially recognized alongside other decimal prefixes for forming multiples and submultiples of base SI units. This adoption integrated the microsecond (μs) into the newly named Système International d'Unités, enabling its consistent use in scientific and technical measurements globally. Prior informal usage in electrical and timing contexts was thus codified, promoting uniformity in . A pivotal advancement came in 1967 at the 13th CGPM, where the second was redefined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the of the caesium-133 atom. This definition was later clarified in 1997 to specify the atom at rest at a temperature of 0 K. This atomic definition directly enhanced the precision of microsecond-scale measurements, as the microsecond became exactly one millionth of this stable cesium-based second, facilitating accurate atomic timekeeping in clocks that achieve relative accuracies on the order of 10^{-15}. Such integration allowed for microsecond resolutions in synchronizing global time standards, with cesium clocks enabling alignments within 0.5 μs across networks. Key milestones in the included the practical application of microseconds in technology during and , particularly amid efforts, where pulse widths of 10 to 25 μs were used in systems like the for detecting aircraft at ranges determined by round-trip echo times (approximately 12.36 μs per radar mile). In of the same era, microsecond timescales became essential for describing implosion dynamics in atomic weapon development, with energy yields occurring in about 1 μs to achieve criticality. By the , the microsecond was incorporated into international standards for time notation, such as through ISO recommendations on SI unit presentation, further embedding it in global technical documentation. The evolution of measurement precision for microseconds progressed from mechanical chronoscopes in the early , which offered resolutions around 1 ms but with daily drifts of milliseconds, to quartz-crystal standards in the that stabilized to 0.1 ms per day. Atomic standards from the late 1950s onward dramatically improved this, achieving microsecond accuracies to relative to —equivalent to absolute uncertainties below 1 ns—through cesium beam techniques that underpin modern primary frequency standards.

Equivalents and Comparisons

Conversions to Other Time Units

The microsecond (μs) is a unit of time equal to one millionth of a second, or 10610^{-6} seconds, as defined by the (SI). This prefix-based relation allows for straightforward conversions to other decimal time units using SI multipliers. For instance, 1 μs equals 0.001 (ms), since the millisecond is 10310^{-3} seconds, making the microsecond one-thousandth of a millisecond. Similarly, 1 μs equals 1,000 (ns), as the nanosecond is 10910^{-9} seconds. To smaller scales, 1 μs equals 1,000,000 picoseconds (ps), given that the picosecond is 101210^{-12} seconds; however, conversions primarily emphasize adjacent SI prefixes like milli-, nano-, and pico- for precision in scientific and engineering contexts. For larger units, 1 μs is 10610^{-6} seconds, so 1,000,000 μs equals 1 second (s). Extending to non-decimal but common units, 1 day—defined as 86,400 seconds—equals 86,400,000,000 μs, or 8.64×10108.64 \times 10^{10} μs. The general conversion formula between microseconds and seconds is tμs=ts×106t_{\mu s} = t_s \times 10^6, where tst_s is the time in seconds; conversely, ts=tμs×106t_s = t_{\mu s} \times 10^{-6}. This formula facilitates practical calculations for extended periods. For example, 1 hour, equivalent to 3,600 seconds, converts to 3,600×106=3.6×1093,600 \times 10^6 = 3.6 \times 10^9 μs.
Time UnitRelation to 1 μsExact Value
Second (s)10610^6 μs = 1 s10610^{-6} s
Millisecond (ms)1 μs = 0.001 ms10310^{-3} ms
Nanosecond (ns)1 μs = 1,000 ns10310^3 ns
Picosecond (ps)1 μs = 1,000,000 ps10610^6 ps
These conversions derive directly from SI prefix standards and the fixed length of the second, ensuring consistency across measurements.

Relation to Physical Phenomena

In vacuum, electromagnetic radiation such as light propagates at the speed of light, c3×108c \approx 3 \times 10^8 m/s, covering a distance of approximately meters in one microsecond (t=106t = 10^{-6} s). This follows from the relation d=c×td = c \times t, where the microsecond timescale highlights the rapid traversal of electromagnetic waves over hundreds of meters, a fundamental limit in relativistic physics. This propagation distance is directly relevant to electromagnetic waves beyond visible light, including radio signals and pulses, which travel the same 300 meters in vacuum per microsecond. In applications like GPS timing, microsecond-scale measurements of signal propagation enable positional accuracy on the order of 300 meters, as the system's pseudoranges rely on the for distance calculations from satellite signals. Natural phenomena also operate on the microsecond scale, such as the formation and propagation of discharge channels, where microsecond-scale pulses are associated with initial in-cloud channel development and repetitive pulse discharges during the event. Similarly, in air, propagating at approximately 343 m/s under standard conditions (20°C), cover about 0.34 millimeters in one microsecond, illustrating the much slower dynamics compared to electromagnetic ones. In relativistic contexts, time dilation effects are negligible at everyday speeds but become measurable for microsecond-scale events in high-energy environments like particle accelerators. For instance, cosmic-ray muons, with a proper lifetime of about 2.2 microseconds, exhibit dilated decay times when accelerated to near-light speeds, allowing them to reach Earth's surface—a direct confirmation of observed in accelerator experiments.

Applications

In Physics and Chemistry

In , the lifetimes of excited nuclear states often span the microsecond range, particularly for isomeric states where gamma decay is hindered. For instance, a long-lived in the radioactive sodium-32 exhibits a 24-microsecond lifetime, the longest observed among isomers with 20 to 28 neutrons decaying via gamma-ray emission, providing insights into nuclear structure and shape coexistence. Similarly, microsecond isomers have been identified in neutron-rich nuclei near the N=20 island of shape inversion, such as in sodium-32, where a 24-microsecond isomeric state highlights deformation effects in exotic . In , microsecond time scales are characteristic of certain decay processes, exemplified by the , a fundamental . The positive muon decays into a , an electron antineutrino, and an with a mean lifetime of 2.197 microseconds, a value precisely measured through experiments confirming predictions. This decay lifetime is crucial for studying muons and accelerator experiments, where relativistic effects extend observed lifetimes. For contextual scale, travels approximately 300 meters in during one microsecond. In chemistry, microsecond time scales govern ultrafast reaction dynamics, including fluorescence lifetimes in excited molecules where electronic relaxation occurs. Ruthenium(II) polypyridyl complexes, for example, display fluorescence lifetimes around 1-10 microseconds due to metal-to-ligand charge transfer states, enabling their use in probing energy transfer and quenching in solution-phase reactions. Vibrational relaxation in polyatomic molecules, following electronic excitation, can also extend into this regime in low-density environments, influencing photochemical pathways. Spectroscopy techniques leverage microsecond resolutions to investigate electronic transitions, particularly in transient absorption experiments. Dispersive setups with microsecond allow observation of short-lived intermediates in photochemical reactions, such as biradicals or charge-transfer states, by probing absorption changes post-excitation with nanosecond-to-microsecond . These methods reveal dynamics in systems like laser-produced plasmas or molecular excitations, where durations match the timescales of radiative and non-radiative decay processes.

In Computing and Electronics

In computing and electronics, the microsecond is a fundamental unit for quantifying the timing of rapid digital operations, where delays at this scale can significantly impact system performance and responsiveness. Modern central processing units (CPUs) operate at gigahertz clock speeds, allowing thousands of instruction cycles to complete within a single microsecond, which underpins the high throughput of contemporary processors. For instance, a typical 3 GHz CPU executes approximately 3,000 clock cycles per microsecond, providing a benchmark for evaluating computational efficiency in applications ranging from general-purpose computing to high-performance simulations. Memory hierarchies in electronic systems further highlight the microsecond's relevance, with access latencies varying by storage tier. Dynamic random-access memory (DRAM) typically incurs latencies of 50-100 nanoseconds for row activation and column access, representing a sub-microsecond scale that is crucial for avoiding bottlenecks in data-intensive workloads; one microsecond equates to 1,000 nanoseconds, enabling precise comparisons to finer-grained timings. Cache misses that propagate to main memory can extend effective latencies toward the microsecond range due to queuing and contention effects, though local DRAM hits remain in the tens to hundreds of nanoseconds. Solid-state drive (SSD) read operations, by contrast, operate squarely in the microsecond domain, with ultra-low-latency NVMe SSDs achieving sub-10-microsecond I/O times under optimal conditions, while typical reads span 10-100 microseconds depending on queue depth and flash controller overhead. Networking protocols within electronic infrastructures also rely on microsecond-scale metrics for reliable data transfer. In Ethernet systems, frame transmission delays and —variations in packet arrival times—frequently occur in the tens of microseconds per switch, influenced by buffering, , and mechanisms that ensure deterministic behavior in time-sensitive networks. These delays are particularly pronounced in bridged or multi-hop topologies, where cumulative can accumulate to impact real-time applications like industrial automation. Real-time embedded systems demand microsecond precision for interrupt handling and control loops, where even brief delays can compromise stability in devices such as automotive controllers or . Interrupt latency, the time from signal assertion to handler execution, averages around 11 microseconds in modern Linux-based real-time kernels, with handler durations extending further based on system load and priority scheduling. This granularity enables embedded processors to maintain in feedback loops, such as , where response times must align within microseconds to prevent errors in dynamic environments.

In Engineering and Telecommunications

In engineering and telecommunications, the microsecond serves as a critical timescale for , , and delay management, enabling precise control in systems where timing errors can lead to or reduced performance. Synchrophasor measurements in power systems, for instance, rely on microsecond-level accuracy to monitor grid stability by capturing voltage and current synchronized to a common time reference, allowing real-time detection of oscillations and phase shifts that could precipitate blackouts. In a 60 Hz power grid, one electrical degree corresponds to approximately 46 μs, underscoring the need for timing precision within this range to achieve total vector error below 1% in phasor calculations. Radar and sonar systems utilize microsecond pulse widths to determine range resolution through echo return timing, where the pulse duration directly influences the ability to distinguish closely spaced targets. In , operating at electromagnetic wave speeds, a 1 μs pulse provides a range resolution of about 150 m, as the signal travels 300 m round-trip during that interval, making it essential for applications like and weather monitoring. systems, propagating acoustic in water at roughly 1500 m/s, employ similar microsecond-scale for high-resolution in underwater and mapping, achieving resolutions on the order of millimeters despite the slower medium. In fiber optic telecommunications, delays are measured in microseconds per kilometer due to the of glass, typically around 5 μs/km for single-mode fibers, which impacts latency in high-speed data networks spanning continents. This delay arises from the reduced in the medium (approximately two-thirds of its value), necessitating compensation in protocols for applications like routing and fronthaul to maintain low . Electromagnetic wave fundamentals underpin these calculations, with delays scaling inversely to the medium's velocity. Global Positioning System (GPS) operations depend on microsecond-precise measurements of signal travel times to compute pseudoranges, where the coarse acquisition code has a chip duration of about 1 μs, corresponding to 300 m ambiguity in . Receivers resolve these to sub-microsecond accuracy through carrier-phase tracking, enabling positioning errors below 10 m by accounting for delays of 60–80 ms from satellites at 20,000 km altitude, thus supporting applications in , , and synchronized networks.

Notable Examples

Everyday and Scientific Contexts

In everyday contexts, microsecond-scale processes occur in human physiology, particularly in the transmission of nerve impulses. Nerve conduction velocity in large myelinated fibers is approximately 100 m/s, meaning that an impulse travels across 1 mm of tissue in about 10 μs, calculated as distance divided by speed (0.001 m / 100 m/s = 10^{-5} s). These transmissions happen far below the threshold for conscious awareness, which requires around 500 ms for an experience to register, rendering such rapid neural events imperceptible to the human mind. In audio technology familiar from daily listening, the standard (CD) format uses a sampling rate of 44.1 kHz, corresponding to a sample period of approximately 22.7 μs per audio sample (1 / 44,100 Hz ≈ 22.68 × 10^{-6} s). This interval captures sound waves at a resolution sufficient for human hearing up to 20 kHz, enabling high-fidelity playback in music and media without audible artifacts from the process. High-speed photography provides another relatable example, where cameras capture fleeting events like impacts on objects. Iconic images, such as a bullet piercing an apple, rely on exposure times of about 1 μs (1/1,000,000 s) to freeze the motion and reveal details invisible to the . In scientific observation, astronomy reveals microsecond-scale phenomena in emissions, where giant radio pulses from neutron stars like the exhibit durations of just a few microseconds. These intense, short bursts represent rapid variations in the light curves of these rotating, variable stellar objects, offering insights into extreme astrophysical processes.

Technological Milestones

During , the developed the system, a pioneering microwave-based automatic-tracking that achieved microsecond-level precision in measuring distances. Operating at around 3 GHz with a of 0.8 microseconds, the SCR-584 enabled range resolutions of approximately 120 meters, corresponding to the time-of-flight measurements in microseconds for echo returns, which was crucial for accurate gun-laying against fast-moving targets. This represented a significant advancement over earlier longer-wavelength radars, allowing Allied forces to track and engage enemy with unprecedented accuracy during battles such as the . In the , instrumentation for atomic bomb tests incorporated microsecond timing to capture and analyze the rapid fission events, with in implosion-type devices unfolding over several microseconds as prompt neutrons initiated exponential fission. The , developed by Harold Edgerton for the U.S. nuclear testing program, recorded still images with exposure times averaging 3 microseconds, providing critical data on the initial fireball formation and shockwave propagation in tests like Operation Tumbler-Snapper. These tools allowed scientists to time the disassembly of the fissile core to microsecond accuracy, informing designs for subsequent thermonuclear weapons. The Apollo missions in the late and early relied on the (AGC) for microsecond-synchronized timing in spacecraft guidance and control systems. The AGC's core memory cycle time was 11.7 microseconds, enabling precise operations such as additions in 23.4 microseconds and serving as the primary source for timing signals that synchronized inertial measurement units, firings, and rendezvous maneuvers during lunar missions. This microsecond-level synchronization was essential for real-time navigation corrections, ensuring the success of Apollo 11's historic landing. In modern , supercomputers like those using networks have achieved end-to-end latencies under 1 microsecond, facilitating massive parallel processing for simulations in climate modeling and . Similarly, has seen gate times approaching the microsecond scale, with trapped-ion systems demonstrating entangling gates in a few microseconds while maintaining high fidelities above 99%, as in IonQ's mixed-species implementations that push toward sub-microsecond operations for scalable error-corrected qubits. These milestones underscore the microsecond as a critical threshold for advancing computational frontiers.

References

Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.