Recent from talks
Be the first to start a discussion here.
Be the first to start a discussion here.
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Welcome to the community hub built to collect knowledge and have discussions related to Microsecond.
Nothing was collected or created yet.
Microsecond
View on Wikipediafrom Wikipedia
| microsecond | |
|---|---|
| Unit system | SI |
| Unit of | time |
| Symbol | μs |
| Conversions | |
| 1 μs in ... | ... is equal to ... |
| SI units | 10−6 s |
A microsecond is a unit of time in the International System of Units (SI) equal to one millionth (0.000001 or 10−6 or 1⁄1,000,000) of a second. Its symbol is μs, sometimes simplified to us when Unicode is not available.
A microsecond is to one second, as one second is to approximately 11.57 days.
A microsecond is equal to 1000 nanoseconds or 1⁄1,000 of a millisecond. Because the next SI prefix is 1000 times larger, measurements of 10−5 and 10−4 seconds are typically expressed as tens or hundreds of microseconds.
Examples
[edit]- 1 microsecond (1 μs) – cycle time for frequency 1×106 hertz (1 MHz), the inverse unit. This corresponds to radio wavelength 300 m (AM medium wave band), as can be calculated by multiplying 1 μs by the speed of light (approximately 3.00×108 m/s).
- 1 microsecond – the length of time of a high-speed, commercial strobe light flash (see air-gap flash).
- 1 microsecond – protein folding takes place on the order of microseconds (thus this is the speed of carbon-based life).
- 1.8 microseconds – the amount of time subtracted from the Earth's day as a result of the 2011 Japanese earthquake.[1]
- 2 microseconds – the lifetime of a muonium particle.
- 2.68 microseconds – the amount of time subtracted from the Earth's day as a result of the 2004 Indian Ocean earthquake.[2]
- 3.33564095 microseconds – the time taken by light to travel one kilometre in a vacuum.
- 5.4 microseconds – the time taken by light to travel one mile in a vacuum (or radio waves point-to-point in a near vacuum).
- 8 microseconds – the time taken by light to travel one mile in typical single-mode fiber optic cable.
- 10 microseconds (μs) – cycle time for frequency 100 kHz, radio wavelength 3 km.
- 18 microseconds – net amount per year that the length of the day lengthens, largely due to tidal acceleration.[3]
- 20.8 microseconds – sampling interval for digital audio with 48,000 samples/s.
- 22.7 microseconds – sampling interval for CD audio (44,100 samples/s).
- 38 microseconds – discrepancy in GPS satellite time per day (compensated by clock speed) due to relativity .[4]
- 50 microseconds – cycle time for highest human-audible tone (20 kHz).
- 50 microseconds – to read the access latency for a modern solid state drive which holds non-volatile computer data.[5]
- 100 microseconds (0.1 ms) – cycle time for frequency 10 kHz.
- 125 microseconds – common sampling interval for telephone audio (8000 samples/s).[6]
- 164 microseconds – half-life of polonium-214.
- 240 microseconds – half-life of copernicium-277.
- 260 to 480 microseconds - return trip ICMP ping time, including operating system kernel TCP/IP processing and answer time, between two Gigabit Ethernet devices connected to the same local area network switch fabric.
- 277.8 microseconds – a fourth (a 60th of a 60th of a second), used in astronomical calculations by al-Biruni and Roger Bacon in 1000 and 1267 AD, respectively.[7][8]
- 490 microseconds – time for light at a 1550 nm frequency to travel 100 km in a singlemode fiber optic cable (where speed of light is approximately 200 million metres per second due to its index of refraction).
- The average human eye blink takes 350,000 microseconds (just over 1⁄3 second).
- The average human finger snap takes 150,000 microseconds (just over 1⁄7 second).
- A camera flash illuminates for 1,000 microseconds.
- Standard camera shutter speed opens the shutter for 4,000 microseconds or 4 milliseconds.
- 584542 years of microseconds fit in 64 bits: (2**64)/(1e6*60*60*24*365.25).
See also
[edit]References
[edit]- ^ Gross, R.S. (14 March 2014). "Japan quake may have shortened Earth days, moved axis". JPL News. Jet Propulsion Laboratory. Retrieved 23 August 2019.
- ^ Cook-Anderson, Gretchen; Beasley, Dolores (January 10, 2005). "NASA Details Earthquake Effects on the Earth". NASA. Retrieved September 18, 2021.
- ^ MacDonald, Fiona. "Earth's Days Are Getting 2 Milliseconds Longer Every 100 Years". ScienceAlert. Retrieved 2017-03-08.
- ^ Richard Pogge. "GPS and Relativity". Retrieved 2011-10-01.
- ^ Intel Solid State Drive Product Specification
- ^ Kumar, Anurag; Manjunath, D.; Kuri, Joy (2008), "Application Models and Performance Issues", Wireless Networking, Elsevier, pp. 53–79, doi:10.1016/b978-012374254-4.50004-1, ISBN 978-0-12-374254-4, retrieved 2022-08-08
- ^ al-Biruni (1879). The chronology of ancient nations: an English version of the Arabic text of the Athâr-ul-Bâkiya of Albîrûnî, or "Vestiges of the Past". Translated by Sachau C Edward. W. H. Allen. pp. 147–149. OCLC 9986841.
- ^ R Bacon (2000) [1928]. The Opus Majus of Roger Bacon. translator: BR Belle. University of Pennsylvania Press. table facing page 231. ISBN 978-1-85506-856-8.
External links
[edit]Microsecond
View on Grokipediafrom Grokipedia
A microsecond (symbol: μs) is a unit of time in the International System of Units (SI) equal to one millionth (10-6) of a second.[1]
It is derived by applying the SI prefix micro- (μ), which denotes a factor of 10-6, to the base unit of time, the second (s).[2]
The second itself is defined as the duration of exactly 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom (at 0 K and at rest).[3]
Thus, one microsecond corresponds precisely to 10-6 of this duration.[4]
This unit is essential for measuring brief intervals in scientific and technological contexts, where precision at the millionth-of-a-second scale is required.[5]
In physics, for instance, light travels exactly 299.792458 meters in vacuum during one microsecond, a distance known as one light-microsecond, which aids in applications like telecommunications and radar ranging.[6]
In electronics and computing, microseconds quantify critical timings such as pulse durations, clock synchronization in networks, and latencies in high-speed data processing, enabling sub-microsecond accuracy in protocols like IEEE 1588 for precision time synchronization.[7][8]
These applications span fields from high-frequency trading systems, where microsecond delays impact performance,[9] to scientific instruments measuring fast chemical reactions or particle decays.[5][10]
These conversions derive directly from SI prefix standards and the fixed length of the second, ensuring consistency across measurements.[14]
Definition and Notation
Formal Definition
The microsecond, denoted by the symbol μs, is a unit of time in the International System of Units (SI) equal to one millionth (1/1,000,000) of a second.[1] It is formed by applying the SI prefix "micro-" to the base unit of time, representing a factor of .[1] Mathematically, this is expressed as .[1] The second (s), the SI base unit of time upon which the microsecond is based, is defined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the unperturbed ground state of the caesium-133 atom.[3] This definition ensures a precise and reproducible standard for all derived time units, including the microsecond.[3] The prefix "micro-" originates from the Greek word mikros, meaning "small," and is combined with "second" to indicate this diminutive scale of measurement.[11]Symbol and Prefix Usage
The official symbol for the microsecond in the International System of Units (SI) is μs, where μ represents the micro prefix and s denotes the second.[12] The micro prefix, symbol μ, indicates a factor of 10^{-6} and was approved for general use within the SI framework by the 11th General Conference on Weights and Measures (CGPM) in 1960.[13][12] In scientific writing and measurement, the symbol μs uses the Greek letter mu (μ) in upright (roman) typeface, as specified by SI conventions.[14] When the Greek μ is unavailable in plain text or certain digital formats, the lowercase Latin letter u may serve as a substitute, resulting in us, though this should be avoided to prevent potential ambiguity with abbreviations like "U.S." for United States in mixed contexts.[14] The standard Greek mu (U+03BC, μ) is the recommended symbol for precision in formal typography, while the micro sign (U+00B5, µ) should be avoided.[14][12] The unit symbol μs does not change in the plural form; for example, both one microsecond and five microseconds are denoted as 1 μs and 5 μs, respectively.[14] According to guidelines from the International Bureau of Weights and Measures (BIPM), a normal space separates the numerical value from the unit symbol, as in "5 μs," while no space appears between the prefix and the base unit symbol itself (μs).[12] These conventions ensure clarity and consistency in expressions involving the microsecond, which equals 10^{-6} seconds.[12]Historical Context
Origin and Early Usage
The term "microsecond," referring to one millionth of a second, first emerged in English scientific literature in 1905, primarily within early electrical engineering contexts to quantify the duration of brief electrical pulses.[15][16] This usage aligned with growing needs to describe transient phenomena in experiments involving high-speed electrical signals, where traditional second-based measurements proved insufficient.[17] Before the formal adoption of "microsecond," 19th-century physicists relied on ad hoc expressions for sub-second fractions in studies of electricity and light propagation, such as calculating signal delays in telegraph lines or rotation times in optical apparatus for speed-of-light determinations.[18] These informal notations captured intervals approaching millionths of a second but lacked a standardized term, reflecting the limitations of instrumentation at the time. The conceptual foundation drew from the metric system's decimal structure, with prefixes like milli- established by the French Academy of Sciences in 1795 to facilitate precise scaling of units.[2] Key early adopters of the microsecond included researchers in electromagnetic wave propagation and telegraphy, who leveraged emerging cathode-ray tube devices—pioneered by Karl Ferdinand Braun in 1897—to visualize and measure short-duration events.[19] The "micro-" prefix itself, denoting 10^{-6}, had been integrated into the centimeter-gram-second (CGS) system by 1873, extending the metric framework to finer scales and enabling the term's practical application in quantifying pulse timings in these fields.[18]Standardization in the 20th Century
The formal standardization of the microsecond as a unit within the International System of Units (SI) occurred during the 11th General Conference on Weights and Measures (CGPM) in 1960, when the micro prefix (symbol μ, denoting 10^{-6}) was officially recognized alongside other decimal prefixes for forming multiples and submultiples of base SI units. This adoption integrated the microsecond (μs) into the newly named Système International d'Unités, enabling its consistent use in scientific and technical measurements globally. Prior informal usage in electrical and timing contexts was thus codified, promoting uniformity in metrology.[1][13] A pivotal advancement came in 1967 at the 13th CGPM, where the second was redefined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom. This definition was later clarified in 1997 to specify the atom at rest at a temperature of 0 K. This atomic definition directly enhanced the precision of microsecond-scale measurements, as the microsecond became exactly one millionth of this stable cesium-based second, facilitating accurate atomic timekeeping in clocks that achieve relative accuracies on the order of 10^{-15}. Such integration allowed for microsecond resolutions in synchronizing global time standards, with cesium clocks enabling alignments within 0.5 μs across networks.[20][21] Key milestones in the 20th century included the practical application of microseconds in radar technology during the 1930s and 1940s, particularly amid World War II efforts, where pulse widths of 10 to 25 μs were used in systems like the SCR-270 for detecting aircraft at ranges determined by round-trip echo times (approximately 12.36 μs per radar mile). In nuclear physics of the same era, microsecond timescales became essential for describing implosion dynamics in atomic weapon development, with energy yields occurring in about 1 μs to achieve criticality. By the 1970s, the microsecond was incorporated into international standards for time notation, such as through ISO recommendations on SI unit presentation, further embedding it in global technical documentation.[22][23] The evolution of measurement precision for microseconds progressed from mechanical chronoscopes in the early 20th century, which offered resolutions around 1 ms but with daily drifts of milliseconds, to quartz-crystal standards in the 1940s that stabilized to 0.1 ms per day. Atomic standards from the late 1950s onward dramatically improved this, achieving microsecond accuracies to parts per billion relative to the second—equivalent to absolute uncertainties below 1 ns—through cesium beam techniques that underpin modern primary frequency standards.[24]Equivalents and Comparisons
Conversions to Other Time Units
The microsecond (μs) is a unit of time equal to one millionth of a second, or seconds, as defined by the International System of Units (SI).[14] This prefix-based relation allows for straightforward conversions to other decimal time units using SI multipliers. For instance, 1 μs equals 0.001 milliseconds (ms), since the millisecond is seconds, making the microsecond one-thousandth of a millisecond.[14] Similarly, 1 μs equals 1,000 nanoseconds (ns), as the nanosecond is seconds.[14] To smaller scales, 1 μs equals 1,000,000 picoseconds (ps), given that the picosecond is seconds; however, conversions primarily emphasize adjacent SI prefixes like milli-, nano-, and pico- for precision in scientific and engineering contexts.[14] For larger units, 1 μs is seconds, so 1,000,000 μs equals 1 second (s).[14] Extending to non-decimal but common units, 1 day—defined as 86,400 seconds—equals 86,400,000,000 μs, or μs.[14] The general conversion formula between microseconds and seconds is , where is the time in seconds; conversely, .[14] This formula facilitates practical calculations for extended periods. For example, 1 hour, equivalent to 3,600 seconds, converts to μs.[14]| Time Unit | Relation to 1 μs | Exact Value |
|---|---|---|
| Second (s) | μs = 1 s | s |
| Millisecond (ms) | 1 μs = 0.001 ms | ms |
| Nanosecond (ns) | 1 μs = 1,000 ns | ns |
| Picosecond (ps) | 1 μs = 1,000,000 ps | ps |
