Hubbry Logo
LightLightMain
Open search
Light
Community hub
Light
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Light
Light
from Wikipedia

A triangular prism dispersing a beam of white light. The longer wavelengths (red) and the shorter wavelengths (green-blue) are separated.

Light, visible light, or visible radiation is electromagnetic radiation that can be perceived by the human eye.[1][2] Visible light spans the visible spectrum and is usually defined as having wavelengths in the range of 400–700 nanometres (nm), corresponding to frequencies of 750–420 terahertz. The visible band sits adjacent to the infrared (with longer wavelengths and lower frequencies) and the ultraviolet (with shorter wavelengths and higher frequencies), called collectively optical radiation.[3][4]

In physics, the term "light" may refer more broadly to electromagnetic radiation of any wavelength, whether visible or not.[5][6] In this sense, gamma rays, X-rays, microwaves and radio waves are also light. The primary properties of light are intensity, propagation direction, frequency or wavelength spectrum, and polarization. Its speed in vacuum, 299792458 m/s, is one of the fundamental constants of nature.[7] All electromagnetic radiation exhibits some properties of both particles and waves. Single, massless elementary particles, or quanta, of light called photons can be detected with specialized equipment; phenomena like interference are described by waves. Most everyday interactions with light can be understood using geometrical optics; quantum optics, is an important research area in modern physics.

The main source of natural light on Earth is the Sun. Historically, another important source of light for humans has been fire, from ancient campfires to modern kerosene lamps. With the development of electric lights and power systems, electric lighting has effectively replaced firelight.

Electromagnetic spectrum and visible light

[edit]
The electromagnetic spectrum, with the visible portion highlighted. The bottom graph (Visible spectrum) is wavelength in units of nanometres (nm).

Generally, electromagnetic radiation (EMR) is classified by wavelength into radio waves, microwaves, infrared, the visible spectrum that we perceive as light, ultraviolet, X-rays and gamma rays. The designation "radiation" excludes static electric, magnetic and near fields.

The behavior of EMR depends on its wavelength. Higher frequencies have shorter wavelengths and lower frequencies have longer wavelengths. When EMR interacts with single atoms and molecules, its behavior depends on the amount of energy per quantum it carries.

EMR in the visible light region consists of quanta (called photons) that are at the lower end of the energies that are capable of causing electronic excitation within molecules, which leads to changes in the bonding or chemistry of the molecule. At the lower end of the visible light spectrum, EMR becomes invisible to humans (infrared) because its photons no longer have enough individual energy to cause a lasting molecular change (a change in conformation) in the visual molecule retinal in the human retina, which change triggers the sensation of vision.

There exist animals that are sensitive to various types of infrared, but not by means of quantum-absorption. Infrared sensing in snakes depends on a kind of natural thermal imaging, in which tiny packets of cellular water are raised in temperature by the infrared radiation. EMR in this range causes molecular vibration and heating effects, which is how these animals detect it.

Above the frequency range of visible light, ultraviolet light becomes invisible to humans, mostly because it is absorbed by the cornea with wavelengths shorter than 360 nm and the internal lens at wavelengths shorter than 400 nm. Furthermore, the rods and cones located in the retina of the human eye cannot detect the very short (shorter than 360 nm) ultraviolet wavelengths and are in fact damaged by ultraviolet. Many animals with eyes that do not require lenses (such as insects and shrimp) are able to detect ultraviolet, by quantum photon-absorption mechanisms, in much the same chemical way that humans detect visible light.

Various sources define visible light as narrowly as 420–680 nm[8][9] to as broadly as 380–800 nm.[10][11] Under ideal laboratory conditions, people can see infrared up to at least 1,050 nm;[12] children and young adults may perceive ultraviolet wavelengths down to about 310–313 nm.[13][14][15]

Plant growth is also affected by the colour spectrum of light, a process known as photomorphogenesis.

Speed of light

[edit]
Beam of sun light inside the cavity of Rocca ill'Abissu at Fondachelli-Fantina, Sicily

The speed of light in vacuum is defined to be exactly 299792458 m/s (approximately 186,282 miles per second). The fixed value of the speed of light in SI units results from the fact that the metre is now defined in terms of the speed of light. All forms of electromagnetic radiation move at exactly this same speed in vacuum.

Different physicists have attempted to measure the speed of light throughout history. Galileo attempted to measure the speed of light in the seventeenth century. An early experiment to measure the speed of light was conducted by Ole Rømer, a Danish physicist, in 1676. Using a telescope, Rømer observed the motions of Jupiter and one of its moons, Io. Noting discrepancies in the apparent period of Io's orbit, he calculated that light takes about 22 minutes to traverse the diameter of Earth's orbit.[16] However, its size was not known at that time. If Rømer had known the diameter of the Earth's orbit, he would have calculated a speed of 227000000 m/s.

Another more accurate measurement of the speed of light was performed in Europe by Hippolyte Fizeau in 1849.[17] Fizeau directed a beam of light at a mirror several kilometers away. A rotating cog wheel was placed in the path of the light beam as it traveled from the source, to the mirror and then returned to its origin. Fizeau found that at a certain rate of rotation, the beam would pass through one gap in the wheel on the way out and the next gap on the way back. Knowing the distance to the mirror, the number of teeth on the wheel and the rate of rotation, Fizeau was able to calculate the speed of light as 313000000 m/s.

Léon Foucault carried out an experiment which used rotating mirrors to obtain a value of 298000000 m/s[17] in 1862. Albert A. Michelson conducted experiments on the speed of light from 1877 until his death in 1931. He refined Foucault's methods in 1926 using improved rotating mirrors to measure the time it took light to make a round trip from Mount Wilson to Mount San Antonio in California. The precise measurements yielded a speed of 299796000 m/s.[18]

The effective velocity of light in various transparent substances containing ordinary matter, is less than in vacuum. For example, the speed of light in water is about 3/4 of that in vacuum.

Two independent teams of physicists were said to bring light to a "complete standstill" by passing it through a Bose–Einstein condensate of the element rubidium, one team at Harvard University and the Rowland Institute for Science in Cambridge, Massachusetts and the other at the Harvard–Smithsonian Center for Astrophysics, also in Cambridge.[19] However, the popular description of light being "stopped" in these experiments refers only to light being stored in the excited states of atoms, then re-emitted at an arbitrary later time, as stimulated by a second laser pulse. During the time it had "stopped", it had ceased to be light.

Optics

[edit]

The study of light and the interaction of light and matter is termed optics. Optics has different forms appropriate to different circumstances. Geometrical optics, appropriate for understanding things like eyes, lenses, cameras, fiber optics, and mirrors, works well when the wavelength of light is small in comparison to the objects it interacts with. Physical optics incorporates wave properties and is needed understand diffraction and interference. Quantum optics applies when studying individual photons interacting with matter.[20]: 33 

Surface scattering

[edit]

A transparent object allows light to transmit or pass through. Conversely, an opaque object does not allow light to transmit through and instead reflecting or absorbing the light it receives. Most objects do not reflect or transmit light specularly and to some degree scatters the incoming light, which is called glossiness. Surface scattering is caused by the surface roughness of the reflecting surfaces, and internal scattering is caused by the difference of refractive index between the particles and medium inside the object. Like transparent objects, translucent objects allow light to transmit through, but translucent objects also scatter certain wavelength of light via internal scattering.[21]

Refraction

[edit]
Due to refraction, the straw dipped in water appears bent and the ruler scale compressed when viewed from a shallow angle.

Refraction is the bending of light rays when passing through a surface between one transparent material and another. It is described by Snell's Law:

where θ1 is the angle between the ray and the surface normal in the first medium, θ2 is the angle between the ray and the surface normal in the second medium and n1 and n2 are the indices of refraction, n = 1 in a vacuum and n > 1 in a transparent substance.

When a beam of light crosses the boundary between a vacuum and another medium, or between two different media, the wavelength of the light changes, but the frequency remains constant. If the beam of light is not orthogonal (or rather normal) to the boundary, the change in wavelength results in a change in the direction of the beam. This change of direction is known as refraction.

The refractive quality of lenses is frequently used to manipulate light in order to change the apparent size of images. Magnifying glasses, spectacles, contact lenses, microscopes and refracting telescopes are all examples of this manipulation.

Light sources

[edit]

There are many sources of light. A body at a given temperature emits a characteristic spectrum of black-body radiation. A simple thermal source is sunlight, the radiation emitted by the chromosphere of the Sun at around 6,000 K (5,730 °C; 10,340 °F). Solar radiation peaks in the visible region of the electromagnetic spectrum when plotted in wavelength units,[22] and roughly 44% of the radiation that reaches the ground is visible.[23] Another example is incandescent light bulbs, which emit only around 10% of their energy as visible light and the remainder as infrared. A common thermal light source in history is the glowing solid particles in flames, but these also emit most of their radiation in the infrared and only a fraction in the visible spectrum.

The peak of the black-body spectrum is in the deep infrared, at about 10 micrometre wavelength, for relatively cool objects like human beings. As the temperature increases, the peak shifts to shorter wavelengths, producing first a red glow, then a white one and finally a blue-white colour as the peak moves out of the visible part of the spectrum and into the ultraviolet. These colours can be seen when metal is heated to "red hot" or "white hot". Blue-white thermal emission is not often seen, except in stars (the commonly seen pure-blue colour in a gas flame or a welder's torch is in fact due to molecular emission, notably by CH radicals emitting a wavelength band around 425 nm and is not seen in stars or pure thermal radiation).

Atoms emit and absorb light at characteristic energies. This produces "emission lines" in the spectrum of each atom. Emission can be spontaneous, as in light-emitting diodes, gas discharge lamps (such as neon lamps and neon signs, mercury-vapor lamps, etc.) and flames (light from the hot gas itself—so, for example, sodium in a gas flame emits characteristic yellow light). Emission can also be stimulated, as in a laser or a microwave maser.

Deceleration of a free charged particle, such as an electron, can produce visible radiation: cyclotron radiation, synchrotron radiation and bremsstrahlung radiation are all examples of this. Particles moving through a medium faster than the speed of light in that medium can produce visible Cherenkov radiation. Certain chemicals produce visible radiation by chemoluminescence. In living things, this process is called bioluminescence. For example, fireflies produce light by this means and boats moving through water can disturb plankton which produce a glowing wake.

Certain substances produce light when they are illuminated by more energetic radiation, a process known as fluorescence. Some substances emit light slowly after excitation by more energetic radiation. This is known as phosphorescence. Phosphorescent materials can also be excited by bombarding them with subatomic particles. Cathodoluminescence is one example. This mechanism is used in cathode-ray tube television sets and computer monitors.

Hong Kong illuminated by colourful artificial lighting

Certain other mechanisms can produce light:

When the concept of light is intended to include very-high-energy photons (gamma rays), additional generation mechanisms include:

Measurement

[edit]

Light is measured with two main alternative sets of units: radiometry consists of measurements of light power at all wavelengths, while photometry measures light with wavelength weighted with respect to a standardized model of human brightness perception. Photometry is useful, for example, to quantify Illumination (lighting) intended for human use.

The photometry units are different from most systems of physical units in that they take into account how the human eye responds to light. The cone cells in the human eye are of three types which respond differently across the visible spectrum and the cumulative response peaks at a wavelength of around 555 nm. Therefore, two sources of light which produce the same intensity (W/m2) of visible light do not necessarily appear equally bright. The photometry units are designed to take this into account and therefore are a better representation of how "bright" a light appears to be than raw intensity. They relate to raw power by a quantity called luminous efficacy and are used for purposes like determining how to best achieve sufficient illumination for various tasks in indoor and outdoor settings. The illumination measured by a photocell sensor does not necessarily correspond to what is perceived by the human eye and without filters which may be costly, photocells and charge-coupled devices (CCD) tend to respond to some infrared, ultraviolet or both.

Light pressure

[edit]

Light exerts physical pressure on objects in its path, a phenomenon which can be deduced by Maxwell's equations, but can be more easily explained by the particle nature of light: photons strike and transfer their momentum. Light pressure is equal to the power of the light beam divided by c, the speed of light.  Due to the magnitude of c, the effect of light pressure is negligible for everyday objects.  For example, a one-milliwatt laser pointer exerts a force of about 3.3 piconewtons on the object being illuminated; thus, one could lift a U.S. penny with laser pointers, but doing so would require about 30 billion 1-mW laser pointers.[24]  However, in nanometre-scale applications such as nanoelectromechanical systems (NEMS), the effect of light pressure is more significant and exploiting light pressure to drive NEMS mechanisms and to flip nanometre-scale physical switches in integrated circuits is an active area of research.[25] At larger scales, light pressure can cause asteroids to spin faster,[26] acting on their irregular shapes as on the vanes of a windmill.  The possibility of making solar sails that would accelerate spaceships in space is also under investigation.[27][28]

Although the motion of the Crookes radiometer was originally attributed to light pressure, this interpretation is incorrect; the characteristic Crookes rotation is the result of a partial vacuum.[29] This should not be confused with the Nichols radiometer, in which the (slight) motion caused by torque (though not enough for full rotation against friction) is directly caused by light pressure.[30] As a consequence of light pressure, Einstein in 1909 predicted the existence of "radiation friction" which would oppose the movement of matter.[31] He wrote, "radiation will exert pressure on both sides of the plate. The forces of pressure exerted on the two sides are equal if the plate is at rest. However, if it is in motion, more radiation will be reflected on the surface that is ahead during the motion (front surface) than on the back surface. The backwardacting force of pressure exerted on the front surface is thus larger than the force of pressure acting on the back. Hence, as the resultant of the two forces, there remains a force that counteracts the motion of the plate and that increases with the velocity of the plate. We will call this resultant 'radiation friction' in brief."

Usually light momentum is aligned with its direction of motion. However, for example in evanescent waves momentum is transverse to direction of propagation.[32]

Historical theories about light, in chronological order

[edit]

Classical Greece and Hellenism

[edit]

In the fifth century BC, Empedocles postulated that everything was composed of four elements; fire, air, earth and water. He believed that goddess Aphrodite made the human eye out of the four elements and that she lit the fire in the eye which shone out from the eye making sight possible. If this were true, then one could see during the night just as well as during the day, so Empedocles postulated an interaction between rays from the eyes and rays from a source such as the sun.[33]

In about 300 BC, Euclid wrote Optica, in which he studied the properties of light. Euclid postulated that light travelled in straight lines and he described the laws of reflection and studied them mathematically. He questioned that sight is the result of a beam from the eye, for he asks how one sees the stars immediately, if one closes one's eyes, then opens them at night. If the beam from the eye travels infinitely fast this is not a problem.[34]

In 55 BC, Lucretius, a Roman who carried on the ideas of earlier Greek atomists, wrote that "The light & heat of the sun; these are composed of minute atoms which, when they are shoved off, lose no time in shooting right across the interspace of air in the direction imparted by the shove." (from On the nature of the Universe). Despite being similar to later particle theories, Lucretius's views were not generally accepted. Ptolemy (c. second century) wrote about the refraction of light in his book Optics.[35]

Classical India

[edit]

In ancient India, the Hindu schools of Samkhya and Vaisheshika, from around the early centuries AD developed theories on light. According to the Samkhya school, light is one of the five fundamental "subtle" elements (tanmatra) out of which emerge the gross elements. The atomicity of these elements is not specifically mentioned and it appears that they were actually taken to be continuous.[36] The Vishnu Purana refers to sunlight as "the seven rays of the sun".[36]

The Indian Buddhists, such as Dignāga in the fifth century and Dharmakirti in the seventh century, developed a type of atomism that is a philosophy about reality being composed of atomic entities that are momentary flashes of light or energy. They viewed light as being an atomic entity equivalent to energy.[36]

Descartes

[edit]

René Descartes (1596–1650) held that light was a mechanical property of the luminous body, rejecting the "forms" of Ibn al-Haytham and Witelo as well as the "species" of Roger Bacon, Robert Grosseteste and Johannes Kepler.[37] In 1637 he published a theory of the refraction of light that assumed, incorrectly, that light travelled faster in a denser medium than in a less dense medium. Descartes arrived at this conclusion by analogy with the behaviour of sound waves.[citation needed] Although Descartes was incorrect about the relative speeds, he was correct in assuming that light behaved like a wave and in concluding that refraction could be explained by the speed of light in different media.

Descartes is not the first to use the mechanical analogies but because he clearly asserts that light is only a mechanical property of the luminous body and the transmitting medium, Descartes's theory of light is regarded as the start of modern physical optics.[37]

Particle theory

[edit]
Pierre Gassendi

Pierre Gassendi (1592–1655), an atomist, proposed a particle theory of light which was published posthumously in the 1660s. Isaac Newton studied Gassendi's work at an early age and preferred his view to Descartes's theory of the plenum. He stated in his Hypothesis of Light of 1675 that light was composed of corpuscles (particles of matter) which were emitted in all directions from a source. One of Newton's arguments against the wave nature of light was that waves were known to bend around obstacles, while light travelled only in straight lines. He did, however, explain the phenomenon of the diffraction of light (which had been observed by Francesco Grimaldi) by allowing that a light particle could create a localised wave in the aether.

Newton's theory could be used to predict the reflection of light, but could only explain refraction by incorrectly assuming that light accelerated upon entering a denser medium because the gravitational pull was greater. Newton published the final version of his theory in his Opticks of 1704. His reputation helped the particle theory of light to hold sway during the eighteenth century. The particle theory of light led Pierre-Simon Laplace to argue that a body could be so massive that light could not escape from it. In other words, it would become what is now called a black hole. Laplace withdrew his suggestion later, after a wave theory of light became firmly established as the model for light (as has been explained, neither a particle or wave theory is fully correct). A translation of Newton's essay on light appears in The large scale structure of space-time, by Stephen Hawking and George F. R. Ellis.

The fact that light could be polarized was for the first time qualitatively explained by Newton using the particle theory. Étienne-Louis Malus in 1810 created a mathematical particle theory of polarization. Jean-Baptiste Biot in 1812 showed that this theory explained all known phenomena of light polarization. At that time the polarization was considered as the proof of the particle theory.

Wave theory

[edit]

To explain the origin of colours, Robert Hooke (1635–1703) developed a "pulse theory" and compared the spreading of light to that of waves in water in his 1665 work Micrographia ("Observation IX"). In 1672 Hooke suggested that light's vibrations could be perpendicular to the direction of propagation. Christiaan Huygens (1629–1695) worked out a mathematical wave theory of light in 1678 and published it in his Treatise on Light in 1690. He proposed that light was emitted in all directions as a series of waves in a medium called the luminiferous aether. As waves are not affected by gravity, it was assumed that they slowed down upon entering a denser medium.[38] Another supporter of the wave theory was Leonhard Euler. He argued in Nova theoria lucis et colorum (1746) that diffraction could more easily be explained by a wave theory.

Christiaan Huygens
Thomas Young's sketch of water waves showing diffraction[39]

The wave theory predicted that light waves could interfere with each other like sound waves (as noted around 1800 by Thomas Young). Young showed by means of numerous diffraction experiments that light behaved as waves[40]: 101 . He first publicly stated his "general law" of interference in January 1802, in his book A Syllabus of a Course of Lectures on Natural and Experimental Philosophy:[41]

But the general law, by which all these appearances are governed, may be very easily deduced from the interference of two coincident undulations, which either cooperate, or destroy each other, in the same manner as two musical notes produce an alternate intension and remission, in the beating of an imperfect unison.[42]

He also proposed that different colours were caused by different wavelengths of light and explained colour vision in terms of three-coloured receptors in the eye.

In 1816 André-Marie Ampère gave Augustin-Jean Fresnel an idea that the polarization of light can be explained by the wave theory if light were a transverse wave.[43] Later, Fresnel independently worked out his own wave theory of light and presented it to the Académie des Sciences in 1817. Siméon Denis Poisson challenged Fresnel's model, claiming that it predicted a bright spot in the shadow behind a circular obstacle contrary to common sense. Dominique-François-Jean Arago created an experiment that showed the bright spot: Poisson's challenge became new evidence for the wave theory.[40]: 109  In 1818, Young wrote to Arago suggesting that light must be transverse waves, not the longitudinal waves characteristic of sound. Fresnel took up the idea and was able to show via mathematical methods that polarization could be explained by a transverse wave theory of light with no longitudinal vibration.[40]: 115 

The weakness of the wave theory was that light waves, like sound waves, would need a medium for transmission. The existence of the hypothetical substance luminiferous aether proposed by Huygens in 1678 was cast into strong doubt in the late nineteenth century by the Michelson–Morley experiment.

Newton's corpuscular theory implied that light would travel faster in a denser medium, while the wave theory of Huygens and others implied the opposite. At that time, the speed of light could not be measured accurately enough to decide which theory was correct. The first to make a sufficiently accurate measurement was Léon Foucault, in 1850.[44] His result supported the wave theory, and the classical particle theory was finally abandoned (only to partly re-emerge in the twentieth century as photons in quantum theory).

Electromagnetic theory

[edit]
A linearly polarized electromagnetic wave traveling along the z-axis, with E denoting the electric field and perpendicular B denoting magnetic field

In 1845, Michael Faraday discovered that the plane of polarization of linearly polarized light is rotated when the light rays travel along the magnetic field direction in the presence of a transparent dielectric, an effect now known as Faraday rotation.[45] This was the first evidence that light was related to electromagnetism. In 1846 he speculated that light might be some form of disturbance propagating along magnetic field lines.[45] Faraday proposed in 1847 that light was a high-frequency electromagnetic vibration, which could propagate even in the absence of a medium such as the ether.[46]

Faraday's work inspired James Clerk Maxwell to study electromagnetic radiation and light. Maxwell discovered that self-propagating electromagnetic waves would travel through space at a constant speed, which happened to be equal to the previously measured speed of light. From this, Maxwell concluded that light was a form of electromagnetic radiation: he first stated this result in 1862 in On Physical Lines of Force. In 1873, he published A Treatise on Electricity and Magnetism, which contained a full mathematical description of the behavior of electric and magnetic fields, still known as Maxwell's equations. Soon after, Heinrich Hertz confirmed Maxwell's theory experimentally by generating and detecting radio waves in the laboratory and demonstrating that these waves behaved exactly like visible light, exhibiting properties such as reflection, refraction, diffraction and interference. Maxwell's theory and Hertz's experiments led directly to the development of modern radio, radar, television, electromagnetic imaging and wireless communications.

In the quantum theory, photons are seen as wave packets of the waves described in the classical theory of Maxwell. The quantum theory was needed to explain effects even with visual light that Maxwell's classical theory could not (such as spectral lines).

Quantum theory

[edit]

In 1900 Max Planck, attempting to explain black-body radiation, suggested that although light was a wave, these waves could gain or lose energy only in finite amounts related to their frequency. Planck called these "lumps" of light energy "quanta" (from a Latin word for "how much"). In 1905, Albert Einstein used the idea of light quanta to explain the photoelectric effect and suggested that these light quanta had a "real" existence. In 1923 Arthur Holly Compton showed that the wavelength shift seen when low intensity X-rays scattered from electrons (so called Compton scattering) could be explained by a particle-theory of X-rays, but not a wave theory. In 1926 Gilbert N. Lewis named these light quanta particles photons.[47]

Eventually quantum mechanics came to picture light as (in some sense) both a particle and a wave, and (in another sense) as a phenomenon which is neither a particle nor a wave (which actually are macroscopic phenomena, such as baseballs or ocean waves). Instead, under some approximations light can be described sometimes with mathematics appropriate to one type of macroscopic metaphor (particles) and sometimes another macroscopic metaphor (waves).

As in the case for radio waves and the X-rays involved in Compton scattering, physicists have noted that electromagnetic radiation tends to behave more like a classical wave at lower frequencies, but more like a classical particle at higher frequencies, but never completely loses all qualities of one or the other. Visible light, which occupies a middle ground in frequency, can easily be shown in experiments to be describable using either a wave or particle model, or sometimes both.

In 1924–1925, Satyendra Nath Bose showed that light followed different statistics from that of classical particles. With Einstein, they generalized this result for a whole set of integer spin particles called bosons (after Bose) that follow Bose–Einstein statistics. The photon is a massless boson of spin 1.

In 1927, Paul Dirac quantized the electromagnetic field. Pascual Jordan and Vladimir Fock generalized this process to treat many-body systems as excitations of quantum fields, a process with the misnomer of second quantization. And at the end of the 1940s a full theory of quantum electrodynamics was developed using quantum fields based on the works of Julian Schwinger, Richard Feynman, Freeman Dyson, and Shinichiro Tomonaga.

Quantum optics

[edit]

John R. Klauder, George Sudarshan, Roy J. Glauber, and Leonard Mandel applied quantum theory to the electromagnetic field in the 1950s and 1960s to gain a more detailed understanding of photodetection and the statistics of light (see degree of coherence). This led to the introduction of the coherent state as a concept which addressed variations between laser light, thermal light, exotic squeezed states, etc. as it became understood that light cannot be fully described just referring to the electromagnetic fields describing the waves in the classical picture. In 1977, H. Jeff Kimble et al. demonstrated a single atom emitting one photon at a time, further compelling evidence that light consists of photons. Previously unknown quantum states of light with characteristics unlike classical states, such as squeezed light were subsequently discovered.

Development of short and ultrashort laser pulses—created by Q switching and modelocking techniques—opened the way to the study of what became known as ultrafast processes. Applications for solid state research (e.g. Raman spectroscopy) were found, and mechanical forces of light on matter were studied. The latter led to levitating and positioning clouds of atoms or even small biological samples in an optical trap or optical tweezers by laser beam. This, along with Doppler cooling and Sisyphus cooling, was the crucial technology needed to achieve the celebrated Bose–Einstein condensation.

Other remarkable results are the demonstration of quantum entanglement, quantum teleportation, and quantum logic gates. The latter are of much interest in quantum information theory, a subject which partly emerged from quantum optics, partly from theoretical computer science.

Use for light on Earth

[edit]

Sunlight provides the energy that green plants use to create sugars mostly in the form of starches, which release energy into the living things that digest them. This process of photosynthesis provides virtually all the energy used by living things. Some species of animals generate their own light, a process called bioluminescence. For example, fireflies use light to locate mates and vampire squid use it to hide themselves from prey.

See also

[edit]

Notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Light is within the portion of the that is visible to the , corresponding to wavelengths between approximately 380 and 750 nanometers. This narrow band, often spanning from violet (shorter wavelengths) to (longer wavelengths), enables vision and is produced by various sources such as the sun, incandescent bulbs, and luminescent materials. Light travels through at a constant speed of exactly 299,792,458 meters per second, a fundamental that defines the meter in the (SI). As a form of transfer, light consists of oscillating electric and magnetic fields perpendicular to its direction of propagation, manifesting wave-like properties such as interference and . Simultaneously, light exhibits particle-like behavior, behaving as discrete packets of energy called photons, each with energy proportional to its , in accordance with . This wave-particle duality underpins , explaining phenomena from the to the behavior in double-slit experiments. Beyond visibility, light's broader electromagnetic context includes and radiation adjacent to the , influencing applications in , , and .

Electromagnetic Nature

Electromagnetic Spectrum

Electromagnetic radiation is a form of propagated through as coupled oscillating electric and magnetic fields that are mutually perpendicular to each other and to the direction of propagation. These waves travel at the in , c=3×108c = 3 \times 10^8 m/s, a universal constant for all electromagnetic waves regardless of or . The encompasses the full range of these waves, ordered by decreasing (or increasing ), from long-wavelength, low-energy radio waves to short-wavelength, high-energy gamma rays. The is divided into regions based on and , each exhibiting distinct interactions with . The table below summarizes approximate ranges for the primary components, derived from standard astronomical and physical classifications.
RegionWavelength RangeFrequency Range (Hz)
Radio waves> 1 × 10^{-1} m< 3 × 10^9
Microwaves1 × 10^{-3} to 1 × 10^{-1} m3 × 10^9 to 3 × 10^{11}
Infrared7 × 10^{-7} to 1 × 10^{-3} m3 × 10^{11} to 4 × 10^{14}
Visible4 × 10^{-7} to 7 × 10^{-7} m (400–700 nm)4 × 10^{14} to 7.5 × 10^{14}
Ultraviolet1 × 10^{-8} to 4 × 10^{-7} m7.5 × 10^{14} to 3 × 10^{16}
X-rays1 × 10^{-11} to 1 × 10^{-8} m3 × 10^{16} to 3 × 10^{19}
Gamma rays< 1 × 10^{-11} m> 3 × 10^{19}
The energy EE of a in the spectrum is related to its ff by Planck's relation E=hfE = h f, where hh is Planck's constant (6.626×10346.626 \times 10^{-34} J s). This implies that decreases with increasing , as is inversely proportional to (f=c/λf = c / \lambda). Thus, radio waves carry the lowest energy, while gamma rays carry the highest. The naming conventions for these regions arose historically from their discovery and initial detection methods. , meaning "below red," was named by in 1800 after observing heating effects beyond visible red light. , or "beyond violet," was identified by Johann Ritter in 1801 through its chemical effects on . Radio waves and microwaves trace to Heinrich Hertz's 1887–1888 experiments confirming predictions. X-rays were termed by in 1895 for their mysterious penetrating properties, and gamma rays by Paul Villard in 1900, later confirmed as electromagnetic by further studies. Visible light occupies the narrow band perceptible to the .

Visible Light

Visible light constitutes the segment of the detectable by the , spanning wavelengths from approximately 400 to 700 nanometers. This range corresponds to a continuum of colors, starting with violet at the shorter wavelengths (around 400–450 nm), progressing through (450–495 nm), (495–570 nm), (570–590 nm), orange (590–620 nm), and ending with at the longer wavelengths (620–700 nm). These colors emerge from the differential and dispersion of light wavelengths, illustrating the spectral nature of white light. The perception of color relies on how these wavelengths interact and combine. In mixing, as exemplified by the RGB model employed in electronic displays and lighting, red, green, and blue primary lights are superimposed to generate secondary colors and ultimately white light when combined in equal intensities. Conversely, subtractive color mixing, utilized in and painting via the CMY model (, , ), works by pigments absorbing specific wavelengths from incident white light, with the mixture of all three primaries yielding black or near-black. A foundational example of visible light's chromatic composition is its decomposition into a spectrum when passed through a prism, revealing the inherent multiplicity of wavelengths in seemingly uniform white light. Human visual perception of visible light is tuned to this narrow band, with —dominant in well-lit environments—peaking in sensitivity at 555 nm in the green-yellow region, where photoreceptor cells enable color discrimination through three types sensitive to short (), medium (green), and long (red) wavelengths. In dim scotopic conditions, rod cells predominate for low-light detection, providing vision without color but with heightened sensitivity to motion and shapes, peaking around 507 nm. This optimizes across lighting levels, though overall sensitivity drops sharply beyond the 400–700 nm bounds. As non-ionizing radiation, visible light lacks the photon energy to eject electrons from atoms, distinguishing it from ionizing ultraviolet (below 400 nm) and X-rays (0.01–10 nm), which can damage DNA directly. Nonetheless, it exerts photochemical effects by exciting molecules in biological systems, such as triggering melanin production in skin cells upon absorption by chromophores like melanin and opsins, leading to pigmentation and potential oxidative stress. Visible light's colors also carry cultural and symbolic weight across societies, often leveraging innate perceptual cues for communication. For example, traffic signal systems universally employ for stop (evoking danger due to its long visibility), yellow for caution (signaling transition), and for proceed (indicating ), a standardized convention that enhances road safety through intuitive color associations.

Fundamental Properties

Speed of Light

The speed of light in vacuum, denoted as cc, is a fundamental physical constant exactly equal to 299,792,458 meters per second. This value has been fixed by definition in the International System of Units (SI) since 1983, when the meter was redefined in terms of the distance light travels in vacuum in 1/299,792,458 of a second, anchoring the unit to this invariant speed. Early attempts to measure cc began in the 17th century. In 1676, Danish astronomer Ole Rømer provided the first quantitative estimate by observing discrepancies in the timing of Jupiter's moon Io's eclipses, attributing delays to the finite time light takes to travel varying distances across Earth's orbit around the Sun; his calculation yielded approximately 227,000 km/s, remarkably close to the modern value given the era's observational limits. Terrestrial measurements advanced in the 19th century with Hippolyte Fizeau's 1849 experiment, which used a rapidly rotating toothed wheel to interrupt and time light pulses traveling 8.6 km to a distant mirror and back, yielding a speed of about 313,000 km/s in air. Refinements continued with Albert A. Michelson's 1926 rotating-mirror apparatus at Mount Wilson Observatory, where an octagonal mirror spun at high speeds reflected light over a 35-km path, producing a value of 299,796 km/s with unprecedented precision for the time. Modern determinations, such as those using laser interferometry in the 1970s, confirmed the value to within a few parts per billion before its exact definition, employing coherent light sources to measure phase shifts over known baselines. The invariance of cc underpins Albert Einstein's 1905 theory of special relativity, positing that light's speed in vacuum remains constant for all inertial observers regardless of their relative motion or the source's velocity, a postulate derived from the null result of the Michelson-Morley experiment and . This leads to profound consequences, including —where moving clocks tick slower—and in the direction of motion, as observers reconcile the unchanging cc with differing relative speeds. Contextually, these principles enable the derivation of the mass-energy equivalence E=mc2E = mc^2, showing that a body's rest energy is proportional to its mass times c2c^2, as explored in Einstein's companion 1905 paper linking to content. In media other than vacuum, light travels slower, with speed vv related to cc by the refractive index n=c/vn = c / v, a dimensionless quantity greater than 1 that quantifies the medium's optical density. This invariance in vacuum establishes cc as the universal speed limit for information and causal influences, ensuring that no signal or particle with mass can exceed it, thereby preserving causality across spacetime as dictated by relativistic principles.

Wave-Particle Duality

Light exhibits both wave-like and particle-like properties, a phenomenon known as wave-particle duality, which reconciles classical descriptions with quantum mechanics. This duality is not a limitation of measurement but a fundamental aspect of light's nature, revealed through experiments that highlight one behavior or the other depending on the setup. In its wave description, light propagates as transverse electromagnetic waves, with oscillating electric and magnetic fields perpendicular to the direction of travel. These waves can be polarized, meaning the electric field vector oscillates in a specific plane (linear polarization) or rotates (circular polarization), a property unique to transverse waves. This framework arises from Maxwell's equations, which describe the interdependence of electric and magnetic fields; for instance, Faraday's law states ×E=Bt\nabla \times \mathbf{E} = -\frac{\partial \mathbf{B}}{\partial t}, and the corrected Ampère's law is ×B=μ0ϵ0Et\nabla \times \mathbf{B} = \mu_0 \epsilon_0 \frac{\partial \mathbf{E}}{\partial t} in free space, predicting self-sustaining waves at speed cc. Conversely, light's particle nature is embodied in photons, discrete quanta that are massless bosons carrying quantized and . Each photon's is given by E=hfE = h f, where hh is Planck's constant and ff is the light's , a relation Einstein applied to light in 1905. Photons also possess p=hλp = \frac{h}{\lambda}, with λ\lambda the , linking the particle's relativistic properties to wave characteristics. The exemplifies light's particle behavior: when monochromatic light strikes a metal surface, electrons are emitted only if the exceeds a material-specific threshold ν0\nu_0, with maximum Kmax=hfϕK_{\max} = h f - \phi (where ϕ=hν0\phi = h \nu_0 is the ), independent of intensity. This quantization, defying classical wave predictions, earned Einstein the Nobel Prize and established the . Compton scattering further confirms photons as particles with momentum: X-rays incident on loosely bound electrons scatter with increased wavelength Δλ=hmec(1cosθ)\Delta \lambda = \frac{h}{m_e c} (1 - \cos \theta), where mem_e is electron mass, cc speed of light, and θ\theta the scattering angle. This shift matches conservation laws for particle collisions, not classical wave scattering, as observed in experiments reported by Compton in 1923. The highlights wave properties through interference fringes formed by light passing through two slits, but in single-photon versions, detections accumulate as discrete hits that collectively build the pattern, showing particles following probabilistic wave-guided paths. Modern setups using attenuated lasers verify this duality without which-path information. The de Broglie hypothesis unifies these views by assigning a λ=hp\lambda = \frac{h}{p} to any particle with pp, extending naturally to photons where it equates the wave wavelength to the particle's de Broglie wavelength. Proposed in 1924, this relation underpins and explains light's dual manifestations.

Light Propagation and Optics

Reflection and Refraction

Reflection occurs when light encounters a boundary between two media and changes direction without altering its speed or , provided the media are non-absorbing. The law of reflection states that the angle of incidence, measured from to the surface, equals of reflection./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/01%3A_The_Nature_of_Light/1.03%3A_The_Law_of_Reflection) This principle holds for smooth surfaces and can be derived from considerations or . Reflections are classified as specular or diffuse depending on . produces a clear , as seen in mirrors where parallel rays reflect parallel to each other at equal angles to . In contrast, scatters light in multiple directions from rough surfaces like or asphalt, enabling visibility of objects under diffuse illumination without a distinct . Refraction describes the bending of light as it passes from one medium to another due to a change in speed, quantified by the refractive index nn, which is the ratio of the speed of light in vacuum to that in the medium. Snell's law governs this bending: n1sinθ1=n2sinθ2n_1 \sin \theta_1 = n_2 \sin \theta_2, where θ1\theta_1 and θ2\theta_2 are the angles of incidence and refraction, respectively. This law arises from Fermat's principle, which posits that light follows the path of least time between two points. When light travels from a denser to a rarer medium (n1>n2n_1 > n_2), reaches a limit at the critical angle θc=sin1(n2/n1)\theta_c = \sin^{-1}(n_2 / n_1), beyond which occurs, with all light reflecting internally. This phenomenon is essential in fiber optics, where light is confined within a core by repeated s. Lenses exploit to focus or diverge light beams. Converging lenses, thicker at , bring parallel rays to a focal point, while diverging lenses, thinner at , spread them apart./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/02%3A_Geometric_Optics_and_Image_Formation/2.05%3A_Thin_Lenses) For a thin symmetric lens in air, the focal length ff is approximated by the lensmaker's formula: f=[R](/page/Radiusofcurvature)2(n1)f = \frac{[R](/page/Radius_of_curvature)}{2(n-1)} where RR is the of each surface and nn is the of the lens material./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/02%3A_Geometric_Optics_and_Image_Formation/2.05%3A_Thin_Lenses) Prisms, typically triangular, refract light through two non-parallel faces, deviating the beam and separating wavelengths due to dispersion. Dispersion arises because the refractive index nn varies with wavelength, being higher for shorter wavelengths like blue light than for longer ones like red./25%3A_Geometric_Optics/25.05%3A_Dispersion_-Rainbows_and_Prisms) In a prism, this causes white light to split into a spectrum, as demonstrated by the formation of rainbows where sunlight refracts and disperses in atmospheric water droplets./25%3A_Geometric_Optics/25.05%3A_Dispersion-_Rainbows_and_Prisms) Refraction in non-uniform media can produce optical illusions such as mirages. In inferior mirages, hot ground creates a layer of low-density air near the surface; light from distant objects bends upward upon entering cooler air above, creating the appearance of water on roads.

Diffraction and Interference

Diffraction is a fundamental wave phenomenon in which light bends around obstacles or spreads through apertures, deviating from straight-line propagation predicted by geometric optics. This effect arises from the wave nature of light, as described by the Huygens-Fresnel principle, which posits that every point on a acts as a source of secondary spherical wavelets, with the new formed by the superposition of these wavelets, modulated by an obliquity factor to account for directional propagation. The principle, originally proposed by in 1690 and refined by in 1818, provides the theoretical foundation for understanding diffraction patterns observed in experiments. In single-slit , light passing through a narrow slit of width aa produces an interference pattern on a screen, characterized by a central bright maximum flanked by alternating minima and secondary maxima. The positions of the minima occur where destructive interference dominates, given by the condition sinθ=mλ/a\sin \theta = m \lambda / a, where θ\theta is the angle from the central axis, λ\lambda is the , mm is a non-zero integer, and aa is the slit width; this arises from the path difference between wavelets from opposite edges of the slit being an integer multiple of λ\lambda. For circular apertures, such as in objectives, limits the resolution, with the angular radius of the (the central bright spot) approximated by θ1.22λ/D\theta \approx 1.22 \lambda / D, where DD is the aperture diameter; this Rayleigh criterion defines the minimum resolvable angle between two point sources, beyond which they blur into one. Interference occurs when two or more coherent light waves superpose, resulting in regions of enhanced (constructive) or reduced (destructive) intensity depending on their phase difference. In Thomas Young's double-slit experiment of 1801, monochromatic light passing through two closely spaced slits separated by distance dd illuminates a distant screen at distance LL, producing bright fringes spaced by Δy=λL/d\Delta y = \lambda L / d, derived from the condition for constructive interference where the path difference is mλm \lambda (mm integer). This pattern demonstrates the wave nature of light, with fringe visibility requiring spatial and temporal coherence between the sources. Thin-film interference exemplifies this in everyday phenomena, such as the iridescent colors of soap bubbles, where light reflected from the front and back surfaces of a thin soap film of thickness tt and refractive index nn interferes; for constructive interference in reflection (accounting for phase shifts), the condition is 2nt=mλ2nt = m\lambda for certain configurations, leading to wavelength-dependent color reinforcement. Polarization influences interference patterns, particularly when light from interfering sources has specific orientations. For polarized light incident on a polarizer at angle θ\theta to its transmission axis, the transmitted intensity follows Malus's law, I=I0cos2θI = I_0 \cos^2 \theta, where I0I_0 is the incident intensity; in interference setups like crossed polarizers with a birefringent sample, this modulates the overall fringe contrast by altering the effective of the superposed waves. Diffraction gratings exploit these principles in by dispersing light into its spectral components, enabling wavelength separation for analysis. A grating with slit spacing dd produces maxima at angles satisfying dsinθ=mλd \sin \theta = m \lambda, allowing different wavelengths to be resolved spatially based on their angular deviation, far superior to prisms for precise measurements in atomic spectra. Observable interference requires coherence, meaning the light sources must maintain a constant phase relationship over the spatial extent (transverse coherence) and duration (longitudinal coherence) of the experiment; incoherent sources, like without filtering, average out phase fluctuations, washing out fringes, whereas lasers provide high coherence lengths exceeding meters for clear patterns.

Light Sources

Natural Sources

Natural sources of light encompass a variety of emission processes occurring without human intervention, ranging from thermal radiation in stellar and terrestrial environments to chemical and electrical excitations. These sources produce light across the , primarily through mechanisms that convert energy into photons via atomic, molecular, or plasma interactions. Thermal sources dominate many natural light emissions, arising from the agitation of charged particles in hot , which approximates for ideal absorbers and emitters. The spectral distribution of this radiation is described by , which quantifies the intensity of emitted as a function of and : B(λ,T)=2hc2λ51ehc/λkT1B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda kT} - 1} where hh is Planck's constant, cc is the speed of light, kk is Boltzmann's constant, λ\lambda is the wavelength, and TT is the absolute temperature. This formula, derived from quantum considerations of energy quantization, predicts a continuous spectrum peaking at wavelengths inversely proportional to temperature, as per Wien's displacement law. The Sun exemplifies such a source, with its photosphere at approximately 5800 K emitting a near-blackbody spectrum that peaks in the visible range around 500 nm, providing the primary illumination for Earth. Geological thermal sources, such as molten lava from volcanic eruptions, also produce incandescent glow through blackbody-like radiation at temperatures typically between 1000°C and 1200°C for basaltic . This incandescence results from the high of the viscous melt, visible as a dull to orange hue at the surface, diminishing as the material cools and solidifies. Celestial sources extend thermal emission to cosmic scales, with stars like the Sun generating light primarily through in their cores, where nuclei combine to form , releasing vast that propagates outward as photons. This core heats the stellar surface, leading to , though absorption and re-emission in the outer layers modify the spectrum. Non-thermal celestial phenomena include auroras, where charged particles from the —mostly electrons and protons—collide with atmospheric gases like oxygen and near Earth's poles, exciting atoms to emit light at specific wavelengths (e.g., from oxygen at ~557 nm). , another atmospheric electrical discharge, ionizes air into a plasma channel at temperatures exceeding 30,000 , producing a brief, intense flash through recombination of electrons and ions, spanning visible wavelengths with a bluish-white appearance. Bioluminescence represents a chemical emission process in living organisms, triggered by enzymatic reactions that oxidize substrates to release energy as photons. In fireflies, the luciferase catalyzes the oxidation of in the presence of oxygen and ATP, producing light primarily in the yellow-green range (500–600 nm), with peak emission around 560 nm for many ; this cold light generates minimal heat, with a quantum yield of approximately 41% (nearly 100% of the emitted energy as photons, without significant thermal loss). Natural light sources exhibit distinct spectral characteristics: thermal sources like and lava yield continuous spectra, with smooth intensity distributions across wavelengths due to the collective emission from dense, hot matter. In contrast, processes involving excited atoms or ions, such as in auroras, plasmas, or , often produce line spectra, featuring discrete emission lines at wavelengths corresponding to atomic transitions (e.g., specific colors from ionized in ). These line spectra arise from low-density gases where individual quantum jumps dominate over effects.

Artificial Sources

Artificial sources of light are engineered devices that produce illumination through controlled physical processes, enabling applications from everyday to precision technologies. Unlike sources, these rely on to generate photons via , electrical discharge, or quantum mechanisms, with ongoing advancements improving and control. Key developments span from the late onward, transforming human environments by providing reliable, tunable light. The evolution of artificial light began with Thomas Edison's incandescent bulb in 1879, which marked the first practical source after extensive experimentation with filaments. This was followed by fluorescent lamps in the early and light-emitting diodes (LEDs) in the mid-20th century, culminating in the laser's invention in 1960 by using a ruby crystal. These milestones, building on principles like proposed by in 1917, have driven efficiency gains from under 5% to over 50% in modern designs. Incandescent bulbs operate on thermal emission, where an electric current heats a filament—typically tungsten, with a melting point of about 3420°C—to around 2500 , causing it to radiate visible light as . However, their efficiency is low, converting only about 5% of input to visible light, with the rest lost as heat. Tungsten's high and resistance to at these temperatures made it ideal for filaments, enabling bulbs to last up to 1000 hours. Fluorescent lamps generate light through electrical discharge in low-pressure mercury vapor, exciting mercury atoms to produce radiation that is then converted to visible light by s coating the tube interior. Pioneered by Hewitt's mercury vapor lamp in , these lamps achieve efficiencies of 20-30% by minimizing thermal losses compared to incandescents. The layer allows color tuning, making them suitable for broad illumination needs. Light-emitting diodes (LEDs) produce light via in a p-n junction, where electrons and holes recombine to emit photons with energy Eg=hfE_g = h f, matching the material's bandgap EgE_g. Early red LEDs used gallium arsenide phosphide in the , but blue LEDs—essential for white light—emerged in 1993 using (GaN) developed by and colleagues, enabling high-efficiency white LEDs through phosphor conversion. GaN's wide bandgap of about 3.4 eV allows blue emission around 450 nm, with overall efficiencies exceeding 50% in modern devices. Lasers produce coherent, monochromatic light through , where incident photons trigger excited atoms to release identical photons, as described by Einstein's 1917 coefficients relating absorption, , and rates. Achieving this requires , where more atoms are in an than , often via optical or electrical pumping. The first laser, Maiman's 1960 device, used a chromium-doped pumped by a flashlamp to emit light at 694 nm. Gas lasers like the helium-neon (He-Ne), operational since 1961, use an in a He-Ne for continuous output at 632.8 nm, prized for its coherence over meters. Solid-state lasers, such as neodymium-doped yttrium aluminum garnet (Nd:YAG), employ a Nd³⁺-doped pumped by diodes or lamps to lase at 1064 nm in the near-infrared, valued for high power and beam quality. These properties—spatial and temporal coherence, narrow linewidth—distinguish lasers from incoherent sources.

Measurement and Detection

Photometric Quantities

Photometric quantities quantify light in terms of its perception by the human visual system, weighting according to the eye's rather than physical power alone. These measures are defined by the (CIE) and form the basis for lighting standards, display technologies, and visual comfort assessments. The core weighting function is the photopic luminosity function V(λ)V(\lambda), which describes the average human eye's sensitivity to wavelengths of light, peaking at 555 nm in the green region of the . Luminous flux, denoted Φv\Phi_v, represents the total amount of visible light emitted, transmitted, or received by a source, measured in lumens (lm). It is calculated by integrating the of the light source with the luminosity function: Φv=6830Φe,λ(λ)V(λ)dλ,\Phi_v = 683 \int_0^\infty \Phi_{e,\lambda}(\lambda) V(\lambda) \, d\lambda, where 683 lm/W is the maximum for monochromatic light at 555 nm, and Φe,λ(λ)\Phi_{e,\lambda}(\lambda) is the spectral radiant flux in watts per nanometer. This quantity captures the overall "light output" as perceived by the eye, making it essential for evaluating the efficiency of lamps and LEDs. Luminous intensity, IvI_v, measures the brightness of a light source in a particular direction, defined as the luminous flux per unit solid angle, with the unit candela (cd), where 1 cd = 1 lm/sr. The candela is an SI base unit, fixed by the luminous intensity of monochromatic radiation at approximately 555 nm with a radiant intensity of 1/683 W/sr. It is particularly useful for point sources like LEDs or stars, emphasizing directional emission weighted by human vision. Luminance, LvL_v, quantifies the brightness of an extended surface or source as seen by an observer, expressed as luminous intensity per unit projected area, in candela per square meter (/). For example, typical office display screens have luminance levels of 250–350 / to ensure comfortable viewing under ambient lighting. This metric is crucial for assessing the perceived brightness of screens, road signs, and illuminated surfaces, incorporating the eye's sensitivity via V(λ)V(\lambda). Illuminance, EvE_v, describes the luminous flux incident on a surface per unit area, measured in lux (lx), where 1 lx = 1 lm/m². It guides ; for instance, the Illuminating Engineering Society (IES) recommends 300–500 lx for general office work to support visual tasks without fatigue. Like other photometric quantities, it is derived from weighted by V(λ)V(\lambda), focusing on the light reaching the eye from illuminated environments. Color metrics in photometry extend these quantities to hue and saturation, using the , which models human color perception through tristimulus values X,Y,ZX, Y, Z. These values are obtained by integrating the light's with CIE standard observer color-matching functions xˉ(λ)\bar{x}(\lambda), yˉ(λ)\bar{y}(\lambda), and zˉ(λ)\bar{z}(\lambda), where YY corresponds to and aligns with V(λ)V(\lambda) since yˉ(λ)=V(λ)\bar{y}(\lambda) = V(\lambda). coordinates xx and yy are derived as x=X/(X+Y+Z)x = X/(X+Y+Z) and y=Y/(X+Y+Z)y = Y/(X+Y+Z), plotting colors on a two-dimensional that excludes brightness, enabling precise specification of light color for applications like displays and .

Radiometric Quantities

Radiometric quantities provide objective measures of electromagnetic radiation's , focusing on physical properties such as power and intensity across the full , without regard to visual sensitivity. These quantities form the basis for quantifying light in physics, astronomy, and applications, enabling precise calculations of transfer in optical systems. Unlike photometric measures, radiometric ones integrate over all wavelengths and directions to capture total . The fundamental radiometric quantity is , denoted Φ_e, which represents the total power of emitted, transmitted, or received by a source or surface, measured in watts (W). It accounts for the integrated energy over all wavelengths and solid angles, serving as the starting point for deriving other quantities in . For instance, the from a light source determines its overall energetic output in free space. Radiance, denoted L_e, quantifies the directional distribution of , defined as the power per unit per unit perpendicular to the direction of , with units of watts per per square meter (W/sr/m²). This quantity is conserved along a ray in lossless media, making it essential for analyzing light through optical systems without changes in intensity due to or focusing. Radiance thus provides a measure of a source's independent of . Irradiance, denoted E_e, measures the incident on a surface per unit area, expressed in watts per square meter (W/m²). It describes the power density from incoming , crucial for assessing energy delivery to detectors or materials. A key example is the , the mean from the Sun at on a surface normal to the rays, valued at approximately 1361.6 W/m² during conditions. Spectral radiometric quantities extend these definitions to specific wavelengths or frequencies, allowing analysis of radiation's distribution across the . B(λ, T), for a blackbody at T, is given by : B(λ,T)=2hc2λ51ehc/λkT1B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda kT} - 1} where h is Planck's constant, c is the , k is Boltzmann's constant, and λ is ; this equation describes the maximum possible at , forming the basis for calibrating broadband sources. Integrating spectral quantities over yields total radiometric values, encompassing the full ./29%3A_Atomic_Physics/29.1%3A_Overview) Detection of radiometric quantities relies on specialized instruments that convert into measurable electrical signals. Photodiodes operate via the , generating current proportional to incident flux in the visible and near- ranges, offering high speed and quantum efficiency for spectral irradiance measurements. Bolometers detect through temperature-induced resistance changes in absorptive materials, suitable for broadband thermal detection across infrared wavelengths. For absolute calibration, cryogenic radiometers employ electrical substitution at low temperatures (near 5 K), equating absorbed to equivalent electrical heating with uncertainties below 0.01%, serving as primary standards traceable to SI units. These detectors ensure accurate realization of radiometric scales in .

Light-Matter Interactions

Absorption and Scattering

When light interacts with , absorption occurs as photons are captured by atoms or molecules, converting the light's into other forms such as or electronic excitation. This process reduces the intensity of the transmitted light and is quantitatively described by the Beer-Lambert law, which states that the transmitted intensity II through a medium of thickness xx is given by I=I0eαx,I = I_0 e^{-\alpha x}, where I0I_0 is the initial intensity and α\alpha is the absorption coefficient, dependent on the material and wavelength. This law, originally formulated by in 1729 and refined by August Beer in 1852, applies to dilute solutions and homogeneous media where is negligible. In absorbing media, the transfer often leads to thermal heating via vibrational relaxation or to excited states that may re-emit light at different wavelengths, though the primary effect is energy dissipation. Scattering, in contrast, redirects without net loss per but randomizes its direction, contributing to phenomena like the of light in atmospheres or tissues. For particles much smaller than the light (typically < 0.1 times the ), dominates, with scattered intensity proportional to 1/λ41/\lambda^4, where λ\lambda is the ; this strong dependence explains the color of the daytime , as shorter wavelengths (~450 nm) scatter more efficiently than longer ones (~650 nm) by molecules like and oxygen. For larger particles comparable to or exceeding the , such as water droplets in clouds (diameters ~10–20 μm), prevails, all visible wavelengths more uniformly and resulting in the white appearance of clouds, though with forward-biased patterns that enhance brightness when viewed from below. A specialized form of , Raman , is inelastic and involves a shift due to energy exchange with molecular vibrations or rotations. The shift Δν=ν0νs\Delta \nu = \nu_0 - \nu_s, where ν0\nu_0 is the incident and νs\nu_s the scattered , corresponds to vibrational energy levels (typically 50–8000 cm⁻¹), enabling non-destructive probing of molecular structures. Discovered by in 1928, this effect is weak (~10⁻⁶ of incident intensity) but crucial for , as the shifted light carries chemical information without requiring sample preparation. In Earth's atmosphere, these processes combine to produce striking . During sunsets, traverses a longer path through the air, enhancing of shorter wavelengths and allowing longer and orange wavelengths to dominate the direct beam, as light scatters least under Rayleigh conditions. Absorption and also underpin applications, such as analyzing absorption lines in stellar spectra—dark features where specific wavelengths are removed by intervening gas clouds or stellar atmospheres—revealing compositions like and in stars via .

Radiation Pressure

Radiation pressure refers to the mechanical force exerted by , such as light, on matter due to the transfer of from . In the quantum description, a single carries p=E[c](/page/Speedoflight)=hf[c](/page/Speedoflight)p = \frac{E}{[c](/page/Speed_of_light)} = \frac{h f}{[c](/page/Speed_of_light)}, where EE is the photon's , [c](/page/Speedoflight)[c](/page/Speed_of_light) is the in , hh is Planck's constant, and ff is the . This relation arises from the relativistic energy- equivalence for massless particles, where E=p[c](/page/Speedoflight)E = p [c](/page/Speed_of_light). When light interacts with a surface, the transfer results in pressure; for perfect absorption, the pressure PP equals the intensity II divided by [c](/page/Speedoflight)[c](/page/Speed_of_light), or P=I[c](/page/Speedoflight)P = \frac{I}{[c](/page/Speed_of_light)}, while for perfect reflection, it doubles to P=2I[c](/page/Speedoflight)P = \frac{2I}{[c](/page/Speed_of_light)}. These expressions derive from classical electromagnetic theory, as predicted by James Clerk Maxwell in 1873, and are confirmed in the picture. The existence of radiation pressure was first experimentally verified in the early 20th century. Russian physicist Pyotr Lebedev conducted the initial measurements in 1900 using a torsion balance with thin mica vanes suspended in a partial vacuum, detecting a small deflection due to sunlight filtered through a slit. Independently, American physicists Ernest Fox Nichols and Gordon Ferrie Hull performed more precise measurements in 1901, employing a Nichols radiometer to quantify the pressure from arc lamp light on delicately balanced mirrors, achieving results within 1% of theoretical predictions. These experiments provided crucial empirical support for the momentum-carrying nature of light. One natural manifestation of is observed in the dust tails of comets, where solar photons push micron-sized dust particles away from the Sun, forming curved, yellowish tails distinct from the ion tails driven by . This effect is particularly evident as comets approach perihelion, with the pressure accelerating smaller grains outward while larger ones lag behind due to . In technological applications, enables propulsion for solar sails, ultra-lightweight reflective sheets that harness sunlight for thrust without fuel; the acceleration a=2PAma = \frac{2 P A}{m} (where AA is sail area and mm is mass) allows gradual velocity increases, as demonstrated in missions like Japan's in 2010 and NASA's Advanced Composite Solar Sail System (ACS3), launched in 2024. Another application is , developed by in 1986, for which he shared the 2018 , which use focused beams to trap and manipulate microscopic particles via gradient forces F=n2cαE2\mathbf{F} = \frac{n}{2c} \alpha \nabla E^2, where nn is the , α\alpha is , and EE is the strength; this technique has revolutionized single-molecule studies in biology.

Historical Development

Ancient and Classical Theories

Ancient civilizations developed early qualitative theories of light and vision, primarily through philosophical and observational means, without quantitative experimentation or distinctions between wave and particle natures. In around 300 BCE, formalized geometric in his treatise Optica, postulating that light propagates in straight lines via visual rays emanating from the eye, which he used to explain reflection and the apparent size of objects based on the angle subtended by these rays. This extramission theory, where sight results from rays emitted by the observer, was endorsed by and , contrasting with the intromission view that light enters the eye from external sources. Aristotle, in the 4th century BCE, challenged pure extramission by proposing an intromission theory in works like De Anima, arguing that vision occurs when transparent media transmit forms or "species" from objects into the eye, facilitated by light as an active agent that actualizes potential transparency in air and other media. This sparked ongoing debates between emission (extramission) and intromission theories, with emphasizing light's role in enabling perception without rays originating solely from the eye, though he retained elements of both. Meanwhile, in ancient during the 5th century BCE, the philosopher described pinhole imaging in the Mozi text, observing that light rays entering a small in a dark room project an inverted image of external objects, demonstrating straight-line propagation without invoking vision theories. In ancient India, contributions to optics emerged alongside medical texts. The Sushruta Samhita, attributed to Sushruta around the 6th century BCE, detailed ophthalmological procedures including cataract surgery. In the Vaisheshika school, founded by Kanada around the 6th century BCE, light was conceived as streams of fine, high-velocity particles known as tejas (fire atoms), which propagate in straight lines and enable vision by entering the eye from luminous sources, aligning with intromission ideas. During the , Ptolemy's (2nd century CE) built on by compiling empirical tables of angles for light passing from air to and , using an experimental setup to measure incidence and , though his data showed inaccuracies due to observational limits. This work treated light rays as straight lines bent at interfaces, focusing on without resolving emission debates. In the , (Alhazen) revolutionized the field in his 11th-century , decisively supporting intromission by refuting extramission through experiments with , where light from objects forms images on screens without eye involvement, establishing that vision results from rays entering the eye. Medieval European scholars synthesized these ideas, with Witelo's Perspectiva (late ) providing a comprehensive Latin treatise on , drawing heavily from Alhazen and to explore ray propagation, , and perspective in vision, treating light as quantifiable rays for . These pre-modern theories remained largely qualitative and philosophical, emphasizing geometric rays and vision mechanisms through and deduction, setting the stage for later experimental transitions without yet distinguishing wave or particle behaviors.

17th-19th Century Theories

In the , proposed an emission theory of light in his work La Dioptrique (), positing that light consists of particles propelled instantaneously through a medium of swirling vortices composed of subtle , which accounted for phenomena like as mechanical pressures within these cosmic eddies. This corpuscular model emphasized light's propagation as a direct emission from luminous sources, aligning with mechanistic philosophy but assuming infinite speed to explain observations without delay. Isaac Newton advanced the particle theory in his seminal Opticks (1704), describing light as streams of minute, elastic particles that obey laws of motion similar to projectiles. Newton explained refraction not as a change in speed but as the particles' deviation due to attractive forces exerted by denser media, such as glass pulling particles toward it with varying intensity based on their inherent "sides" or properties, which also accounted for color dispersion in prisms. This framework unified reflection as elastic collisions and supported the corpuscular view by fitting empirical data from his prism experiments, though it struggled with later diffraction observations. Countering the particle model, introduced a wave theory in his Traité de la Lumière (written in 1678, published 1690), conceiving light as longitudinal pressure waves propagating through an elastic —a pervasive, subtle medium filling . derived the laws of reflection and geometrically using the concept of secondary wavelets emanating from each point on a , with the to these wavelets forming the new , thus explaining light's rectilinear path as the of expanding spherical pulses in the . His approach anticipated as interference among wavelets but lacked quantitative detail for it, emphasizing instead the finite consistent with astronomical delays. The wave theory gained empirical traction in the early 19th century through Thomas Young's (1801), which demonstrated interference patterns of alternating bright and dark fringes when light passed through two closely spaced apertures, a attributable solely to the superposition of coherent wave trains rather than particle streams. Building on this, developed a mathematical formulation of in 1818, applying Huygens' principle with interference to predict the intensity distribution around obstacles, such as the bright spot at the center of a circular shadow (Poisson's spot), which decisively refuted Newton's particle model through precise calculations matching observations. Fresnel's equations for oblique integrated Young's interference with wave propagation, solidifying the wave nature by quantifying how secondary sources constructively or destructively combine. The particle-wave debate culminated in classical unification with James Clerk Maxwell's electromagnetic theory (1865), which portrayed light as transverse electromagnetic waves arising from oscillating electric and magnetic fields in the , governed by coupled partial differential equations. Maxwell derived the wave speed as c=1ϵ0μ0c = \frac{1}{\sqrt{\epsilon_0 \mu_0}}
Add your contribution
Related Hubs
User Avatar
No comments yet.