Hubbry Logo
search
logo
Light
Light
current hub
2321509

Light

logo
Community Hub0 Subscribers
Read side by side
from Wikipedia

A triangular prism dispersing a beam of white light. The longer wavelengths (red) and the shorter wavelengths (green-blue) are separated.

Light, visible light, or visible radiation is electromagnetic radiation that can be perceived by the human eye.[1][2] Visible light spans the visible spectrum and is usually defined as having wavelengths in the range of 400–700 nanometres (nm), corresponding to frequencies of 750–420 terahertz. The visible band sits adjacent to the infrared (with longer wavelengths and lower frequencies) and the ultraviolet (with shorter wavelengths and higher frequencies), called collectively optical radiation.[3][4]

In physics, the term "light" may refer more broadly to electromagnetic radiation of any wavelength, whether visible or not.[5][6] In this sense, gamma rays, X-rays, microwaves and radio waves are also light. The primary properties of light are intensity, propagation direction, frequency or wavelength spectrum, and polarization. Its speed in vacuum, 299792458 m/s, is one of the fundamental constants of nature.[7] All electromagnetic radiation exhibits some properties of both particles and waves. Single, massless elementary particles, or quanta, of light called photons can be detected with specialized equipment; phenomena like interference are described by waves. Most everyday interactions with light can be understood using geometrical optics; quantum optics, is an important research area in modern physics.

The main source of natural light on Earth is the Sun. Historically, another important source of light for humans has been fire, from ancient campfires to modern kerosene lamps. With the development of electric lights and power systems, electric lighting has effectively replaced firelight.

Electromagnetic spectrum and visible light

[edit]
The electromagnetic spectrum, with the visible portion highlighted. The bottom graph (Visible spectrum) is wavelength in units of nanometres (nm).

Generally, electromagnetic radiation (EMR) is classified by wavelength into radio waves, microwaves, infrared, the visible spectrum that we perceive as light, ultraviolet, X-rays and gamma rays. The designation "radiation" excludes static electric, magnetic and near fields.

The behavior of EMR depends on its wavelength. Higher frequencies have shorter wavelengths and lower frequencies have longer wavelengths. When EMR interacts with single atoms and molecules, its behavior depends on the amount of energy per quantum it carries.

EMR in the visible light region consists of quanta (called photons) that are at the lower end of the energies that are capable of causing electronic excitation within molecules, which leads to changes in the bonding or chemistry of the molecule. At the lower end of the visible light spectrum, EMR becomes invisible to humans (infrared) because its photons no longer have enough individual energy to cause a lasting molecular change (a change in conformation) in the visual molecule retinal in the human retina, which change triggers the sensation of vision.

There exist animals that are sensitive to various types of infrared, but not by means of quantum-absorption. Infrared sensing in snakes depends on a kind of natural thermal imaging, in which tiny packets of cellular water are raised in temperature by the infrared radiation. EMR in this range causes molecular vibration and heating effects, which is how these animals detect it.

Above the frequency range of visible light, ultraviolet light becomes invisible to humans, mostly because it is absorbed by the cornea with wavelengths shorter than 360 nm and the internal lens at wavelengths shorter than 400 nm. Furthermore, the rods and cones located in the retina of the human eye cannot detect the very short (shorter than 360 nm) ultraviolet wavelengths and are in fact damaged by ultraviolet. Many animals with eyes that do not require lenses (such as insects and shrimp) are able to detect ultraviolet, by quantum photon-absorption mechanisms, in much the same chemical way that humans detect visible light.

Various sources define visible light as narrowly as 420–680 nm[8][9] to as broadly as 380–800 nm.[10][11] Under ideal laboratory conditions, people can see infrared up to at least 1,050 nm;[12] children and young adults may perceive ultraviolet wavelengths down to about 310–313 nm.[13][14][15]

Plant growth is also affected by the colour spectrum of light, a process known as photomorphogenesis.

Speed of light

[edit]
Beam of sun light inside the cavity of Rocca ill'Abissu at Fondachelli-Fantina, Sicily

The speed of light in vacuum is defined to be exactly 299792458 m/s (approximately 186,282 miles per second). The fixed value of the speed of light in SI units results from the fact that the metre is now defined in terms of the speed of light. All forms of electromagnetic radiation move at exactly this same speed in vacuum.

Different physicists have attempted to measure the speed of light throughout history. Galileo attempted to measure the speed of light in the seventeenth century. An early experiment to measure the speed of light was conducted by Ole Rømer, a Danish physicist, in 1676. Using a telescope, Rømer observed the motions of Jupiter and one of its moons, Io. Noting discrepancies in the apparent period of Io's orbit, he calculated that light takes about 22 minutes to traverse the diameter of Earth's orbit.[16] However, its size was not known at that time. If Rømer had known the diameter of the Earth's orbit, he would have calculated a speed of 227000000 m/s.

Another more accurate measurement of the speed of light was performed in Europe by Hippolyte Fizeau in 1849.[17] Fizeau directed a beam of light at a mirror several kilometers away. A rotating cog wheel was placed in the path of the light beam as it traveled from the source, to the mirror and then returned to its origin. Fizeau found that at a certain rate of rotation, the beam would pass through one gap in the wheel on the way out and the next gap on the way back. Knowing the distance to the mirror, the number of teeth on the wheel and the rate of rotation, Fizeau was able to calculate the speed of light as 313000000 m/s.

Léon Foucault carried out an experiment which used rotating mirrors to obtain a value of 298000000 m/s[17] in 1862. Albert A. Michelson conducted experiments on the speed of light from 1877 until his death in 1931. He refined Foucault's methods in 1926 using improved rotating mirrors to measure the time it took light to make a round trip from Mount Wilson to Mount San Antonio in California. The precise measurements yielded a speed of 299796000 m/s.[18]

The effective velocity of light in various transparent substances containing ordinary matter, is less than in vacuum. For example, the speed of light in water is about 3/4 of that in vacuum.

Two independent teams of physicists were said to bring light to a "complete standstill" by passing it through a Bose–Einstein condensate of the element rubidium, one team at Harvard University and the Rowland Institute for Science in Cambridge, Massachusetts and the other at the Harvard–Smithsonian Center for Astrophysics, also in Cambridge.[19] However, the popular description of light being "stopped" in these experiments refers only to light being stored in the excited states of atoms, then re-emitted at an arbitrary later time, as stimulated by a second laser pulse. During the time it had "stopped", it had ceased to be light.

Optics

[edit]

The study of light and the interaction of light and matter is termed optics. Optics has different forms appropriate to different circumstances. Geometrical optics, appropriate for understanding things like eyes, lenses, cameras, fiber optics, and mirrors, works well when the wavelength of light is small in comparison to the objects it interacts with. Physical optics incorporates wave properties and is needed understand diffraction and interference. Quantum optics applies when studying individual photons interacting with matter.[20]: 33 

Surface scattering

[edit]

A transparent object allows light to transmit or pass through. Conversely, an opaque object does not allow light to transmit through and instead reflecting or absorbing the light it receives. Most objects do not reflect or transmit light specularly and to some degree scatters the incoming light, which is called glossiness. Surface scattering is caused by the surface roughness of the reflecting surfaces, and internal scattering is caused by the difference of refractive index between the particles and medium inside the object. Like transparent objects, translucent objects allow light to transmit through, but translucent objects also scatter certain wavelength of light via internal scattering.[21]

Refraction

[edit]
Due to refraction, the straw dipped in water appears bent and the ruler scale compressed when viewed from a shallow angle.

Refraction is the bending of light rays when passing through a surface between one transparent material and another. It is described by Snell's Law:

where θ1 is the angle between the ray and the surface normal in the first medium, θ2 is the angle between the ray and the surface normal in the second medium and n1 and n2 are the indices of refraction, n = 1 in a vacuum and n > 1 in a transparent substance.

When a beam of light crosses the boundary between a vacuum and another medium, or between two different media, the wavelength of the light changes, but the frequency remains constant. If the beam of light is not orthogonal (or rather normal) to the boundary, the change in wavelength results in a change in the direction of the beam. This change of direction is known as refraction.

The refractive quality of lenses is frequently used to manipulate light in order to change the apparent size of images. Magnifying glasses, spectacles, contact lenses, microscopes and refracting telescopes are all examples of this manipulation.

Light sources

[edit]

There are many sources of light. A body at a given temperature emits a characteristic spectrum of black-body radiation. A simple thermal source is sunlight, the radiation emitted by the chromosphere of the Sun at around 6,000 K (5,730 °C; 10,340 °F). Solar radiation peaks in the visible region of the electromagnetic spectrum when plotted in wavelength units,[22] and roughly 44% of the radiation that reaches the ground is visible.[23] Another example is incandescent light bulbs, which emit only around 10% of their energy as visible light and the remainder as infrared. A common thermal light source in history is the glowing solid particles in flames, but these also emit most of their radiation in the infrared and only a fraction in the visible spectrum.

The peak of the black-body spectrum is in the deep infrared, at about 10 micrometre wavelength, for relatively cool objects like human beings. As the temperature increases, the peak shifts to shorter wavelengths, producing first a red glow, then a white one and finally a blue-white colour as the peak moves out of the visible part of the spectrum and into the ultraviolet. These colours can be seen when metal is heated to "red hot" or "white hot". Blue-white thermal emission is not often seen, except in stars (the commonly seen pure-blue colour in a gas flame or a welder's torch is in fact due to molecular emission, notably by CH radicals emitting a wavelength band around 425 nm and is not seen in stars or pure thermal radiation).

Atoms emit and absorb light at characteristic energies. This produces "emission lines" in the spectrum of each atom. Emission can be spontaneous, as in light-emitting diodes, gas discharge lamps (such as neon lamps and neon signs, mercury-vapor lamps, etc.) and flames (light from the hot gas itself—so, for example, sodium in a gas flame emits characteristic yellow light). Emission can also be stimulated, as in a laser or a microwave maser.

Deceleration of a free charged particle, such as an electron, can produce visible radiation: cyclotron radiation, synchrotron radiation and bremsstrahlung radiation are all examples of this. Particles moving through a medium faster than the speed of light in that medium can produce visible Cherenkov radiation. Certain chemicals produce visible radiation by chemoluminescence. In living things, this process is called bioluminescence. For example, fireflies produce light by this means and boats moving through water can disturb plankton which produce a glowing wake.

Certain substances produce light when they are illuminated by more energetic radiation, a process known as fluorescence. Some substances emit light slowly after excitation by more energetic radiation. This is known as phosphorescence. Phosphorescent materials can also be excited by bombarding them with subatomic particles. Cathodoluminescence is one example. This mechanism is used in cathode-ray tube television sets and computer monitors.

Hong Kong illuminated by colourful artificial lighting

Certain other mechanisms can produce light:

When the concept of light is intended to include very-high-energy photons (gamma rays), additional generation mechanisms include:

Measurement

[edit]

Light is measured with two main alternative sets of units: radiometry consists of measurements of light power at all wavelengths, while photometry measures light with wavelength weighted with respect to a standardized model of human brightness perception. Photometry is useful, for example, to quantify Illumination (lighting) intended for human use.

The photometry units are different from most systems of physical units in that they take into account how the human eye responds to light. The cone cells in the human eye are of three types which respond differently across the visible spectrum and the cumulative response peaks at a wavelength of around 555 nm. Therefore, two sources of light which produce the same intensity (W/m2) of visible light do not necessarily appear equally bright. The photometry units are designed to take this into account and therefore are a better representation of how "bright" a light appears to be than raw intensity. They relate to raw power by a quantity called luminous efficacy and are used for purposes like determining how to best achieve sufficient illumination for various tasks in indoor and outdoor settings. The illumination measured by a photocell sensor does not necessarily correspond to what is perceived by the human eye and without filters which may be costly, photocells and charge-coupled devices (CCD) tend to respond to some infrared, ultraviolet or both.

Light pressure

[edit]

Light exerts physical pressure on objects in its path, a phenomenon which can be deduced by Maxwell's equations, but can be more easily explained by the particle nature of light: photons strike and transfer their momentum. Light pressure is equal to the power of the light beam divided by c, the speed of light.  Due to the magnitude of c, the effect of light pressure is negligible for everyday objects.  For example, a one-milliwatt laser pointer exerts a force of about 3.3 piconewtons on the object being illuminated; thus, one could lift a U.S. penny with laser pointers, but doing so would require about 30 billion 1-mW laser pointers.[24]  However, in nanometre-scale applications such as nanoelectromechanical systems (NEMS), the effect of light pressure is more significant and exploiting light pressure to drive NEMS mechanisms and to flip nanometre-scale physical switches in integrated circuits is an active area of research.[25] At larger scales, light pressure can cause asteroids to spin faster,[26] acting on their irregular shapes as on the vanes of a windmill.  The possibility of making solar sails that would accelerate spaceships in space is also under investigation.[27][28]

Although the motion of the Crookes radiometer was originally attributed to light pressure, this interpretation is incorrect; the characteristic Crookes rotation is the result of a partial vacuum.[29] This should not be confused with the Nichols radiometer, in which the (slight) motion caused by torque (though not enough for full rotation against friction) is directly caused by light pressure.[30] As a consequence of light pressure, Einstein in 1909 predicted the existence of "radiation friction" which would oppose the movement of matter.[31] He wrote, "radiation will exert pressure on both sides of the plate. The forces of pressure exerted on the two sides are equal if the plate is at rest. However, if it is in motion, more radiation will be reflected on the surface that is ahead during the motion (front surface) than on the back surface. The backwardacting force of pressure exerted on the front surface is thus larger than the force of pressure acting on the back. Hence, as the resultant of the two forces, there remains a force that counteracts the motion of the plate and that increases with the velocity of the plate. We will call this resultant 'radiation friction' in brief."

Usually light momentum is aligned with its direction of motion. However, for example in evanescent waves momentum is transverse to direction of propagation.[32]

Historical theories about light, in chronological order

[edit]

Classical Greece and Hellenism

[edit]

In the fifth century BC, Empedocles postulated that everything was composed of four elements; fire, air, earth and water. He believed that goddess Aphrodite made the human eye out of the four elements and that she lit the fire in the eye which shone out from the eye making sight possible. If this were true, then one could see during the night just as well as during the day, so Empedocles postulated an interaction between rays from the eyes and rays from a source such as the sun.[33]

In about 300 BC, Euclid wrote Optica, in which he studied the properties of light. Euclid postulated that light travelled in straight lines and he described the laws of reflection and studied them mathematically. He questioned that sight is the result of a beam from the eye, for he asks how one sees the stars immediately, if one closes one's eyes, then opens them at night. If the beam from the eye travels infinitely fast this is not a problem.[34]

In 55 BC, Lucretius, a Roman who carried on the ideas of earlier Greek atomists, wrote that "The light & heat of the sun; these are composed of minute atoms which, when they are shoved off, lose no time in shooting right across the interspace of air in the direction imparted by the shove." (from On the nature of the Universe). Despite being similar to later particle theories, Lucretius's views were not generally accepted. Ptolemy (c. second century) wrote about the refraction of light in his book Optics.[35]

Classical India

[edit]

In ancient India, the Hindu schools of Samkhya and Vaisheshika, from around the early centuries AD developed theories on light. According to the Samkhya school, light is one of the five fundamental "subtle" elements (tanmatra) out of which emerge the gross elements. The atomicity of these elements is not specifically mentioned and it appears that they were actually taken to be continuous.[36] The Vishnu Purana refers to sunlight as "the seven rays of the sun".[36]

The Indian Buddhists, such as Dignāga in the fifth century and Dharmakirti in the seventh century, developed a type of atomism that is a philosophy about reality being composed of atomic entities that are momentary flashes of light or energy. They viewed light as being an atomic entity equivalent to energy.[36]

Descartes

[edit]

René Descartes (1596–1650) held that light was a mechanical property of the luminous body, rejecting the "forms" of Ibn al-Haytham and Witelo as well as the "species" of Roger Bacon, Robert Grosseteste and Johannes Kepler.[37] In 1637 he published a theory of the refraction of light that assumed, incorrectly, that light travelled faster in a denser medium than in a less dense medium. Descartes arrived at this conclusion by analogy with the behaviour of sound waves.[citation needed] Although Descartes was incorrect about the relative speeds, he was correct in assuming that light behaved like a wave and in concluding that refraction could be explained by the speed of light in different media.

Descartes is not the first to use the mechanical analogies but because he clearly asserts that light is only a mechanical property of the luminous body and the transmitting medium, Descartes's theory of light is regarded as the start of modern physical optics.[37]

Particle theory

[edit]
Pierre Gassendi

Pierre Gassendi (1592–1655), an atomist, proposed a particle theory of light which was published posthumously in the 1660s. Isaac Newton studied Gassendi's work at an early age and preferred his view to Descartes's theory of the plenum. He stated in his Hypothesis of Light of 1675 that light was composed of corpuscles (particles of matter) which were emitted in all directions from a source. One of Newton's arguments against the wave nature of light was that waves were known to bend around obstacles, while light travelled only in straight lines. He did, however, explain the phenomenon of the diffraction of light (which had been observed by Francesco Grimaldi) by allowing that a light particle could create a localised wave in the aether.

Newton's theory could be used to predict the reflection of light, but could only explain refraction by incorrectly assuming that light accelerated upon entering a denser medium because the gravitational pull was greater. Newton published the final version of his theory in his Opticks of 1704. His reputation helped the particle theory of light to hold sway during the eighteenth century. The particle theory of light led Pierre-Simon Laplace to argue that a body could be so massive that light could not escape from it. In other words, it would become what is now called a black hole. Laplace withdrew his suggestion later, after a wave theory of light became firmly established as the model for light (as has been explained, neither a particle or wave theory is fully correct). A translation of Newton's essay on light appears in The large scale structure of space-time, by Stephen Hawking and George F. R. Ellis.

The fact that light could be polarized was for the first time qualitatively explained by Newton using the particle theory. Étienne-Louis Malus in 1810 created a mathematical particle theory of polarization. Jean-Baptiste Biot in 1812 showed that this theory explained all known phenomena of light polarization. At that time the polarization was considered as the proof of the particle theory.

Wave theory

[edit]

To explain the origin of colours, Robert Hooke (1635–1703) developed a "pulse theory" and compared the spreading of light to that of waves in water in his 1665 work Micrographia ("Observation IX"). In 1672 Hooke suggested that light's vibrations could be perpendicular to the direction of propagation. Christiaan Huygens (1629–1695) worked out a mathematical wave theory of light in 1678 and published it in his Treatise on Light in 1690. He proposed that light was emitted in all directions as a series of waves in a medium called the luminiferous aether. As waves are not affected by gravity, it was assumed that they slowed down upon entering a denser medium.[38] Another supporter of the wave theory was Leonhard Euler. He argued in Nova theoria lucis et colorum (1746) that diffraction could more easily be explained by a wave theory.

Christiaan Huygens
Thomas Young's sketch of water waves showing diffraction[39]

The wave theory predicted that light waves could interfere with each other like sound waves (as noted around 1800 by Thomas Young). Young showed by means of numerous diffraction experiments that light behaved as waves[40]: 101 . He first publicly stated his "general law" of interference in January 1802, in his book A Syllabus of a Course of Lectures on Natural and Experimental Philosophy:[41]

But the general law, by which all these appearances are governed, may be very easily deduced from the interference of two coincident undulations, which either cooperate, or destroy each other, in the same manner as two musical notes produce an alternate intension and remission, in the beating of an imperfect unison.[42]

He also proposed that different colours were caused by different wavelengths of light and explained colour vision in terms of three-coloured receptors in the eye.

In 1816 André-Marie Ampère gave Augustin-Jean Fresnel an idea that the polarization of light can be explained by the wave theory if light were a transverse wave.[43] Later, Fresnel independently worked out his own wave theory of light and presented it to the Académie des Sciences in 1817. Siméon Denis Poisson challenged Fresnel's model, claiming that it predicted a bright spot in the shadow behind a circular obstacle contrary to common sense. Dominique-François-Jean Arago created an experiment that showed the bright spot: Poisson's challenge became new evidence for the wave theory.[40]: 109  In 1818, Young wrote to Arago suggesting that light must be transverse waves, not the longitudinal waves characteristic of sound. Fresnel took up the idea and was able to show via mathematical methods that polarization could be explained by a transverse wave theory of light with no longitudinal vibration.[40]: 115 

The weakness of the wave theory was that light waves, like sound waves, would need a medium for transmission. The existence of the hypothetical substance luminiferous aether proposed by Huygens in 1678 was cast into strong doubt in the late nineteenth century by the Michelson–Morley experiment.

Newton's corpuscular theory implied that light would travel faster in a denser medium, while the wave theory of Huygens and others implied the opposite. At that time, the speed of light could not be measured accurately enough to decide which theory was correct. The first to make a sufficiently accurate measurement was Léon Foucault, in 1850.[44] His result supported the wave theory, and the classical particle theory was finally abandoned (only to partly re-emerge in the twentieth century as photons in quantum theory).

Electromagnetic theory

[edit]
A linearly polarized electromagnetic wave traveling along the z-axis, with E denoting the electric field and perpendicular B denoting magnetic field

In 1845, Michael Faraday discovered that the plane of polarization of linearly polarized light is rotated when the light rays travel along the magnetic field direction in the presence of a transparent dielectric, an effect now known as Faraday rotation.[45] This was the first evidence that light was related to electromagnetism. In 1846 he speculated that light might be some form of disturbance propagating along magnetic field lines.[45] Faraday proposed in 1847 that light was a high-frequency electromagnetic vibration, which could propagate even in the absence of a medium such as the ether.[46]

Faraday's work inspired James Clerk Maxwell to study electromagnetic radiation and light. Maxwell discovered that self-propagating electromagnetic waves would travel through space at a constant speed, which happened to be equal to the previously measured speed of light. From this, Maxwell concluded that light was a form of electromagnetic radiation: he first stated this result in 1862 in On Physical Lines of Force. In 1873, he published A Treatise on Electricity and Magnetism, which contained a full mathematical description of the behavior of electric and magnetic fields, still known as Maxwell's equations. Soon after, Heinrich Hertz confirmed Maxwell's theory experimentally by generating and detecting radio waves in the laboratory and demonstrating that these waves behaved exactly like visible light, exhibiting properties such as reflection, refraction, diffraction and interference. Maxwell's theory and Hertz's experiments led directly to the development of modern radio, radar, television, electromagnetic imaging and wireless communications.

In the quantum theory, photons are seen as wave packets of the waves described in the classical theory of Maxwell. The quantum theory was needed to explain effects even with visual light that Maxwell's classical theory could not (such as spectral lines).

Quantum theory

[edit]

In 1900 Max Planck, attempting to explain black-body radiation, suggested that although light was a wave, these waves could gain or lose energy only in finite amounts related to their frequency. Planck called these "lumps" of light energy "quanta" (from a Latin word for "how much"). In 1905, Albert Einstein used the idea of light quanta to explain the photoelectric effect and suggested that these light quanta had a "real" existence. In 1923 Arthur Holly Compton showed that the wavelength shift seen when low intensity X-rays scattered from electrons (so called Compton scattering) could be explained by a particle-theory of X-rays, but not a wave theory. In 1926 Gilbert N. Lewis named these light quanta particles photons.[47]

Eventually quantum mechanics came to picture light as (in some sense) both a particle and a wave, and (in another sense) as a phenomenon which is neither a particle nor a wave (which actually are macroscopic phenomena, such as baseballs or ocean waves). Instead, under some approximations light can be described sometimes with mathematics appropriate to one type of macroscopic metaphor (particles) and sometimes another macroscopic metaphor (waves).

As in the case for radio waves and the X-rays involved in Compton scattering, physicists have noted that electromagnetic radiation tends to behave more like a classical wave at lower frequencies, but more like a classical particle at higher frequencies, but never completely loses all qualities of one or the other. Visible light, which occupies a middle ground in frequency, can easily be shown in experiments to be describable using either a wave or particle model, or sometimes both.

In 1924–1925, Satyendra Nath Bose showed that light followed different statistics from that of classical particles. With Einstein, they generalized this result for a whole set of integer spin particles called bosons (after Bose) that follow Bose–Einstein statistics. The photon is a massless boson of spin 1.

In 1927, Paul Dirac quantized the electromagnetic field. Pascual Jordan and Vladimir Fock generalized this process to treat many-body systems as excitations of quantum fields, a process with the misnomer of second quantization. And at the end of the 1940s a full theory of quantum electrodynamics was developed using quantum fields based on the works of Julian Schwinger, Richard Feynman, Freeman Dyson, and Shinichiro Tomonaga.

Quantum optics

[edit]

John R. Klauder, George Sudarshan, Roy J. Glauber, and Leonard Mandel applied quantum theory to the electromagnetic field in the 1950s and 1960s to gain a more detailed understanding of photodetection and the statistics of light (see degree of coherence). This led to the introduction of the coherent state as a concept which addressed variations between laser light, thermal light, exotic squeezed states, etc. as it became understood that light cannot be fully described just referring to the electromagnetic fields describing the waves in the classical picture. In 1977, H. Jeff Kimble et al. demonstrated a single atom emitting one photon at a time, further compelling evidence that light consists of photons. Previously unknown quantum states of light with characteristics unlike classical states, such as squeezed light were subsequently discovered.

Development of short and ultrashort laser pulses—created by Q switching and modelocking techniques—opened the way to the study of what became known as ultrafast processes. Applications for solid state research (e.g. Raman spectroscopy) were found, and mechanical forces of light on matter were studied. The latter led to levitating and positioning clouds of atoms or even small biological samples in an optical trap or optical tweezers by laser beam. This, along with Doppler cooling and Sisyphus cooling, was the crucial technology needed to achieve the celebrated Bose–Einstein condensation.

Other remarkable results are the demonstration of quantum entanglement, quantum teleportation, and quantum logic gates. The latter are of much interest in quantum information theory, a subject which partly emerged from quantum optics, partly from theoretical computer science.

Use for light on Earth

[edit]

Sunlight provides the energy that green plants use to create sugars mostly in the form of starches, which release energy into the living things that digest them. This process of photosynthesis provides virtually all the energy used by living things. Some species of animals generate their own light, a process called bioluminescence. For example, fireflies use light to locate mates and vampire squid use it to hide themselves from prey.

See also

[edit]

Notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Light is electromagnetic radiation within the portion of the electromagnetic spectrum that is visible to the human eye, corresponding to wavelengths between approximately 380 and 750 nanometers.[1] This narrow band, often spanning from violet (shorter wavelengths) to red (longer wavelengths), enables vision and is produced by various sources such as the sun, incandescent bulbs, and luminescent materials.[2] Light travels through vacuum at a constant speed of exactly 299,792,458 meters per second, a fundamental physical constant that defines the meter in the International System of Units (SI).[3] As a form of energy transfer, light consists of oscillating electric and magnetic fields perpendicular to its direction of propagation, manifesting wave-like properties such as interference and diffraction.[4] Simultaneously, light exhibits particle-like behavior, behaving as discrete packets of energy called photons, each with energy proportional to its frequency, in accordance with quantum mechanics.[5] This wave-particle duality underpins modern physics, explaining phenomena from the photoelectric effect to the behavior in double-slit experiments.[6] Beyond visibility, light's broader electromagnetic context includes ultraviolet and infrared radiation adjacent to the visible spectrum, influencing applications in photography, telecommunications, and medical imaging.[7]

Electromagnetic Nature

Electromagnetic Spectrum

Electromagnetic radiation is a form of energy propagated through space as coupled oscillating electric and magnetic fields that are mutually perpendicular to each other and to the direction of propagation.[8] These waves travel at the speed of light in vacuum, $ c = 3 \times 10^8 $ m/s, a universal constant for all electromagnetic waves regardless of frequency or wavelength.[3] The electromagnetic spectrum encompasses the full range of these waves, ordered by decreasing wavelength (or increasing frequency), from long-wavelength, low-energy radio waves to short-wavelength, high-energy gamma rays. The spectrum is divided into regions based on wavelength and frequency, each exhibiting distinct interactions with matter. The table below summarizes approximate ranges for the primary components, derived from standard astronomical and physical classifications.[9]
RegionWavelength RangeFrequency Range (Hz)
Radio waves> 1 × 10^{-1} m< 3 × 10^9
Microwaves1 × 10^{-3} to 1 × 10^{-1} m3 × 10^9 to 3 × 10^{11}
Infrared7 × 10^{-7} to 1 × 10^{-3} m3 × 10^{11} to 4 × 10^{14}
Visible4 × 10^{-7} to 7 × 10^{-7} m (400–700 nm)4 × 10^{14} to 7.5 × 10^{14}
Ultraviolet1 × 10^{-8} to 4 × 10^{-7} m7.5 × 10^{14} to 3 × 10^{16}
X-rays1 × 10^{-11} to 1 × 10^{-8} m3 × 10^{16} to 3 × 10^{19}
Gamma rays< 1 × 10^{-11} m> 3 × 10^{19}
The energy $ E $ of a photon in the spectrum is related to its frequency $ f $ by Planck's relation $ E = h f $, where $ h $ is Planck's constant ($ 6.626 \times 10^{-34} $ J s). This implies that photon energy decreases with increasing wavelength, as frequency is inversely proportional to wavelength ($ f = c / \lambda $). Thus, radio waves carry the lowest energy, while gamma rays carry the highest.[9] The naming conventions for these regions arose historically from their discovery and initial detection methods. Infrared, meaning "below red," was named by William Herschel in 1800 after observing heating effects beyond visible red light.[10] Ultraviolet, or "beyond violet," was identified by Johann Ritter in 1801 through its chemical effects on silver chloride. Radio waves and microwaves trace to Heinrich Hertz's 1887–1888 experiments confirming Maxwell's predictions. X-rays were termed by Wilhelm Röntgen in 1895 for their mysterious penetrating properties, and gamma rays by Paul Villard in 1900, later confirmed as electromagnetic by further studies.[10] Visible light occupies the narrow band perceptible to the human eye.[9]

Visible Light

Visible light constitutes the segment of the electromagnetic spectrum detectable by the human eye, spanning wavelengths from approximately 400 to 700 nanometers. This range corresponds to a continuum of colors, starting with violet at the shorter wavelengths (around 400–450 nm), progressing through blue (450–495 nm), green (495–570 nm), yellow (570–590 nm), orange (590–620 nm), and ending with red at the longer wavelengths (620–700 nm).[2] These colors emerge from the differential refraction and dispersion of light wavelengths, illustrating the spectral nature of white light.[11] The perception of color relies on how these wavelengths interact and combine. In additive color mixing, as exemplified by the RGB model employed in electronic displays and lighting, red, green, and blue primary lights are superimposed to generate secondary colors and ultimately white light when combined in equal intensities.[12] Conversely, subtractive color mixing, utilized in printing and painting via the CMY model (cyan, magenta, yellow), works by pigments absorbing specific wavelengths from incident white light, with the mixture of all three primaries yielding black or near-black. A foundational example of visible light's chromatic composition is its decomposition into a rainbow spectrum when passed through a prism, revealing the inherent multiplicity of wavelengths in seemingly uniform white light.[13][11] Human visual perception of visible light is tuned to this narrow band, with photopic vision—dominant in well-lit environments—peaking in sensitivity at 555 nm in the green-yellow region, where cone photoreceptor cells enable color discrimination through three types sensitive to short (blue), medium (green), and long (red) wavelengths. In dim scotopic conditions, rod cells predominate for low-light detection, providing grayscale vision without color but with heightened sensitivity to motion and shapes, peaking around 507 nm. This dual system optimizes adaptation across lighting levels, though overall sensitivity drops sharply beyond the 400–700 nm bounds.[14][15] As non-ionizing radiation, visible light lacks the photon energy to eject electrons from atoms, distinguishing it from ionizing ultraviolet (below 400 nm) and X-rays (0.01–10 nm), which can damage DNA directly. Nonetheless, it exerts photochemical effects by exciting molecules in biological systems, such as triggering melanin production in skin cells upon absorption by chromophores like melanin and opsins, leading to pigmentation and potential oxidative stress.[16][17] Visible light's colors also carry cultural and symbolic weight across societies, often leveraging innate perceptual cues for communication. For example, traffic signal systems universally employ red for stop (evoking danger due to its long wavelength visibility), yellow for caution (signaling transition), and green for proceed (indicating safety), a standardized convention that enhances road safety through intuitive color associations.[18]

Fundamental Properties

Speed of Light

The speed of light in vacuum, denoted as $ c $, is a fundamental physical constant exactly equal to 299,792,458 meters per second.[3] This value has been fixed by definition in the International System of Units (SI) since 1983, when the meter was redefined in terms of the distance light travels in vacuum in 1/299,792,458 of a second, anchoring the unit to this invariant speed. Early attempts to measure $ c $ began in the 17th century. In 1676, Danish astronomer Ole Rømer provided the first quantitative estimate by observing discrepancies in the timing of Jupiter's moon Io's eclipses, attributing delays to the finite time light takes to travel varying distances across Earth's orbit around the Sun; his calculation yielded approximately 227,000 km/s, remarkably close to the modern value given the era's observational limits.[19] Terrestrial measurements advanced in the 19th century with Hippolyte Fizeau's 1849 experiment, which used a rapidly rotating toothed wheel to interrupt and time light pulses traveling 8.6 km to a distant mirror and back, yielding a speed of about 313,000 km/s in air.[20] Refinements continued with Albert A. Michelson's 1926 rotating-mirror apparatus at Mount Wilson Observatory, where an octagonal mirror spun at high speeds reflected light over a 35-km path, producing a value of 299,796 km/s with unprecedented precision for the time.[21] Modern determinations, such as those using laser interferometry in the 1970s, confirmed the value to within a few parts per billion before its exact definition, employing coherent light sources to measure phase shifts over known baselines.[22] The invariance of $ c $ underpins Albert Einstein's 1905 theory of special relativity, positing that light's speed in vacuum remains constant for all inertial observers regardless of their relative motion or the source's velocity, a postulate derived from the null result of the Michelson-Morley experiment and Maxwell's equations.[23] This leads to profound consequences, including time dilation—where moving clocks tick slower—and length contraction in the direction of motion, as observers reconcile the unchanging $ c $ with differing relative speeds. Contextually, these principles enable the derivation of the mass-energy equivalence $ E = mc^2 $, showing that a body's rest energy is proportional to its mass times $ c^2 $, as explored in Einstein's companion 1905 paper linking inertia to energy content. In media other than vacuum, light travels slower, with speed $ v $ related to $ c $ by the refractive index $ n = c / v $, a dimensionless quantity greater than 1 that quantifies the medium's optical density.[24] This invariance in vacuum establishes $ c $ as the universal speed limit for information and causal influences, ensuring that no signal or particle with mass can exceed it, thereby preserving causality across spacetime as dictated by relativistic principles.[23]

Wave-Particle Duality

Light exhibits both wave-like and particle-like properties, a phenomenon known as wave-particle duality, which reconciles classical descriptions with quantum mechanics. This duality is not a limitation of measurement but a fundamental aspect of light's nature, revealed through experiments that highlight one behavior or the other depending on the setup.[25] In its wave description, light propagates as transverse electromagnetic waves, with oscillating electric and magnetic fields perpendicular to the direction of travel. These waves can be polarized, meaning the electric field vector oscillates in a specific plane (linear polarization) or rotates (circular polarization), a property unique to transverse waves. This framework arises from Maxwell's equations, which describe the interdependence of electric and magnetic fields; for instance, Faraday's law states ×E=Bt\nabla \times \mathbf{E} = -\frac{\partial \mathbf{B}}{\partial t}, and the corrected Ampère's law is ×B=μ0ϵ0Et\nabla \times \mathbf{B} = \mu_0 \epsilon_0 \frac{\partial \mathbf{E}}{\partial t} in free space, predicting self-sustaining waves at speed cc.[26][26] Conversely, light's particle nature is embodied in photons, discrete quanta that are massless bosons carrying quantized energy and momentum. Each photon's energy is given by E=hfE = h f, where hh is Planck's constant and ff is the light's frequency, a relation Einstein applied to light in 1905. Photons also possess momentum p=hλp = \frac{h}{\lambda}, with λ\lambda the wavelength, linking the particle's relativistic properties to wave characteristics.[27][28] The photoelectric effect exemplifies light's particle behavior: when monochromatic light strikes a metal surface, electrons are emitted only if the frequency exceeds a material-specific threshold ν0\nu_0, with maximum kinetic energy Kmax=hfϕK_{\max} = h f - \phi (where ϕ=hν0\phi = h \nu_0 is the work function), independent of intensity. This quantization, defying classical wave predictions, earned Einstein the 1921 Nobel Prize and established the photon concept.[27][27] Compton scattering further confirms photons as particles with momentum: X-rays incident on loosely bound electrons scatter with increased wavelength Δλ=hmec(1cosθ)\Delta \lambda = \frac{h}{m_e c} (1 - \cos \theta), where mem_e is electron mass, cc speed of light, and θ\theta the scattering angle. This shift matches conservation laws for particle collisions, not classical wave scattering, as observed in experiments reported by Compton in 1923.[28][28] The double-slit experiment highlights wave properties through interference fringes formed by light passing through two slits, but in single-photon versions, detections accumulate as discrete hits that collectively build the pattern, showing particles following probabilistic wave-guided paths. Modern setups using attenuated lasers verify this duality without which-path information.[29][29] The de Broglie hypothesis unifies these views by assigning a wavelength λ=hp\lambda = \frac{h}{p} to any particle with momentum pp, extending naturally to photons where it equates the wave wavelength to the particle's de Broglie wavelength. Proposed in 1924, this relation underpins quantum mechanics and explains light's dual manifestations.[30][30]

Light Propagation and Optics

Reflection and Refraction

Reflection occurs when light encounters a boundary between two media and changes direction without altering its speed or wavelength, provided the media are non-absorbing. The law of reflection states that the angle of incidence, measured from the normal to the surface, equals the angle of reflection./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/01%3A_The_Nature_of_Light/1.03%3A_The_Law_of_Reflection) This principle holds for smooth surfaces and can be derived from wave interference considerations or Fermat's principle.[31] Reflections are classified as specular or diffuse depending on surface roughness. Specular reflection produces a clear image, as seen in mirrors where parallel rays reflect parallel to each other at equal angles to the normal.[32] In contrast, diffuse reflection scatters light in multiple directions from rough surfaces like paper or asphalt, enabling visibility of objects under diffuse illumination without a distinct image.[32] Refraction describes the bending of light as it passes from one medium to another due to a change in speed, quantified by the refractive index $ n $, which is the ratio of the speed of light in vacuum to that in the medium.[33] Snell's law governs this bending: $ n_1 \sin \theta_1 = n_2 \sin \theta_2 $, where $ \theta_1 $ and $ \theta_2 $ are the angles of incidence and refraction, respectively.[33] This law arises from Fermat's principle, which posits that light follows the path of least time between two points.[31] When light travels from a denser to a rarer medium ($ n_1 > n_2 $), refraction reaches a limit at the critical angle $ \theta_c = \sin^{-1}(n_2 / n_1) $, beyond which total internal reflection occurs, with all light reflecting internally.[34] This phenomenon is essential in fiber optics, where light is confined within a core by repeated total internal reflections.[35] Lenses exploit refraction to focus or diverge light beams. Converging lenses, thicker at the center, bring parallel rays to a focal point, while diverging lenses, thinner at the center, spread them apart./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/02%3A_Geometric_Optics_and_Image_Formation/2.05%3A_Thin_Lenses) For a thin symmetric lens in air, the focal length $ f $ is approximated by the lensmaker's formula:
f=[R](/page/Radiusofcurvature)2(n1) f = \frac{[R](/page/Radius_of_curvature)}{2(n-1)}
where $ R $ is the radius of curvature of each surface and $ n $ is the refractive index of the lens material./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/02%3A_Geometric_Optics_and_Image_Formation/2.05%3A_Thin_Lenses) Prisms, typically triangular, refract light through two non-parallel faces, deviating the beam and separating wavelengths due to dispersion. Dispersion arises because the refractive index $ n $ varies with wavelength, being higher for shorter wavelengths like blue light than for longer ones like red./25%3A_Geometric_Optics/25.05%3A_Dispersion_-Rainbows_and_Prisms) In a prism, this causes white light to split into a spectrum, as demonstrated by the formation of rainbows where sunlight refracts and disperses in atmospheric water droplets./25%3A_Geometric_Optics/25.05%3A_Dispersion-_Rainbows_and_Prisms) Refraction in non-uniform media can produce optical illusions such as mirages. In inferior mirages, hot ground creates a layer of low-density air near the surface; light from distant objects bends upward upon entering cooler air above, creating the appearance of water on roads.[36]

Diffraction and Interference

Diffraction is a fundamental wave phenomenon in which light bends around obstacles or spreads through apertures, deviating from straight-line propagation predicted by geometric optics. This effect arises from the wave nature of light, as described by the Huygens-Fresnel principle, which posits that every point on a wavefront acts as a source of secondary spherical wavelets, with the new wavefront formed by the superposition of these wavelets, modulated by an obliquity factor to account for directional propagation.[37] The principle, originally proposed by Christiaan Huygens in 1690 and refined by Augustin-Jean Fresnel in 1818, provides the theoretical foundation for understanding diffraction patterns observed in experiments.[38] In single-slit diffraction, light passing through a narrow slit of width aa produces an interference pattern on a screen, characterized by a central bright maximum flanked by alternating minima and secondary maxima. The positions of the minima occur where destructive interference dominates, given by the condition sinθ=mλ/a\sin \theta = m \lambda / a, where θ\theta is the angle from the central axis, λ\lambda is the wavelength, mm is a non-zero integer, and aa is the slit width; this arises from the path difference between wavelets from opposite edges of the slit being an integer multiple of λ\lambda.[39] For circular apertures, such as in telescope objectives, diffraction limits the resolution, with the angular radius of the Airy disk (the central bright spot) approximated by θ1.22λ/D\theta \approx 1.22 \lambda / D, where DD is the aperture diameter; this Rayleigh criterion defines the minimum resolvable angle between two point sources, beyond which they blur into one.[40] Interference occurs when two or more coherent light waves superpose, resulting in regions of enhanced (constructive) or reduced (destructive) intensity depending on their phase difference. In Thomas Young's double-slit experiment of 1801, monochromatic light passing through two closely spaced slits separated by distance dd illuminates a distant screen at distance LL, producing bright fringes spaced by Δy=λL/d\Delta y = \lambda L / d, derived from the condition for constructive interference where the path difference is mλm \lambda (mm integer).[41] This pattern demonstrates the wave nature of light, with fringe visibility requiring spatial and temporal coherence between the sources. Thin-film interference exemplifies this in everyday phenomena, such as the iridescent colors of soap bubbles, where light reflected from the front and back surfaces of a thin soap film of thickness tt and refractive index nn interferes; for constructive interference in reflection (accounting for phase shifts), the condition is 2nt=mλ2nt = m\lambda for certain configurations, leading to wavelength-dependent color reinforcement. Polarization influences interference patterns, particularly when light from interfering sources has specific orientations. For polarized light incident on a polarizer at angle θ\theta to its transmission axis, the transmitted intensity follows Malus's law, I=I0cos2θI = I_0 \cos^2 \theta, where I0I_0 is the incident intensity; in interference setups like crossed polarizers with a birefringent sample, this modulates the overall fringe contrast by altering the effective amplitude of the superposed waves.[42] Diffraction gratings exploit these principles in spectroscopy by dispersing light into its spectral components, enabling wavelength separation for analysis. A grating with slit spacing dd produces maxima at angles satisfying dsinθ=mλd \sin \theta = m \lambda, allowing different wavelengths to be resolved spatially based on their angular deviation, far superior to prisms for precise measurements in atomic spectra.[43] Observable interference requires coherence, meaning the light sources must maintain a constant phase relationship over the spatial extent (transverse coherence) and duration (longitudinal coherence) of the experiment; incoherent sources, like sunlight without filtering, average out phase fluctuations, washing out fringes, whereas lasers provide high coherence lengths exceeding meters for clear patterns.[44]

Light Sources

Natural Sources

Natural sources of light encompass a variety of emission processes occurring without human intervention, ranging from thermal radiation in stellar and terrestrial environments to chemical and electrical excitations. These sources produce light across the electromagnetic spectrum, primarily through mechanisms that convert energy into photons via atomic, molecular, or plasma interactions.[45] Thermal sources dominate many natural light emissions, arising from the agitation of charged particles in hot matter, which approximates blackbody radiation for ideal absorbers and emitters. The spectral distribution of this radiation is described by Planck's law, which quantifies the intensity of emitted radiation as a function of wavelength and temperature:
B(λ,T)=2hc2λ51ehc/λkT1 B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda kT} - 1}
where hh is Planck's constant, cc is the speed of light, kk is Boltzmann's constant, λ\lambda is the wavelength, and TT is the absolute temperature. This formula, derived from quantum considerations of energy quantization, predicts a continuous spectrum peaking at wavelengths inversely proportional to temperature, as per Wien's displacement law.[46][47] The Sun exemplifies such a source, with its photosphere at approximately 5800 K emitting a near-blackbody spectrum that peaks in the visible range around 500 nm, providing the primary illumination for Earth.[48] Geological thermal sources, such as molten lava from volcanic eruptions, also produce incandescent glow through blackbody-like radiation at temperatures typically between 1000°C and 1200°C for basaltic magma. This incandescence results from the high thermal energy of the viscous melt, visible as a dull red to orange hue at the surface, diminishing as the material cools and solidifies.[49][50] Celestial sources extend thermal emission to cosmic scales, with stars like the Sun generating light primarily through nuclear fusion in their cores, where hydrogen nuclei combine to form helium, releasing vast energy that propagates outward as photons. This core energy heats the stellar surface, leading to thermal radiation, though absorption and re-emission in the outer layers modify the spectrum.[51][52] Non-thermal celestial phenomena include auroras, where charged particles from the solar wind—mostly electrons and protons—collide with atmospheric gases like oxygen and nitrogen near Earth's poles, exciting atoms to emit light at specific wavelengths (e.g., green from oxygen at ~557 nm). Lightning, another atmospheric electrical discharge, ionizes air into a plasma channel at temperatures exceeding 30,000 K, producing a brief, intense flash through recombination of electrons and ions, spanning visible wavelengths with a bluish-white appearance.[53][54][55][56] Bioluminescence represents a chemical emission process in living organisms, triggered by enzymatic reactions that oxidize substrates to release energy as photons. In fireflies, the enzyme luciferase catalyzes the oxidation of luciferin in the presence of oxygen and ATP, producing light primarily in the yellow-green range (500–600 nm), with peak emission around 560 nm for many species; this cold light generates minimal heat, with a quantum yield of approximately 41% (nearly 100% of the emitted energy as photons, without significant thermal loss).[57][58][59] Natural light sources exhibit distinct spectral characteristics: thermal sources like stars and lava yield continuous spectra, with smooth intensity distributions across wavelengths due to the collective emission from dense, hot matter. In contrast, processes involving excited atoms or ions, such as in auroras, lightning plasmas, or bioluminescence, often produce line spectra, featuring discrete emission lines at wavelengths corresponding to atomic transitions (e.g., specific colors from ionized nitrogen in lightning). These line spectra arise from low-density gases where individual quantum jumps dominate over broadband thermal effects.[60][61]

Artificial Sources

Artificial sources of light are engineered devices that produce illumination through controlled physical processes, enabling applications from everyday lighting to precision technologies. Unlike natural sources, these rely on electrical energy to generate photons via thermal, electrical discharge, or quantum mechanisms, with ongoing advancements improving efficiency and spectral control. Key developments span from the late 19th century onward, transforming human environments by providing reliable, tunable light. The evolution of artificial light began with Thomas Edison's incandescent bulb in 1879, which marked the first practical electric light source after extensive experimentation with filaments.[62] This was followed by fluorescent lamps in the early 20th century and light-emitting diodes (LEDs) in the mid-20th century, culminating in the laser's invention in 1960 by Theodore Maiman using a ruby crystal.[63] These milestones, building on principles like stimulated emission proposed by Albert Einstein in 1917, have driven efficiency gains from under 5% to over 50% in modern designs.[63] Incandescent bulbs operate on thermal emission, where an electric current heats a filament—typically tungsten, with a melting point of about 3420°C—to around 2500 K, causing it to radiate visible light as blackbody radiation.[64] However, their efficiency is low, converting only about 5% of input energy to visible light, with the rest lost as heat.[64] Tungsten's high emissivity and resistance to evaporation at these temperatures made it ideal for filaments, enabling bulbs to last up to 1000 hours.[65] Fluorescent lamps generate light through electrical discharge in low-pressure mercury vapor, exciting mercury atoms to produce ultraviolet radiation that is then converted to visible light by phosphors coating the tube interior.[66] Pioneered by Peter Cooper Hewitt's mercury vapor lamp in 1901, these lamps achieve efficiencies of 20-30% by minimizing thermal losses compared to incandescents.[66] The phosphor layer allows color tuning, making them suitable for broad illumination needs. Light-emitting diodes (LEDs) produce light via electroluminescence in a semiconductor p-n junction, where electrons and holes recombine to emit photons with energy Eg=hfE_g = h f, matching the material's bandgap EgE_g.[67] Early red LEDs used gallium arsenide phosphide in the 1960s, but blue LEDs—essential for white light—emerged in 1993 using gallium nitride (GaN) developed by Shuji Nakamura and colleagues, enabling high-efficiency white LEDs through phosphor conversion.[67] GaN's wide bandgap of about 3.4 eV allows blue emission around 450 nm, with overall efficiencies exceeding 50% in modern devices.[67] Lasers produce coherent, monochromatic light through stimulated emission, where incident photons trigger excited atoms to release identical photons, as described by Einstein's 1917 coefficients relating absorption, spontaneous emission, and stimulated emission rates.[63] Achieving this requires population inversion, where more atoms are in an excited state than ground state, often via optical or electrical pumping. The first laser, Maiman's 1960 ruby device, used a chromium-doped ruby crystal pumped by a flashlamp to emit red light at 694 nm.[63] Gas lasers like the helium-neon (He-Ne), operational since 1961, use an electric discharge in a He-Ne mixture for continuous red output at 632.8 nm, prized for its coherence over meters.[68] Solid-state lasers, such as neodymium-doped yttrium aluminum garnet (Nd:YAG), employ a Nd³⁺-doped crystal pumped by diodes or lamps to lase at 1064 nm in the near-infrared, valued for high power and beam quality.[68] These properties—spatial and temporal coherence, narrow linewidth—distinguish lasers from incoherent sources.

Measurement and Detection

Photometric Quantities

Photometric quantities quantify light in terms of its perception by the human visual system, weighting radiant energy according to the eye's spectral sensitivity rather than physical power alone.[69] These measures are defined by the International Commission on Illumination (CIE) and form the basis for lighting standards, display technologies, and visual comfort assessments. The core weighting function is the photopic luminosity function $ V(\lambda) $, which describes the average human eye's sensitivity to wavelengths of light, peaking at 555 nm in the green region of the visible spectrum.[70] Luminous flux, denoted $ \Phi_v $, represents the total amount of visible light emitted, transmitted, or received by a source, measured in lumens (lm). It is calculated by integrating the spectral power distribution of the light source with the luminosity function:
Φv=6830Φe,λ(λ)V(λ)dλ, \Phi_v = 683 \int_0^\infty \Phi_{e,\lambda}(\lambda) V(\lambda) \, d\lambda,
where 683 lm/W is the maximum luminous efficacy for monochromatic light at 555 nm, and $ \Phi_{e,\lambda}(\lambda) $ is the spectral radiant flux in watts per nanometer.[69] This quantity captures the overall "light output" as perceived by the eye, making it essential for evaluating the efficiency of lamps and LEDs.[71] Luminous intensity, $ I_v $, measures the brightness of a light source in a particular direction, defined as the luminous flux per unit solid angle, with the unit candela (cd), where 1 cd = 1 lm/sr. The candela is an SI base unit, fixed by the luminous intensity of monochromatic radiation at approximately 555 nm with a radiant intensity of 1/683 W/sr. It is particularly useful for point sources like LEDs or stars, emphasizing directional emission weighted by human vision.[72] Luminance, $ L_v $, quantifies the brightness of an extended surface or source as seen by an observer, expressed as luminous intensity per unit projected area, in candela per square meter (cd/). For example, typical office display screens have luminance levels of 250–350 cd/ to ensure comfortable viewing under ambient lighting.[73] This metric is crucial for assessing the perceived brightness of screens, road signs, and illuminated surfaces, incorporating the eye's sensitivity via $ V(\lambda) $.[71] Illuminance, $ E_v $, describes the luminous flux incident on a surface per unit area, measured in lux (lx), where 1 lx = 1 lm/m². It guides lighting design; for instance, the Illuminating Engineering Society (IES) recommends 300–500 lx for general office work to support visual tasks without fatigue.[74] Like other photometric quantities, it is derived from radiant flux weighted by $ V(\lambda) $, focusing on the light reaching the eye from illuminated environments.[69] Color metrics in photometry extend these quantities to hue and saturation, using the CIE 1931 color space, which models human color perception through tristimulus values $ X, Y, Z $. These values are obtained by integrating the light's spectral power distribution with CIE standard observer color-matching functions $ \bar{x}(\lambda) $, $ \bar{y}(\lambda) $, and $ \bar{z}(\lambda) $, where $ Y $ corresponds to luminance and aligns with $ V(\lambda) $ since $ \bar{y}(\lambda) = V(\lambda) $.[75] Chromaticity coordinates $ x $ and $ y $ are derived as $ x = X/(X+Y+Z) $ and $ y = Y/(X+Y+Z) $, plotting colors on a two-dimensional diagram that excludes brightness, enabling precise specification of light color for applications like displays and photography.[76]

Radiometric Quantities

Radiometric quantities provide objective measures of electromagnetic radiation's energy, focusing on physical properties such as power and intensity across the full spectrum, without regard to human visual sensitivity. These quantities form the basis for quantifying light in physics, astronomy, and engineering applications, enabling precise calculations of energy transfer in optical systems. Unlike photometric measures, radiometric ones integrate over all wavelengths and directions to capture total radiant energy.[77] The fundamental radiometric quantity is radiant flux, denoted Φ_e, which represents the total power of electromagnetic radiation emitted, transmitted, or received by a source or surface, measured in watts (W). It accounts for the integrated energy over all wavelengths and solid angles, serving as the starting point for deriving other quantities in radiometry. For instance, the radiant flux from a light source determines its overall energetic output in free space.[78] Radiance, denoted L_e, quantifies the directional distribution of radiant flux, defined as the power per unit solid angle per unit projected area perpendicular to the direction of propagation, with units of watts per steradian per square meter (W/sr/m²). This quantity is conserved along a ray in lossless media, making it essential for analyzing light propagation through optical systems without changes in intensity due to distance or focusing. Radiance thus provides a measure of a source's brightness independent of distance.[79] Irradiance, denoted E_e, measures the radiant flux incident on a surface per unit area, expressed in watts per square meter (W/m²). It describes the power density from incoming radiation, crucial for assessing energy delivery to detectors or materials. A key example is the solar constant, the mean irradiance from the Sun at Earth's orbit on a surface normal to the rays, valued at approximately 1361.6 W/m² during solar minimum conditions.[80] Spectral radiometric quantities extend these definitions to specific wavelengths or frequencies, allowing analysis of radiation's distribution across the spectrum. Spectral radiance B(λ, T), for a blackbody at temperature T, is given by Planck's law:
B(λ,T)=2hc2λ51ehc/λkT1 B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda kT} - 1}
where h is Planck's constant, c is the speed of light, k is Boltzmann's constant, and λ is wavelength; this equation describes the maximum possible spectral radiance at thermal equilibrium, forming the basis for calibrating broadband sources. Integrating spectral quantities over wavelength yields total radiometric values, encompassing the full electromagnetic spectrum./29%3A_Atomic_Physics/29.1%3A_Overview) Detection of radiometric quantities relies on specialized instruments that convert radiant energy into measurable electrical signals. Photodiodes operate via the photoelectric effect, generating current proportional to incident photon flux in the visible and near-infrared ranges, offering high speed and quantum efficiency for spectral irradiance measurements. Bolometers detect radiation through temperature-induced resistance changes in absorptive materials, suitable for broadband thermal detection across infrared wavelengths. For absolute calibration, cryogenic radiometers employ electrical substitution at low temperatures (near 5 K), equating absorbed optical power to equivalent electrical heating with uncertainties below 0.01%, serving as primary standards traceable to SI units. These detectors ensure accurate realization of radiometric scales in metrology.[81][82][83]

Light-Matter Interactions

Absorption and Scattering

When light interacts with matter, absorption occurs as photons are captured by atoms or molecules, converting the light's energy into other forms such as heat or electronic excitation. This process reduces the intensity of the transmitted light and is quantitatively described by the Beer-Lambert law, which states that the transmitted intensity II through a medium of thickness xx is given by
I=I0eαx, I = I_0 e^{-\alpha x},
where I0I_0 is the initial intensity and α\alpha is the absorption coefficient, dependent on the material and wavelength.[84] This law, originally formulated by Pierre Bouguer in 1729 and refined by August Beer in 1852, applies to dilute solutions and homogeneous media where scattering is negligible. In absorbing media, the energy transfer often leads to thermal heating via vibrational relaxation or to excited states that may re-emit light at different wavelengths, though the primary effect is energy dissipation.[85] Scattering, in contrast, redirects light without net energy loss per photon but randomizes its direction, contributing to phenomena like the diffusion of light in atmospheres or tissues. For particles much smaller than the light wavelength (typically < 0.1 times the wavelength), Rayleigh scattering dominates, with scattered intensity proportional to 1/λ41/\lambda^4, where λ\lambda is the wavelength; this strong wavelength dependence explains the blue color of the daytime sky, as shorter blue wavelengths (~450 nm) scatter more efficiently than longer red ones (~650 nm) by molecules like nitrogen and oxygen.[86] For larger particles comparable to or exceeding the wavelength, such as water droplets in clouds (diameters ~10–20 μm), Mie scattering prevails, scattering all visible wavelengths more uniformly and resulting in the white appearance of clouds, though with forward-biased patterns that enhance brightness when viewed from below.[87] A specialized form of scattering, Raman scattering, is inelastic and involves a frequency shift due to energy exchange with molecular vibrations or rotations. The shift Δν=ν0νs\Delta \nu = \nu_0 - \nu_s, where ν0\nu_0 is the incident frequency and νs\nu_s the scattered frequency, corresponds to vibrational energy levels (typically 50–8000 cm⁻¹), enabling non-destructive probing of molecular structures.[88] Discovered by C. V. Raman in 1928, this effect is weak (~10⁻⁶ of incident intensity) but crucial for spectroscopy, as the shifted light carries chemical information without requiring sample preparation.[89] In Earth's atmosphere, these processes combine to produce striking visual effects. During sunsets, sunlight traverses a longer path through the air, enhancing scattering of shorter wavelengths and allowing longer red and orange wavelengths to dominate the direct beam, as red light scatters least under Rayleigh conditions.[90] Absorption and scattering also underpin spectroscopy applications, such as analyzing absorption lines in stellar spectra—dark features where specific wavelengths are removed by intervening gas clouds or stellar atmospheres—revealing compositions like hydrogen and helium in stars via Fraunhofer lines.

Radiation Pressure

Radiation pressure refers to the mechanical force exerted by electromagnetic radiation, such as light, on matter due to the transfer of momentum from photons. In the quantum description, a single photon carries momentum $ p = \frac{E}{c} = \frac{h f}{c} $, where $ E $ is the photon's energy, $ c $ is the speed of light in vacuum, $ h $ is Planck's constant, and $ f $ is the frequency. This relation arises from the relativistic energy-momentum equivalence for massless particles, where $ E = p c $. When light interacts with a surface, the momentum transfer results in pressure; for perfect absorption, the pressure $ P $ equals the intensity $ I $ divided by $ c $, or $ P = \frac{I}{c} $, while for perfect reflection, it doubles to $ P = \frac{2I}{c} $. These expressions derive from classical electromagnetic theory, as predicted by James Clerk Maxwell in 1873, and are confirmed in the photon picture. The existence of radiation pressure was first experimentally verified in the early 20th century. Russian physicist Pyotr Lebedev conducted the initial measurements in 1900 using a torsion balance with thin mica vanes suspended in a partial vacuum, detecting a small deflection due to sunlight filtered through a slit. Independently, American physicists Ernest Fox Nichols and Gordon Ferrie Hull performed more precise measurements in 1901, employing a Nichols radiometer to quantify the pressure from arc lamp light on delicately balanced mirrors, achieving results within 1% of theoretical predictions. These experiments provided crucial empirical support for the momentum-carrying nature of light. One natural manifestation of radiation pressure is observed in the dust tails of comets, where solar photons push micron-sized dust particles away from the Sun, forming curved, yellowish tails distinct from the ion tails driven by solar wind. This effect is particularly evident as comets approach perihelion, with the pressure accelerating smaller grains outward while larger ones lag behind due to gravity. In technological applications, radiation pressure enables propulsion for solar sails, ultra-lightweight reflective sheets that harness sunlight for thrust without fuel; the acceleration $ a = \frac{2 P A}{m} $ (where $ A $ is sail area and $ m $ is spacecraft mass) allows gradual velocity increases, as demonstrated in missions like Japan's IKAROS sail in 2010 and NASA's Advanced Composite Solar Sail System (ACS3), launched in 2024.[91] Another application is optical tweezers, developed by Arthur Ashkin in 1986, for which he shared the 2018 Nobel Prize in Physics, which use focused laser beams to trap and manipulate microscopic particles via gradient forces $ \mathbf{F} = \frac{n}{2c} \alpha \nabla E^2 $, where $ n $ is the refractive index, $ \alpha $ is polarizability, and $ E $ is the electric field strength; this technique has revolutionized single-molecule studies in biology.[92]

Historical Development

Ancient and Classical Theories

Ancient civilizations developed early qualitative theories of light and vision, primarily through philosophical and observational means, without quantitative experimentation or distinctions between wave and particle natures. In ancient Greece around 300 BCE, Euclid formalized geometric optics in his treatise Optica, postulating that light propagates in straight lines via visual rays emanating from the eye, which he used to explain reflection and the apparent size of objects based on the angle subtended by these rays.[93] This extramission theory, where sight results from rays emitted by the observer, was endorsed by Plato and Euclid, contrasting with the intromission view that light enters the eye from external sources.[94] Aristotle, in the 4th century BCE, challenged pure extramission by proposing an intromission theory in works like De Anima, arguing that vision occurs when transparent media transmit forms or "species" from objects into the eye, facilitated by light as an active agent that actualizes potential transparency in air and other media.[95] This sparked ongoing debates between emission (extramission) and intromission theories, with Aristotle emphasizing light's role in enabling perception without rays originating solely from the eye, though he retained elements of both.[96] Meanwhile, in ancient China during the 5th century BCE, the philosopher Mozi described pinhole imaging in the Mozi text, observing that light rays entering a small aperture in a dark room project an inverted image of external objects, demonstrating straight-line propagation without invoking vision theories.[97] In ancient India, contributions to optics emerged alongside medical texts. The Sushruta Samhita, attributed to Sushruta around the 6th century BCE, detailed ophthalmological procedures including cataract surgery.[98] In the Vaisheshika school, founded by Kanada around the 6th century BCE, light was conceived as streams of fine, high-velocity particles known as tejas (fire atoms), which propagate in straight lines and enable vision by entering the eye from luminous sources, aligning with intromission ideas.[99] During the Hellenistic period, Ptolemy's Optics (2nd century CE) built on Euclidean geometry by compiling empirical tables of refraction angles for light passing from air to water and glass, using an experimental setup to measure incidence and refraction, though his data showed inaccuracies due to observational limits.[100] This work treated light rays as straight lines bent at interfaces, focusing on visual perception without resolving emission debates. In the Islamic Golden Age, Ibn al-Haytham (Alhazen) revolutionized the field in his 11th-century Book of Optics, decisively supporting intromission by refuting extramission through experiments with camera obscura, where light from objects forms images on screens without eye involvement, establishing that vision results from rays entering the eye.[101] Medieval European scholars synthesized these ideas, with Witelo's Perspectiva (late 13th century) providing a comprehensive Latin treatise on optics, drawing heavily from Alhazen and Ptolemy to explore ray propagation, refraction, and perspective in vision, treating light as quantifiable rays for geometric analysis.[102] These pre-modern theories remained largely qualitative and philosophical, emphasizing geometric rays and vision mechanisms through observation and deduction, setting the stage for later experimental transitions without yet distinguishing wave or particle behaviors.[103]

17th-19th Century Theories

In the 17th century, René Descartes proposed an emission theory of light in his work La Dioptrique (1637), positing that light consists of particles propelled instantaneously through a medium of swirling vortices composed of subtle matter, which accounted for phenomena like refraction as mechanical pressures within these cosmic eddies.[104] This corpuscular model emphasized light's propagation as a direct emission from luminous sources, aligning with mechanistic philosophy but assuming infinite speed to explain observations without delay.[105] Isaac Newton advanced the particle theory in his seminal Opticks (1704), describing light as streams of minute, elastic particles that obey laws of motion similar to projectiles.[106] Newton explained refraction not as a change in speed but as the particles' deviation due to attractive forces exerted by denser media, such as glass pulling particles toward it with varying intensity based on their inherent "sides" or properties, which also accounted for color dispersion in prisms.[107] This framework unified reflection as elastic collisions and supported the corpuscular view by fitting empirical data from his prism experiments, though it struggled with later diffraction observations. Countering the particle model, Christiaan Huygens introduced a wave theory in his Traité de la Lumière (written in 1678, published 1690), conceiving light as longitudinal pressure waves propagating through an elastic ether—a pervasive, subtle medium filling space.[108] Huygens derived the laws of reflection and refraction geometrically using the concept of secondary wavelets emanating from each point on a wavefront, with the tangent to these wavelets forming the new wavefront, thus explaining light's rectilinear path as the envelope of expanding spherical pulses in the ether.[109] His approach anticipated diffraction as interference among wavelets but lacked quantitative detail for it, emphasizing instead the finite speed of light consistent with astronomical delays. The wave theory gained empirical traction in the early 19th century through Thomas Young's double-slit experiment (1801), which demonstrated interference patterns of alternating bright and dark fringes when light passed through two closely spaced apertures, a phenomenon attributable solely to the superposition of coherent wave trains rather than particle streams.[110] Building on this, Augustin-Jean Fresnel developed a mathematical formulation of diffraction in 1818, applying Huygens' principle with interference to predict the intensity distribution around obstacles, such as the bright spot at the center of a circular shadow (Poisson's spot), which decisively refuted Newton's particle model through precise calculations matching observations.[111] Fresnel's equations for oblique diffraction integrated Young's interference with wave propagation, solidifying the wave nature by quantifying how secondary sources constructively or destructively combine. The particle-wave debate culminated in classical unification with James Clerk Maxwell's electromagnetic theory (1865), which portrayed light as transverse electromagnetic waves arising from oscillating electric and magnetic fields in the ether, governed by coupled partial differential equations.[26] Maxwell derived the wave speed as
c=1ϵ0μ0 c = \frac{1}{\sqrt{\epsilon_0 \mu_0}}
, where ϵ0\epsilon_0 and μ0\mu_0 are the permittivity and permeability of free space, yielding a value matching astronomical measurements of light's velocity and thereby identifying light as an electromagnetic disturbance without invoking separate particles or longitudinal pressures.[26] This synthesis reconciled optics with electricity and magnetism, establishing a comprehensive classical framework for light's propagation and interactions.

20th Century and Modern Theories

The advent of quantum theory in the early 20th century revolutionized the understanding of light by resolving paradoxes in classical electromagnetism, particularly through the quantization of energy. In 1900, Max Planck introduced the concept of discrete energy quanta to explain blackbody radiation, proposing that oscillators emit and absorb energy in multiples of $ h\nu $, where $ h $ is Planck's constant and $ \nu $ is the frequency, marking the birth of quantum mechanics.[112] This quantization addressed the ultraviolet catastrophe predicted by classical theory. Building on this, Albert Einstein extended the idea to light itself in 1905, interpreting the photoelectric effect as evidence for light quanta, or photons, each carrying energy $ E = h\nu $, independent of intensity, which explained why light ejects electrons only above a threshold frequency.[113] Einstein's photon hypothesis unified wave and particle descriptions, earning him the 1921 Nobel Prize.[114] Subsequent developments integrated quantum principles into atomic structure and extended duality to matter. In 1913, Niels Bohr proposed a model of the hydrogen atom where electrons occupy discrete energy levels, and light emission occurs via quantized transitions between these levels, producing spectral lines that matched observations.[115] This model incorporated Planck's quanta to stabilize the atom against classical radiation losses. In 1924, Louis de Broglie generalized wave-particle duality by hypothesizing that all matter, like light, exhibits wave properties, with wavelength $ \lambda = h/p $ where $ p $ is momentum, laying the foundation for wave mechanics.[116] These ideas culminated in the full quantum mechanical framework by the late 1920s, where light's dual nature became central. Relativity provided another pillar, redefining light's role in spacetime. Einstein's 1905 special relativity posited the invariance of light speed $ c $ in vacuum for all inertial observers, leading to time dilation and length contraction, and establishing light as the universal speed limit. In general relativity (1915), light follows null geodesics—paths where the spacetime interval is zero—defining light cones that delineate causal boundaries in curved spacetime, explaining gravitational lensing and time delays in light propagation.[117] Quantum electrodynamics (QED), developed in the 1940s, emerged as the relativistic quantum field theory of light and matter interactions. Richard Feynman's path integral formulation, along with contributions from Julian Schwinger and Sin-Itiro Tomonaga, described electromagnetic interactions via virtual photon exchange, achieving unprecedented precision, such as in the anomalous magnetic moment of the electron.[118] The fine structure constant $ \alpha \approx 1/137 $, a dimensionless measure of electromagnetic coupling strength, governs these processes and remains a fundamental parameter in QED.[119] Modern theories, encompassed by quantum optics, explore light's quantum states and correlations. Coherent states, introduced by Roy Glauber in the 1960s, describe laser light as minimum-uncertainty Gaussian wavepackets, enabling precise quantum descriptions of optical fields.[120] Squeezed states, first theoretically proposed in the 1970s and experimentally realized in the 1980s, reduce uncertainty in one quadrature below the vacuum limit at the expense of the other, enhancing precision in interferometry and gravitational wave detection.[120] Photon entanglement, demonstrating non-local correlations, was experimentally confirmed in Bell tests by Alain Aspect in 1982, violating classical inequalities and supporting quantum mechanics over local hidden variables.[121] By 2025, these frameworks remain foundational, with no paradigm-shifting theoretical advances in light's nature, though applications in quantum information continue to evolve.

Applications and Uses

Technological Applications

Optical devices harness the principles of light propagation and focusing to achieve high-resolution imaging and signal transmission. Microscopes, for instance, enable detailed examination of specimens but are constrained by the Abbe diffraction limit, which sets the smallest resolvable distance at approximately half the wavelength of visible light used for illumination, typically around 200-300 nanometers for standard optical systems.[122] Telescopes, divided into refractors that use lenses to bend incoming light rays and focus them at a focal point, and reflectors that employ curved mirrors to gather and redirect light for imaging distant celestial objects, allow astronomers to observe faint stars and galaxies by collecting light over large apertures.[123] Fiber optics rely on total internal reflection, where light signals propagate within a core of higher refractive index surrounded by a cladding of lower index, preventing leakage and enabling high-speed data transmission over long distances with minimal loss.[124] In communications, lasers serve as coherent light sources in fiber optic networks, achieving remarkably low attenuation of less than 0.2 dB per kilometer at wavelengths around 1550 nanometers, which supports terabit-per-second data rates across global infrastructures.[125] LiDAR systems, utilizing pulsed laser beams to measure distances via time-of-flight calculations, produce precise three-dimensional maps for applications such as autonomous vehicle navigation and topographic surveying, with resolutions down to centimeters over kilometers.[126] Imaging technologies exploit light's wave properties for capturing visual information. Photography balances light exposure through the interplay of aperture size, shutter speed, and sensor sensitivity (ISO), ensuring optimal brightness and depth of field in recorded images.[127] Holography records the interference patterns between object-scattered light and a coherent reference beam on a photosensitive medium, reconstructing three-dimensional images upon illumination that preserve parallax and depth cues.[128] In energy applications, photovoltaics convert sunlight into electricity through the photovoltaic effect in semiconductor materials, where efficiency is optimized by matching the solar spectrum to the material's bandgap energy, with record efficiencies for crystalline silicon cells reaching 27.8% as of mid-2025 under standard test conditions.[129] Medical procedures leverage light for minimally invasive interventions. Endoscopy employs flexible fiber optic bundles or rigid scopes to deliver illumination deep into the body, allowing real-time visualization of internal organs during diagnostics and surgeries.[130] Laser surgery utilizes targeted absorption of laser energy by tissue chromophores, leading to precise ablation where the irradiated material is vaporized or removed layer by layer with minimal thermal damage to surrounding areas.[131]

Biological and Environmental Roles

Light plays a fundamental role in biological processes and environmental systems on Earth, enabling energy transfer, physiological regulation, and ecological balance. In living organisms, light drives essential functions such as photosynthesis and vision, while in the broader environment, it influences climate dynamics and ecosystem interactions. These roles highlight light's integration into natural cycles, from sustaining primary production to shaping evolutionary adaptations. Photosynthesis, the process by which plants, algae, and cyanobacteria convert light energy into chemical energy, relies on chlorophyll pigments that absorb specific wavelengths of light. Chlorophyll a in photosystem II (PSII) has an absorption peak at 680 nm (P680), while in photosystem I (PSI) it peaks at 700 nm (P700), allowing efficient capture of red light for electron excitation in reaction centers.[132] This light-driven process splits water molecules to produce oxygen and fuels the Calvin cycle, summarized by the equation:
6CO2+6H2OlightC6H12O6+6O2 6CO_2 + 6H_2O \xrightarrow{\text{light}} C_6H_{12}O_6 + 6O_2
where carbon dioxide and water yield glucose and oxygen, supporting nearly all life through oxygenic photosynthesis that originated in ancient cyanobacteria.[133] In animal vision, light regulates circadian rhythms via intrinsically photosensitive retinal ganglion cells containing melanopsin, which is particularly sensitive to blue light around 480 nm, synchronizing biological clocks to day-night cycles.[134] Disruptions from reduced winter light exposure can lead to seasonal affective disorder (SAD), characterized by depressive symptoms linked to melanopsin pathway variations and altered sleep timing.[135] Within ecosystems, ultraviolet (UV) radiation facilitates vitamin D synthesis in the skin of vertebrates upon exposure to UVB wavelengths (290-320 nm), essential for calcium metabolism and immune function across food webs.[136] Infrared radiation contributes to heat balance by warming surfaces and atmospheres, maintaining thermal equilibria in habitats from forests to oceans through absorption and re-emission.[137] However, artificial light pollution disorients nocturnal migrants, such as birds, drawing them into urban areas and increasing collision risks during seasonal journeys.[138] Solar radiation powers global weather patterns by heating the atmosphere unevenly, driving convection, winds, and precipitation cycles that distribute energy across latitudes.[139] Albedo effects amplify this in polar regions, where melting ice caps reduce reflectivity—from about 0.8 for snow to 0.1 for open water—absorbing more solar energy and accelerating warming through positive feedback.[140] Evolutionary adaptations to light have shaped diverse organisms, with light-harvesting complexes in photosynthetic species evolving to optimize fluctuating light environments for energy capture.[141] In light-scarce deep-sea habitats, bioluminescence has independently evolved in approximately 76% of metazoans, enabling predation, camouflage, and communication via luciferin-luciferase reactions that mimic or counter faint downwelling light.[142]

References

User Avatar
No comments yet.