Hubbry Logo
MeteorologyMeteorologyMain
Open search
Meteorology
Community hub
Meteorology
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Meteorology
Meteorology
from Wikipedia

Meteorology is the scientific study of the Earth's atmosphere and short-term atmospheric phenomena (i.e., weather), with a focus on weather forecasting.[1] It has applications in the military, aviation, energy production, transport, agriculture, construction, weather warnings, and disaster management.

Along with climatology, atmospheric physics, atmospheric chemistry, and aeronomy, meteorology forms the broader field of the atmospheric sciences. The interactions between Earth's atmosphere and its oceans (notably El Niño and La Niña) are studied in the interdisciplinary field of hydrometeorology. Other interdisciplinary areas include biometeorology, space weather, and planetary meteorology. Marine weather forecasting relates meteorology to maritime and coastal safety, based on atmospheric interactions with large bodies of water.

Meteorologists study meteorological phenomena driven by solar radiation, Earth's rotation, ocean currents, and other factors. These include everyday weather like clouds, precipitation, and wind patterns, as well as severe weather events such as tropical cyclones and severe winter storms. Such phenomena are quantified using variables like temperature, pressure, and humidity, which are then used to forecast weather at local (microscale), regional (mesoscale and synoptic scale), and global scales. Meteorologists collect data using basic instruments like thermometers, barometers, and weather vanes (for surface-level measurements), alongside advanced tools like weather satellites, balloons, reconnaissance aircraft, buoys, and radars. The World Meteorological Organization (WMO) ensures international standardization of meteorological research.

The study of meteorology dates back millennia. Ancient civilizations tried to predict weather through folklore, astrology, and religious rituals. Aristotle's treatise Meteorology sums up early observations of the field, which advanced little during early medieval times but experienced a resurgence during the Renaissance, when Alhazen and René Descartes challenged Aristotelian theories, emphasizing scientific methods. In the 18th century, accurate measurement tools (e.g., barometer and thermometer) were developed, and the first meteorological society was founded. In the 19th century, telegraph-based weather observation networks were formed across broad regions.[2] In the 20th century, numerical weather prediction (NWP), coupled with advanced satellite and radar technology, introduced sophisticated forecasting models.[3] Later, computers revolutionized forecasting by processing vast datasets in real time and automatically solving modeling equations. 21st-century meteorology is highly accurate and driven by big data and supercomputing. It is adopting innovations like machine learning, ensemble forecasting, and high-resolution global climate modeling.[4] Climate change–induced extreme weather poses new challenges for forecasting and research,[5] while inherent uncertainty remains because of the atmosphere's chaotic nature (see butterfly effect).[6]

Etymology

[edit]

The word meteorology is from the Ancient Greek μετέωρος metéōros (meteor) and -λογία -logia (-(o)logy), meaning "the study of things high in the air".[citation needed]

History

[edit]

Ancient meteorology up to the time of Aristotle

[edit]
Parhelion (sundog) in Savoie

Early attempts at predicting weather were often related to prophecy and divining, and were sometimes based on astrological ideas. Ancient religions believed meteorological phenomena to be under the control of the gods.[7] The ability to predict rains and floods based on annual cycles was evidently used by humans at least from the time of agricultural settlement if not earlier. Early approaches to predicting weather were based on astrology and were practiced by priests. The Egyptians had rain-making rituals as early as 3500 BC.[7]

Ancient Indian Upanishads contain mentions of clouds and seasons.[8] The Samaveda mentions sacrifices to be performed when certain phenomena were noticed.[9] Varāhamihira's classical work Brihatsamhita, written about 500 AD,[8] provides evidence of weather observation.

Cuneiform inscriptions on Babylonian tablets included associations between thunder and rain. The Chaldeans differentiated the 22° and 46° halos.[9]

The ancient Greeks were the first to make theories about the weather. Many natural philosophers studied the weather. However, as meteorological instruments did not exist, the inquiry was largely qualitative, and could only be judged by more general theoretical speculations.[7]: 8  Herodotus states that Thales predicted the solar eclipse of 585 BC. He studied Babylonian equinox tables.[7]{{rp|11} According to Seneca, he explained that the cause of the Nile's annual floods was due to northerly winds hindering its descent by the sea.[7]: 4  Anaximander and Anaximenes thought that thunder and lightning was caused by air smashing against the cloud, thus kindling the flame. Early meteorological theories generally considered that there was a fire-like substance in the atmosphere. Anaximander defined wind as a flowing of air, but this was not generally accepted for centuries.[7]: 5  A theory to explain summer hail was first proposed by Anaxagoras. He observed that air temperature decreased with increasing height and that clouds contain moisture. He also noted that heat caused objects to rise, and therefore the heat on a summer day would drive clouds to an altitude where the moisture would freeze.[7]: 6  Empedocles theorized on the change of the seasons. He believed that fire and water opposed each other in the atmosphere, and when fire gained the upper hand, the result was summer, and when water did, it was winter. Democritus also wrote about the flooding of the Nile. He said that snow in northern parts of the world melted during the summer solstice. This would cause vapors to form clouds, which would cause storms when driven to the Nile by northerly winds, thus filling the lakes and the Nile.[7]: 8  Hippocrates inquired into the effect of weather on health. Eudoxus claimed that bad weather followed four-year periods, according to Pliny.[7]: 9 

Aristotelian meteorology

[edit]

These early observations would form the basis for Aristotle's Meteorology, written in 350 BC.[7]: 11 [10] Aristotle is considered the founder of meteorology.[11] One of the most impressive achievements described in the Meteorology is the description of what is now known as the hydrologic cycle. His work would remain an authority on meteorology for nearly 2,000 years.[12]

The treatise On the Universe (composed before 250 BC or between 350 and 200 BC) noted:[13]

If the flashing body is set on fire and rushes violently to the earth it is called a thunderbolt ; if it be only half of fire, but violent also and massive, it is called a meteor ; if it is entirely free from fire, it is called a smoking bolt. They are all called 'swooping bolts', because they swoop down upon the earth. Lightning is sometimes smoky, and is then called 'smouldering lightning' ; sometimes it darts quickly along, and is then said to be 'vivid' ; at other times it travels in crooked lines, and is called 'forked lightning' ; when it swoops down upon some object it is called 'swooping lightning'.

After Aristotle, progress in meteorology stalled for a long time. Theophrastus compiled a book on weather forecasting, called the Book of Signs, as well as On Winds. He gave hundreds of signs for weather phenomena for a period up to a year.[7]: 25  His system was based on dividing the year by the setting and the rising of the Pleiad, halves into solstices and equinoxes, and the continuity of the weather for those periods. He also divided months into the new moon, fourth day, eighth day and full moon, in likelihood of a change in the weather occurring. The day was divided into sunrise, mid-morning, noon, mid-afternoon and sunset, with corresponding divisions of the night, with change being likely at one of these divisions.[7]: 25  Applying the divisions and a principle of balance in the yearly weather, he came up with forecasts like that if a lot of rain falls in the winter, the spring is usually dry. Rules based on actions of animals are also present in his work, like that if a dog rolls on the ground, it is a sign of a storm. Shooting stars and the Moon were also considered significant. However, he made no attempt to explain these phenomena, referring only to the Aristotelian method.[7]: 26  The work of Theophrastus remained a dominant influence in weather forecasting for nearly 2,000 years.[14]

Meteorology after Aristotle

[edit]

Meteorology continued to be studied and developed over the centuries, but it was not until the Renaissance in the 14th to 17th centuries that significant advancements were made in the field. Scientists such as Galileo and Descartes introduced new methods and ideas, leading to the scientific revolution in meteorology.

Speculation on the cause of the flooding of the Nile ended when Eratosthenes, according to Proclus, stated that it was known that man had gone to the sources of the Nile and observed the rains, although interest in its implications continued.[7]: 26 

During the era of Roman Greece and Europe, scientific interest in meteorology waned. In the 1st century BC, most natural philosophers claimed that the clouds and winds extended up to 111 miles, but Posidonius thought that they reached up to five miles, after which the air is clear, liquid and luminous. He closely followed Aristotle's theories. By the end of the second century BC, the center of science shifted from Athens to Alexandria, home to the ancient Library of Alexandria. In the 2nd century AD, Ptolemy's Almagest dealt with meteorology, because it was considered a subset of astronomy. He gave several astrological weather predictions.[7]: 27  He constructed a map of the world divided into climatic zones by their illumination, in which the length of the Summer solstice increased by half an hour per zone between the equator and the Arctic.[7]: 28  Ptolemy wrote on the atmospheric refraction of light in the context of astronomical observations.[15]

In 25 AD, Pomponius Mela, a Roman geographer, formalized the climatic zone system.[16] In 63–64 AD, Seneca wrote Naturales quaestiones. It was a compilation and synthesis of ancient Greek theories. However, theology was of foremost importance to Seneca, and he believed that phenomena such as lightning were tied to fate.[7]: 29  The second book (chapter) of Pliny the Elder's Natural History covers meteorology. He states that more than twenty ancient Greek authors studied meteorology. He did not make any personal contributions, and the value of his work is in preserving earlier speculation, much like Seneca's work.[7]: 30 

Twilight at Baker Beach

From 400 to 1100, scientific learning in Europe was preserved by the clergy. Isidore of Seville devoted a considerable attention to meteorology in Etymologiae, De ordine creaturum and De natura rerum. Bede the Venerable was the first Englishman to write about the weather in De natura rerum in 703. The work was a summary of then extant classical sources. However, Aristotle's works were largely lost until the 12th century, including Meteorologica. Isidore and Bede were scientifically minded, but they adhered to the letter of Scripture.[7]: 30 

Islamic civilization translated many ancient works into Arabic which were transmitted and translated in western Europe to Latin.[7]: 31 

In the 9th century, Al-Dinawari wrote the Kitab al-Nabat (Book of Plants), in which he deals with the application of meteorology to agriculture during the Arab Agricultural Revolution. He describes the meteorological character of the sky, the planets and constellations, the sun and moon, the lunar phases indicating seasons and rain, the anwa (heavenly bodies of rain), and atmospheric phenomena such as winds, thunder, lightning, snow, floods, valleys, rivers, lakes.[17][18]

In 1021, Alhazen showed that atmospheric refraction is also responsible for twilight in Opticae thesaurus; he estimated that twilight begins when the sun is 19 degrees below the horizon, and also used a geometric determination based on this to estimate the maximum possible height of the Earth's atmosphere as 52,000 passim (about 49 miles, or 79 km).[19]

Adelard of Bath was one of the early translators of the classics. He also discussed meteorological topics in his Quaestiones naturales. He thought dense air produced propulsion in the form of wind. He explained thunder by saying that it was due to ice colliding in clouds, and in Summer it melted. In the 13th century, Aristotelian theories reestablished dominance in meteorology. For the next four centuries, meteorological work by and large was mostly commentary. It has been estimated over 156 commentaries on the Meteorologica were written before 1650.[7]: 22 

Experimental evidence was less important than appeal to the classics and authority in medieval thought. In the 13th century, Roger Bacon advocated experimentation and the mathematical approach. In his Opus majus, he followed Aristotle's theory on the atmosphere being composed of water, air, and fire, supplemented by optics and geometric proofs. He noted that Ptolemy's climatic zones had to be adjusted for topography.[7]: 33 

Albertus Magnus was the first to propose that each drop of falling rain had the form of a small sphere, and that this form meant that the rainbow was produced by light interacting with each raindrop.[20] Roger Bacon was the first to calculate the angular size of the rainbow. He stated that a rainbow summit cannot appear higher than 42 degrees above the horizon.[21]

In the late 13th century and early 14th century, Kamāl al-Dīn al-Fārisī and Theodoric of Freiberg were the first to give the correct explanations for the primary rainbow phenomenon. Theodoric went further and also explained the secondary rainbow.[22]

By the middle of the 16th century, meteorology had developed along two lines: theoretical science based on Meteorologica, and astrological weather forecasting. The pseudoscientific prediction by natural signs became popular and enjoyed protection of the church and princes. This was supported by scientists like Regiomontanus, Leonard Digges, and Johannes Kepler. However, there were skeptics. In the 14th century, Nicole Oresme believed that weather forecasting was possible, but that the rules for it were unknown at the time. Astrological influence in meteorology persisted until the 18th century.[7]: 33 

Gerolamo Cardano's De Subilitate (1550) was the first work to challenge fundamental aspects of Aristotelian theory. Cardano maintained that there were only three basic elements- earth, air, and water. He discounted fire because it needed material to spread and produced nothing. Cardano thought there were two kinds of air: free air and enclosed air. The former destroyed inanimate things and preserved animate things, while the latter had the opposite effect.[7]: 36 

René Descartes's Discourse on the Method (1637) typifies the beginning of the scientific revolution in meteorology. His scientific method had four principles: to never accept anything unless one clearly knew it to be true; to divide every difficult problem into small problems to tackle; to proceed from the simple to the complex, always seeking relationships; to be as complete and thorough as possible with no prejudice.[7]: 37 

In the appendix Les Meteores, he applied these principles to meteorology. He discussed terrestrial bodies and vapors which arise from them, proceeding to explain the formation of clouds from drops of water, and winds, clouds then dissolving into rain, hail and snow. He also discussed the effects of light on the rainbow. Descartes hypothesized that all bodies were composed of small particles of different shapes and interwovenness. All of his theories were based on this hypothesis. He explained the rain as caused by clouds becoming too large for the air to hold, and that clouds became snow if the air was not warm enough to melt them, or hail if they met colder wind. Like his predecessors, Descartes's method was deductive, as meteorological instruments were not developed and extensively used yet. He introduced the Cartesian coordinate system to meteorology and stressed the importance of mathematics in natural science. His work established meteorology as a legitimate branch of physics.[7]: 37 

In the 18th century, the invention of the thermometer and barometer allowed for more accurate measurements of temperature and pressure, leading to a better understanding of atmospheric processes. This century also saw the birth of the first meteorological society, the Societas Meteorologica Palatina in 1780.[23]

In the 19th century, advances in technology such as the telegraph and photography led to the creation of weather observing networks and the ability to track storms. Additionally, scientists began to use mathematical models to make predictions about the weather. The 20th century saw the development of radar and satellite technology, which greatly improved the ability to observe and track weather systems. In addition, meteorologists and atmospheric scientists started to create the first weather forecasts and temperature predictions.[24]

In the 20th and 21st centuries, with the advent of computer models and big data, meteorology has become increasingly dependent on numerical methods and computer simulations. This has greatly improved weather forecasting and climate predictions. Additionally, meteorology has expanded to include other areas such as air quality, atmospheric chemistry, and climatology. The advancement in observational, theoretical and computational technologies has enabled ever more accurate weather predictions and understanding of weather pattern and air pollution. In current time, with the advancement in weather forecasting and satellite technology, meteorology has become an integral part of everyday life, and is used for many purposes such as aviation, agriculture, and disaster management.[citation needed]

Instruments and classification scales

[edit]
A hemispherical cup anemometer

In 1441, King Sejong's son, Prince Munjong of Korea, invented the first standardized rain gauge.[25] These were sent throughout the Joseon dynasty of Korea as an official tool to assess land taxes based upon a farmer's potential harvest. In 1450, Leone Battista Alberti developed a swinging-plate anemometer, and was known as the first anemometer.[26] In 1607, Galileo Galilei constructed a thermoscope. In 1611, Johannes Kepler wrote the first scientific treatise on snow crystals: "Strena Seu de Nive Sexangula (A New Year's Gift of Hexagonal Snow)".[27] In 1643, Evangelista Torricelli invented the mercury barometer.[26] In 1662, Sir Christopher Wren invented the mechanical, self-emptying, tipping bucket rain gauge. In 1714, Gabriel Fahrenheit created a reliable scale for measuring temperature with a mercury-type thermometer.[28] In 1742, Anders Celsius, a Swedish astronomer, proposed the "centigrade" temperature scale, the predecessor of the current Celsius scale.[29] In 1783, the first hair hygrometer was demonstrated by Horace-Bénédict de Saussure. In 1802–1803, Luke Howard wrote On the Modification of Clouds, in which he assigns cloud types Latin names.[30] In 1806, Francis Beaufort introduced his system for classifying wind speeds.[31] Near the end of the 19th century the first cloud atlases were published, including the International Cloud Atlas, which has remained in print ever since. The April 1960 launch of the first successful weather satellite, TIROS-1, marked the beginning of the age where weather information became available globally.

Atmospheric composition research

[edit]

In 1648, Blaise Pascal rediscovered that atmospheric pressure decreases with height, and deduced that there is a vacuum above the atmosphere.[32] In 1738, Daniel Bernoulli published Hydrodynamics, initiating the kinetic theory of gases and established the basic laws for the theory of gases.[33] In 1761, Joseph Black discovered that ice absorbs heat without changing its temperature when melting. In 1772, Black's student Daniel Rutherford discovered nitrogen, which he called phlogisticated air, and together they developed the phlogiston theory.[34] In 1777, Antoine Lavoisier discovered oxygen and developed an explanation for combustion.[35] In 1783, in Lavoisier's essay "Reflexions sur le phlogistique",[36] he deprecates the phlogiston theory and proposes a caloric theory.[37][38] In 1804, John Leslie observed that a matte black surface radiates heat more effectively than a polished surface, suggesting the importance of black-body radiation. In 1808, John Dalton defended caloric theory in A New System of Chemistry and described how it combines with matter, especially gases; he proposed that the heat capacity of gases varies inversely with atomic weight. In 1824, Sadi Carnot analyzed the efficiency of steam engines using caloric theory; he developed the notion of a reversible process and, in postulating that no such thing exists in nature, laid the foundation for the second law of thermodynamics. In 1716, Edmond Halley suggested that aurorae are caused by "magnetic effluvia" moving along the Earth's magnetic field lines.

Research into cyclones and air flow

[edit]
General circulation of the Earth's atmosphere: The westerlies and trade winds are part of the Earth's atmospheric circulation.

In 1494, Christopher Columbus experienced a tropical cyclone, which led to the first written European account of a hurricane.[39] In 1686, Edmond Halley presented a systematic study of the trade winds and monsoons and identified solar heating as the cause of atmospheric motions.[40] In 1735, an ideal explanation of global circulation through study of the trade winds was written by George Hadley.[41] In 1743, when Benjamin Franklin was prevented from seeing a lunar eclipse by a hurricane, he decided that cyclones move in a contrary manner to the winds at their periphery.[42] Understanding the kinematics of how exactly the rotation of the Earth affects airflow was partial at first. Gaspard-Gustave Coriolis published a paper in 1835 on the energy yield of machines with rotating parts, such as waterwheels.[43] In 1856, William Ferrel proposed the existence of a circulation cell in the mid-latitudes, and the air within deflected by the Coriolis force resulting in the prevailing westerly winds.[44] Late in the 19th century, the motion of air masses along isobars was understood to be the result of the large-scale interaction of the pressure gradient force and the deflecting force. By 1912, this deflecting force was named the Coriolis effect.[45] Just after World War I, a group of meteorologists in Norway led by Vilhelm Bjerknes developed the Norwegian cyclone model that explains the generation, intensification and ultimate decay (the life cycle) of mid-latitude cyclones, and introduced the idea of fronts, that is, sharply defined boundaries between air masses.[46] The group included Carl-Gustaf Rossby (who was the first to explain the large scale atmospheric flow in terms of fluid dynamics), Tor Bergeron (who first determined how rain forms) and Jacob Bjerknes.

Observation networks and weather forecasting

[edit]
Cloud classification by altitude of occurrence
This "Hyetographic or Rain Map of the World" was first published 1848 by Alexander Keith Johnston.
This "Hyetographic or Rain Map of Europe" was also published in 1848 as part of "The Physical Atlas".

In the late 16th century and first half of the 17th century a range of meteorological instruments were invented – the thermometer, barometer, hydrometer, as well as wind and rain gauges. In the 1650s natural philosophers started using these instruments to systematically record weather observations. Scientific academies established weather diaries and organised observational networks.[47] In 1654, Ferdinando II de Medici established the first weather observing network, that consisted of meteorological stations in Florence, Cutigliano, Vallombrosa, Bologna, Parma, Milan, Innsbruck, Osnabrück, Paris and Warsaw. The collected data were sent to Florence at regular time intervals.[48] In the 1660s Robert Hooke of the Royal Society of London sponsored networks of weather observers. Hippocrates's treatise Airs, Waters, and Places had linked weather to disease. Thus early meteorologists attempted to correlate weather patterns with epidemic outbreaks, and the climate with public health.[47]

During the Age of Enlightenment meteorology tried to rationalise traditional weather lore, including astrological meteorology. But there were also attempts to establish a theoretical understanding of weather phenomena. Edmond Halley and George Hadley tried to explain trade winds. They reasoned that the rising mass of heated equator air is replaced by an inflow of cooler air from high latitudes. A flow of warm air at high altitude from equator to poles in turn established an early picture of circulation. Frustration with the lack of discipline among weather observers, and the poor quality of the instruments, led the early modern nation states to organise large observation networks. Thus, by the end of the 18th century, meteorologists had access to large quantities of reliable weather data.[47] In 1832, an electromagnetic telegraph was created by Baron Schilling.[49] The arrival of the electrical telegraph in 1837 afforded, for the first time, a practical method for quickly gathering surface weather observations from a wide area.[50]

This data could be used to produce maps of the state of the atmosphere for a region near the Earth's surface and to study how these states evolved through time. To make frequent weather forecasts based on these data required a reliable network of observations, but it was not until 1849 that the Smithsonian Institution began to establish an observation network across the United States under the leadership of Joseph Henry.[51] Similar observation networks were established in Europe at this time. The Reverend William Clement Ley was key in understanding of cirrus clouds and early understandings of jet streams.[52] Charles Kenneth Mackinnon Douglas, known as 'CKM' Douglas, read Ley's papers after his death and carried on the early study of weather systems.[53] 19th-century researchers in meteorology were drawn from military or medical backgrounds, rather than trained as dedicated scientists.[54] In 1854, the United Kingdom government appointed Robert FitzRoy to the new office of Meteorological Statist to the Board of Trade with the task of gathering weather observations at sea. FitzRoy's office became the United Kingdom Meteorological Office in 1854, the second oldest national meteorological service in the world (the Central Institution for Meteorology and Geodynamics (ZAMG) in Austria was founded in 1851 and is the oldest weather service in the world). The first daily weather forecasts made by FitzRoy's Office were published in The Times newspaper in 1860. The following year a system was introduced of hoisting storm warning cones at principal ports when a gale was expected.

FitzRoy coined the term "weather forecast" and tried to separate scientific approaches from prophetic ones.[55]

Over the next 50 years, many countries established national meteorological services. The India Meteorological Department (1875) was established to follow tropical cyclone and monsoon.[56] The Finnish Meteorological Central Office (1881) was formed from part of Magnetic Observatory of Helsinki University.[57] Japan's Tokyo Meteorological Observatory, the forerunner of the Japan Meteorological Agency, began constructing surface weather maps in 1883.[58] The United States Weather Bureau (1890) was established under the United States Department of Agriculture. The Australian Bureau of Meteorology (1906) was established by a Meteorology Act to unify existing state meteorological services.[59][60]

Numerical weather prediction

[edit]
A meteorologist at the console of the IBM 7090 in the Joint Numerical Weather Prediction Unit, c. 1965

In 1904, Norwegian scientist Vilhelm Bjerknes first argued in his paper Weather Forecasting as a Problem in Mechanics and Physics that it should be possible to forecast weather from calculations based upon natural laws.[61][62]

It was not until later in the 20th century that advances in the understanding of atmospheric physics led to the foundation of modern numerical weather prediction. In 1922, Lewis Fry Richardson published "Weather Prediction By Numerical Process",[63] after finding notes and derivations he worked on as an ambulance driver in World War I. He described how small terms in the prognostic fluid dynamics equations that govern atmospheric flow could be neglected, and a numerical calculation scheme that could be devised to allow predictions. Richardson envisioned a large auditorium of thousands of people performing the calculations. However, the sheer number of calculations required was too large to complete without electronic computers, and the size of the grid and time steps used in the calculations led to unrealistic results. Though numerical analysis later found that this was due to numerical instability.

Starting in the 1950s, numerical forecasts with computers became feasible.[64] The first weather forecasts derived this way used barotropic (single-vertical-level) models, and could successfully predict the large-scale movement of midlatitude Rossby waves, that is, the pattern of atmospheric lows and highs.[65] In 1959, the UK Meteorological Office received its first computer, a Ferranti Mercury.[66]

In the 1960s, the chaotic nature of the atmosphere was first observed and mathematically described by Edward Lorenz, founding the field of chaos theory.[67] These advances have led to the current use of ensemble forecasting in most major forecasting centers, to take into account uncertainty arising from the chaotic nature of the atmosphere.[68] Mathematical models used to predict the long term weather of the Earth (climate models), have been developed that have a resolution today that are as coarse as the older weather prediction models. These climate models are used to investigate long-term climate shifts, such as what effects might be caused by human emission of greenhouse gases.

Meteorologists

[edit]

Meteorologists are scientists who study and work in the field of meteorology.[69] The American Meteorological Society publishes and continually updates an authoritative electronic Meteorology Glossary.[70] Meteorologists work in government agencies, private consulting and research services, industrial enterprises, utilities, radio and television stations, and in education. In the United States, meteorologists held about 10,000 jobs in 2018.[71]

Although weather forecasts and warnings are the best known products of meteorologists for the public, weather presenters on radio and television are not necessarily professional meteorologists. They are most often reporters with little formal meteorological training, using unregulated titles such as weather specialist or weatherman. The American Meteorological Society and National Weather Association issue "Seals of Approval" to weather broadcasters who meet certain requirements but this is not mandatory to be hired by the media.

Equipment

[edit]
Satellite image of Hurricane Hugo with a polar low visible at the top of the image

Each science has its own unique sets of laboratory equipment. In the atmosphere, there are many things or qualities of the atmosphere that can be measured. Rain, which can be observed, or seen anywhere and anytime was one of the first atmospheric qualities measured historically. Also, two other accurately measured qualities are wind and humidity. Neither of these can be seen but can be felt. The devices to measure these three sprang up in the mid-15th century and were respectively the rain gauge, the anemometer, and the hygrometer. Many attempts had been made prior to the 15th century to construct adequate equipment to measure the many atmospheric variables. Many were faulty in some way or were simply not reliable. Even Aristotle noted this in some of his work as the difficulty to measure the air.

Sets of surface measurements are important data to meteorologists. They give a snapshot of a variety of weather conditions at one single location and are usually at a weather station, a ship or a weather buoy. The measurements taken at a weather station can include any number of atmospheric observables. Usually, temperature, pressure, wind measurements, and humidity are the variables that are measured by a thermometer, barometer, anemometer, and hygrometer, respectively.[72] Professional stations may also include air quality sensors (carbon monoxide, carbon dioxide, methane, ozone, dust, and smoke), ceilometer (cloud ceiling), falling precipitation sensor, flood sensor, lightning sensor, microphone (explosions, sonic booms, thunder), pyranometer/pyrheliometer/spectroradiometer (IR/Vis/UV photodiodes), rain gauge/snow gauge, scintillation counter (background radiation, fallout, radon), seismometer (earthquakes and tremors), transmissometer (visibility), and a GPS clock for data logging. Upper air data are of crucial importance for weather forecasting. The most widely used technique is launches of radiosondes. Supplementing the radiosondes a network of aircraft collection is organized by the World Meteorological Organization.

Remote sensing, as used in meteorology, is the concept of collecting data from remote weather events and subsequently producing weather information. The common types of remote sensing are Radar, Lidar, and satellites (or photogrammetry). Each collects data about the atmosphere from a remote location and, usually, stores the data where the instrument is located. Radar and Lidar are not passive because both use EM radiation to illuminate a specific portion of the atmosphere.[73] Weather satellites along with more general-purpose Earth-observing satellites circling the earth at various altitudes have become an indispensable tool for studying a wide range of phenomena from forest fires to El Niño.

Spatial scales

[edit]

The study of the atmosphere can be divided into distinct areas that depend on both time and spatial scales. At one extreme of this scale is climatology. In the timescales of hours to days, meteorology separates into micro-, meso-, and synoptic scale meteorology. Respectively, the geospatial size of each of these three scales relates directly with the appropriate timescale.

Other subclassifications are used to describe the unique, local, or broad effects within those subclasses.

Scales of Atmospheric Motion Systems[74]
Type of motion Horizontal scale (meter)
Molecular mean free path 10−7
Minute turbulent eddies 10−2 – 10−1
Small eddies 10−1 – 1
Dust devils 1–10
Gusts 10 – 102
Tornadoes 102
Cumulonimbus clouds 103
Fronts, squall lines 104 – 105
Hurricanes 105
Synoptic Cyclones 106
Planetary waves 107

Microscale

[edit]

Microscale meteorology is the study of atmospheric phenomena on a scale of about 1 kilometre (0.62 mi) or less. Individual thunderstorms, clouds, and local turbulence caused by buildings and other obstacles (such as individual hills) are modeled on this scale.[75] Misoscale meteorology is an informal subdivision.

Mesoscale

[edit]

Mesoscale meteorology is the study of atmospheric phenomena that has horizontal scales ranging from 1 km to 1000 km and a vertical scale that starts at the Earth's surface and includes the atmospheric boundary layer, troposphere, tropopause, and the lower section of the stratosphere. The terms meso-alpha, meso-beta, and meso-gamma to classify the horizontal scales of atmospheric processes were introduced to the field of mesoscale meteorology by Isidoro Orlanski.[76] Mesoscale timescales last from less than a day to multiple weeks. The events typically of interest are thunderstorms, squall lines, fronts, precipitation bands in tropical and extratropical cyclones, and topographically generated weather systems such as mountain waves and sea and land breezes.[77]

Synoptic scale

[edit]
NOAA: Synoptic scale weather analysis

Synoptic scale meteorology predicts atmospheric changes at scales up to 1000 km and 105 sec ~ (2.8 days), in time and space. At the synoptic scale, the Coriolis acceleration acting on moving air masses (outside of the tropics) plays a dominant role in predictions. The phenomena typically described by synoptic meteorology include events such as extratropical cyclones, baroclinic troughs and ridges, frontal zones, and to some extent jet streams. All of these are typically given on weather maps for a specific time. The minimum horizontal scale of synoptic phenomena is limited to the spacing between surface observation stations.[78]

Global scale

[edit]
Annual mean sea surface temperatures

Global scale meteorology is the study of weather patterns related to the transport of heat from the tropics to the poles. Very large scale oscillations are of importance at this scale. These oscillations have time periods typically on the order of months, such as the Madden–Julian oscillation, or years, such as the El Niño–Southern Oscillation and the Pacific decadal oscillation. Global scale meteorology pushes into the range of climatology. The traditional definition of climate is pushed into larger timescales and with the understanding of the longer time scale global oscillations, their effect on climate and weather disturbances can be included in the synoptic and mesoscale timescales predictions.

Numerical Weather Prediction is a main focus in understanding air–sea interaction, tropical meteorology, atmospheric predictability, and tropospheric/stratospheric processes.[79] The Naval Research Laboratory in Monterey, California, developed a global atmospheric model called Navy Operational Global Atmospheric Prediction System (NOGAPS). NOGAPS is run operationally at Fleet Numerical Meteorology and Oceanography Center for the United States Military. Many other global atmospheric models are run by national meteorological agencies.

Branches of meteorology

[edit]

Based on methodological approach

[edit]

Physical meteorology

[edit]

Physical meteorology studies the atmosphere's physical properties, processes, and phenomena. It covers the fundamental principles of atmospheric thermodynamics and energy transfer, including solar and terrestrial radiation (absorption, reflection, and scattering). Cloud physics is another key area of investigation, alongside the study of aerosols, precipitation formation, and atmospheric moist processes. The field further examines optical, electrical, and acoustical effects within the atmosphere. Near-surface processes like mixing, turbulence, and friction, and understanding their connection to the atmosphere's physical characteristics also fall within its scope.[80][81][82]

Dynamic meteorology

[edit]

Dynamic meteorology is the study of atmospheric motions and the physical laws that govern them, using principles drawn from fluid dynamics, thermodynamics and mechanical motion. It aims to explain why the atmosphere moves and how its state evolves. Here, the fundamental analytical unit of atmospheric behavior is an air parcel, defined as an infinitesimally small region in the fluid continuum of the atmosphere. This key conceptual tool allows for abstraction from the atmosphere's discrete molecular and chemical nature. Temperature, density, pressure, etc. are considered key physical quantities with unique values within this atmospheric continuum. These characterize the state of the atmosphere.[74]

Synoptic meteorology

[edit]

Synoptic meteorology focuses on diagnosing the conditions of the atmosphere at a given moment across large regions. This branch involves the preparation of various weather maps displaying meteorological conditions observed simultaneously (hence the term "synoptic," meaning "viewed together"). Examples of such maps include upper-air charts, aerological diagrams, and satellite imagery of cloud movement. Through detailed analysis of these charts, meteorologists aim to understand large-scale wind and pressure systems, as well as the complex relationship between atmospheric circulation and the regional surface environment. The primary objective of the field is to predict atmospheric changes from these initial conditions, typically for a few hours to a few days ahead. The ultimate goal is to help produce regional or station-based forecasts.[83][84][85]

Based on scale

[edit]

Boundary layer meteorology

[edit]

Boundary layer meteorology is the study of processes in the air layer directly above Earth's surface, known as the atmospheric boundary layer (ABL). The effects of the surface – heating, cooling, and friction – cause turbulent mixing within the air layer. Significant movement of heat, matter, or momentum on time scales of less than a day are caused by turbulent motions.[86] Boundary layer meteorology includes the study of all types of surface–atmosphere boundary, including ocean, lake, urban land and non-urban land for the study of meteorology.

Applications

[edit]

Weather forecasting

[edit]
Forecast of surface pressures five days into the future for the north Pacific, North America, and north Atlantic Ocean

Weather forecasting is the application of science and technology to predict the state of the atmosphere at a future time and given location. Humans have attempted to predict the weather informally for millennia and formally since at least the 19th century.[87][88] Weather forecasts are made by collecting quantitative data about the current state of the atmosphere and using scientific understanding of atmospheric processes to project how the atmosphere will evolve.[89]

Once an all-human endeavor based mainly upon changes in barometric pressure, current weather conditions, and sky condition,[90][91] forecast models are now used to determine future conditions. Human input is still required to pick the best possible forecast model to base the forecast upon, which involves pattern recognition skills, teleconnections, knowledge of model performance, and knowledge of model biases. The chaotic nature of the atmosphere, the massive computational power required to solve the equations that describe the atmosphere, error involved in measuring the initial conditions, and an incomplete understanding of atmospheric processes mean that forecasts become less accurate as the difference in current time and the time for which the forecast is being made (the range of the forecast) increases. The use of ensembles and model consensus help narrow the error and pick the most likely outcome.[92][93][94]

There are a variety of end uses to weather forecasts. Weather warnings are important forecasts because they are used to protect life and property.[95] Forecasts based on temperature and precipitation are important to agriculture,[96][97][98][99] and therefore to commodity traders within stock markets. Temperature forecasts are used by utility companies to estimate demand over coming days.[100][101][102] On an everyday basis, people use weather forecasts to determine what to wear. Since outdoor activities are severely curtailed by heavy rain, snow, and wind chill, forecasts can be used to plan activities around these events, and to plan ahead and survive them.

Aviation meteorology

[edit]

Aviation meteorology deals with the impact of weather on air traffic management and flight operations. It is important for aircrews to understand meteorological conditions affecting flight planning and in-flight safety.[103] Weather phenomena such as turbulence, icing, thunderstorms, and reduced visibility are major hazards to aviation and are included in standardized pilot training syllabi worldwide.[104] In India, the Directorate General of Civil Aviation (DGCA) includes meteorology as a compulsory subject in pilot licensing examinations.[105]

Agricultural meteorology

[edit]

Meteorologists, soil scientists, agricultural hydrologists, and agronomists are people concerned with studying the effects of weather and climate on plant distribution, crop yield, water-use efficiency, phenology of plant and animal development, and the energy balance of managed and natural ecosystems. Conversely, they are interested in the role of vegetation on climate and weather.[106]

Hydrometeorology

[edit]

Hydrometeorology is the branch of meteorology that deals with the hydrologic cycle, the water budget, and the rainfall statistics of storms.[107] A hydrometeorologist prepares and issues forecasts of accumulating (quantitative) precipitation, heavy rain, heavy snow, and highlights areas with the potential for flash flooding. Typically the range of knowledge that is required overlaps with climatology, mesoscale and synoptic meteorology, and other geosciences.[108]

Nuclear meteorology

[edit]

Nuclear meteorology investigates the distribution of radioactive aerosols and gases in the atmosphere.[109]

Maritime meteorology

[edit]

Maritime meteorology deals with air and wave forecasts for ships operating at sea. Organizations such as the Ocean Prediction Center, Honolulu National Weather Service forecast office, United Kingdom Met Office, KNMI and JMA prepare high seas forecasts for the world's oceans.

Military meteorology

[edit]

Military meteorology is the research and application of meteorology for military purposes. In the United States, the United States Navy's Commander, Naval Meteorology and Oceanography Command oversees meteorological efforts for the Navy and Marine Corps while the United States Air Force's Air Force Weather Agency is responsible for the Air Force and Army.

Environmental meteorology

[edit]

Environmental meteorology mainly analyzes industrial pollution dispersion physically and chemically based on meteorological parameters such as temperature, humidity, wind, and various weather conditions.

Renewable energy

[edit]

Meteorology applications in renewable energy includes basic research, "exploration", and potential mapping of wind power and solar radiation for wind and solar energy.

See also

[edit]

References

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
![Cumulus clouds in fair weather](./ assets/Cumulus_clouds_in_fair_weather.jpeg) Meteorology is the science concerned with the Earth's atmosphere and its physical processes, with a primary emphasis on understanding and predicting phenomena. This interdisciplinary field applies principles of physics, chemistry, and to analyze short-term atmospheric dynamics, such as , , and systems, on timescales ranging from minutes to weeks, in contrast to , which examines long-term patterns and averages. Meteorologists utilize observational data from instruments like barometers, thermometers, and radars, combined with numerical models run on high-performance computers, to forecast conditions that impact , , and disaster preparedness. Significant advancements include the development of satellites in the , such as TIROS I, which enabled global monitoring, and subsequent improvements in computational power that have enhanced forecast accuracy beyond initial expectations, reducing errors in medium-range predictions. These achievements have facilitated timely warnings for severe events like hurricanes and floods, saving countless lives and supporting economic activities reliant on reliable information.

Definition and Fundamentals

Core Concepts and Physical Principles

Meteorology examines atmospheric processes through the lens of physics, treating the atmosphere as a compressible, viscous subject to gravitational, , and rotational forces. The fundamental governing equations derive from , momentum, and energy, adapted from the Navier-Stokes equations for geophysical scales. These principles enable modeling of phenomena from local to global circulation, emphasizing causal mechanisms like buoyancy-driven ascent and geostrophic balance. The Earth's atmosphere comprises approximately 78.08% , 20.95% oxygen, and 0.93% by volume in dry air, with variable comprising up to 4% in humid regions. Its vertical structure features distinct layers defined by temperature gradients: the (0–12 km altitude on average, extending to 8 km at poles and 18 km at ), where nearly all occurs due to convective mixing; the (12–50 km), characterized by stable inversion from absorption of ultraviolet radiation; the (50–85 km), with temperatures dropping to -90°C; and the above, where molecular dissociation dominates. This layering arises from , wherein the vertical gradient balances gravitational force: dpdz=ρg\frac{dp}{dz} = -\rho g, with decreasing exponentially from 1013 hPa at to near zero above 100 km. Thermodynamic processes underpin atmospheric stability and motion, with the dry adiabatic lapse rate— the temperature decrease of a rising unsaturated air parcel—equaling 9.8°C per kilometer, derived from Γd=gcp\Gamma_d = \frac{g}{c_p} where gg is gravity and cpc_p specific heat at constant pressure. Moist processes reduce this to 4–6°C/km via latent heat release during condensation, fostering convective instability when environmental lapse rates exceed these values. Energy conservation manifests in the first law: dq=cvdT+pdVdq = c_v dT + p dV, applied to parcels assuming reversible adiabatic expansion or compression. Large-scale dynamics incorporate the Coriolis effect, an apparent deflection arising from at Ω=7.29×105\Omega = 7.29 \times 10^{-5} rad/s, with force Fc=2Ω×v\mathbf{F_c} = -2 \boldsymbol{\Omega} \times \mathbf{v}, deflecting northward motion rightward in the and promoting around low- systems. This interacts with gradients and in the , yielding balanced flows like where Coriolis balances force. The global energy balance sustains circulation: incoming solar radiation averages 342 W/m² at the top of the atmosphere, balanced by and reflected shortwave, yielding an effective blackbody temperature of 255 K despite surface averages of 288 K due to greenhouse trapping. Latitudinal imbalances drive Hadley, Ferrel, and polar cells via poleward heat transport. Meteorology focuses on the short-term dynamics and of atmospheric conditions, such as events lasting from hours to a few weeks, in contrast to , which analyzes long-term averages and variability of these conditions over decades or centuries to identify patterns. This distinction arises from meteorology's emphasis on immediate causal processes like pressure gradients and fronts driving transient phenomena, while climatology aggregates data to discern statistical norms and anomalies, often using historical records spanning 30 years or more as defined by the . Atmospheric science serves as an overarching discipline that includes meteorology but extends to fundamental physical and chemical processes throughout the atmosphere, encompassing topics like , interactions, and upper-atmospheric layers beyond the where weather primarily occurs. Meteorology, by comparison, prioritizes applied forecasting and synoptic-scale analysis for practical outcomes, such as or agricultural planning, drawing on empirical models tuned for predictability rather than exhaustive theoretical exploration of all atmospheric layers. Hydrology intersects with meteorology through as a key atmospheric input to the , but diverges by concentrating on terrestrial water storage, flow, and quality in rivers, soils, and aquifers, excluding direct atmospheric dynamics. Similarly, oceanography examines marine physical processes, including currents and , with meteorology contributing boundary conditions via and ; however, oceanography centers on oceanic interiors and ecosystems, not as the primary driver. These fields collaborate in coupled models for phenomena like El Niño, yet meteorology remains atmosphere-centric, grounded in gas-phase and fluid motion equations specific to air masses.

History

Ancient Origins and Aristotelian Framework

Early meteorological observations emerged in ancient civilizations for practical purposes such as , , and seasonal planning, with records dating back to Mesopotamian tablets from around 650 BC that included rudimentary short-range forecasts based on patterns and celestial observations. These Babylonian efforts, documented in texts like the , linked weather signs to planetary positions and winds, reflecting an empirical but astrological approach rather than causal mechanisms. Similarly, ancient Egyptians tracked the Nile's annual flooding cycles, associating them with the heliacal rising of Sirius around 3000 BC, while Chinese records from the (c. 1600–1046 BC) noted and wind directions for calendrical predictions. In , pre-Socratic philosophers advanced qualitative theories of atmospheric phenomena, building on Ionian naturalism. (c. 585–528 BC) proposed air as the primary substance, with and explaining and winds, while (c. 610–546 BC) invoked the boundless apeiron to account for meteorological changes as mixtures of opposites like hot and cold. These ideas shifted focus from mythological attributions—such as controlling storms—to naturalistic explanations, though lacking systematic experimentation. By the , Hippocratic texts like Airs, Waters, Places (c. 400 BC) empirically correlated with , describing how winds and seasons influenced disease patterns based on observations in coastal regions. Aristotle's Meteorologica, composed around 340 BC, synthesized and formalized these traditions into the first comprehensive treatise on sublunary phenomena "above the " but below the . Spanning four books, it categorized meteora into vaporous (e.g., , , from moist exhalations condensing in the atmosphere) and dry (e.g., winds, earthquakes from subterranean combustion of earthy exhalations). Grounded in his four-element theory—, water, air, and fire—Aristotle explained processes like as water transforming into vapor via solar heat, rising to form clouds, and precipitating when cooled, while winds arose from uneven solar heating causing air displacements. He incorporated empirical data, such as regional flood records (e.g., the Deucalion deluge localized to ancient Hellas) and patterns, but relied on teleological causation, viewing phenomena as purposeful outcomes of natural tendencies rather than mechanical forces. This framework dominated Western meteorological thought for nearly two millennia, embedding qualitative analogies over quantitative measurement and perpetuating errors like attributing earthquakes to underground winds or comets to atmospheric combustion. Aristotle critiqued predecessors for insufficient causal depth, emphasizing systematic classification—e.g., distinguishing superior (higher atmosphere) from inferior (sublunary) meteora—yet his reliance on unverified assumptions, such as the earth's centrality and finite exhalations, constrained predictive accuracy until empirical challenges in the Renaissance. Despite inaccuracies, Meteorologica established meteorology as a deductive science, influencing Islamic scholars like Avicenna and medieval Europeans who expanded its observational base without overturning its core principles until the 17th century.

Post-Aristotelian Advances to 19th Century

In the centuries following , meteorological inquiry progressed through empirical observations and instrumental innovations, particularly during the . Evangelista Torricelli's invention of the mercury in 1643 provided the first reliable means to measure , revealing variations that correlated with weather changes and challenging earlier qualitative descriptions. Thermometers, refined in the from early designs by Galileo and Santorio, enabled precise recordings, laying groundwork for quantitative data collection. Theoretical models emerged in the , with George Hadley proposing in 1735 a large-scale driven by solar heating at the and cooling at higher latitudes, explaining as part of a hemispheric cell. Benjamin Franklin's 1752 during a demonstrated that consists of , linking electrical phenomena to atmospheric processes and informing later understandings of thunderstorms. advanced gaseous composition studies with his 1793 Meteorological Observations and Essays, which included systematic weather records, and his 1801 law of partial pressures, quantifying how mixed atmospheric gases exert independent pressures. By the early 19th century, classification systems formalized observations, as Luke Howard introduced a nomenclature for clouds in 1803, categorizing them into genera like cirrus, cumulus, and stratus based on form and altitude, a framework still foundational. Mid-century empirical laws emerged from coordinated observations; Christophorus Buys Ballot formulated in 1857 the relation between wind direction and gradients, stating that in the , facing into the wind positions low pressure to the left. These advances shifted meteorology from speculative philosophy toward data-driven science, emphasizing , , , and measurements across networks.

Instrumentation and Early Forecasting

The invention of the by in 1643 marked a pivotal advance in , enabling the quantitative measurement of variations that correlate with systems. This device, consisting of a filled with mercury inverted in a reservoir, demonstrated that air exerts on the column, with falling readings often preceding storms due to approaching low-pressure areas. Concurrently, early thermometers emerged; while developed a rudimentary water-based version around 1593, the calibrated by in 1714 provided reliable temperature scales, facilitating the tracking of diurnal and seasonal thermal patterns critical to analyses. Hygrometers for humidity measurement evolved from primitive designs attributed to in the 15th century, with significant improvements by in 1670 using hair or animal materials that expand with moisture absorption. Wind speed gauges, or , saw early mechanical forms in the , but the practical cup anemometer was invented by John Thomas Romney Robinson in 1846, allowing consistent velocity recordings. Rain gauges, documented in Korea under King Sejong as early as 1441, quantified volume, supporting hydrological correlations with and . These instruments, deployed at fixed stations, generated empirical datasets that revealed causal links, such as pressure gradients driving and following frontal passages, though initial observations remained localized and non-synoptic. In the early 19th century, Luke Howard, a British pharmacist and amateur , advanced qualitative instrumentation through his 1803 essay "On the Modifications of Clouds," classifying clouds into genera—cirrus, cumulus, stratus, and nimbus—based on form, altitude, and potential, a system refined over subsequent editions and foundational to modern nomenclature. This enabled observers to infer atmospheric stability and moisture content from visual cues, complementing instrumental data. The electric telegraph, commercialized in the 1840s, revolutionized data collection by enabling rapid transmission from remote stations, permitting the compilation of simultaneous pressure and wind maps across regions. Early forecasting emerged in this context, with Vice-Admiral Robert FitzRoy establishing the British Meteorological Department in 1854 and issuing gale warnings from 1860 based on telegraphed barometric trends. On August 1, 1861, FitzRoy published the world's first daily public weather forecast in The Times, predicting conditions 24-48 hours ahead using isobaric analysis to identify storm paths, motivated by maritime disasters like the 1859 Royal Charter wreck. These bulletins, phrased as "probable weather" for coastal areas, relied on empirical rules linking falling pressure to fronts but faced skepticism due to inconsistent accuracy amid limited data density and theoretical gaps in dynamics. In the United States, Joseph Henry at the Smithsonian Institution coordinated voluntary observations from 1849, issuing rudimentary alerts that prefigured national services. Such efforts underscored forecasting's probabilistic nature, grounded in instrumental evidence rather than deterministic models, with success tied to recognizing pressure-driven causality over folklore.

20th Century Developments in Dynamics and Networks

In the early 20th century, established the School of Meteorology in 1917, which advanced dynamical understanding through the theory and the Norwegian cyclone model. This framework, developed by Jacob Bjerknes and Halvor Solberg between 1918 and 1921, described cyclones as originating along a —a boundary between cold polar air and warmer subtropical air—progressing through warm sector occlusion and eventual dissipation. The theory integrated hydrodynamic and thermodynamic principles, explaining contrasts and frontal displacements as drivers of mid-latitude weather systems, replacing earlier vague depictions of pressure centers with causal mechanisms rooted in and geostrophic balance. These concepts disseminated globally, influencing synoptic analysis during and interwar periods, particularly in and . Carl-Gustaf Rossby, building on ideas after emigrating to the in , formalized large-scale dynamics in the 1930s, introducing Rossby waves as planetary-scale undulations in the westerly governed by the beta effect—the latitudinal variation in the Coriolis parameter. Rossby's work emphasized conservation of , enabling predictions of long-wave patterns and their role in steering cyclones, while his establishment of meteorology programs at institutions like the trained forecasters in quasi-geostrophic approximations. By the 1940s, wartime demands accelerated these developments, with intensified focus on upper-air dynamics via and soundings to resolve vertical wind shears and thermal structures. Parallel advancements in observational networks supported dynamical models by providing synoptic data for verification. Surface station networks expanded rapidly; for instance, the U.S. Weather Bureau operated over 200 cooperative stations by 1900, growing to thousands by mid-century through voluntary observers and automated rain gauges, enabling daily compilation. Upper-air profiling transformed in the 1920s with radiosondes—instruments attached to transmitting temperature, pressure, and humidity via radio telemetry—first operationalized in (1927) and (1929), achieving global adoption by for routine twice-daily soundings up to 20-30 km altitudes. These complemented earlier and manned ascents, resolving three-dimensional atmospheric structure essential for frontal verification. International coordination via the International Meteorological Committee (formed , evolving into the IMO) facilitated telegraphic data exchange across continents, standardizing observations for hemispheric analyses by the 1920s. This network density—exemplified by Europe's 500+ stations contributing to forecasts—allowed empirical testing of dynamical theories, revealing discrepancies like underpredicted cyclone intensification that spurred refinements in baroclinic instability concepts. By , allied meteorological services integrated these networks for operational forecasting, laying groundwork for post-war computational integration while highlighting limitations in sparse tropical coverage.

Numerical and Computational Era

The numerical era in meteorology began with Lewis Fry Richardson's 1922 publication of Weather Prediction by Numerical Process, which outlined a method to forecast weather by solving hydrodynamic equations manually. Richardson's attempt produced an erroneous pressure change of 145 hectopascals over six hours due to inconsistent initial data and the immense computational demands, rendering manual numerical prediction impractical without electronic aids. Advances in electronic computing during the late enabled the first successful numerical weather predictions. In April 1950, Jule Charney, together with and others, utilized the computer to integrate the barotropic , yielding a 24-hour forecast that required approximately 24 hours of computation time. This effort, detailed in a November 1950 Tellus paper, demonstrated modest skill in predicting large-scale patterns, marking the inception of operational (NWP). The 1950s saw institutional commitment to NWP, with the formation of the Joint Numerical Weather Prediction Unit in 1954 by the U.S. Weather Bureau, , , MIT, and to develop and implement models. By 1955, the produced the first 36-hour forecast from initial conditions at 1500Z on 18. These barotropic models evolved into more comprehensive primitive systems, incorporating vertical structure and nonlinear dynamics, facilitated by increasing computational power. Operational NWP expanded globally in the ; the U.K. issued its first routine computer forecasts twice daily starting November 2, 1965, using the computer. Subsequent decades brought global models, techniques, and ensemble methods to address atmospheric chaos, with supercomputers enabling higher-resolution and improved forecast accuracy. By the , centers like the European Centre for Medium-Range Weather Forecasts routinely produced skillful medium-range predictions, transforming meteorology from empirical to physics-based .

Observation and Equipment

Ground-Based Instruments

Ground-based instruments provide direct, in-situ measurements of atmospheric variables at the Earth's surface, forming the core of surface observation networks worldwide. These tools, deployed at fixed weather stations or portable setups, quantify parameters essential for , , and , including air , , wind and direction, , precipitation amount and type, and . According to (WMO) standards outlined in its Guide to Meteorological Instruments and Methods of (WMO-No. 8), surface observations must adhere to specific siting criteria—such as placement over level, open at least 100 meters from obstacles—to minimize errors from local effects like urban heat islands or terrain-induced . Modern automated weather stations integrate multiple sensors for real-time data transmission, achieving uncertainties typically below 0.5°C for , 0.5 hPa for , and 10% for relative under WMO Class 1 requirements for principal stations. Thermometers measure air by detecting or resistance changes in materials. Liquid-in-glass thermometers, using mercury or alcohol, were pioneered in the early 17th century by inventors like and later standardized with scales by Daniel Fahrenheit (1714) and (1742), offering resolutions down to 0.1°C but prone to breakage and contamination. Contemporary platinum resistance thermometers (PRTs) or thermistors provide higher accuracy (±0.1°C or better) and automation compatibility, with WMO specifying ventilation shields to reduce radiative heating errors by up to 5°C in direct sunlight. These instruments are mounted 1.5 to 2 meters above ground in standard shelters to represent free-air conditions, excluding surface soil influences. Barometers gauge , a key indicator of systems via the hypsometric relation where gradients drive winds. The mercury barometer, developed by in 1643, determines as the height of a supported mercury column (typically 760 mm at standard), with modern aneroid versions using evacuated capsules for portability and digital readouts accurate to 0.1 hPa. WMO guidelines require corrections for temperature (mercury expansion coefficient ~0.00018/°C) and variations (up to 0.8 hPa difference between poles and ), ensuring data comparability across global networks. Wind parameters are captured by anemometers for speed and wind vanes for direction. Cup anemometers, invented by John Thomas Romney Robinson in 1846, employ rotating hemispheres calibrated to miles-per-hour or meters-per-second scales, with three-cup designs achieving ±1% accuracy at 10 m/s but underestimating in gusts below 1 m/s. Sonic anemometers, using ultrasonic pulse transit times, emerged in the 1990s for frictionless, high-frequency (up to 100 Hz) measurements immune to mechanical wear, ideal for studies with resolutions of 0.01 m/s. Instruments are sited at 10 meters height per WMO standards to standardize over varied terrains, avoiding obstructions within 10 times the height to prevent flow distortion errors exceeding 20%. Humidity is assessed via hygrometers, often as psychrometers combining wet- and dry-bulb thermometers where evaporative cooling yields relative humidity from psychrometric tables or equations like Tetens' formula. Hair hygrometers, based on organic fiber elongation, date to the but suffer ±5% inaccuracies from ; capacitive sensors in modern units offer ±2% precision by measuring dielectric changes in films. WMO mandates aspiration at 3-5 m/s for psychrometers to ensure accurate wet-bulb depression, critical for calculations influencing and forecasts. Rain gauges quantify through collection and measurement, with standard tipping-bucket designs recording 0.2 mm increments via electromagnetic switches, suitable for intensities up to 50 mm/h but undercatching by 5-20% in windy conditions due to aerodynamic effects. Float-type gauges provide continuous levels for snowfall melt equivalents, while WMO siphoning gauges auto-empty for long-term unattended operation; siting requires funnels 0.3-1 m above ground to evade splash-in errors. These instruments, alongside visibility meters (e.g., transmissometers measuring 5-50 km ranges) and present detectors (e.g., disdrometers classifying drop sizes), enable comprehensive synoptic reports coded in formats like for international exchange.

Remote Sensing and Satellites

Remote sensing in meteorology encompasses the acquisition of atmospheric data through detection of electromagnetic radiation reflected or emitted from Earth and its atmosphere, enabling observations over vast areas inaccessible to ground-based instruments. Satellites serve as primary platforms, providing continuous global coverage critical for tracking weather systems, monitoring cloud dynamics, and deriving parameters such as sea surface temperatures and atmospheric moisture profiles. The inaugural meteorological satellite, (), launched by on April 1, 1960, demonstrated the feasibility of space-based imaging by capturing the first cloud-cover photographs from orbit, revolutionizing weather observation. Subsequent developments included the geostationary GOES (Geostationary Operational Environmental Satellite) series, with the first launched on October 16, 1975, allowing fixed-position monitoring of regional weather events. Meteorological satellites operate in two principal orbit types: geostationary, positioned at approximately 35,786 kilometers altitude over the for stationary views of a fixed disk, facilitating real-time updates every 10-15 minutes; and polar-orbiting, in low- orbits of 800-1,400 kilometers that traverse from pole to pole, achieving near-global coverage twice daily with resolutions down to 250 meters. Geostationary systems like GOES excel in severe tracking, while polar platforms such as NOAA's JPSS series provide detailed vertical soundings essential for initialization. Key instruments include passive microwave radiometers for all-weather precipitation estimation, infrared sounders for temperature and humidity profiling up to 40 kilometers altitude, and visible/near-infrared imagers capturing spectral bands to discern cloud types and distributions. Data from these sensors, processed via models, yield products like maps and wind vectors from motion, assimilated into forecast models to enhance accuracy. Advancements in the feature hyperspectral imagers on GOES-R/16-19 satellites, offering 16 channels with 2-kilometer resolution and scan times under 5 minutes for continental U.S. coverage, improving convective storm nowcasting. Integration of scatterometry and altimetry further refines ocean surface wind and wave data, while emerging GNSS reflectometry techniques augment traditional sensors for and tropospheric delay measurements. These capabilities, validated against in-situ observations, underscore satellites' indispensable role in mitigating observational gaps in data-sparse regions.

Data Networks and Assimilation

The Global Observing System (GOS), established under the (WMO), coordinates worldwide observations of the atmosphere, ocean surface, land, and space-based platforms to support monitoring, , and analysis. It integrates data from national meteorological services, including surface stations measuring , , , and ; upper-air soundings via radiosondes launched twice daily from approximately 900 global sites; and marine observations from buoys and ships. Satellite constellations, such as geostationary and polar-orbiting systems operated by agencies like NOAA and , provide continuous coverage of cloud patterns, moisture, and radiative fluxes, contributing over 90% of assimilated data volume in modern systems. Aircraft reports and ground-based radars further enhance resolution in populated regions, with the system's evolution guided by WMO's rolling reviews to address gaps in polar and oceanic areas. The WMO Integrated Global Observing System (WIGOS) framework unifies these networks under standardized protocols for , metadata, and real-time exchange via the WMO Information System (WIS), which leverages the Global Telecommunication System for near-instantaneous dissemination. Core components include the Global Basic Observing Network (GBON), mandating essential surface and upper-air measurements from member states to ensure baseline coverage, with compliance monitoring tools developed by ECMWF to track implementation. Despite advancements, challenges persist, such as uneven station density in developing regions and instrument biases, which necessitate procedures like automated detection and inter-comparisons against model backgrounds. Data assimilation integrates these heterogeneous observations into (NWP) models by minimizing discrepancies between measured values and short-range model forecasts, producing optimal initial conditions that account for observational errors and model uncertainties. Sequential methods, such as three-dimensional variational (3D-Var) analysis, solve static optimization problems over a single time step, while four-dimensional variational (4D-Var) extends this to a time , implicitly evolving the model to propagate backward. ECMWF implemented operational 4D-Var in 1997, enabling assimilation of diverse data types like radiances and GPS signals, which has extended medium-range forecast skill by days since the 1970s. (EnKF) approaches, used by NOAA's (NCEP), employ Monte Carlo sampling of model perturbations to quantify uncertainty probabilistically, outperforming variational methods in handling non-Gaussian errors. These techniques rely on background error covariances derived from ensemble forecasts or historical statistics, with bias correction applied to systematic offsets in satellite data, ensuring causal consistency between observations and dynamical principles. For instance, ECMWF's system assimilates over 100 million observations daily, improving global forecast accuracy by reducing root-mean-square errors in geopotential height by up to 20% over decades. Limitations include computational demands of 4D-Var, which require supercomputing resources, and assumptions of Gaussian error distributions that may underestimate extremes, prompting hybrid ensemble-variational methods in operational centers. Ongoing developments, such as joint assimilation of atmospheric and oceanic data, aim to enhance coupled forecasting for events like tropical cyclones.

Atmospheric Scales and Phenomena

Microscale and Mesoscale Dynamics

Microscale dynamics encompass atmospheric motions on horizontal length scales typically smaller than 4 kilometers and time scales under 1 hour, where the becomes negligible and local frictional and forces dominate. Key processes include turbulent eddies in the , buoyant thermals rising from heated surfaces, and wakes formed behind topographic obstacles, all governed primarily by the Navier-Stokes equations without rotational effects. These dynamics drive small-scale phenomena such as dust devils, with updrafts reaching 10-20 m/s over diameters of 10-100 meters, and urban heat island circulations influenced by building-induced flow perturbations. Simulations of microscale flows often rely on (CFD) models to resolve down to grid scales of 1-10 meters, as seen in studies of dispersion in complex . Mesoscale dynamics address phenomena spanning horizontal scales of roughly 4 to 400 kilometers and durations of hours, bridging microscale and larger synoptic patterns through ageostrophic balances and release of . Characteristic features include organized in squall lines, where cold pool outflows propagate at 10-20 m/s and generate new updrafts via dynamic lifting, and fronts advancing inland at 5-10 km/h under differential heating. Gravity waves on mesoscales, with wavelengths of 10-100 km, propagate energy vertically and horizontally, modulating formation and as observed in midlatitude systems. These processes deviate from geostrophic equilibrium due to strong vertical shears and release, necessitating full primitive equation models like the Weather Research and Forecasting (WRF) system, which resolves features at 1-10 km grid spacing. Interactions between microscale and mesoscale flows occur in the "" regime around 1-10 km, where resolved mesoscale models generate subgrid that feeds back into larger eddies, challenging predictability without nested high-resolution simulations. Empirical data from field campaigns, such as those measuring rolls with widths of 5-20 km via , underscore how surface heterogeneity amplifies mesoscale circulations, influencing local outbreaks. Advances in coupled modeling frameworks, integrating meso- to microscale domains, have improved forecasts of wakes extending 10-50 km downwind, with velocity deficits up to 20-40% in stable conditions.

Synoptic and Global Scales

The synoptic scale refers to atmospheric disturbances with horizontal wavelengths of approximately 1,000 to 5,000 kilometers and temporal scales of 1 to 10 days. These features dominate mid-latitude patterns and are analyzed through simultaneous observations across large regions to depict evolving pressure systems and fronts. Key synoptic phenomena include extratropical cyclones, low-pressure systems that develop poleward of 30° via baroclinic instability, where horizontal gradients provide energy for growth through vertical shear and ageostrophic circulations. These cyclones typically span 1,000 to 2,500 kilometers, feature comma-shaped cloud patterns in , and are associated with fronts marking boundaries between air masses of contrasting and , leading to organized bands of and winds exceeding 50 knots in mature stages. Anticyclones, or high-pressure systems, complement these by promoting and clear skies, often steering the migratory patterns of lows across continents. Upper-level jets, such as the polar at 200-300 hPa, modulate synoptic development by enhancing divergence aloft, which sustains falls; maximum speeds reach 100-200 km/h in winter due to strengthened meridional contrasts. On the global scale, planetary circulation manifests through a three-cell model in each hemisphere, driven by differential solar heating and , which collectively transport heat poleward at rates equivalent to about 2 petawatts. The spans from the equator to roughly 30° latitude, characterized by equatorward at the surface converging into rising motion at the , with poleward upper-tropospheric flow and subtropical descent forming high-pressure belts and deserts. The Ferrel cell, an indirect circulation between 30° and 60° latitude, features prevailing at the surface resulting from eddy-driven fluxes that counteract forcing, facilitating mid-latitude tracks. The polar cell, from 60° to the poles, involves cold air subsidence over and the , outflow as easterlies, and equatorward flow aloft, maintaining the coldest stratospheric temperatures and influencing polar vortex dynamics. Rossby waves, large-scale undulations in the with wavelengths of 3,000 to 6,000 kilometers, propagate westward relative to the mean flow and introduce variability in global weather teleconnections, such as the .

Branches of Meteorology

Physical and Dynamic Meteorology

Physical meteorology applies principles of physics to atmospheric phenomena, encompassing , transfer, microphysics, and processes. Thermodynamic concepts, such as of thermodynamics and adiabatic processes, govern and air parcel stability, with the dry adiabatic calculated at approximately 9.8 °C per kilometer under standard conditions. Radiation balance involves shortwave solar absorption and longwave terrestrial emission, modulated by atmospheric constituents like and aerosols, which influence the through selective absorption spectra. details hydrometeor formation via , where (typically 0.1–1 μm particles) enable supersaturation thresholds below 100% relative humidity for droplet initiation, followed by growth through diffusion and collision-coalescence leading to . Dynamic meteorology studies large-scale atmospheric motions through , deriving predictive models from the Navier-Stokes equations adapted for geophysical scales. The form the core framework, consisting of horizontal momentum equations incorporating Coriolis forces (with parameter f=2Ωsinϕf = 2\Omega \sin\phi, where Ω\Omega is Earth's and ϕ\phi ), hydrostatic balance (pz=ρg\frac{\partial p}{\partial z} = -\rho g), continuity (ρt+(ρv)=0\frac{\partial \rho}{\partial t} + \nabla \cdot (\rho \mathbf{v}) = 0), and the thermodynamic energy equation. These are often simplified under the anelastic or Boussinesq approximations for buoyancy-driven flows, neglecting for computational efficiency in models. Key balances include geostrophic adjustment, where forces equilibrate with Coriolis deflection, yielding speeds vg=1fρpnv_g = \frac{1}{f\rho} \frac{\partial p}{\partial n} perpendicular to isobars, essential for midlatitude synoptic systems. Rossby waves, governed by conservation q=(ζ+f)/hq = (\zeta + f)/h, propagate westward relative to mean flow, influencing meanders and storm tracks with typical wavelengths of 3000–6000 km.

Synoptic and Specialized Methodological Approaches

Synoptic meteorology focuses on the analysis and prediction of large-scale atmospheric phenomena by integrating simultaneous observations from diverse sources to depict weather patterns across regions spanning thousands of kilometers. This methodological emphasizes constructing synoptic charts—maps that summarize atmospheric conditions at a specific time, including sea-level , , , and —to identify features such as extratropical cyclones, anticyclones, and fronts. Data from surface stations, radiosondes, aircraft reports, and satellites enables the portrayal of horizontal and vertical structures, facilitating the understanding of and weather system evolution. Core methods in synoptic involve plotting isobars to delineate pressure gradients driving geostrophic winds, tracing fronts via discontinuities in and , and evaluating to assess in systems. Forecasters apply kinematic techniques, such as streamline , to compute and convergence fields from horizontal vectors, which indicate regions of upward motion conducive to formation and . These empirical approaches rely on from historical analogs, supplemented by rules-of-thumb for system motion, like steering by upper-level winds at 500 hPa. Short-range forecasts often employ or trend , with accuracy diminishing beyond 24-48 hours due to amplification of initial errors. Specialized methodological approaches within synoptic meteorology incorporate dynamic principles to enhance diagnostic capabilities beyond basic charting. Quasi-geostrophic theory approximates mid-latitude synoptic flows by balancing Coriolis forces with pressure gradients while accounting for weak ageostrophic components, enabling calculations of vertical velocity via the omega equation and predictions of frontogenesis through differential vorticity . This framework, developed in the mid-20th century, underpins subjective tools like the Petterssen frontogenesis function, which quantifies front sharpening from convergent flow. Kinematic methods extend to estimating vertical motions through the , integrating horizontal divergence over layers to derive ascent rates typically on the order of 0.1-1 Pa/s in developing cyclones. In tropical contexts, synoptic methods adapt to weaker baroclinicity and dominant convective processes, emphasizing easterly waves, monsoon troughs, and tropical cyclone tracks rather than mid-latitude fronts. Analysis prioritizes and sea surface temperatures exceeding 26.5°C for genesis potential, with specialized techniques revealing synoptic-scale flow interactions like vorticity recycling in . These approaches, informed by sparse observations, integrate satellite-derived winds and Dvorak enhancement techniques for intensity estimation, differing fundamentally from extratropical reliance on thermal contrasts.

Scale-Based Subfields

Micrometeorology focuses on atmospheric processes at the smallest scales, typically within the up to 1-2 km in height, where surface friction generates and governs exchanges of heat, moisture, and momentum between the ground and air. This subfield employs instruments such as towers and sonic anemometers to measure turbulent fluxes, informing models of local , dynamics, and urban microclimates. Applications extend to , siting, and dispersion of airborne pollutants, with key research emphasizing Monin-Obukhov similarity theory for scaling turbulent statistics under varying stability conditions. Mesometeorology, or , investigates phenomena spanning 5 to several hundred kilometers horizontally, including thunderstorms, lines, , and orographically induced flows like gap winds. These features emerge from interactions between synoptic forcing and local heterogeneities, such as or differential heating, and persist for hours to a day. High-resolution observations from and numerical models with grid spacings of 1-10 km are essential, as mesoscale systems challenge predictability due to nonlinear feedbacks and upscale energy cascades. This subfield supports nowcasting of severe local storms and has advanced through initiatives like the U.S. National Weather Service's mesoscale discussions. Synoptic meteorology analyzes large-scale circulations exceeding 1000 km, encompassing extratropical cyclones, high-pressure ridges, and frontal boundaries, using data collected at standardized times (e.g., 00Z and 12Z UTC) to construct weather maps. Originating with ' frontal wave theory in 1918-1922, it applies quasi-geostrophic approximations to diagnose baroclinic instability and dynamics. Observational tools include upper-air soundings and satellite-derived water vapor imagery, enabling forecasts of precipitation patterns and storm tracks days in advance. The subfield remains foundational to operational centers, though limited by sparse data in remote regions like oceans. Planetary-scale meteorology addresses global patterns such as the Hadley, Ferrel, and polar cells, along with monsoonal regimes and equatorial waves, operating over thousands of kilometers and weeks to months. Driven by latitudinal heating imbalances and , these are simulated via general circulation models incorporating radiative-convective equilibrium. Reanalysis datasets, like those from the European Centre for Medium-Range Weather Forecasts spanning 1979-present, provide empirical validation. This subfield interfaces with but emphasizes transient variability, such as El Niño-Southern Oscillation teleconnections.

Weather Forecasting Methods

Observational and Empirical Techniques

Observational and empirical techniques in weather forecasting utilize direct measurements of atmospheric conditions and statistical or pattern-based rules derived from historical data to predict future states, serving as baselines against which more complex methods are evaluated. These approaches emphasize data from surface weather stations, upper-air soundings, , and early observations, interpreted through heuristics rather than dynamical simulations. Persistence forecasting, a fundamental empirical method, assumes current conditions will persist unchanged into the forecast period, performing adequately in stable synoptic environments like prolonged high-pressure systems but yielding low skill in dynamic weather. For instance, in during April 2025, persistence forecasts for maximum temperatures exhibited a of 3.1°C at 24 hours lead time, highlighting its limitations beyond short horizons. Climatological forecasting provides another baseline by predicting the long-term average conditions for a specific location and date, derived from decades of historical records, and is particularly useful for assessing seasonal anomalies or verifying long-range predictions. Trend or steady-state forecasting extrapolates recent observed changes, such as continuing a daily rise of 2°C or a system's movement at its current speed, offering simplicity for nowcasting but degrading rapidly with atmospheric variability. Analog methods enhance empirical prediction by identifying historical patterns—often via sea-level or 500 hPa fields—that closely match current observations, then averaging outcomes from those analogs to forecast evolution, with applications in subseasonal to seasonal ranges. This technique has demonstrated improved skill when using extended analog datasets, as longer records increase the likelihood of finding robust matches. Synoptic-scale empirical rules, rooted in manual analysis of weather charts, guide forecasters in recognizing fronts, , and patterns to anticipate changes, as pioneered in early 20th-century efforts to systematize observation-based prediction. Statistical empirical models, such as multiple on pattern-level predictors like teleconnection indices, extend these techniques for probabilistic outputs, though they remain constrained by the assumption of stationarity in historical relationships. In operational settings, combinations of persistence and via linear weighting can outperform individual methods for medium-range temperature forecasts, underscoring the value of blending empirical baselines. These techniques persist in modern for verifying numerical model performance and handling data-sparse regions, where direct from physics is infeasible.

Numerical Weather Prediction Models

Numerical weather prediction (NWP) models forecast atmospheric conditions by numerically solving systems of partial differential equations that govern , , and moisture in the atmosphere. These models approximate the —derived from conservation of momentum (Navier-Stokes adapted for large-scale flows), mass, energy, and —using finite-difference, , or finite-volume discretization methods on a three-dimensional grid. Sub-grid scale processes, such as and , are represented through parametrizations rather than explicit resolution due to computational constraints. Early efforts in NWP date to Lewis Fry Richardson's 1922 manual calculations, which produced unstable results due to inadequate data and computational methods, highlighting the sensitivity to initial conditions later formalized by Edward Lorenz's in 1963. Practical implementation began in 1950 with Jule Charney's team using the computer for barotropic forecasts, yielding the first 24-hour prediction on September 1, 1955, via a two-level baroclinic model at the U.S. Weather Bureau. Advancements accelerated with multi-layer models: NOAA introduced a three-layer hemispheric model in 1962 and a six-layer primitive equation model in 1966, enabling global coverage by 1973. The NWP process comprises data assimilation, initialization, and forecast integration. Data assimilation combines heterogeneous observations—surface stations, radiosondes, satellites, , and aircraft—with a short-range forecast background field to estimate initial conditions, minimizing errors via methods like 3D-Var, 4D-Var (incorporating over 6-12 hours), or ensemble Kalman filters. This step addresses observational sparsity and errors, producing analyses cycled into forward integration where equations are timestepped (typically 10-30 minutes) over forecast horizons of 1-10 days. Physics suites handle , microphysics, land-surface interactions, and processes, tuned against observations. Prominent operational global NWP models include the NOAA (GFS), running at ~13 km horizontal resolution with 64 vertical levels updated four times daily; the ECMWF Integrated Forecasting System (IFS), at ~9 km resolution and 137 levels, renowned for medium-range accuracy up to 10 days; the UK Unified Model (UKMO); Germany's ; and Canada's GEM. Regional models like the Weather Research and Forecasting (WRF) model nest within global outputs for higher resolution (1-5 km) over limited domains. ECMWF's IFS consistently outperforms GFS in verification scores for hemispheric 500 hPa height anomalies, attributed to superior ensemble and resolution. Post-2020 advances leverage for resolutions below 3 km, improved hybrid ensemble-variational assimilation incorporating AI for bias correction, and emulators accelerating physics parametrizations by factors of 10-100 while preserving fidelity. However, traditional deterministic NWP remains superior to pure AI models for extended-range predictability in chaotic systems, as validated by skill scores for events like tropical cyclones. Limitations persist from error growth ( ~2 days) and parametrization uncertainties, necessitating ensembles for probabilistic outputs.

Ensemble and Probabilistic Forecasting

Ensemble forecasting represents a numerical weather prediction technique that produces multiple simulations, or "members," by introducing controlled perturbations to initial conditions, model physics, or parameters, thereby sampling a distribution of plausible future atmospheric states to quantify forecast uncertainty. This method acknowledges the chaotic nature of atmospheric dynamics, where small errors in initial data or model representations can amplify into significant divergences, as formalized by Edward Lorenz's work on sensitivity to initial conditions in the 1960s. Unlike deterministic forecasts, which yield a single trajectory, ensembles generate a spread of outcomes whose statistical properties inform probabilistic predictions, such as the likelihood of temperature anomalies or storm tracks. Operational ensemble systems emerged in the early to address limitations in single-model runs, with the European Centre for Medium-Range Weather Forecasts (ECMWF) launching its Ensemble Prediction System (EPS) on December 19, 1992, using 33 members at triangular truncation T63 resolution (approximately 210 km grid spacing) for 10-day forecasts. The U.S. (NCEP) followed suit around the same period, introducing the Global Ensemble Forecast System (GEFS) precursor. Perturbation techniques include singular vector methods to target fastest-growing instabilities, ensemble transform Kalman filters for flow-dependent errors, and stochastic parameterizations to mimic subgrid-scale variability, ensuring members are equally likely representations of sources. Probabilistic forecasts derive directly from ensemble statistics, expressing outcomes as probabilities rather than point estimates; for example, the exceeding 10 mm in 24 hours is the proportion of members meeting that threshold, calibrated via ensemble dressing or Bayesian model averaging to correct biases. In practice, the (NWS) employs GEFS outputs for medium-range guidance, where the ensemble mean often outperforms individual deterministic runs in anomaly correlation scores beyond 5 days, particularly for blocking patterns or genesis. Verification metrics, such as the continuous ranked probability score (CRPS), demonstrate ensembles' superior reliability for high-impact events, with ECMWF EPS achieving CRPS improvements of 10-20% over deterministic ECMWF forecasts in 500 hPa fields from 1992 to 2017. These approaches enhance decision-making in sectors like and , where underestimating spread in deterministic models has led to overconfident errors, as evidenced by the 1999 U.S. Midwest floods where ensembles better captured . Computational demands remain high—modern systems like ECMWF's 51-member EPS require exascale resources—but and post-processing, such as analog ensembles, further refine local probabilistic guidance without inflating . Limitations include underdispersion in some systems, where spreads are narrower than observed errors, necessitating ongoing against reforecast datasets spanning decades.

Applications

Forecasting and Public Safety

Weather forecasting enhances public safety by delivering timely warnings that facilitate evacuations, sheltering, and other protective measures, thereby mitigating loss of life and property damage during severe events. National meteorological services, such as the U.S. (NWS), issue alerts for phenomena including tornadoes, hurricanes, floods, and extreme heat, with indicating substantial reductions in casualties attributable to these systems. In the case of tornadoes, advancements in Doppler radar deployment since the 1990s have improved the probability of detection and extended average lead times, contributing to a marked decline in fatalities. U.S. tornado deaths averaged around 130 annually in the 1950s but fell to 54 per year between 1975 and 2000, representing a 93% per capita reduction, largely due to enhanced warning capabilities. Studies further quantify that warnings with lead times up to 15 minutes reduce fatalities, while longer lead times account for 30-50% of injury reductions and up to 25% of fatality decreases. Tornado warnings have also been associated with over 40% fewer injuries at certain lead time intervals. For hurricanes, accurate track forecasts enable ordered evacuations, with geo-targeted warnings proving more effective than generic ones in increasing evacuation compliance rates. Forecast improvements, including graphics, have supported successful evacuations in events like in 1989, where advance predictions allowed coastal preparations despite the storm's intensity. Overall, enhancing forecast accuracy by 50% could prevent approximately 2,200 deaths annually in the U.S. from weather-related extremes, underscoring the value of continued investment in prediction technologies. Broader applications include heat alerts and flood warnings, though effectiveness varies; for instance, NWS heat alerts showed no significant mortality reduction in some urban analyses, highlighting the need for integrated communication strategies. Globally, early warning systems have potential to address the over 2 million weather-related deaths in the past 50 years, predominantly in vulnerable regions.

Sector-Specific Uses

Meteorology provides critical data and services tailored to economic sectors, enabling , optimization, and enhanced amid variability. demand for such services has expanded due to rising weather-related financial losses, with applications spanning , , , transportation, , and . In a 2005 survey by the Weather Risk Management Association, represented 72% of business uses for information, followed by at 9%, retail at 7%, at 7%, and transportation at 5%. In , meteorological forecasts guide planting schedules, timing, pest management, and harvest operations to minimize losses from , floods, or frosts. The (NOAA) supplies , , and in-situ data for assessing resilience and frequency, supporting decisions on input allocation and yield projections. Seasonal and subseasonal predictions add value in variable climates by informing selection and uptake, though broad-area forecasts require for farm-level utility. Agrometeorologists also forecast wind conditions for safe , reducing environmental runoff while protecting yields. Aviation depends on specialized meteorological products for safe operations, including observations of visibility, icing, turbulence, and thunderstorms that influence routing, delays, and cancellations. The Federal Aviation Administration (FAA) and National Weather Service collaborate via the Aviation Weather Center to deliver METARs (surface observations), TAFs (terminal aerodrome forecasts), and graphical tools like the Graphical Forecasts for Aviation, covering the U.S. airspace and beyond. Terminal Doppler weather radar detects wind shear and gust fronts near airports, preventing accidents during takeoffs and landings. These services, disseminated through platforms like the Aviation Digital Data Service, integrate multiple data sources to support real-time pilot briefings and air traffic management. The energy sector leverages meteorology for load forecasting, grid stability, and renewable integration, particularly and solar output predictions based on short-term speed, , and insolation data. information aids site selection for farms and , evaluating long-term patterns of rainfall and regimes. Financial instruments like weather derivatives, traded on exchanges such as the (which handled 108,000 contracts in ), hedge against temperature-driven demand spikes for heating or cooling. Transportation beyond aviation incorporates forecasts for road de-icing, rail signaling in fog, and marine routing to evade storms, optimizing fuel use and safety. Shipping relies on wave height and gale warnings to adjust courses, while ground transport uses precipitation and visibility data for convoy planning and hazard mitigation. In insurance and construction, meteorology informs actuarial models for severe event pricing and loss estimation, with post-event analyses refining risk maps after disasters like hurricanes. Construction schedules account for wind and rain thresholds to avoid delays, treating these sectors as highly sensitive to disruptions in weather-dependent economies like Switzerland's. Retail applications extend to inventory adjustments for weather-influenced consumer behavior, such as seasonal apparel demand.

Military and Strategic Applications

Meteorology plays a pivotal role in operations by informing decisions on timing, routes, and tactics, as adverse can degrade , hinder mobility, and impair equipment efficacy, while favorable conditions enable surprise and precision. For instance, high winds or can disrupt air sorties, ground advances, or naval maneuvers, necessitating accurate forecasts to mitigate risks and exploit opportunities. U.S. emphasizes as a force multiplier, integrating meteorological data into command assessments for terrestrial, maritime, and domains. During , meteorological forecasting advanced significantly due to operational demands, with -based weather detection emerging from wartime radar developments to track storms and for aviation and naval campaigns. A notable example occurred in the Normandy invasion, where Allied forecasters, analyzing barometric trends and upper-air data, recommended delaying the June 5 landing to amid a brief clearing window, averting potential disaster from gale-force winds exceeding 30 knots and rough seas. In the Pacific theater, U.S. Army Air Forces used weather reconnaissance flights to identify typhoon paths, such as Typhoon Louise in October 1945, which threatened naval operations and led to the cancellation of planned strikes. In the Vietnam War era, the U.S. conducted from 1967 to 1972, seeding clouds with over the to extend rains and disrupt enemy logistics, generating an estimated 82,000 cloud-seeding sorties and increasing rainfall by up to 30% in targeted areas according to evaluations, though long-term efficacy remained contested due to natural variability. This marked one of the few documented attempts at operational for strategic denial, banned under the 1977 amid concerns over escalation. Contemporary U.S. military meteorology relies on specialized units like the Air Force's 557th Weather Wing, which delivers tailored forecasts to warfighters using integrated data from ground sensors, aircraft, and satellites, supporting over 100 global sites as of 2023. The Defense Meteorological Satellite Program, operational since 1960 and transitioned to Space Force oversight, provides visible, infrared, and microwave imagery to detect fog, clouds, and storms over remote regions, enabling real-time tracking of phenomena like tropical cyclones with resolutions down to 0.25 km. Navy and Marine Corps Meteorological and Oceanographic (METOC) teams forecast conditions for amphibious assaults and flight operations, incorporating space weather data to predict ionospheric disruptions affecting communications and GPS accuracy during missile launches. These capabilities extend to probabilistic modeling for ensemble predictions, assessing risks like icing on aircraft at altitudes above 30,000 feet or turbulence impacting drone swarms. Strategically, meteorological intelligence informs broader doctrines, such as synchronizing operations with seasonal patterns—e.g., avoiding winter campaigns in temperate zones where snow depths exceed 1 meter can immobilize armored units—or leveraging polar orbiting data for denied-access environments. In contested spaces, adversaries may target weather assets, underscoring the need for resilient, distributed sensing networks. While speculative concepts like advanced modification persist in doctrinal discussions, current applications prioritize predictive superiority over alteration, grounded in empirical observation rather than unproven interventions.

Limitations and Controversies

Inherent Predictability Constraints

The atmosphere exhibits chaotic dynamics, characterized by sensitive dependence on initial conditions, which imposes fundamental limits on deterministic weather prediction regardless of computational advances. This phenomenon, first demonstrated by Edward Lorenz in through a simplified model of atmospheric convection, reveals that minute perturbations in starting states—such as rounding errors in numerical computations—can amplify exponentially, leading to divergent trajectories in model outputs. In Lorenz's experiments, rerunning a with slightly altered initial values (e.g., 0.506127 instead of 0.506) produced entirely different long-term results, illustrating deterministic nonperiodicity in nonlinear systems. Lorenz's 1969 analysis further quantified these constraints, estimating an inherent predictability barrier for large-scale atmospheric flows on the order of two weeks, beyond which errors grow uncontrollably due to nonlinear instabilities and the vast of the system. Empirical studies confirm this horizon: skillful deterministic forecasts for mid-latitude synoptic typically extend to 9-10 days, with potential extensions to 14 days under optimal conditions, but rapid error saturation occurs thereafter as small-scale processes cascade upward. Medium- to long-term forecasts spanning more than one to two weeks thus carry inherent uncertainty, with models evolving as the forecast time approaches, particularly in precipitation intensity and exact timing. For smaller mesoscale features, like thunderstorms, predictability shrinks to hours or days, reflecting the inverse cascade of uncertainty from unresolved eddies and . These limits stem from the atmosphere's Lyapunov instability, where positive Lyapunov exponents indicate exponential divergence rates, rendering long-range exactitude impossible without perfect initial data—an unattainable ideal given observational gaps and measurement errors. Consequently, operational forecasting shifts to probabilistic methods after the deterministic horizon, sampling uncertainties to estimate likelihoods rather than precise outcomes. While model resolution and have extended practical skill within the chaos-limited window, the core barrier persists, as confirmed by reanalyses showing no secular increase in the fundamental predictability timescale.

Model Limitations and Accuracy Issues

Numerical weather prediction (NWP) models rely on solving discretized equations of atmospheric dynamics, but inherent limitations arise from the chaotic nature of the atmosphere, as identified by Edward Lorenz in his 1963 computational experiments demonstrating sensitive dependence on initial conditions, where minute perturbations amplify into significant forecast divergences over time. This "butterfly effect" imposes a fundamental predictability horizon, with deterministic skill typically degrading sharply beyond 7-10 days for mid-latitude weather patterns, as small errors in initial observations—often on the order of observational noise or unresolved sub-grid processes—exponentially grow due to nonlinear instabilities. For instance, anomaly correlation coefficients (ACC) for 500 hPa forecasts from leading models like ECMWF's Integrated Forecasting System often exceed 0.90 at 24-hour lead times but fall below 0.60 by day 5 and approach 0.50 (indicating no better than ) by day 10, reflecting this error amplification. Parameterization schemes for sub-grid-scale phenomena, such as , microphysics, and , introduce systematic biases since these processes occur below model grid resolutions—typically 9-25 km for operational global models—necessitating empirical approximations that cannot fully capture causal interactions like moist triggers or feedbacks. These approximations contribute to underprediction of extreme events; for example, during the 10 August 2020 Midwest U.S. , which produced winds over 150 km/h and $12 billion in damages, operational NWP models from NOAA and ECMWF failed to anticipate the event's intensity and path due to inadequate representation of organization and insufficient accuracy from sparse observations. Similarly, tropical cyclone track forecasts exhibit errors averaging 100-200 km at 48 hours in global models, stemming from deficiencies in vortex initialization and ocean-atmosphere parameterizations, despite decades of refinement. Data assimilation challenges exacerbate inaccuracies, as integrating heterogeneous observations (e.g., satellites, radiosondes) into model states via methods like 4D-Var or ensemble Kalman filters is computationally intensive and prone to imbalances, particularly in data-sparse regions like oceans or polar areas, leading to forecast busts where models diverge from reality within hours for convective-scale events. U.S. global NWP systems, such as the , have lagged European counterparts in skill metrics by 1-2 days of predictability since the early , attributed to resource constraints, slower adoption of advanced physics schemes, and gaps in workforce expertise for high-resolution ensemble configurations. Verification studies highlight that models often struggle with and large-scale teleconnections, like the Madden-Julian Oscillation's interaction with the , failing to reproduce observed variability due to incomplete representation of stratospheric-tropospheric coupling. Overall, while short-range (0-3 day) forecasts achieve high accuracy for synoptic features, medium-range reliability drops, underscoring the need for probabilistic approaches to quantify rather than overreliance on deterministic outputs.

Public Perception and Attribution Debates

Public perception of in meteorology is generally high for short-term predictions, with surveys indicating that five-day forecasts achieve approximately 90% accuracy and seven-day forecasts around 80%, reflecting substantial improvements over decades—such as the UK's reporting that four-day forecasts today match the reliability of one-day forecasts from 30 years ago. However, public trust can be undermined by perceived inconsistencies across sources like apps and official outlets, with studies showing that users' assessments of forecast accuracy and consistency directly correlate with overall confidence in those tools, often leading to selective reliance on preferred providers. Attribution debates center on linking specific extreme weather events—such as hurricanes, floods, or heatwaves—to anthropogenic , where probabilistic event attribution methods estimate how global warming alters the likelihood or intensity of such occurrences, rather than establishing direct causation. surveys reveal widespread attribution to , particularly for temperature-related extremes like heatwaves and wildfires, with majorities of in 2025 viewing recent events as influenced by warming, though less consensus exists for precipitation-driven disasters like floods. These perceptions are amplified by media narratives, which frequently frame events as "unprecedented" or directly "caused" by human emissions, despite attribution science emphasizing increased rather than inevitability. Critics argue that such attributions overstate anthropogenic influences due to reliance on climate models that struggle with internal variability and rare-event statistics, potentially conflating correlation with causation in ways that ignore historical precedents of extremes under cooler conditions. For instance, probabilistic approaches can yield high-confidence claims of "made more likely" for events fitting model projections, but these are vulnerable to biases in model tuning and fail to robustly account for natural oscillations like the Atlantic Multidecadal Oscillation, leading to accusations of advocacy-driven interpretations that prioritize alarm over empirical caution. Ethical concerns also arise in communicating these probabilities to the public, as simplified messaging risks eroding trust when events occur without clear climate signals, fostering skepticism toward meteorology and climate science alike. Mainstream institutions, often aligned with consensus views, may underemphasize these uncertainties, contributing to polarized debates where empirical data on long-term trends—such as no clear increase in U.S. hurricane landfalls despite warming—clashes with attribution narratives.

Recent Advances

AI and Machine Learning Integration

Machine learning techniques, particularly deep neural networks, have been integrated into meteorological to emulate atmospheric dynamics, surpassing traditional (NWP) models in speed and accuracy for medium-range global forecasts. These data-driven approaches train on extensive reanalysis datasets like ERA5, capturing nonlinear relationships without explicit physical equations, enabling predictions in minutes on standard hardware compared to hours on supercomputers for NWP. By 2023, models like Google's GraphCast demonstrated superiority over the European Centre for Medium-Range Weather Forecasts' (ECMWF) Integrated Forecasting System (IFS) on 90% of 1,380 verification targets for 10-day forecasts, including improved predictions of tropical cyclone tracks and atmospheric rivers. Operational adoption accelerated in 2025, with ECMWF launching its Forecasting System (AIFS) as the first fully data-driven global model in production use. AIFS, employing graph neural networks in , generates deterministic and ensemble forecasts four times daily, matching or exceeding IFS performance in upper-air variables while running 1,000 times faster; its ensemble variant became operational on July 1, 2025, under policy. Similarly, NVIDIA's FourCastNet evolved to version 3 in July 2025, delivering probabilistic 15-day ensemble forecasts at 0.25° resolution in 64 seconds, outperforming ECMWF ensembles in wind and metrics through geometric on . These systems facilitate rapid ensemble generation for , enhancing critical for alerts. Beyond core prediction, AI/ML augments NWP via post-processing, nowcasting, and data assimilation; for instance, NOAA's Global Ensemble Forecast System (GEFS) integrates ML surrogates like FourCastNet for improved initial conditions, yielding higher ensemble quality. Probabilistic models such as GenCast (December 2024) extend this by generating 15-day ensembles more reliably than operational counterparts, particularly for extratropical cyclones. Challenges persist in extreme event predictability and physical interpretability, yet empirical validations show MLWP reducing errors by 10-20% in key variables over baselines, driven by scalable training on petabyte-scale data.

Enhanced Resolution and Data Assimilation

Enhanced resolution in (NWP) models refers to the reduction in grid spacing, enabling the simulation of smaller-scale atmospheric phenomena such as convective storms and orographic effects that coarser grids cannot resolve. Recent upgrades have pushed global and regional models toward sub-10 km horizontal resolutions; for instance, the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecasting System (IFS) increased its medium-range ensemble forecast resolution from 18 km (TCo639) to 9 km (TCo1279) in Cycle 48r1 implemented in 2023, unifying resolutions across deterministic and ensemble components and improving track and intensity forecasts by up to 10% in position reduction. Similarly, the U.S. National Oceanic and Atmospheric Administration's (NOAA) High-Resolution Rapid Refresh (HRRR) model operates at 3 km resolution, providing hourly updates with assimilation to support convection-allowing forecasts out to 18-48 hours, capturing mesoscale features like thunderstorms more accurately than legacy models. These advancements stem from increased computational power and refined physics parameterizations, allowing models to explicitly resolve processes previously treated statistically, though they demand proportionally more for initialization to avoid instability. Data assimilation (DA) techniques integrate diverse observations—such as radiances, reflectivities, and surface measurements—into model initial conditions, with recent progress focusing on hybrid variational-ensemble methods to handle the volume of data required by high-resolution grids. Four-dimensional variational (4D-Var) assimilation minimizes a cost function over a time window to optimize states, while Kalman filters (EnKF) use probabilistic to update states with observations, showing comparable performance in operational settings but with EnKF offering advantages in flow-dependent for nonlinear dynamics. ECMWF's operational 4D-Var system, for example, assimilates data every 12 hours at resolutions matching its forecast grids, incorporating innovations like all-sky microwave radiances for cloudy regions. NOAA's 2025 10-year DA strategy emphasizes fully coupled system assimilation, integrating atmosphere, , and components continuously to enhance predictability in high-resolution contexts. Machine learning enhancements have accelerated DA efficiency; the Artificial Intelligence Data Assimilation Framework (ADAF) leverages neural networks to generate analysis fields, improving 0-6 hour forecasts in AI-based models over traditional radar DA initializations. Similarly, the FuXi Weather system, introduced in 2025, employs ML for global forecasting with cycling DA of multi-satellite observations, achieving sub-kilometer effective resolution in targeted downscaling while reducing computational costs compared to physics-based NWP. These methods address DA's computational scalability for resolutions approaching 1 km, where traditional approaches falter due to the "curse of dimensionality," though validation against independent observations remains essential to mitigate overfitting risks inherent in data-driven components. Empirical evaluations indicate that coupled 4D-Var-EnKF hybrids yield 5-15% error reductions in short-range precipitation forecasts over standalone methods, particularly for severe weather events.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.