Hubbry Logo
History of climate change scienceHistory of climate change scienceMain
Open search
History of climate change science
Community hub
History of climate change science
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
History of climate change science
History of climate change science
from Wikipedia

John Tyndall's ratio spectrophotometer (drawing from 1861) measured how much infrared radiation was absorbed and emitted by various gases filling its central tube.[1] Such measurements furthered understanding of the greenhouse effect that underlies global warming and climate change.

The history of the scientific discovery of climate change began in the early 19th century when ice ages and other natural changes in paleoclimate were first suspected and the natural greenhouse effect was first identified. In the late 19th century, scientists first argued that human emissions of greenhouse gases could change Earth's energy balance and climate. The existence of the greenhouse effect, while not named as such, was proposed as early as 1824 by Joseph Fourier.[2] The argument and the evidence were further strengthened by Claude Pouillet in 1827 and 1838. In 1856 Eunice Newton Foote demonstrated that the warming effect of the sun is greater for air with water vapour than for dry air, and the effect is even greater with carbon dioxide.[3][4]

John Tyndall was the first to measure the infrared absorption and emission of various gases and vapors. From 1859 onwards, he showed that the effect was due to a very small proportion of the atmosphere, with the main gases having no effect, and was largely due to water vapor, though small percentages of hydrocarbons and carbon dioxide had a significant effect.[5] The effect was more fully quantified by Svante Arrhenius in 1896, who made the first quantitative prediction of global warming due to a hypothetical doubling of atmospheric carbon dioxide.

In the 1960s, the evidence for the warming effect of carbon dioxide gas became increasingly convincing. Scientists also discovered that human activities that generated atmospheric aerosols (e.g., "air pollution") could have cooling effects as well (later referred to as global dimming). Other theories for the causes of global warming were also proposed, involving forces from volcanism to solar variation. During the 1970s, scientific understanding of global warming greatly increased.

By the 1990s, as the result of improving the accuracy of computer models and observational work confirming the Milankovitch theory of the ice ages, a consensus position formed. It became clear that greenhouse gases were deeply involved in most climate changes and human-caused emissions were bringing discernible global warming.

Since the 1990s, scientific research on climate change has included multiple disciplines and has expanded. Research has expanded the understanding of causal relations, links with historic data, and abilities to measure and model climate change. Research during this period has been summarized in the Assessment Reports by the Intergovernmental Panel on Climate Change, with the First Assessment Report coming out in 1990.

Prior to the 20th century

[edit]

Regional changes, antiquity through 19th century

[edit]

From ancient times, people suspected that the climate of a region could change over the course of centuries. For example, Theophrastus, a pupil of Ancient Greek philosopher Aristotle in the 4th century BC, told how the draining of marshes had made a particular locality more susceptible to freezing, and speculated that lands became warmer when the clearing of forests exposed them to sunlight. In the 1st century BC, Roman writer and architect Vitruvius wrote about climate in relation to housing architecture and how to choose locations for cities.[6][7] Renaissance European and later scholars saw that deforestation, irrigation, and grazing had altered the lands around the Mediterranean since ancient times; they thought it plausible that these human interventions had affected the local weather.[8][9] In his book published in 1088, Northern Song dynasty Chinese scholar and statesman Shen Kuo promoted the theory of gradual climate change over centuries of time once ancient petrified bamboos were found to be preserved underground in the dry climate zone and arid northern region of Yanzhou, now modern day Yan'an, Shaanxi province, far from the warmer, wetter climate areas of China where bamboos typically grow.[10][11]

The 18th and 19th-century conversion of Eastern North America from forest to croplands brought obvious change within a human lifetime. From the early 19th century, many believed the transformation was altering the region's climate—probably for the better. When farmers in America, dubbed "sodbusters", took over the Great Plains, they held that "rain follows the plow".[12][13] Other experts disagreed, and some argued that deforestation caused rapid rainwater run-off and flooding, and could even result in reduced rainfall. European academics, suggesting that the temperate zones inhabited by the "Caucasian race" were naturally superior for the spread of civilization, proffered that the Orientals of the Ancient Near East had heedlessly converted their once lush lands into impoverished deserts.[14]

Meanwhile, national weather agencies had begun to compile masses of reliable observations of temperature, rainfall, and the like. When these figures were analyzed, they showed many rises and dips, but no steady long-term change. By the end of the 19th century, scientific opinion had turned decisively against any belief in a human influence on climate. And whatever the regional effects, few imagined that humans could affect the climate of the planet as a whole.[14]

Paleo-climate change and theories of its causes, 19th century

[edit]
Erratics, boulders deposited by glaciers far from any existing glaciers, led geologists to the conclusion that climate had changed in the past.
Joseph Fourier
James Croll

From the mid-17th century, naturalists attempted to reconcile mechanical philosophy with theology, initially within a biblical timescale. By the late 18th century, there was increasing acceptance of prehistoric epochs. Geologists found evidence of a succession of geological ages with climate changes. There were various competing theories about these changes; Buffon proposed that the Earth had begun as an incandescent globe and was very gradually cooling. James Hutton, whose ideas of cyclic change over huge periods were later dubbed uniformitarianism, was among those who found signs of past glacial activity in places too warm for glaciers in modern times.[15]

In 1815 Jean-Pierre Perraudin described for the first time how glaciers might be responsible for the giant boulders seen in alpine valleys. As he hiked in the Val de Bagnes, he noticed giant granite rocks that were scattered around the narrow valley. He knew that it would take an exceptional force to move such large rocks. He also noticed how glaciers left stripes on the land and concluded that it was the ice that had carried the boulders down into the valleys.[16]

His idea was initially met with disbelief. Jean de Charpentier wrote, "I found his hypothesis so extraordinary and even so extravagant that I considered it as not worth examining or even considering."[17] Despite Charpentier's initial rejection, Perraudin eventually convinced Ignaz Venetz that it might be worth studying. Venetz convinced Charpentier, who in turn convinced the influential scientist Louis Agassiz that the glacial theory had merit.[16]

Agassiz developed a theory of what he termed "Ice Age"—when glaciers covered Europe and much of North America. In 1837 Agassiz was the first to scientifically propose that the Earth had been subject to a past ice age.[18] William Buckland had been a leading proponent in Britain of flood geology, later dubbed catastrophism, which accounted for erratic boulders and other "diluvium" as relics of the Biblical flood. This was strongly opposed by Charles Lyell's version of Hutton's uniformitarianism and was gradually abandoned by Buckland and other catastrophist geologists. A field trip to the Alps with Agassiz in October 1838 convinced Buckland that features in Britain had been caused by glaciation, and both he and Lyell strongly supported the ice age theory which became widely accepted by the 1870s.[15]

Before the concept of ice ages was proposed, Joseph Fourier in 1824 reasoned based on physics that Earth's atmosphere kept the planet warmer than would be the case in a vacuum. Fourier recognized that the atmosphere transmitted visible light waves efficiently to the earth's surface. The earth then absorbed visible light and emitted infrared radiation in response, but the atmosphere did not transmit infrared efficiently, which therefore increased surface temperatures. He also suspected that human activities could influence the radiation balance and Earth's climate, although he focused primarily on land-use changes. In an 1827 paper, Fourier stated,[19]

The establishment and progress of human societies, the action of natural forces, can notably change, and in vast regions, the state of the surface, the distribution of water and the great movements of the air. Such effects are able to make to vary, in the course of many centuries, the average degree of heat; because the analytic expressions contain coefficients relating to the state of the surface and which greatly influence the temperature.

Fourier's work built on previous discoveries: in 1681 Edme Mariotte noted that glass, though transparent to sunlight, obstructs radiant heat.[20][21] Around 1774 Horace Bénédict de Saussure showed that non-luminous warm objects emit infrared heat, and used a glass-topped insulated box to trap and measure heat from sunlight.[22][23]

The physicist Claude Pouillet proposed in 1838 that water vapor and carbon dioxide might trap infrared and warm the atmosphere, but there was still no experimental evidence of these gases absorbing heat from thermal radiation.[24]

Eunice Newton Foote recognized carbon dioxide's heat-capturing effect in 1856, appreciating its implications for the planet.[3]

The warming effect of sunlight on different gases was examined in 1856 by Eunice Newton Foote, who described her experiments using glass tubes exposed to sunlight. The warming effect of the sun was greater for compressed air than for an evacuated tube and greater for moist air than dry air. "Thirdly, the highest effect of the sun's rays I have found to be in carbonic acid gas." (carbon dioxide) She continued: "An atmosphere of that gas would give to our earth a high temperature; and if, as some suppose, at one period of its history, the air had mixed with it a larger proportion than at present, an increased temperature from its action, as well as from an increased weight, must have necessarily resulted." Her work was presented by Prof. Joseph Henry at the American Association for the Advancement of Science meeting in August 1856 and described as a brief note written by then journalist David Ames Wells; her paper was published later that year in the American Journal of Science and Arts. Few noticed the paper and it was only rediscovered in the 21st century,[25][3][26][27]

John Tyndall took Fourier's work one step further in 1859 when he built an apparatus to investigate the absorption of infrared radiation in different gases. He found that water vapor, hydrocarbons like methane (CH4), and carbon dioxide (CO2) strongly block the radiation. He understood that without these gases the planet would rapidly freeze.[28][29]

Some scientists suggested that ice ages and other great climate changes were due to changes in the amount of gases emitted in volcanism. But that was only one of many possible causes. Another obvious possibility was solar variation. Shifts in ocean currents also might explain many climate changes. For changes over millions of years, the raising and lowering of mountain ranges would change patterns of both winds and ocean currents. Or perhaps the climate of a continent had not changed at all, but it had grown warmer or cooler because of polar wander (the North Pole shifting to where the Equator had been or the like). There were dozens of theories.

For example, in the mid-19th century, James Croll published calculations of how the gravitational pulls of the Sun, Moon, and planets subtly affect the Earth's motion and orientation. The inclination of the Earth's axis and the shape of its orbit around the Sun oscillate gently in cycles lasting tens of thousands of years. During some periods the Northern Hemisphere would get slightly less sunlight during the winter than it would get during other centuries. Snow would accumulate, reflecting sunlight and leading to a self-sustaining ice age.[17][30] Most scientists, however, found Croll's ideas—and every other theory of climate change—unconvincing.

First calculations of greenhouse effect, 1896

[edit]
In 1896 Svante Arrhenius calculated the effect of a doubling atmospheric carbon dioxide to be an increase in surface temperatures of 5–6 degrees Celsius.
T. C. Chamberlin
This 1902 article attributes to Svante Arrhenius a theory that coal combustion could eventually lead to human extinction.[31]
This 1912 article, earlier published in Popular Mechanics, succinctly describes the greenhouse effect, describing how burning coal creates carbon dioxide that causes climate change.[32]

By the late 1890s, Samuel Pierpoint Langley along with Frank W. Very[33] had attempted to determine the surface temperature of the Moon by measuring infrared radiation leaving the Moon and reaching the Earth.[34] The angle of the Moon in the sky when a scientist took a measurement determined how much CO2 and water vapor the Moon's radiation had to pass through to reach the Earth's surface, resulting in weaker measurements when the Moon was low in the sky. This result was unsurprising given that scientists had known about infrared radiation absorption for decades.

In 1896 Svante Arrhenius used Langley's observations of increased infrared absorption where Moon rays pass through the atmosphere at a low angle, encountering more carbon dioxide (CO2), to estimate an atmospheric cooling effect from a future decrease of CO2. He realized that the cooler atmosphere would hold less water vapor (another greenhouse gas) and calculated the additional cooling effect. He also realized the cooling would increase snow and ice cover at high latitudes, making the planet reflect more sunlight and thus further cool down, as James Croll had hypothesized. Overall Arrhenius calculated that cutting CO2 in half would suffice to produce an ice age. He further calculated that a doubling of atmospheric CO2 would give a total warming of 5–6 degrees Celsius.[35]

Further, Arrhenius' colleague Arvid Högbom, who was quoted in length in Arrhenius' 1896 study On the Influence of Carbonic Acid in the Air upon the Temperature of the Earth[36] had been attempting to quantify natural sources of emissions of CO2 for purposes of understanding the global carbon cycle. Högbom found that estimated carbon production from industrial sources in the 1890s (mainly coal burning) was comparable with the natural sources.[37] Arrhenius saw that this human emission of carbon would eventually lead to a warming energy imbalance. However, because of the relatively low rate of CO2 production in 1896, Arrhenius thought the warming would take thousands of years, and he expected it would be beneficial to humanity.[37][38] In 1908 he revised this prediction to take hundreds of years due to the ever increasing rate of fuel use and that within his lifetime this would benefit humanity.[39]

In 1899 Thomas Chrowder Chamberlin developed at length the idea that climate changes could result from changes in the concentration of atmospheric carbon dioxide.[40] Chamberlin wrote in his 1899 book, An Attempt to Frame a Working Hypothesis of the Cause of Glacial Periods on an Atmospheric Basis:

By the investigations of Tyndall, Lecher and Pretner, Keller, Roentgen, and Arrhenius, it has been shown that the carbon dioxide and water vapor of the atmosphere have remarkable power of absorbing and temporarily retaining heat rays, while the oxygen, nitrogen, and argon of the atmosphere possess this power in a feeble degree only. It follows that the effect of the carbon dioxide and water vapor is to blanket the earth with a thermally absorbent envelope. .. The general results assignable to a greatly increased or a greatly reduced quantity of atmospheric carbon dioxide and water may be summarized as follows:

  • a. An increase, by causing a larger absorption of the sun's radiant energy, raises the average temperature, while a reduction lowers it. The estimate of Dr. Arrhenius, based upon an elaborate mathematical discussion of the observations of Professor Langley, is that an increase of the carbon dioxide to the amount of two or three times the present content would elevate the average temperature 8° or 9 °C. and would bring on a mild climate analogous to that which prevailed in the Middle Tertiary age. On the other hand, a reduction of the quantity of carbon dioxide in the atmosphere to an amount ranging from 55 to 62 per cent, of the present content, would reduce the average temperature 4 or 5 C, which would bring on a glaciation comparable to that of the Pleistocene period.
  • b. A second effect of increase and decrease in the amount of atmospheric carbon dioxide is the equalization, on the one hand, of surface temperatures, or their differentiation on the other. [...][41]

The term "greenhouse effect" for this warming was introduced by Nils Gustaf Ekholm in 1901.[42][43]

20th century onwards

[edit]
The impact of the greenhouse effect on climate was presented to the public early in the 20th century, as succinctly described in this 1912 Popular Mechanics article.

Paleoclimates and sunspots, early 1900s to 1950s

[edit]

Arrhenius's calculations were disputed and subsumed into a larger debate over whether atmospheric changes had caused the ice ages. Experimental attempts to measure infrared absorption in the laboratory seemed to show little differences resulted from increasing CO2 levels, and also found significant overlap between absorption by CO2 and absorption by water vapor, all of which suggested that increasing carbon dioxide emissions would have little climatic effect. These early experiments were later found to be insufficiently accurate, given the instrumentation of the time. Many scientists also thought that the oceans would quickly absorb any excess carbon dioxide.[37]

Other theories of the causes of climate change fared no better. The principal advances were in observational paleoclimatology, as scientists in various fields of geology worked out methods to reveal ancient climates. In 1929, Wilmot H. Bradley found that annual varves of clay laid down in lake beds showed climate cycles. Andrew Ellicott Douglass saw strong indications of climate change in tree rings. Noting that the rings were thinner in dry years, he reported climate effects from solar variations, particularly in connection with the 17th-century dearth of sunspots (the Maunder Minimum) noticed previously by William Herschel and others. Other scientists, however, found good reason to doubt that tree rings could reveal anything beyond random regional variations. The value of tree rings for climate study was not solidly established until the 1960s.[44][45]

Through the 1930s the most persistent advocate of a solar-climate connection was astrophysicist Charles Greeley Abbot. By the early 1920s, he had concluded that the solar "constant" was misnamed: his observations showed large variations, which he connected with sunspots passing across the face of the Sun. He and a few others pursued the topic into the 1960s, convinced that sunspot variations were a main cause of climate change. Other scientists were skeptical.[44][45] Nevertheless, attempts to connect the solar cycle with climate cycles were popular in the 1920s and 1930s. Respected scientists announced correlations that they insisted were reliable enough to make predictions. Sooner or later, every prediction failed, and the subject fell into disrepute.[46]

Milutin Milanković

Meanwhile, Milutin Milankovitch, building on James Croll's theory, improved the tedious calculations of the varying distances and angles of the Sun's radiation as the Sun and Moon gradually perturbed the Earth's orbit. Some observations of varves (layers seen in the mud covering the bottom of lakes) matched the prediction of a Milankovitch cycle lasting about 21,000 years. However, most geologists dismissed the astronomical theory. For they could not fit Milankovitch's timing to the accepted sequence, which had only four ice ages, all of them much longer than 21,000 years.[47]

In 1938 Guy Stewart Callendar attempted to revive Arrhenius's greenhouse-effect theory. Callendar presented evidence that both temperature and the CO2 level in the atmosphere had been rising over the past half-century, and he argued that newer spectroscopic measurements showed that the gas was effective in absorbing infrared in the atmosphere. Nevertheless, most scientific opinion continued to dispute or ignore the theory.[48]

Increasing concern, 1950s–1960s

[edit]
Charles Keeling, receiving the National Medal of Science from George W. Bush, in 2001

Better spectrography in the 1950s showed that CO2 and water vapor absorption lines did not overlap completely. Climatologists also realized that little water vapor was present in the upper atmosphere. Both developments showed that the CO2 greenhouse effect would not be overwhelmed by water vapor.[49][37]

In 1955, Hans Suess's carbon-14 isotope analysis showed that CO2 released from fossil fuels was not immediately absorbed by the ocean. In 1956, Gilbert Plass published the results of his landmark research on the relationship between atmospheric CO2 and global average temperature. His results indicated that a doubling of CO2 would warm the planet by 3.6 °C.[50] In 1957, better understanding of ocean chemistry led Roger Revelle to a realization that the ocean surface layer had limited ability to absorb carbon dioxide, also predicting the rise in levels of CO2 and later being proven by Charles David Keeling.[51] By the late 1950s, more scientists were arguing that carbon dioxide emissions could be a problem, with some projecting in 1959 that CO2 would rise 25% by the year 2000, with potentially "radical" effects on climate.[37]

In the centennial of the American oil industry in 1959, organized by the American Petroleum Institute and the Columbia Graduate School of Business, Edward Teller said "It has been calculated that a temperature rise corresponding to a 10 per cent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York. ... At present the carbon dioxide in the atmosphere has risen by 2 per cent over normal. By 1970, it will be perhaps 4 per cent, by 1980, 8 per cent, by 1990, 16 per cent if we keep on with our exponential rise in the use of purely conventional fuels."[52] In 1960 Charles David Keeling demonstrated that the level of CO2 in the atmosphere was in fact rising. Concern mounted year by year along with the rise of the "Keeling Curve" of atmospheric CO2.

Another clue to the nature of climate change came in the mid-1960s from analysis of deep-sea cores by Cesare Emiliani and analysis of ancient corals by Wallace Broecker and collaborators. Rather than four long ice ages, they found a large number of shorter ones in a regular sequence. It appeared that the timing of ice ages was set by the small orbital shifts of the Milankovitch cycles. While the matter remained controversial, some began to suggest that the climate system is sensitive to small changes and can readily be flipped from a stable state into an instable one.[47]

Scientists meanwhile began using computers to develop more sophisticated versions of Arrhenius's calculations. In 1967, taking advantage of the ability of digital computers to integrate absorption curves numerically, Syukuro Manabe and Richard Wetherald made the first detailed calculation of the greenhouse effect incorporating convection (the "Manabe-Wetherald one-dimensional radiative-convective model").[53][54] They found that, in the absence of unknown feedbacks such as changes in clouds, a doubling of carbon dioxide from the current level would result in approximately 2 °C increase in global temperature. For this, and related work, Manabe was awarded a share of the 2021 Nobel Prize in Physics.[55]

By the 1960s, aerosol pollution ("smog") had become a serious local problem in many cities, and some scientists began to consider whether the cooling effect of particulate pollution could affect global temperatures. Scientists were unsure whether the cooling effect of particulate pollution or warming effect of greenhouse gas emissions would predominate, but regardless, began to suspect that human emissions could be disruptive to climate in the 21st century if not sooner. In his 1968 book The Population Bomb, Paul R. Ehrlich wrote, "the greenhouse effect is being enhanced now by the greatly increased level of carbon dioxide ... [this] is being countered by low-level clouds generated by contrails, dust, and other contaminants ... At the moment we cannot predict what the overall climatic results will be of our using the atmosphere as a garbage dump."[56]

Efforts to establish a global temperature record that began in 1938 culminated in 1963, when J. Murray Mitchell presented one of the first up-to-date temperature reconstructions. His study involved data from over 200 weather stations, collected by the World Weather Records[57], which was used to calculate latitudinal average temperature. In his presentation, Murray showed that, beginning in 1880, global temperatures increased steadily until 1940. After that, a multi-decade cooling trend emerged. Murray's work contributed to the overall acceptance of a possible global cooling trend.[58][59]

In 1965, the landmark report "Restoring the Quality of Our Environment" by U.S. President Lyndon B. Johnson's Science Advisory Committee warned of the harmful effects of fossil fuel emissions:

The part that remains in the atmosphere may have a significant effect on climate; carbon dioxide is nearly transparent to visible light, but it is a strong absorber and back radiator of infrared radiation, particularly in the wave lengths from 12 to 18 microns; consequently, an increase of atmospheric carbon dioxide could act, much like the glass in a greenhouse, to raise the temperature of the lower air.[40]

The committee used the recently available global temperature reconstructions and carbon dioxide data from Charles David Keeling and colleagues to reach their conclusions. They declared the rise of atmospheric carbon dioxide levels to be the direct result of fossil fuel burning. The committee concluded that human activities were sufficiently large to have significant, global impact—beyond the area the activities take place. "Man is unwittingly conducting a vast geophysical experiment", the committee wrote.[59]

In 1966, Nobel Prize winner Glenn T. Seaborg, Chairperson of the United States Atomic Energy Commission warned of the climate crisis: "At the rate we are currently adding carbon dioxide to our atmosphere (six billion tons a year), within the next few decades the heat balance of the atmosphere could be altered enough to produce marked changes in the climate--changes which we might have no means of controlling even if by that time we have made great advances in our programs of weather modification."[60]

1969 Memorandum to Richard Nixon's
White House Counsel John Ehrlichman

    Carbon dioxide in the atmosphere has the effect of a pane of glass in a greenhouse. The CO2 content is normally in a stable cycle, but recently man has begun to introduce instability through the burning of fossil fuels. ... It is now pretty clearly agreed that the CO2 content will rise 25% by 2000. This could increase the average temperature near the earth's surface by 7 degrees Fahrenheit. This in turn could raise the level of the sea by 10 feet. Goodbye New York. Goodbye Washington, for that matter. We have no data on Seattle.

Daniel Patrick Moynihan, September 17, 1969[61]


White House Urban Affairs Director,
discussing the greenhouse effect and
urging building of a monitoring system


(Click {} to view memo)

A 1968 study by the Stanford Research Institute for the American Petroleum Institute noted:[62]

If the earth's temperature increases significantly, a number of events might be expected to occur, including the melting of the Antarctic ice cap, a rise in sea levels, warming of the oceans, and an increase in photosynthesis. ... Revelle makes the point that man is now engaged in a vast geophysical experiment with his environment, the earth. Significant temperature changes are almost certain to occur by the year 2000 and these could bring about climatic changes.

In 1969, NATO was the first candidate to deal with climate change on an international level. It was planned then to establish a hub of research and initiatives of the organization in the civil area, dealing with environmental topics[63] as acid rain and the greenhouse effect. The suggestion of US President Richard Nixon was not very successful with the administration of German Chancellor Kiesinger. But the topics and the preparation work done on the NATO proposal by German authorities gained international momentum, (see e.g. the Stockholm United Nations Conference on the Human Environment 1970) as the government of Willy Brandt started to apply them on the civil sphere instead.[63][clarification needed]

Also in 1969, Mikhail Budyko published a theory on the ice–albedo feedback, a foundational element of what is today known as Arctic amplification.[64] The same year a similar model was published by William D. Sellers.[65] Both studies attracted significant attention, since they hinted at the possibility for a runaway positive feedback within the global climate system.[66]

A 1969 memo from White House Urban Affairs Director Daniel Patrick Moynihan tried to impress the office of U.S. President Nixon with the projected severity of the greenhouse effect.[61] However, action was not taken, even after a 20 December 1971 initiative from the Office of Science and Technology, "Determine the Climate Change Caused by Man and Nature".[67] In the initiative, Nixon's science advisors recommended an international network for monitoring climate trends and human impact on it.[68]

Scientists increasingly predict warming, 1970s

[edit]
Mean temperature anomalies during the period 1965 to 1975 with respect to the average temperatures from 1937 to 1946. This dataset was not available at the time.

In the early 1970s, evidence that aerosols were increasing worldwide and that the global temperature series showed cooling encouraged Reid Bryson and some others to warn of the possibility of severe cooling. The questions and concerns put forth by Bryson and others launched a new wave of research into the factors of such global cooling.[59] Meanwhile, the new evidence that the timing of ice ages was set by predictable orbital cycles suggested that the climate would gradually cool, over thousands of years. Several scientific panels from this time period concluded that more research was needed to determine whether warming or cooling was likely, indicating that the trend in the scientific literature had not yet become a consensus.[69][70][71] For the century ahead, however, a survey of the scientific literature from 1965 to 1979 found 7 articles predicting cooling and 44 predicting warming (many other articles on climate made no prediction); the warming articles were cited much more often in subsequent scientific literature.[59] Research into warming and greenhouse gases held the greater emphasis, with nearly six times more studies predicting warming than predicting cooling, suggesting concern among scientists was largely over warming as they turned their attention toward the greenhouse effect.[59]

John Sawyer published the study Man-made Carbon Dioxide and the "Greenhouse" Effect in 1972.[72] He summarized the knowledge of the science at the time, the anthropogenic attribution of the carbon dioxide greenhouse gas, distribution and exponential rise, findings which still hold today. Additionally he accurately predicted the rate of global warming for the period between 1972 and 2000.[73][74]

The increase of 25% CO2 expected by the end of the century therefore corresponds to an increase of 0.6 °C in the world temperature – an amount somewhat greater than the climatic variation of recent centuries. – John Sawyer, 1972

The first satellite records compiled in the early 1970s showed snow and ice cover over the Northern Hemisphere to be increasing, prompting further scrutiny into the possibility of global cooling.[59] J. Murray Mitchell updated his global temperature reconstruction in 1972, which continued to show cooling.[59][75] However, scientists determined that the cooling observed by Mitchell was not a global phenomenon. Global averages were changing, largely in part due to unusually severe winters experienced by Asia and some parts of North America in 1972 and 1973, but these changes were mostly constrained to the Northern Hemisphere. In the Southern Hemisphere, the opposite trend was observed. The severe winters, however, pushed the issue of global cooling into the public eye.[59]

The mainstream news media at the time exaggerated the warnings of the minority who expected imminent cooling. For example, in 1975, Newsweek magazine published a story titled "The Cooling World" that warned of "ominous signs that the Earth's weather patterns have begun to change".[76] The article drew on studies documenting the increasing snow and ice in regions of the Northern Hemisphere and concerns and claims by Reid Bryson that global cooling by aerosols would dominate carbon dioxide warming.[59] The article continued by stating that evidence of global cooling was so strong that meteorologists were having "a hard time keeping up with it".[76] On 23 October 2006, Newsweek issued an update stating that it had been "spectacularly wrong about the near-term future".[77] Nevertheless, this article and others like it had long-lasting effects on public perception of climate science.[59]

1977 Memorandum to the President:
Release of Fossil CO2 and the
Possibility of a Catastrophic Climate Change


    (within 60 years:) Because of the "greenhouse effect" of atmospheric CO2 the increased concentration will induce a global climatic warming of anywhere from 0.5 to 5 °C.
    ... The potential effect on the environment of a climatic fluctuation of such rapidity could be catastrophic and calls for an impact assessment of unprecedented importance and difficulty. A rapid climatic change may result in large scale crop failures at a time when an increased world population taxes agriculture to the limits of productivity.
    ... The urgency of the problem derives from our inability to shift rapidly to non-fossil fuel sources once the climatic effects become evident not long after the year 2000; ...

Frank Press, 7 July 1977[78]
Chief science adviser to U.S. President Carter

Such media coverage heralding the coming of a new ice age resulted in beliefs that this was the consensus among scientists, despite this not being reflected by the scientific literature. As it became apparent that scientific opinion was in favor of global warming, the public began to express doubt over how trustworthy the science was.[59] The argument that scientists were wrong about global cooling, so therefore may be wrong about global warming has been called "the Ice Age Fallacy" by Time author Bryan Walsh.[79]

In the first two "Reports for the Club of Rome" in 1972[80] and 1974,[81] the anthropogenic climate changes by CO2 increase as well as by waste heat were mentioned. About the latter John Holdren wrote in a study[82] cited in the 1st report, "that global thermal pollution is hardly our most immediate environmental threat. It could prove to be the most inexorable, however, if we are fortunate enough to evade all the rest". Simple global-scale estimates[83] that recently have been actualized[84] and confirmed by more refined model calculations[85][86] show noticeable contributions from waste heat to global warming after the year 2100, if its growth rates are not strongly reduced (below the averaged 2% p.a. which occurred since 1973).

Evidence for warming accumulated. By 1975, Manabe and Wetherald had developed a three-dimensional global climate model that gave a roughly accurate representation of the current climate. Doubling CO2 in the model's atmosphere gave a roughly 2 °C rise in global temperature.[87] Several other kinds of computer models gave similar results: it was impossible to make a model that gave something resembling the actual climate and not have the temperature rise when the CO2 concentration was increased.

In a separate development, an analysis of deep-sea cores published in 1976 by Nicholas Shackleton and colleagues showed that the dominating influence on ice age timing came from a 100,000-year Milankovitch orbital change. This was unexpected, since the change in sunlight in that cycle was slight. The result emphasized that the climate system is driven by feedbacks, and thus is strongly susceptible to small changes in conditions.[17]

A 1977 memo (see quote box) from President Carter's chief science adviser Frank Press warned of the possibility of catastrophic climate change.[78] However, other issues—such as known harms to health from pollutants, and avoiding energy dependence on other nations—seemed more pressing and immediate.[78] Energy Secretary James Schlesinger advised that "the policy implications of this issue are still too uncertain to warrant Presidential involvement and policy initiatives", and the fossil fuel industry began sowing doubt about climate science.[78]

The 1979 World Climate Conference (12 to 23 February) of the World Meteorological Organization concluded "it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to a gradual warming of the lower atmosphere, especially at higher latitudes. ... It is possible that some effects on a regional and global scale may be detectable before the end of this century and become significant before the middle of the next century."[88]

In July 1979 the United States National Research Council published a report,[89] concluding (in part):

When it is assumed that the CO2 content of the atmosphere is doubled and statistical thermal equilibrium is achieved, the more realistic of the modeling efforts predict a global surface warming of between 2 °C and 3.5 °C, with greater increases at high latitudes. ... we have tried but have been unable to find any overlooked or underestimated physical effects that could reduce the currently estimated global warmings due to a doubling of atmospheric CO2 to negligible proportions or reverse them altogether.

One week before President Carter left office, the White House Council on Environmental Quality (CEQ) issued reports including a suggestion to limit global average temperature to 2°C above preindustrial levels, one goal agreed to in the 2015 Paris climate accord.[90]

Consensus begins to form, 1980–1988

[edit]
James Hansen during his 1988 testimony to Congress, which alerted the public to the dangers of global warming

By the early 1980s, the slight cooling trend from 1945 to 1975 had stopped. Aerosol pollution had decreased in many areas due to environmental legislation and changes in fuel use, and it became clear that the cooling effect from aerosols was not going to increase substantially while carbon dioxide levels were progressively increasing.

Hansen and others published the 1981 study Climate impact of increasing atmospheric carbon dioxide, and noted:

It is shown that the anthropogenic carbon dioxide warming should emerge from the noise level of natural climate variability by the end of the century, and there is a high probability of warming in the 1980s. Potential effects on climate in the 21st century include the creation of drought-prone regions in North America and central Asia as part of a shifting of climatic zones, erosion of the West Antarctic ice sheet with a consequent worldwide rise in sea level, and opening of the fabled Northwest Passage.[91]

In 1982, Greenland ice cores drilled by Hans Oeschger, Willi Dansgaard, and collaborators revealed dramatic temperature oscillations in the space of a century in the distant past.[92] The most prominent of the changes in their record corresponded to the violent Younger Dryas climate oscillation seen in shifts in types of pollen in lake beds all over Europe. Evidently drastic climate changes were possible within a human lifetime.

In 1973 James Lovelock speculated that chlorofluorocarbons (CFCs) could have a global warming effect. In 1975 V. Ramanathan found that a CFC molecule could be 10,000 times more effective in absorbing infrared radiation than a carbon dioxide molecule, making CFCs potentially important despite their very low concentrations in the atmosphere. While most early work on CFCs focused on their role in ozone depletion, by 1985 Ramanathan and others showed that CFCs together with methane and other trace gases could have nearly as important a climate effect as increases in CO2. In other words, global warming would arrive twice as fast as had been expected.[93]

Since the 1980s, global average surface temperatures during a given decade have almost always been higher than the average temperature in the preceding decade.

In 1985 a joint UNEP/WMO/ICSU Conference on the "Assessment of the Role of Carbon Dioxide and Other Greenhouse Gases in Climate Variations and Associated Impacts" concluded that greenhouse gases "are expected" to cause significant warming in the next century and that some warming is inevitable.[94]

Meanwhile, ice cores drilled by a Franco-Soviet team at the Vostok Station in Antarctica showed that CO2 and temperature had gone up and down together in wide swings through past ice ages. This confirmed the CO2-temperature relationship in a manner entirely independent of computer climate models, strongly reinforcing the emerging scientific consensus. The findings also pointed to powerful biological and geochemical feedbacks.[95]

In January 1986[96], the German Physical Society published warning of an impending climate catastrophe[97]

In June 1988, James E. Hansen made one of the first assessments that human-caused warming had already measurably affected global climate.[98] Shortly after, a "World Conference on the Changing Atmosphere: Implications for Global Security" gathered hundreds of scientists and others in Toronto. They concluded that the changes in the atmosphere due to human pollution "represent a major threat to international security and are already having harmful consequences over many parts of the globe", and declared that by 2005 the world would be well-advised to push its emissions some 20% below the 1988 level.[99]

The 1980s saw important breakthroughs with regard to global environmental challenges. Ozone depletion was mitigated by the Vienna Convention (1985) and the Montreal Protocol (1987). Acid rain was mainly regulated on national and regional levels.

Increased consensus amongst scientists: 1988 to present

[edit]
Scientific consensus on causation: Academic studies of scientific agreement on human-caused global warming among climate experts (2010–2015) reflect that the level of consensus correlates with expertise in climate science.[100] A 2019 study found scientific consensus to be at 100%,[101] and a 2021 study concluded that consensus exceeded 99%.[102] Another 2021 study found that 98.7% of climate experts indicated that the Earth is getting warmer mostly because of human activity.[103]

In 1988 the WMO established the Intergovernmental Panel on Climate Change with the support of the UNEP. The IPCC continues its work through the present day, and issues a series of Assessment Reports and supplemental reports that describe the state of scientific understanding at the time each report is prepared. Scientific developments during this period are summarized about once every five to six years in the IPCC Assessment Reports which were published in 1990 (First Assessment Report), 1995 (Second Assessment Report), 2001 (Third Assessment Report), 2007 (Fourth Assessment Report), 2013/2014 (Fifth Assessment Report). and 2021 Sixth Assessment Report[104] The 2001 report was the first to state positively that the observed global temperature increase was "likely" to be due to human activities. The conclusion was influenced especially by the so-called hockey stick graph showing an abrupt historical temperature rise simultaneous with the rise of greenhouse gas emissions, and by observations of changes in ocean heat content that had a "signature" matching the pattern that computer models calculated for the effect of greenhouse warming. By the time of the 2021 report, scientists had much additional evidence. Above all, measurements of paleotemperatures from several eras in the distant past, and the record of temperature change since the mid 19th century, could be matched against measurements of CO2 levels to provide independent confirmation of supercomputer model calculations.

These developments depended crucially on Weather satellites, other satellites and huge globe-spanning observation programs. Since the 1990s research into historical and modern climate change expanded rapidly. International coordination was provided by the World Climate Research Programme (established in 1980) and was increasingly oriented around providing input to the IPCC reports. Measurement networks such as the Global Ocean Observing System, Integrated Carbon Observation System, and NASA's Earth Observing System enabled monitoring of the causes and effects of ongoing change. Research also broadened, linking many fields such as Earth sciences, behavioral sciences, economics, and security.

Relative importance of human activity versus natural causes

[edit]

A historically important question in climate change research has regarded the relative importance of human activity and natural causes during the period of instrumental record. In the 1995 Second Assessment Report (SAR), the IPCC made the widely quoted statement that "The balance of evidence suggests a discernible human influence on global climate". The phrase "balance of evidence" suggested the (English) common-law standard of proof required in civil as opposed to criminal courts: not as high as "beyond reasonable doubt". In 2001 the Third Assessment Report (TAR) refined this, saying "There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities".[105] The 2007 Fourth Assessment Report (AR4) strengthened this finding:

  • "Anthropogenic warming of the climate system is widespread and can be detected in temperature observations taken at the surface, in the free atmosphere and in the oceans. Evidence of the effect of external influences, both anthropogenic and natural, on the climate system has continued to accumulate since the TAR."[106]

Other findings of the IPCC Fourth Assessment Report include:

  • "It is extremely unlikely (<5%)[107] that the global pattern of warming during the past half century can be explained without external forcing (i.e., it is inconsistent with being the result of internal variability), and very unlikely[107] that it is due to known natural external causes alone. The warming occurred in both the ocean and the atmosphere and took place at a time when natural external forcing factors would likely have produced cooling."[108]
  • "From new estimates of the combined anthropogenic forcing due to greenhouse gases, aerosols, and land surface changes, it is extremely likely (>95%)[107] that human activities have exerted a substantial net warming influence on climate since 1750."[109]
  • "It is virtually certain[107] that anthropogenic aerosols produce a net negative radiative forcing (cooling influence) with a greater magnitude in the Northern Hemisphere than in the Southern Hemisphere."[109]

Some results from scientific studies on this issue are listed below:

  • In 1996, in a paper in Nature titled "A search for human influences on the thermal structure of the atmosphere", Benjamin D. Santer et al. wrote: "The observed spatial patterns of temperature change in the free atmosphere from 1963 to 1987 are similar to those predicted by state-of-the-art climate models incorporating various combinations of changes in carbon dioxide, anthropogenic sulphate aerosol and stratospheric ozone concentrations. The degree of pattern similarity between models and observations increases through this period. It is likely that this trend is partially due to human activities, although many uncertainties remain, particularly relating to estimates of natural variability."[110]
  • A 2002 paper in the Journal of Geophysical Research says "Our analysis suggests that the early twentieth century warming can best be explained by a combination of warming due to increases in greenhouse gases and natural forcing, some cooling due to other anthropogenic forcings, and a substantial, but not implausible, contribution from internal variability. In the second half of the century we find that the warming is largely caused by changes in greenhouse gases, with changes in sulphates and, perhaps, volcanic aerosol offsetting approximately one third of the warming."[111][112]
  • A 2005 review of detection and attribution studies by the International Ad hoc Detection and Attribution Group[113] found that "natural drivers such as solar variability and volcanic activity are at most partially responsible for the large-scale temperature changes observed over the past century, and that a large fraction of the warming over the last 50 yr can be attributed to greenhouse gas increases. Thus, the recent research supports and strengthens the IPCC Third Assessment Report conclusion that 'most of the global warming over the past 50 years is likely due to the increase in greenhouse gases.'"
  • Barnett and colleagues (2005) say that the observed warming of the oceans "cannot be explained by natural internal climate variability or solar and volcanic forcing, but is well simulated by two anthropogenically forced climate models," concluding that "it is of human origin, a conclusion robust to observational sampling and model differences".[114]
  • Two papers in the journal Science in August 2005[115][116] resolve the problem, evident at the time of the TAR, of tropospheric temperature trends. The UAH version of the record contained errors, and there is evidence of spurious cooling trends in the radiosonde record, particularly in the tropics. See satellite temperature measurements for details; and the 2006 US CCSP report.[117]

Extreme event attribution

[edit]
Extreme event attribution methods generally involve applying climate change models to scenarios in both the "real" world that is experiencing global warming, and a simulated world that does not suffer the drivers of global warming. Differences between the results of the two processes—especially the frequency, intensity and impacts of extreme weather events—are then analyzed to arrive at the attribution result.[118][119][120]

Extreme event attribution (EEA), also known as attribution science, was developed in the early decades of the 21st century.[121] EEA uses climate models to identify and quantify the role that human-caused climate change plays in the frequency, intensity, duration, and impacts of specific individual extreme weather events.[122][123] Results of attribution studies allow scientists and journalists to make statements such as, "this weather event was made at least n times more likely by human-caused climate change" or "this heatwave was made m degrees hotter than it would have been in a world without global warming" or "this event was effectively impossible without climate change".[124]

A common EEA approach uses model simulations to compare events in two worlds—a first world with human-caused greenhouse gas emissions and a second world without such emissions—and attributing differences to human influence.[125] Greater computing power of the 2000s allowed weather to be simulated over and over again, and conceptual breakthroughs in the early to mid 2010s[119] enabled attribution science to detect the effects of climate change on some events with high confidence.[121] Scientists use methods that have already been peer reviewed, allowing "rapid attribution" studies to be published within a "news cycle" time frame.[119]

Terminology

[edit]
Terms like "climate emergency" and climate crisis" have often been used by activists, and are increasingly found in academic papers.[126]

Before the 1980s, it was unclear whether the warming effect of increased greenhouse gases was stronger than the cooling effect of airborne particulates in air pollution. Scientists used the term inadvertent climate modification to refer to human impacts on the climate at this time.[127] In the 1980s, the terms global warming and climate change became more common, often being used interchangeably.[128][129][130] Scientifically, global warming refers only to increased global average surface temperature, while climate change describes both global warming and its effects on Earth's climate system, such as precipitation changes.[127]

Climate change can also be used more broadly to include changes to the climate that have happened throughout Earth's history.[131] Global warming—used as early as 1975[132]—became the more popular term after NASA climate scientist James Hansen used it in his 1988 testimony in the U.S. Senate.[133] Since the 2000s, usage of climate change has increased.[134] Various scientists, politicians and media may use the terms climate crisis or climate emergency to talk about climate change, and may use the term global heating instead of global warming.[135][136]

Discredited theories and reconciled apparent discrepancies

[edit]

Analogy of the greenhouse effect to the atmosphere

[edit]

The early work of Joseph Fourier found that a greenhouse heats up mainly due to radiation trapping. This is analogous to radiation trapping in the atmosphere, leading to the term "greenhouse effect".[137]

An experiment performed by Prof. R. W. Wood in 1909 led him to reject radiation trapping, claiming that a greenhouse is heated merely due to convection blocking.[138] This theory became a widespread view in the scientific community.[139][140][141][142]

Moreover, Wood's theory has been used to reject the analogy, and to doubt the existence of a greenhouse effect in the atmosphere.[143][144][145][146]
Experiments have discredited Wood's theory. They have confirmed that radiation trapping is indeed the dominant cause of heating in a greenhouse. Hence the analogy is valid.[147][148][149]

Discussions around locations of temperature measurement stations

[edit]
Exterior of a Stevenson screen used for temperature measurements on land stations.

There have been attempts to raise public controversy over the accuracy of the instrumental temperature record on the basis of the urban heat island effect, the quality of the surface station network, and assertions that there have been unwarranted adjustments to the temperature record.[150][151]

Weather stations that are used to compute global temperature records are not evenly distributed over the planet, and their distribution has changed over time. There were a small number of weather stations in the 1850s, and the number did not reach the current 3000+ until the 1951 to 1990 period[152]

The 2001 IPCC Third Assessment Report (TAR) acknowledged that the urban heat island is an important local effect, but cited analyses of historical data indicating that the effect of the urban heat island on the global temperature trend is no more than 0.05 °C (0.09 °F) degrees through 1990.[153] Peterson (2003) found no difference between the warming observed in urban and rural areas.[154]

Parker (2006) found that there was no difference in warming between calm and windy nights. Since the urban heat island effect is strongest for calm nights and is weak or absent on windy nights, this was taken as evidence that global temperature trends are not significantly contaminated by urban effects.[155] Pielke and Matsui published a paper disagreeing with Parker's conclusions.[156]

In 2005, Roger A. Pielke and Stephen McIntyre criticized the US instrumental temperature record and adjustments to it, and Pielke and others criticized the poor quality siting of a number of weather stations in the United States.[157][158] A study in 2010 examined the siting of temperature stations and found that those measurement stations that were poorly showed a slight cool bias rather than the warm bias which deniers had postulated.[159][160]

The Berkeley Earth Surface Temperature group carried out an independent assessment of land temperature records, which examined issues raised by deniers, such as the urban heat island effect, poor station quality, and the risk of data selection bias. The preliminary results, made public in October 2011, found that these factors had not biased the results obtained by NOAA, the Hadley Centre together with the Climatic Research Unit (HadCRUT) and NASA's GISS in earlier studies. The group also confirmed that over the past 50 years the land surface warmed by 0.911 °C, and their results closely matched those obtained from these earlier studies.[161][162][163][164]

Apparent discrepancy for tropospheric temperature increases in the tropics

[edit]

General circulation models and basic physical considerations predict that in the tropics the temperature of the troposphere should increase more rapidly than the temperature of the surface. A 2006 report to the U.S. Climate Change Science Program noted that models and observations agreed on this amplification for monthly and interannual time scales but not for decadal time scales in most observed data sets. Improved measurement and analysis techniques have reconciled this discrepancy: corrected buoy and satellite surface temperatures are slightly cooler and corrected satellite and radiosonde measurements of the tropical troposphere are slightly warmer.[165] Satellite temperature measurements show that tropospheric temperatures are increasing with "rates similar to those of the surface temperature", leading the IPCC to conclude in 2007 that this discrepancy is reconciled.[166]

Iris hypothesis

[edit]

The iris hypothesis was a hypothesis proposed by Richard Lindzen and colleagues in 2001 that suggested increased sea surface temperature in the tropics would result in reduced cirrus clouds and thus more infrared radiation leakage from Earth's atmosphere. His study of observed changes in cloud coverage and modeled effects on infrared radiation released to space as a result seemed to support the hypothesis.[167] This suggested infrared radiation leakage was hypothesized to be a negative feedback in which an initial warming would result in an overall cooling of the surface.

The idea of the iris effect of cirrus clouds in trapping outgoing radiation was reasonable, but it ignored the larger compensating effect on the blocking of incoming sun's rays, and effects of changes in altitude of clouds.[168]: 92 [169] Moreover, a number of errors were found in the papers.[170][171] For this reason, the iris effect no longer plays a role in the current scientific consensus on climate change.

Apparent "Antarctica cooling" discrepancy

[edit]

Antarctica is the coldest, driest continent on Earth, and has the highest average elevation.[172] Antarctica's dryness means the air contains little water vapor and conducts heat poorly.[173] The Southern Ocean surrounding the continent is far more effective at absorbing heat than any other ocean.[174] The presence of extensive, year-round sea ice, which has a high albedo (reflectivity), adds to the albedo of the ice sheets' own bright, white surface.[172] Antarctica's coldness makes it the only place on Earth to have an atmospheric temperature inversion occur every winter;[172] elsewhere on Earth, the atmosphere is at its warmest near the surface and becomes cooler as elevation increases. During the Antarctic winter, the surface of central Antarctica becomes cooler than middle layers of the atmosphere,[173] which makes greenhouse gases trap heat in the middle atmosphere, and reduce its flow toward the surface and toward space, rather than preventing the flow of heat from the lower atmosphere to the upper layers. The effect lasts until the end of the Antarctic winter.[173][172] Early climate models predicted temperature trends over Antarctica would emerge more slowly and be more subtle than those elsewhere.[175]

There were fewer than twenty permanent weather stations across the continent and only two in the continent's interior. Automatic weather stations were deployed relatively late, and their observational record was brief for much of the 20th century satellite temperature measurements began in 1981 and are typically limited to cloud-free conditions. Thus, datasets representing the entire continent had begun to appear only by the very end of the 20th century.[176] The exception was the Antarctic Peninsula, where warming was pronounced and well-documented;[177] it was eventually found to have warmed by 3 °C (5.4 °F) since the mid 20th century.[178] Based on those limited data, several papers published in the early 2000s said there had been an overall cooling over continental Antarctica outside the Peninsula.[179][180] In particular, a 2002 analysis led by Peter Doran indicated stronger cooling than warming over Antarctica between 1966 and 2000, and found the McMurdo Dry Valleys in East Antarctica had experienced cooling of 0.7 °C per decade.[181] The paper noted that its data was limited, and it still found warming over 42% of the continent.[181][182]

Nevertheless, the paper received widespread media coverage, as multiple journalists described those findings as "contradictory" to global warming,[183][184][185] which was criticized by scientists at the time.[186][187] The "controversy" around cooling of Antarctica received further attention in 2004 when Michael Crichton wrote the novel State of Fear. The novel featured a fictional conspiracy among climate scientists to fake evidence of global warming, and cited Doran's study as proof that there was no warming in Antarctica outside of the Peninsula.[188] That novel was mentioned in a 2006 US Senate hearing in support of climate change denial,[189] and Peter Doran published a statement in The New York Times decrying the misinterpretation of his work.[182] The British Antarctic Survey and NASA also issued statements affirming the strength of climate science after the hearing.[190][191]

By 2009, researchers had combined historical weather-station data with satellite measurements to create consistent temperature records going back to 1957 that demonstrated warming of >0.05 °C per decade across the continent, with cooling in East Antarctica offset by the average temperature increase of at least 0.176 ± 0.06 °C per decade in West Antarctica.[192] That paper was widely reported on,[193][194] and subsequent research confirmed clear warming over West Antarctica in the 20th century, the only uncertainty being the magnitude.[195] During 2012–2013, estimates based on WAIS Divide ice cores and revised temperature records from Byrd Station suggested a much-larger West-Antarctica warming of 2.4 °C (4.3 °F) since 1958, or around 0.46 °C (0.83 °F) per decade,[196][197][198][199] but some scientists continued to emphasize uncertainty.[200] In 2022, a study narrowed the warming of the Central area of the West Antarctic Ice Sheet between 1959 and 2000 to 0.31 °C (0.56 °F) per decade, and conclusively attributed it to increases in greenhouse gas concentrations caused by human activity.[201] Likewise, the strong cooling at McMurdo Dry Valleys was confirmed to be a local trend.[202]

The Antarctica-wide warming trend continued after 2000, and in February 2020, the continent recorded its highest-ever temperature of 18.3 °C, exceeding the previous record of 17.5 °C in March 2015.[203] The East Antarctica interior also demonstrated clear warming between 2000 and 2020.[204][205] In particular, the South Pole warmed by 0.61 ± 0.34 °C per decade between 1990 and 2020, which is three times the global average.[206][207] On the other hand, changes in atmospheric circulation patterns like the Interdecadal Pacific Oscillation (IPO) and the Southern Annular Mode (SAM) slowed or partially reversed the warming of West Antarctica, with the Antarctic Peninsula experiencing cooling from 2002.[208][209][210] While a variability in those patterns is natural, past ozone depletion had also led the SAM to be stronger than it had been in the past 600 years of observations. Starting around 2002, studies predicted a reversal in the SAM once the ozone layer began to recover following the Montreal Protocol,[211][212][213] and those changes are consistent with their predictions.[214]

Under the most intense climate change scenario, known as RCP8.5, models predict Antarctic surface temperatures to rise by 3 °C (5.4 °F) by 2070[215] and by 4 °C (7.2 °F) on average by 2100, which will be accompanied by a 30% increase in precipitation and a 30% decrease in sea ice by 2100.[216] The Southern Ocean waters "south of 50° S latitude would also warm by about 1.9 °C (3.4 °F) by 2070.[215] RCPs were developed in the late 2000s, and early 2020s research considers RCP8.5 much less likely[217] than the more-moderate scenarios like RCP 4.5, which lie in between the worst-case scenario and the Paris Agreement goals.[218][219] If a low-emission scenario mostly consistent with the Paris Agreement goals is followed, then Antarctica would experience surface and ocean warming of less than 1 °C (1.8 °F) by 2070, while less than 15% of sea ice would be lost and precipitation would increase by less than 10%.[215]

Solar variation

[edit]

Some climate change deniers have argued that solar variation is a significant contributor to the observed global warming, which would reduce the relative importance of human-made causes. However, this is not supported by scientific consensus on climate change. Scientists reject the notion that the warming observed in the global mean surface temperature record since about 1850 is the result of solar variations: "The observed rapid rise in global mean temperatures seen after 1985 cannot be ascribed to solar variability, whichever of the mechanisms is invoked and no matter how much the solar variation is amplified."[220]

The consensus position is that solar radiation may have increased by 0.12 W/m2 since 1750, compared to 1.6 W/m2 for the net anthropogenic forcing.[221]: 3  Already in 2001, the IPCC Third Assessment Report had found that, "The combined change in radiative forcing of the two major natural factors (solar variation and volcanic aerosols) is estimated to be negative for the past two, and possibly the past four, decades."[222]

Many studies say that the recent level of solar activity was historically high as determined by sunspot activity and other factors. This is known as the "Modern Maximum". Solar activity could affect climate either by variation in the Sun's output or, more speculatively, by an indirect effect on the amount of cloud formation. Solanki and co-workers suggest that solar activity for the last 60 to 70 years may be at its highest level in 8,000 years, however they said "that solar variability is unlikely to have been the dominant cause of the strong warming during the past three decades", and concluded that "at the most 30% of the strong warming since [1970] can be of solar origin".[223] Although the paradigm of the Modern Maximum is broadly accepted, [224] its recurrence rate is still an open question.[225], and "solar activity reconstructions tell us that only a minor fraction of the recent global warming can be explained by the variable Sun."[226]

Solar activity

[edit]
The graph shows the solar irradiance without a long-term trend. The 11-year solar cycle is also visible. The temperature, in contrast, shows an upward trend.
Solar irradiance (yellow) plotted with temperature (red) since 1880.
Modeled simulation of the effect of various factors (including GHGs, Solar irradiance) singly and in combination, showing in particular that solar activity produces a small and nearly uniform warming, unlike what is observed.

The role of solar activity in climate change has also been calculated over longer time periods using "proxy" datasets, such as tree rings.[227] Models indicate that solar and volcanic forcings can explain periods of relative warmth and cold between AD 1000 and 1900, but human-induced forcings are needed to reproduce the late-20th century warming.[228]

Another line of evidence against the sun having caused recent climate change comes from looking at how temperatures at different levels in the Earth's atmosphere have changed.[229]

The US Environmental Protection Agency (US EPA, 2009) responded to public comments on climate change attribution.[230] A number of commenters had argued that recent climate change could be attributed to changes in solar irradiance. According to the US EPA (2009), this attribution was not supported by the bulk of the scientific literature. Citing the work of the IPCC (2007), the US EPA pointed to the low contribution of solar irradiance to radiative forcing since the start of the Industrial Revolution in 1750. Over this time period (1750 to 2005),[231] the estimated contribution of solar irradiance to radiative forcing was 5% the value of the combined radiative forcing due to increases in the atmospheric concentrations of carbon dioxide, methane and nitrous oxide (see graph opposite).

The role of the Sun in recent climate change has been looked at by climate scientists. Since 1978, output from the Sun has been measured by satellites[232]: 6  significantly more accurately than was previously possible from the surface. These measurements indicate that the Sun's total solar irradiance has not increased since 1978, so the warming during the past 30 years cannot be directly attributed to an increase in total solar energy reaching the Earth (see graph above, left). In the three decades since 1978, the combination of solar and volcanic activity probably had a slight cooling influence on the climate.[233]

Climate models have been used to examine the role of the Sun in recent climate change.[234] Models are unable to reproduce the rapid warming observed in recent decades when they only take into account variations in total solar irradiance and volcanic activity. Models are, however, able to simulate the observed 20th century changes in temperature when they include all of the most important external forcings, including human influences and natural forcings. As has already been stated, Hegerl et al. (2007) concluded that greenhouse gas forcing had "very likely" caused most of the observed global warming since the mid-20th century. In making this conclusion, Hegerl et al. (2007) allowed for the possibility that climate models had been underestimated the effect of solar forcing.[235]

Models and observations (see figure above, middle) show that greenhouse gas results in warming of the lower atmosphere at the surface (called the troposphere) but cooling of the upper atmosphere (called the stratosphere).[236] Depletion of the ozone layer by chemical refrigerants has also resulted in a cooling effect in the stratosphere. If the Sun was responsible for observed warming, warming of the troposphere at the surface and warming at the top of the stratosphere would be expected as increase solar activity would replenish ozone and oxides of nitrogen.[237] The stratosphere has a reverse temperature gradient than the troposphere so as the temperature of the troposphere cools with altitude, the stratosphere rises with altitude. Hadley cells are the mechanism by which equatorial generated ozone in the tropics (highest area of UV irradiance in the stratosphere) is moved poleward. Global climate models suggest that climate change may widen the Hadley cells and push the jetstream northward thereby expanding the tropics region and resulting in warmer, dryer conditions in those areas overall.[238]

Comparison with other planets

[edit]

Some have argued that the Sun is responsible for recently observed climate change.[239] Warming on Mars was quoted as evidence that global warming on Earth was being caused by changes in the Sun.[240][241][242] This has been discredited by scientists: "Wobbles in the orbit of Mars are the main cause of its climate change in the current era" (see also orbital forcing).[243] Also, there are alternative explanations of why warming had occurred on Triton, Pluto, Jupiter and Mars.[242]

Effect of cosmic rays

[edit]

The view that cosmic rays could provide the mechanism by which changes in solar activity affect climate is not supported by the literature.[244] Solomon et al. (2007)[245] state:

[..] the cosmic ray time series does not appear to correspond to global total cloud cover after 1991 or to global low-level cloud cover after 1994. Together with the lack of a proven physical mechanism and the plausibility of other causal factors affecting changes in cloud cover, this makes the association between galactic cosmic ray-induced changes in aerosol and cloud formation controversial

Studies in 2007 and 2008 found no relation between warming in recent decades and cosmic rays.[246][247] Pierce and Adams (2009)[248] used a model to simulate the effect of cosmic rays on cloud properties. They concluded that the hypothesized effect of cosmic rays was too small to explain recent climate change.[248] The authors of that study noted that their findings did not rule out a possible connection between cosmic rays and climate change, and recommended further research.[249]

Erlykin et al. (2009)[250] found that the evidence showed that connections between solar variation and climate were more likely to be mediated by direct variation of insolation rather than cosmic rays, and concluded: "Hence within our assumptions, the effect of varying solar activity, either by direct solar irradiance or by varying cosmic ray rates, must be less than 0.07 °C since 1956, i.e. less than 14% of the observed global warming." Carslaw (2009)[251] and Pittock (2009)[252] reviewed the recent and historical literature in this field and continue to find that the link between cosmic rays and climate is tenuous, though they encourage continued research.

Henrik Svensmark has suggested that the magnetic activity of the sun deflects cosmic rays, and that this may influence the generation of cloud condensation nuclei, and thereby have an effect on the climate.[253]

Past estimates of greenhouse gas emissions and temperature rises

[edit]

Previous estimates for the year 2020

[edit]

In 2011, the United Nations Environment Programme looked at how world emissions might develop out to the year 2020 depending on different policy decisions.[254]: 7  They convened 55 scientists and experts from 28 scientific groups across 15 countries. Projections, assuming no new efforts to reduce emissions or based on the "business-as-usual" hypothetical trend,[255] suggested global emissions in 2020 of 56 gigatonnes CO
2
-equivalent (GtCO
2
-eq), with a range of 55-59 GtCO
2
-eq.[254]: 12  In adopting a different baseline where the pledges to the Copenhagen Accord were met in their most ambitious form, the projected global emission by 2020 will still reach the 50 gigatonnes CO
2
.[256] Continuing with the current trend, particularly in the case of low-ambition form, there is an expectation of 3° Celsius temperature increase by the end of the century, which is estimated to bring severe environmental, economic, and social consequences.[257]

The report also considered the effect on emissions of policies put forward by UNFCCC Parties to address climate change. Assuming more stringent efforts to limit emissions lead to projected global emissions in 2020 of between 49 and 52 GtCO
2
-eq, with a median estimate of 51 GtCO
2
-eq.[254]: 12  Assuming less stringent efforts to limit emissions lead to projected global emissions in 2020 of between 53 and 57 GtCO
2
-eq, with a median estimate of 55 GtCO
2
-eq.[254]: 12 

See also

[edit]

References

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The history of climate change science chronicles the scientific inquiry into the physical processes driving variations in Earth's , originating in the early with foundational hypotheses on atmospheric heat retention and evolving through empirical experiments, theoretical modeling, and observational data analysis to contemporary understandings of and anthropogenic influences. Key early advancements include Joseph Fourier's 1824 proposal that the atmosphere functions analogously to glass in retaining terrestrial heat, preventing the planet from cooling to expected equilibrium temperatures based on solar input alone. This concept was empirically substantiated by John Tyndall's 1860s laboratory demonstrations that and selectively absorb infrared radiation, identifying these gases as primary agents in the so-called . advanced quantitative insight in 1896 by calculating that halving atmospheric CO2 would reduce global mean temperature by approximately 4–5°C, while doubling it could raise it by 5–6°C, framing potential climatic impacts of industrial emissions within established physical principles. Subsequent developments integrated and paleoclimatic evidence, with James Croll and elucidating cyclical patterns through variations in Earth's eccentricity, obliquity, and , emphasizing natural forcings over short-term anthropogenic perturbations. The saw systematic monitoring commence with Charles David Keeling's 1958 initiation of precise atmospheric CO2 measurements at , revealing a steady rise from pre-industrial levels of about 280 ppm to over 400 ppm by the , correlating with combustion and . Mid-century debates highlighted tensions between -induced cooling hypotheses and CO2-driven warming projections, with analyses of 1970s literature indicating no consensus on imminent —contrary to popularized narratives—but rather a predominance of studies anticipating net warming from greenhouse gases as effects were better quantified. Modern climate science crystallized through general circulation models and attribution studies, notably James Hansen's congressional testimony linking observed warming to enhanced greenhouse forcing, supported by subsequent paleoclimate proxies and satellite data confirming tropospheric warming patterns inconsistent with solar variability alone. Defining characteristics include the field's reliance on first-principles physics, validated by laboratory and spectral measurements from space, alongside persistent controversies over model sensitivities, natural variability attribution, and the weighting of empirical versus simulated outcomes in policy-relevant syntheses—issues compounded by institutional tendencies toward alarmist framing in academic and media outlets despite empirical discrepancies in extreme event linkages. Achievements encompass robust quantification of imbalances and forcings, enabling probabilistic forecasts, though causal claims remain anchored in correlations strengthened by understanding rather than irrefutable experimentation at planetary scales.

Foundations Before the 20th Century

Observations of Regional Climate Variations from Antiquity to 19th Century

Ancient Greek and Roman scholars documented qualitative observations of regional weather patterns and variations, often linking them to agricultural outcomes and seasonal shifts. Aristotle's Meteorology (circa 350 BCE) cataloged phenomena such as changing wind directions in the Mediterranean, unusual frosts in Greece, and variations in Nile River flooding, which influenced Egyptian harvests and were recorded annually by priests since at least 3000 BCE. These records noted periodic low floods leading to famines, such as during the reign of Sesostris III (1878–1839 BCE), attributed to drier conditions in the Ethiopian highlands. Roman writers like Pliny the Elder in Natural History (77 CE) described regional droughts in Italy and anomalous cold snaps affecting olive yields, while Vitruvius (circa 15 BCE) observed stable but variable Mediterranean climates conducive to viticulture in southern Gaul and Hispania. In medieval Europe, documentary evidence from annals and chronicles revealed warmer conditions during the Medieval Warm Period (approximately 950–1250 CE), particularly in the North Atlantic region. Norse sagas and settlement records indicate mild summers enabling colonization of Greenland around 985 CE, with Erik the Red's accounts of grassy fjords supporting livestock, contrasting later abandonment by 1350 CE due to advancing ice. In England, the Domesday Book (1086 CE) lists over 40 vineyards, reflecting extended growing seasons, while central European tree-ring corroborated harvest records show reduced frost days and higher summer temperatures averaging 1–2°C above the subsequent centuries. These variations were regional, with proxy-supported chronicles noting wetter conditions in the British Isles facilitating barley yields but occasional droughts in Iberia. The onset of cooler conditions around 1300 CE marked the Little Ice Age (approximately 1300–1850 CE), with European records documenting intensified regional variability. The Great Famine of 1315–1317, chronicled in monastic annals across France, England, and Scandinavia, stemmed from prolonged rains and cold summers reducing grain harvests by up to 75% in some areas, leading to widespread starvation. Glacier advances in the Alps, noted in Swiss parish records from the 1580s onward, buried villages and blocked passes, while Dutch and English diaries described frozen rivers and canals, including the Thames frost fairs from 1608 to 1814. The winter of 1709, Europe's coldest in the instrumental era precursor, saw temperatures drop 2–4°C below average in France, killing livestock and vines as recorded in parish ledgers. These events highlighted amplified winter severity in northwestern Europe, with volcanic eruptions like Huaynaputina (1600 CE) linked to subsequent harvest failures in chronicles. By the 19th century, observations increasingly incorporated early instrumental data alongside qualitative reports, signaling the Little Ice Age's termination amid regional warming trends. In the Alps, Swiss and Italian surveys from the 1820s documented glacier retreats, such as the Mer de Glace shrinking by 500 meters between 1818 and 1830, attributed to milder winters in explorer journals. British meteorological logs, starting systematically around 1770 via the Royal Society, noted fewer severe frosts in London and extended Thames navigation seasons post-1815, correlating with harvest improvements. In North America, U.S. Army forts' records from 1820–1892 captured variable Great Plains droughts, like the 1850s aridity affecting settler agriculture, while European diaries reported reduced ice cover on Baltic ports by the 1840s. These shifts underscored ongoing regional disparities, with Mediterranean areas experiencing drier summers per consular reports.

19th-Century Theories of Natural Climate Drivers

In the , scientists increasingly attributed long-term variations, such as ice ages, to natural astronomical forcings rather than cooling mechanisms. The French mathematician Joseph Alphonse Adhémar proposed the first comprehensive astronomical theory in his 1842 book Révolutions de la mer, arguing that precession of the equinoxes reduced summer insolation in one hemisphere every 10,500 years, leading to alternating glaciations in the Northern and Southern Hemispheres through persistent snow accumulation. Adhémar's model linked orbital geometry to polar ice dynamics, providing a cyclic explanation for glacial periods observed in geological records. James Croll, a self-educated Scottish scientist, advanced this framework in the 1860s by incorporating orbital eccentricity alongside precession and obliquity variations. In his 1864 paper published in Philosophical Magazine, Croll posited that low eccentricity phases, when combined with precession, diminish Northern Hemisphere summer solar radiation, initiating ice buildup that is amplified by feedbacks including reduced heat transport via ocean currents like the Gulf Stream and enhanced ice-albedo effects. His 1875 book Climate and Time in Their Geological Relations expanded these ideas, predicting recurrent ice ages on timescales of tens to hundreds of thousands of years, aligning with stratigraphic evidence of multiple Pleistocene glaciations. Croll's theory emphasized causal chains where initial insolation deficits trigger self-reinforcing climate shifts, influencing ocean circulation and continental ice sheets. Solar variability also drew attention as a potential driver of shorter-term fluctuations. The discovery of the 11-year cycle by Heinrich Schwabe around 1843 spurred investigations into solar output's role in weather patterns, with astronomers like earlier (1801) correlating minima to climatic impacts such as crop failures. By the late , figures including Balfour Stewart explored "cosmical ," hypothesizing that solar cycles and planetary alignments modulate and temperature over decadal to centennial scales. These ideas, though speculative and lacking precise quantification, complemented orbital theories by suggesting variable total as a modulator of Earth's energy balance. Volcanic eruptions were recognized for inducing transient cooling through stratospheric aerosols, as evidenced by the 1815 Tambora eruption's global temperature drop of about 0.5–1°C and resultant 1816 "." However, such events were viewed primarily as episodic perturbations rather than systematic long-term drivers, with limited theoretical integration into broader climate cyclicity models until later analyses.

Initial Calculations of the Greenhouse Effect

In 1824, French mathematician and physicist Joseph Fourier proposed that the Earth's atmosphere functions analogously to the glass of a greenhouse by absorbing and re-emitting infrared radiation, thereby elevating surface temperatures beyond what solar input alone would produce. Fourier calculated that without this atmospheric effect, Earth's average temperature would be approximately -18°C, far colder than the observed 15°C, attributing the discrepancy to the selective absorption of heat rays by atmospheric constituents. Experimental verification followed in 1856 when American scientist demonstrated that and absorb solar heat more effectively than atmospheric air. Using glass cylinders filled with different gases exposed to , Foote recorded temperatures rising to 125°F in a CO2-filled tube versus 106°F in one with common air, concluding that increased atmospheric CO2 could warm the planet. Building on this, Irish physicist conducted systematic laboratory experiments starting in 1859, quantifying the absorption of radiation by various gases using a and isolated tubes. Tyndall identified as the primary absorber but showed that CO2, , and hydrocarbons also trap heat significantly, even in trace amounts, establishing the physical basis for atmospheric warming independent of conduction or . The first quantitative estimate of the greenhouse effect's sensitivity to CO2 levels came in 1896 from Swedish chemist , who modeled global temperature changes based on CO2 concentrations using absorption data from earlier spectroscopists like Knut Ångström. Arrhenius calculated that doubling atmospheric CO2 would raise Earth's average temperature by 5–6°C, while halving it would lower it by 4–5°C, factoring in feedback amplification. These figures, derived from energy balance considerations and geological evidence of past ice ages, represented the initial numerical assessment linking CO2 variations to planetary-scale climate shifts.

Early to Mid-20th Century Investigations

Paleoclimate Studies and Solar Influences (1900s-1940s)

In the early 1900s, paleoclimate research advanced through the development of proxy records to reconstruct past environmental conditions, particularly focusing on glacial and interglacial periods. Gerard De Geer pioneered varve chronology, using annually layered glacial sediments to establish timelines for deglaciation in Scandinavia; he presented his initial Swedish varve chronology at the 1910 International Geological Congress, counting over 10,000 varves to date the retreat of the Scandinavian ice sheet to approximately 11,000 years ago. Concurrently, A. E. Douglass developed dendrochronology in the American Southwest, identifying annual tree-ring patterns by 1914 and linking ring widths to climatic variations, with initial applications to drought reconstruction and solar activity correlations. These proxies provided empirical data on millennial-scale climate fluctuations, emphasizing natural forcings over short-term human influences. Milutin Milanković extended to paleoclimate explanation, proposing that cyclic variations in Earth's eccentricity (100,000-year cycle), (41,000-year cycle), and (23,000-year cycle) modulate insolation distribution, driving cycles. Beginning systematic calculations in 1909, he published "The Astronomical Theory of Climate Changes" in 1920, integrating models to predict glacial maxima aligned with summer insolation minima. His comprehensive "Canon of Insolation and the Ice-Age Problem" in tabulated insolation values over 650,000 years, correlating them with geological of Pleistocene glaciations, though contemporary reception was mixed due to incomplete proxy data. These works privileged astronomical , with Milanković estimating insolation changes of up to 100 W/m² at high latitudes sufficient to initiate ice sheets via feedbacks. Solar influences gained attention through direct measurements and proxy correlations, as researchers sought causal links to observed climate variability. Charles Greeley Abbot initiated systematic solar constant observations in 1902 at the Smithsonian Astrophysical Observatory, reporting values around 1.35 kW/m² with detected variations of 1-2% tied to sunspot cycles, arguing these modulated global temperatures. By the 1920s, Douglass's tree-ring data revealed 11-year sunspot cycles in ring widths, suggesting solar-driven precipitation and temperature shifts over centuries. Analyses of historical weather records, including Nile flood levels and European temperatures, showed positive correlations with sunspot numbers during the early 20th century, with rising solar activity from 1910-1930 coinciding with observed warming episodes. These studies underscored solar irradiance as a primary natural driver, with minimal consideration of atmospheric composition changes until later decades.

Post-WWII Atmospheric and Oceanic Research (1950s)

In the aftermath of , advancements in , rocketry, and early computing enabled expanded atmospheric research, including routine upper-air soundings and experiments that informed broader dynamics. These tools facilitated detailed mapping of tropospheric winds and temperature profiles, revealing patterns in global circulation such as jet streams and Hadley cells, which were critical for understanding heat transport. A pivotal contribution came from Gilbert N. Plass, who in 1956 revived interest in the carbon dioxide through precise calculations. Using laboratory-measured CO2 absorption bands and accounting for atmospheric layering up to 75 km, Plass estimated that doubling atmospheric CO2 from emissions would increase global surface temperatures by approximately 3.6°C, while emphasizing that natural CO2 variations had historically influenced Pleistocene ice ages. His work integrated empirical spectral data with energy balance principles, highlighting CO2's role in trapping outgoing infrared radiation without invoking unverified feedback mechanisms. Oceanic research advanced concurrently, with studies at the quantifying CO2 solubility and exchange rates. In 1957, and Hans E. Suess published findings that the oceans, while absorbing much of the CO2 released since the , exhibited reduced buffering capacity due to chemical equilibria involving and ions, limiting uptake to about half of anthropogenic emissions and allowing atmospheric accumulation over centuries. Their , based on isotopic ratios and dissolution kinetics, warned that this disequilibrium represented a "large-scale experiment" with uncertain climatic consequences, as oceanic circulation timescales—estimated at 10 years for surface exchange but longer for deep waters—delayed full equilibration. The (IGY), spanning July 1957 to December 1958, coordinated global efforts in and across 67 nations, yielding over 10,000 rocket launches for ionospheric data and extensive ship-based salinity-temperature profiles that refined models of . IGY datasets documented extents and Pacific current variabilities, providing empirical baselines for assessing atmospheric-oceanic coupling, though interpretations remained focused on natural forcings like solar cycles rather than rapid anthropogenic shifts. These observations underscored the oceans' dominant role in heat storage, with surface layers absorbing 90% of shortwave radiation but mixing inefficiencies delaying deep penetration.

Early CO2 Measurements and Simple Climate Models (1950s-1960s)

In 1956, physicist Gilbert N. Plass published calculations using improved spectroscopic data on CO2 absorption bands, estimating that a doubling of atmospheric CO2 concentration would raise Earth's average surface temperature by 3.6°C, reviving interest in CO2 as a climate driver amid debates over post-glacial warming. Plass's work incorporated radiative transfer models accounting for vertical atmospheric layers up to 75 km, predicting both upward and downward infrared fluxes influenced by CO2 variations, and suggested fossil fuel emissions could counteract natural CO2 decline from weathering processes. The following year, oceanographer Roger Revelle and chemist Hans E. Suess analyzed the air-sea CO2 exchange, concluding that the oceans' buffering capacity—due to chemical reactions involving bicarbonate and carbonate ions—would limit absorption of anthropogenic CO2 to far less than previously assumed, with only a fraction of industrial emissions dissolving rapidly into surface waters. They estimated the atmospheric lifetime of excess CO2 molecules at around 10 years before oceanic dissolution, but emphasized that equilibrium uptake would be incomplete, potentially leading to measurable buildup; this prompted calls for precise monitoring, as prior assumptions of rapid oceanic neutralization underestimated fossil fuel impacts. Responding to such theoretical concerns, geochemist initiated systematic, high-precision measurements of atmospheric CO2 at the in , starting on March 29, 1958, with the first reading registering 313 parts per million (ppm) by volume. Selected for its remote, high-altitude location minimizing local contamination, the site employed non-dispersive infrared analyzers calibrated against known gas standards, revealing not only a seasonal cycle driven by Northern Hemisphere but also an unequivocal upward trend—rising from about 315 ppm in 1958 to over 320 ppm by the mid-1960s—confirming anthropogenic accumulation beyond natural variability. By the late 1960s, simple general circulation models (GCMs) began incorporating these insights. In 1967, meteorologists and Richard T. Wetherald developed one of the earliest zonally averaged atmospheric models, simulating a single vertical column of air with radiative-convective equilibrium and feedback; it projected a global temperature increase of approximately 2°C for doubled CO2, roughly aligning with Plass's estimate but accounting for amplified effects from increased and cloud adjustments. This model represented Earth's atmosphere in discrete layers, solving equations for heat transport, lapse rates, and greenhouse trapping, marking a shift from purely radiative calculations to dynamical simulations that highlighted CO2's role in perturbing energy balance.

Late 20th Century: Emergence of Anthropogenic Hypotheses

1970s: Competing Warming and Cooling Scenarios

In the , climate scientists analyzed the observed mid-20th-century cooling trend in the , spanning roughly 1940 to 1970, alongside projections of rising anthropogenic emissions of CO2 and aerosols. This period featured competing scenarios: potential global cooling from sulfate aerosols reflecting sunlight or from land-use changes increasing dust, versus warming driven by CO2 accumulation in the atmosphere. A analysis of 71 peer-reviewed papers published between and found that only 7 explicitly predicted cooling, 42 forecasted warming primarily due to gases, and 20 remained neutral or emphasized uncertainty, indicating no consensus favoring cooling. Cooling scenarios often hinged on aerosols overpowering CO2 effects. In 1971, S. Ichtiaque Rasool and Stephen H. Schneider modeled that a six- to eightfold increase in stratospheric —far exceeding observed emissions—could reduce global temperatures by about 3.5°C, potentially initiating an , though they noted CO2 warming as a counterforce under lower aerosol assumptions. Other contributions included Reid Bryson's work on shifts and dust from amplifying regional cooling. These views gained media traction, exemplified by a 1975 article warning of a new ice age, but relied on conditional assumptions about aerosol growth that did not materialize, as pollution controls later curbed sulfate emissions. Warming projections built on prior CO2 sensitivity estimates, such as those from G.S. Callendar and Gilbert Plass, with models suggesting 1.5–4°C warming per CO2 doubling if aerosols proved transient. The 1971 Study of Critical Environmental Problems (SCEP) report, convened by the American Institute of Biological Sciences, highlighted CO2 as a long-term warming risk requiring monitoring. Similarly, the 1975 U.S. National Academy of Sciences (NAS) report "Understanding Climatic Change" urged expanded research into both factors but concluded that unchecked CO2 growth would likely cause significant warming, estimating a 0.5–5°C rise by 2000 under high-emission paths, while acknowledging aerosol cooling as a possible short-term offset. These debates reflected data limitations, including sparse global temperature records and rudimentary models lacking full ocean-atmosphere coupling. By decade's end, improved aerosol measurements and the resumption of warming trends shifted emphasis toward CO2 dominance, though uncertainties persisted. Popular narratives overstated cooling consensus, driven by press coverage rather than peer-reviewed literature, where warming scenarios prevailed.

1980s: Equilibrium Climate Sensitivity Estimates and Early Consensus Claims

The 1979 Charney Report's estimate of equilibrium climate sensitivity (ECS)—the expected long-term global surface temperature increase from a doubling of atmospheric CO2 concentrations—as ranging from 1.5°C to 4.5°C, with a best estimate of 3°C, continued to frame discussions in the 1980s, derived primarily from early general circulation models (GCMs) incorporating water vapor and ice-albedo feedbacks while acknowledging large uncertainties in cloud responses. This range persisted due to limited computational power and observational constraints, with model refinements yielding similar spreads; for instance, Hansen et al. (1981) used a GCM to simulate ECS values aligning with the Charney bounds, projecting 2–4°C warming for CO2 doublings but highlighting ocean mixing and aerosol effects as key unknowns. Deterministic models from the era, including energy balance approaches, produced ECS estimates spanning 1–6°C, underscoring the decade's reliance on theoretical parameterizations rather than comprehensive empirical validation. International gatherings amplified focus on these estimates, as the 1985 Workshop concluded that greenhouse gas-induced warming could exceed natural variability, recommending policy responses based on sensitivities within the 1.5–4.5°C range, though participants noted the need for further research on transient responses and regional patterns. James Hansen's June 23, 1988, testimony before the U.S. Senate emphasized model sensitivities around 4°C per CO2 doubling, linking recent U.S. heatwaves to enhanced greenhouse forcing and asserting 99% confidence in anthropogenic contributions, which propelled public and political attention despite model assumptions on forcing and sensitivity remaining central uncertainties. Hansen's projections, calibrated to historical trends, forecasted 0.2–0.3°C/decade warming under business-as-usual emissions, but empirical temperature records through the late 1980s showed variability influenced by volcanic and solar factors, challenging immediate attribution claims. Claims of emerging scientific consensus crystallized around these events, with proponents citing alignment across models on positive feedbacks driving ECS above 1°C, yet the persistent wide range reflected unresolved debates over feedback strengths—particularly clouds, which could amplify or dampen sensitivity—and the role of non-CO2 forcings like sulfates. Academic institutions and media often portrayed the Charney-derived estimates as settled, potentially overlooking dissenting views on lower sensitivities inferred from satellite data or paleoclimate proxies, where institutional incentives may have favored higher-end projections to secure funding and policy relevance. By decade's end, this framework informed the IPCC's formation in 1988, but rigorous first-principles scrutiny reveals that true consensus pertained more to directional warming risks than precise quantification, as empirical constraints on ECS remained sparse amid competing natural drivers.

IPCC Establishment and First Assessment Reports (Late 1980s-1990s)

![James Hansen testifying before Congress in 1988][float-right] The (IPCC) was established in 1988 by the (WMO) and the (UNEP) to provide governments with regular assessments of the scientific basis of , its potential impacts, and options for and . The initiative followed heightened international concern, including U.S. Senator Timothy Wirth's organization of 's June 1988 congressional testimony, where Hansen stated with high confidence that observed warming was due to the from human activities. The IPCC's first plenary session convened from November 9 to 11, 1988, in Geneva, Switzerland, where three working groups were formed: Working Group I on scientific assessment, Working Group II on impacts and , and Working Group III on options. The IPCC's First Assessment Report (FAR) was completed in August 1990 and approved at the second in , . I concluded that global mean surface air temperature had increased by 0.3 °C to 0.6 °C over the past century, consistent with predictions, and that accounted for over half of the enhanced . The report's Summary for Policymakers (SPM), approved line-by-line by governments, stated that the balance of evidence suggested "a discernible human influence on global climate," though this phrase faced internal scientific for potentially overstating detection certainty at the time. Projections under a "business as usual" scenario estimated 0.3 °C warming per decade into the , with sea-level rise of about 6 cm per decade, based on equilibrium estimates ranging from 1.5 °C to 4.5 °C for doubled CO2. A 1992 supplement updated estimates but maintained core findings. The Second Assessment Report (SAR), finalized in 1995, built on the FAR with greater data accumulation and model refinements. Working Group I affirmed an observed temperature rise of about 0.3 °C since the late 19th century and projected future warming of 1–3.5 °C by 2100 under various emission scenarios, with sea-level rise of 15–95 cm. The SPM, again governmentally approved, strengthened language to "the balance of evidence suggests a discernible human influence on global climate," a shift from the FAR that some scientists, including dissenting Working Group I authors, argued reflected political editing rather than unanimous scientific consensus, as draft chapters had emphasized unresolved detection issues. Despite such procedural critiques, the SAR informed the 1997 Kyoto Protocol negotiations by underscoring the need for emission reductions. Early IPCC reports prioritized synthesizing peer-reviewed literature but drew criticism for relying on incomplete observational records and models that later showed higher uncertainties in natural variability and aerosol effects.

1990s to Early 2000s: Observations Versus Projections

Surface and Satellite Temperature Records

Surface records are derived from networks of land-based weather stations measuring air typically 1.5 to 2 meters above ground and sea surface temperatures (SSTs) from ship and observations. Major global datasets in the included NASA's (GISS) Surface Temperature Analysis (GISTEMP), the United Kingdom's Hadley Centre-Climate Research Unit (HadCRUT), and NOAA's Global Historical Climatology Network (GHCN)-based analyses. These datasets, covering from the late , reported a global mean surface increase of approximately 0.6°C over the , with accelerated warming of about 0.15°C per since the through the early 2000s. Homogenization procedures were applied to correct for non-climatic biases such as effects, station relocations, and instrument changes, though the impact of these adjustments on long-term trends remains debated among researchers. Satellite-based temperature records began in December 1978 with the Microwave Sounding Unit (MSU) instruments aboard NOAA polar-orbiting satellites, providing estimates of lower tropospheric temperatures (roughly surface to 8 km altitude) via microwave emissions from atmospheric oxygen. The (UAH) dataset, developed by Roy Spencer and , was the first major analysis; their 1990 study of data from 1979 to 1988 indicated a global lower tropospheric trend near zero at +0.01°C per , contrasting with surface records. Subsequent refinements, including corrections for satellite and sensor drift, adjusted UAH trends upward to about +0.08°C per by the mid-1990s for the lower troposphere. An independent dataset from Remote Sensing Systems (RSS), introduced in the early 2000s, employed different calibration methods and reported higher trends, around +0.18°C per decade for the lower troposphere from 1979 through 2008. Comparisons revealed surface trends exceeding satellite-derived tropospheric trends by roughly 0.05°C per decade since 1979, with pronounced differences in the tropics where climate models predicted 1.2 to 1.5 times greater warming in the mid-troposphere than at the surface—a pattern termed "amplification"—but observations showed amplification ratios closer to or below 1.0, fueling debates on model reliability and data processing. These discrepancies persisted into the early 2000s, prompting multiple intercomparisons and methodological reviews, though both record types confirmed overall warming amid uncertainties in vertical temperature structure.

Model-Observation Discrepancies in the 1990s

In the early 1990s, satellite-based measurements of lower tropospheric temperatures, pioneered by Roy Spencer and John Christy using Microwave Sounding Unit (MSU) data from NOAA satellites, revealed minimal global warming trends from 1979 to 1990, with trends near zero or slightly negative in initial analyses covering the period through the mid-1990s. These observations contrasted with surface temperature records, which indicated warming of approximately 0.1–0.15°C per decade over the same satellite era, prompting questions about the vertical structure of atmospheric warming expected under greenhouse gas forcing. Radiosonde (balloon) datasets, such as those analyzed by James Angell, showed tropospheric warming of about 0.3 K from 1958 to 1987, but with much of it concentrated in the late 1970s to early 1980s, and limited evidence of sustained trends into the 1990s. Climate models at the time, including those referenced in the IPCC's 1990 First Assessment Report, projected tropospheric warming broadly consistent with surface trends but with amplification in the tropical upper troposphere—predicting rates 1.2 to 1.4 times the surface warming due to moist convection dynamics under increased greenhouse gases. However, both satellite and homogenized radiosonde records through the 1990s failed to exhibit this "tropical hotspot," showing instead comparable or subdued warming aloft relative to the surface, particularly in the tropics where model-observation differences reached up to 50% in trend magnitude. For instance, UAH satellite data indicated a tropical lower tropospheric trend of near-zero from 1979 to 1998, diverging from model ensembles that anticipated 0.1–0.2°C per decade warming in that layer. Efforts to reconcile these discrepancies in the mid-1990s involved adjustments for data inhomogeneities, such as orbital decay in satellites and instrument changes in radiosondes, leading to revised datasets like HadRT that showed modest tropospheric warming aligning more closely with surfaces but still falling at the low end of IPCC 1990 projection ranges. The IPCC's 1995 Second Assessment Report attributed much of the apparent shortfall to natural variability, including volcanic eruptions (e.g., Mount Pinatubo in 1991) and ENSO events, claiming no fundamental inconsistency after such corrections, though tropical upper-air trends remained uncertain and subject to debate. Critics, including Spencer and Christy, argued that persistent gaps, especially in the tropics, suggested overestimation in model sensitivity or underappreciation of natural factors, with empirical adjustments not fully resolving the vertical profile mismatch observed through the decade. These issues highlighted limitations in early general circulation models' representation of convective processes and cloud feedbacks, fueling ongoing scrutiny into the 2000s.

Influence of Data Adjustments and Climategate (2009)

Data adjustments in global temperature records, such as those applied by NOAA and NASA GISS, involve homogenization techniques to correct for non-climatic factors including station relocations, changes in measurement times, and urban heat island effects. These processes aim to produce consistent long-term trends from heterogeneous raw observations, but analyses have shown that adjustments often amplify recent warming relative to historical baselines. For instance, a peer-reviewed study of U.S. temperature data found that poor-quality stations, including those near heat sources, introduced a warming bias of up to 0.9°C per century after accounting for siting issues, with homogenization failing to fully mitigate this. Similarly, examination of European records in the GHCN dataset revealed that homogenization adjustments increased centennial warming trends by an average of 16% in some regions, raising questions about algorithmic over-correction for urban influences. Critics, including peer-reviewed work on urban blending, argue that pairwise homogenization inadvertently propagates urban heat signals across rural stations, artificially enhancing global trends by blending contaminated data. These controversies intensified scrutiny of surface records, particularly as raw data from untouched networks like the U.S. Climate Reference Network exhibited less pronounced warming than adjusted datasets. Independent audits, such as those of the Berkeley Earth Surface Temperature project, confirmed similar overall trends but highlighted version-dependent adjustments that could alter decadal signals, with some iterations showing exaggerated post-1990s rises due to incomplete breakpoint detection. In the U.S., post-2010 NOAA revisions cooled early 20th-century temperatures while warming recent decades, increasing the reported trend by approximately 0.3–0.5°C over the instrumental period, prompting claims of methodological bias toward confirming model projections. Such findings contributed to debates on record reliability, especially when contrasted with satellite-derived lower-troposphere data from UAH and RSS, which showed milder warming rates of about 0.13°C per decade since 1979, less affected by surface-specific adjustments. The Climategate incident in November 2009 amplified these concerns when over 1,000 emails and documents were hacked from the University of East Anglia's Climatic Research Unit (CRU) and leaked online. Key revelations included discussions among leading climate scientists, such as Phil Jones and Michael Mann, on using "Mike's Nature trick" to "hide the decline"—a reference to splicing modern instrumental temperatures onto pre-1960 tree-ring proxy data to mask a post-1960 divergence where tree-ring indices failed to track observed warming, potentially indicating proxy unreliability for recent climates. Other emails revealed resistance to sharing data with external researchers, efforts to influence peer review (e.g., blocking skeptical papers), and statistical manipulations to minimize apparent medieval warm period signals in reconstructions. These disclosures fueled accusations of gatekeeping and selective presentation, eroding trust in institutions like the IPCC, which relied heavily on CRU data for its assessments. Subsequent investigations, including the UK House of Commons Science and Technology Committee, the Oxburgh panel, and the Independent Climate Change Email Review, cleared scientists of fabricating data or dishonesty, attributing issues to poor communication and defensiveness rather than misconduct. However, the inquiries criticized CRU for inadequate data archiving and transparency, recommending improved openness without deeply probing scientific validity of practices like the proxy splice. Critics noted the panels' limited scope, often comprising insiders, and failure to address FOIA evasions documented in emails, such as deleting files to avoid disclosure. The episode spurred policy changes, including CRU's partial data releases and broader calls for raw data access under initiatives like the Global Historical Climatology Network, but it also entrenched divisions, with public skepticism toward anthropogenic warming claims rising temporarily amid perceptions of institutional bias. By highlighting potential causal disconnects between raw observations and adjusted narratives, Climategate influenced the field's trajectory toward greater empirical auditing, though mainstream syntheses continued to emphasize adjusted records as robust.

2000s to 2010s: Hiatus, Attribution, and Intensifying Debates

The Global Warming Pause (1998-2013) and Explanatory Hypotheses

The observed slowdown in global mean surface temperature (GMST) rise from 1998 to 2013, often termed the "global warming pause" or "hiatus," featured a linear trend of approximately 0.05°C per decade, markedly lower than the 0.12°C per decade from 1951 to 2012 or the accelerated rates in prior decades. This period began following the strong 1997-1998 El Niño event, which produced a temporary temperature spike, and persisted through a sequence of La Niña-dominant years that suppressed surface warming. Satellite lower troposphere records, such as those from the University of Alabama in Huntsville dataset, similarly indicated minimal warming of about 0.07°C per decade over this interval. Multiple peer-reviewed analyses attributed the hiatus primarily to internal climate variability rather than a cessation of anthropogenic forcing. A negative phase of the Interdecadal Pacific Oscillation (IPO), akin to the Pacific Decadal Oscillation, redirected heat from the surface to subsurface ocean layers, reducing tropical Pacific sea surface temperatures and enhancing trade winds that facilitated deeper ocean heat uptake. The Atlantic Multidecadal Oscillation (AMO) in a cooling phase further contributed by altering hemispheric energy redistribution. Modeling simulations, including those using coupled general circulation models, reproduced the hiatus when initialized with observed sea surface temperature patterns, underscoring the role of such decadal oscillations in modulating short-term trends atop long-term greenhouse gas-driven warming. Alternative hypotheses invoked temporary reductions in effective radiative forcing or enhanced energy sinks. Increased stratospheric aerosol loading from moderate volcanic eruptions, such as those in 2008-2010, provided minor cooling offsets, though insufficient to explain the full magnitude. Low solar irradiance during the prolonged solar minimum around 2008-2009 was proposed as a contributing factor, aligning with empirical correlations between solar cycles and decadal temperature variability. Enhanced vertical mixing in the Pacific, driven by strengthened winds, sequestered heat below 700 meters, as evidenced by Argo float observations showing accelerated ocean heat content increase in deeper layers during this era. Debates arose over the hiatus's interpretation, with some studies questioning its statistical robustness by emphasizing that short-term trends (15 years) fall below the 30-year climatological standard and are prone to sampling variability. Post-2013 data adjustments in datasets like NOAA's ERSST v4, which buoy-corrected ship measurements upward, reduced the apparent trend slowdown, prompting critiques of potential over-adjustment to align observations with models. The Fifth IPCC Assessment Report (AR5) incorporated the hiatus in its summary, noting it did not invalidate long-term warming but highlighted uncertainties in transient climate response and the influence of natural variability on decadal scales. This period underscored discrepancies between climate model ensembles, which projected stronger warming, and observations, fueling discussions on equilibrium climate sensitivity estimates.

Advances in Extreme Event Attribution

Extreme event attribution seeks to quantify the influence of human-induced on the probability or intensity of specific weather extremes, such as heatwaves or floods, by comparing observed events to simulations of counterfactual climates without anthropogenic forcings. The conceptual framework was proposed in 2003 by , who argued for probabilistic assessments of liability for climate-related damages through model-based comparisons of event likelihoods in actual versus pre-industrial conditions. This approach emphasized ensembles of climate simulations to account for natural variability, marking a shift from global-scale detection and attribution to individual events. The first peer-reviewed application appeared in 2004, when Peter Stott and colleagues analyzed the 2003 European heatwave, which caused over 70,000 excess deaths, concluding that anthropogenic greenhouse gases had at least doubled its likelihood based on Hadley Centre model ensembles. This study introduced the risk ratio metric, where the probability of the event in the current climate divided by its probability in a counterfactual scenario yields a factor indicating increased odds due to human influence. Subsequent advances in the late 2000s refined these methods, incorporating multi-model ensembles to reduce uncertainties, though reliance on coarse-resolution models limited applicability to precipitation extremes and tropical cyclones. By the 2010s, methodological improvements enabled broader coverage, including storyline approaches that conditioned models on observed weather patterns to isolate thermodynamic responses to warming, as applied retrospectively to the 2003 heatwave to estimate a 2.5–4 °C reduction in peak temperatures absent human forcings. The launched annual "Explaining Extreme Events from a Climate Perspective" supplements in 2011, compiling peer-reviewed analyses of recent extremes and revealing detectable human signals primarily in events but inconsistent results for or storms due to model deficiencies in simulating variability. Rapid attribution protocols emerged around 2013–2015 with initiatives like , allowing assessments within weeks of events using pre-computed model outputs, as demonstrated in analyses of the where doubled the odds. Despite these developments, limitations persist: attribution depends heavily on model accuracy for , where small sample sizes inflate uncertainties, and probabilistic claims cannot prove causation in individual instances, only altered risks. Critics note that over-reliance on tuned models may bias toward detecting anthropogenic signals, particularly when natural forcings like solar or volcanic activity are underrepresented, and empirical validation remains challenging absent direct counterfactual observations. By the mid-2010s, over 100 studies had been conducted, predominantly on extremes in data-rich regions, highlighting gaps in attribution for underrepresented events or locations with sparse observations.

Third and Fourth IPCC Reports: Refinements and Criticisms

The Third Assessment Report (TAR), released in May 2001, refined prior IPCC assessments by integrating expanded datasets on atmospheric greenhouse gas concentrations, which had risen to levels unprecedented in at least 420,000 years based on ice core analyses, and by updating radiative forcing estimates to show a net positive anthropogenic influence of +1.46 W/m² (range: +0.6 to +2.5 W/m²). It strengthened attribution statements compared to the 1995 Second Assessment Report, asserting "new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities," supported by detection and attribution studies that detected a anthropogenic signal amid natural variability. Refinements included narrowed uncertainty in equilibrium climate sensitivity (likely 1.5–4.5°C for doubled CO₂, unchanged from prior reports but with more supporting model ensembles) and greater emphasis on paleoclimate proxies, such as multiproxy reconstructions indicating Northern Hemisphere temperatures during the late 20th century were likely the highest in 1,000 years. Criticisms of the TAR focused on the Summary for Policymakers (SPM), which detractors argued selectively emphasized high-confidence findings while downplaying dissenting views among contributing authors; for instance, atmospheric physicist Richard Lindzen, a lead author, contended the SPM misrepresented the degree of scientific consensus on attribution by implying near-unanimity where uncertainties in natural forcings like solar variability persisted. A key contention arose over the "hockey stick" reconstruction by Mann, Bradley, and Hughes, featured prominently in the SPM, which statistical reanalyses later showed to be overly smoothed due to principal component methods that minimized medieval warm period signals, potentially understating natural variability and inflating recent warming's anomaly. These issues highlighted broader process concerns, including limited external review of certain chapters and the line-by-line governmental approval of the SPM, which some analysts viewed as introducing political pressures that prioritized alarm over balanced uncertainty ranges. The Fourth Assessment Report (AR4), published in 2007, advanced refinements through enhanced observational syntheses, such as satellite and data confirming an energy imbalance of 0.75 W/m² from 1950–2000, and improved general circulation models that better simulated tropical tropospheric amplification. It elevated confidence in anthropogenic drivers, stating it is "very likely" (>90% probability) that human activities caused most observed warming since the mid-20th century, a step up from the TAR's "likely" framing, bolstered by Bayesian attribution methods isolating signals against s and solar forcings. Equilibrium remained 2–4.5°C (best estimate 3°C for doubled CO₂), but projections incorporated scenario-based emissions pathways yielding global temperature rises of 1.8–4.0°C by 2100 under various socioeconomic assumptions, with reduced cooling aiding forward projections. AR4 faced scrutiny for factual errors in the Working Group II volume, including unsubstantiated claims of 2035 Himalayan glacier melt (extrapolated from a 1999 magazine article rather than peer-reviewed data) and 40% Amazon rainforest die-off risk from minor drying (based on a non-peer-reviewed WWF report), which undermined credibility and prompted an Inter-Academy Council review revealing inadequate sourcing standards and overreliance on grey literature comprising up to 30% of references. Critics, including econometrician Ross McKitrick, argued the report's high attribution confidence overlooked persistent model biases, such as overprediction of tropospheric warming relative to surface trends in regions like the tropics, where radiosonde data showed discrepancies of up to 0.1–0.2°C/decade. The SPM's governmental negotiation process was faulted for diluting scientific caveats, such as on sea-level rise projections (0.18–0.59 m by 2100, excluding rapid ice sheet dynamics), to favor policy urgency despite unresolved ice physics, reflecting institutional tendencies toward consensus-building over empirical caution.

Recent Advances and Reassessments (2010s-2025)

Fifth and Sixth IPCC Reports: Updated Sensitivities and Projections

The Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), published between 2013 and 2014, maintained the equilibrium climate sensitivity (ECS)—defined as the long-term global surface temperature response to a doubling of atmospheric CO₂ concentration—as likely ranging from 1.5°C to 4.5°C, with high confidence that it exceeds 1°C. This range, unchanged from prior assessments, integrated evidence from instrumental records, paleoclimate proxies, and climate models, though it highlighted persistent uncertainties in feedbacks such as cloud responses. Transient climate response (TCR), the shorter-term warming under rising CO₂, was assessed as likely 1.0–2.5°C. Projections for global mean surface temperature increase by 2081–2100 relative to 1986–2005, under Representative Concentration Pathway (RCP) scenarios, ranged from about 0.3–1.7°C for RCP2.6 (low emissions) to 2.6–4.8°C for RCP8.5 (high emissions), emphasizing that anthropogenic forcings dominated observed warming trends. These projections incorporated multi-model ensembles from CMIP5, but noted limitations in representing regional variability and extreme events. The Sixth Assessment Report (AR6), released from 2021 to 2023, refined ECS to a likely range of 2.5–4.0°C, with a best estimate of 3.0°C, narrowing the lower bound based on updated assessments of historical energy imbalances, paleoclimate data, and feedback processes while retaining a very likely range of 2.0–5.0°C. This shift reflected contributions from CMIP6 models exhibiting higher average sensitivities than CMIP5, alongside revised estimates of effective radiative forcing (ERF) from CO₂, increased from 1.82 W m⁻² (1750–2011) in AR5 to 2.16 W m⁻² (1750–2019). TCR was updated to likely 1.4–2.2°C with a best estimate of 1.8°C, informed by improved observational constraints on transient responses. AR6 emphasized high confidence in human-induced warming but acknowledged that process-based model sensitivities remained higher than some observationally derived estimates, contributing to ongoing debates over feedback quantification. Projections in AR6 shifted to Shared Socioeconomic Pathways (SSPs), projecting global surface air temperature increases for 2081–2100 relative to 1850–1900 of 1.0–1.8°C under SSP1-1.9 (very low emissions), 1.7–2.6°C under SSP1-2.6 (low), and 3.3–5.7°C under SSP5-8.5 (very high), with continued warming projected even under net-zero CO₂ emissions due to lagged responses. These updates incorporated enhanced sea level rise projections (e.g., 0.28–0.55 m for SSP1-2.6 by 2100) and greater emphasis on near-term risks, such as a greater than 50% chance of exceeding 1.5°C warming between 2021 and 2040 under current policies. AR6 projections highlighted reduced uncertainty in low-emissions pathways but noted persistent divergences between models and observations in tropical tropospheric amplification, underscoring the need for further empirical validation of sensitivity assumptions.

Post-2020 Insights on Aerosols, Methane, and Model Performance

Post-2020 research has highlighted the role of declining anthropogenic aerosol emissions in unmasking underlying warming, particularly following the International Maritime Organization's (IMO) 2020 regulations, which capped sulfur content in marine fuels at 0.5% from 3.5%, reducing ship-related SO2 emissions by approximately 80% over oceans. This abrupt decrease in sulfate , which exert a net cooling effect through direct and cloud brightening, has been estimated to produce a positive of 0.06–0.31 W/m² globally, equivalent to advancing anthropogenic warming by up to three years in some model simulations. Broader declines in SO2 emissions—40% globally since the mid-2000s, with China's reductions exceeding 70%—have similarly diminished aerosol masking, contributing to accelerated surface warming rates observed since 2020, though the effect is projected to be transient as stabilize at lower levels. Atmospheric methane concentrations have risen more rapidly than anticipated post-2020, with NOAA measurements indicating annual increases of 14.84 ppb in 2020, peaking at 17.69 ppb in 2021, before stabilizing around 12–13 ppb annually through 2023, reaching 1921.79 ppb by 2024. Global emissions surged by 20–25 teragrams per year in 2020–2021, primarily from enhanced tropical wetland sources amid anomalous wetness, alongside fossil fuel sector contributions accounting for nearly one-third of anthropogenic totals. This acceleration, exceeding prior projections, amplifies methane's short-lived climate forcing—second only to CO2 among human-influenced gases—potentially elevating its atmospheric burden by up to 13% by 2030 if unchecked, though isotopic analyses suggest biogenic dominance over fossil origins in recent upticks. Evaluations of CMIP6 models post-2020 reveal mixed performance, with many exhibiting systematic high biases in equilibrium (ECS) estimates—often 4–5°C per CO2 doubling—leading to overprojections of recent warming when unweighted ensembles are compared to observations. Weighting schemes favoring models with better historical fidelity reduce projected end-of-century warming by 0.5–1°C under high-emission scenarios, as high-ECS models underperform in simulating energy budget constraints and regional patterns like variability. Incorporating post-2020 aerosol and forcings improves hindcasts, but persistent discrepancies in feedbacks and tropospheric amplification underscore uncertainties, prompting calls for emergent constraints from to refine projections amid observed warming rates aligning more closely with lower-sensitivity models after aerosol adjustments.

Empirical Evaluations of Long-Term Forecasts Up to 2025

A comprehensive analysis of 17 prominent climate model projections published between 1970 and 2007, encompassing Hansen's 1988 scenarios and early IPCC assessments, determined that these forecasts skillfully captured global mean surface temperature trends in the decades following their issuance, with predicted warming rates aligning closely to observations without systematic bias. Specifically, Hansen's intermediate Scenario B, which assumed a phase-out of chlorofluorocarbon emissions and slowing growth in other greenhouse gases, projected approximately 0.45°C of warming from 1988 to the late 2010s, a figure that matched observed increases within natural variability and forcing adjustments such as the 1991 Mount Pinatubo eruption. His high-emission Scenario A, however, overestimated warming by roughly 30-50% relative to realized temperatures through 2020, though this scenario assumed continued rapid emissions growth that did not fully materialize. IPCC First Assessment Report (1990) projections under the IS92a business-as-usual scenario anticipated a global temperature rise of 0.4°C to 1.1°C from 1990 to 2025, with a central estimate around 0.7°C; observed anomalies indicate an increase of approximately 0.9°C from 1990 (anomaly ~+0.35°C relative to 1951-1980 baseline) to 2024 (+1.28°C relative to the same baseline), placing outcomes toward the upper end of the range but within uncertainty bounds accounting for emission trajectories and internal variability. Subsequent evaluations of Coupled Model Intercomparison Project (CMIP) ensembles, including CMIP3 through CMIP6, similarly found that multi-model means for future warming up to 2020 were slightly conservative globally, underpredicting by 0.1-0.2°C in aggregate, though spatial patterns revealed overestimation over 63% of Earth's surface area, particularly in land regions and the tropics. Critiques of these evaluations emphasize persistent model biases, such as inflated equilibrium climate sensitivity (ECS) in subsets of CMIP6 models (ECS >4°C), which contribute to overstated warming projections for impacts like extremes and regional changes; for instance, high-ECS models have been shown to exaggerate projected damages when used in integrated assessment models. Evaluations incorporating tropospheric satellite data further highlight discrepancies, with models overpredicting mid-tropospheric warming rates by 1.5-2 times relative to observations from 1979-2020, suggesting over-reliance on parameterized feedbacks like water vapor amplification. Up to 2025, the 2023-2024 El Niño-driven temperature spikes have narrowed gaps for some ensemble means, but long-term trend assessments through 2024 confirm that observed decadal warming rates (0.18°C per decade from 1990-2024) remain below the median of unconstrained CMIP6 projections (0.20-0.25°C per decade under comparable forcings).
Forecast SourceProjected Warming (Key Scenario)Observed Warming to ~2024Alignment Notes
Hansen 1988 (Scenario B)~0.45°C (1988-2019)~0.55°C (adjusted for baseline)Close match; variability explains minor differences.
IPCC FAR 1990 (IS92a)0.4-1.1°C (1990-2025)~0.9°C (1990-2024)Within range; upper bound realized due to sustained emissions.
CMIP5/6 Ensembles0.20-0.25°C/decade (1990s-2020s)0.18°C/decade (1990-2024)Global mean slightly underpredicted; regional overprediction common.
These assessments underscore that while global aggregates exhibit , unresolved issues in representing variability, feedbacks, and effects contribute to ongoing debates over model reliability for policy-relevant projections beyond aggregate temperatures.

Debates on Natural Versus Human Causal Factors

Solar Variability and Cosmic Ray Influences

Research into solar variability's influence on Earth's climate dates back to the 19th century, with early observations linking sunspot activity to weather patterns and crop yields, as noted by William Herschel in 1801 who correlated sunspot minima with cooler periods affecting agricultural output. By the early 20th century, astronomers like Charles Greeley Abbot in 1910 proposed trends in total solar irradiance (TSI) coinciding with climatic cooling, laying groundwork for quantitative assessments. Post-World War II advancements in satellite measurements from 1978 onward provided direct TSI data, revealing an 11-year cycle variation of about 0.1% but no significant long-term upward trend since the mid-20th century, despite global temperature rises. In the 1970s and 1980s, amid emerging concerns over anthropogenic warming, scientists debated solar contributions to 20th-century temperature increases, with studies like John A. Eddy's 1976 analysis connecting multi-century solar activity variations to major climate shifts, including the during the (1645–1715) when numbers were anomalously low. Peer-reviewed reconstructions, such as those using cosmogenic isotopes like , indicated solar forcing could explain up to 50% of pre-industrial temperature variance over the past millennium, though estimates varied. However, IPCC assessments from the 1990s onward attributed only a minor role to solar variability in post-1950 warming, estimating radiative forcing from solar changes at +0.12 W/m² (range -0.06 to +0.30 W/m²) for 1750–2011, far smaller than greenhouse gas forcings exceeding +2.0 W/m². Critics, including analyses of records, argue IPCC solar reconstructions underestimate activity by relying on flawed proxies, potentially overlooking amplified effects via mechanisms like ultraviolet-induced stratospheric changes. Proposed amplification mechanisms beyond direct TSI, such as solar-modulated ultraviolet radiation altering atmospheric circulation or ocean heat uptake, gained traction in the 1990s, with models suggesting they could double solar impact on surface temperatures. Empirical correlations between solar cycles and global sea surface temperatures persisted into the 2000s, prompting hypotheses that total solar irradiance plus magnetic activity might account for much of the observed warming up to the 1980s, after which divergences appeared as TSI stabilized. Despite these, comprehensive reviews conclude solar activity's net effect on recent global warming remains small, with post-1980 temperature rises uncorrelated to declining or flat TSI trends. The cosmic ray-cloud hypothesis, advanced by Henrik Svensmark in the mid-1990s, posits that galactic cosmic rays (GCRs), modulated by solar magnetic activity, influence cloud formation through atmospheric ionization, thereby affecting planetary albedo and temperature. Building on Edward Ney's 1959 suggestion of cosmic ray-climate links, Svensmark's 1996 findings correlated GCR flux decreases during high solar activity with reduced low-level cloud cover, implying a cooling mechanism during solar maxima. Laboratory experiments, including CERN's CLOUD chamber from 2009, confirmed GCRs enhance aerosol nucleation precursors like sulfuric acid clusters, supporting the microphysical pathway, though the overall climatic amplification remains debated. Proponents argue this mechanism reconciles paleoclimate records, where GCR variations driven by solar cycles and heliomagnetic field changes align with temperature proxies over millennia, potentially explaining discrepancies in direct TSI forcing. Skeptics, including multiple peer-reviewed assessments, contend the effect's magnitude is insufficient for significant 20th-century warming attribution, citing weak correlations between GCR flux and global trends post-1980s and inconsistencies with observed stratospheric responses. The hypothesis faced marginalization in mainstream models, with IPCC reports through AR6 largely dismissing it due to insufficient for large-scale modulation, though acknowledging potential regional influences. Recent critiques highlight institutional resistance, noting that while early correlations held for 20th-century data, divergences post-1990s align with anthropogenic forcings, yet alternative analyses using balanced solar proxies suggest cosmic ray influences warrant further empirical scrutiny beyond prevailing dismissals.

Internal Climate Oscillations and Volcanic Forcings

Internal climate oscillations, such as the El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), and Atlantic Multidecadal Oscillation (AMO), represent modes of natural variability arising from interactions within the coupled ocean-atmosphere system, redistributing heat and influencing global temperatures on interannual to multidecadal timescales. ENSO, characterized by irregular cycles of 2-7 years, was first systematically documented in the late through observations of Peruvian coastal anomalies but gained prominence in climate science during the 1960s with Bjerknes' recognition of its teleconnections to global patterns. The PDO, identified in 1996 as a pattern of Pacific (SST) variability with 20-30 year phases, and the AMO, described in 1997 as North Atlantic SST fluctuations over 60-80 years, were formalized through statistical analyses of instrumental records extending back to the early . These oscillations do not generate net heat but modulate surface temperatures; for instance, strong El Niño events can elevate global mean surface temperatures by approximately 0.1-0.2°C through enhanced atmospheric heat release from the equatorial Pacific, while La Niña phases produce opposite cooling effects. In the history of climate modeling, these internal modes were increasingly incorporated starting in the 1980s and 1990s as stochastic forcings to simulate observed decadal fluctuations, with general circulation models (GCMs) reproducing ENSO-like variability but often underestimating its amplitude in projections. Debates intensified during the 1998-2013 "global warming hiatus," a period of subdued surface warming despite rising CO2 concentrations, where analyses attributed up to 50-100% of the slowdown to natural variability, including a sequence of La Niña events and negative PDO phases that enhanced Pacific subsurface heat uptake. Empirical reconstructions and model simulations indicated that such phases can mask anthropogenic warming signals for 10-30 years, challenging attributions that emphasized greenhouse gas dominance without fully accounting for phase alignments. Critics, including some climate dynamicists, argued that overreliance on ensemble averages in IPCC assessments downplayed the predictive role of these oscillations, as evidenced by discrepancies between modeled and observed multidecadal trends in the early 21st century. Recent studies have questioned the oscillatory nature of PDO and AMO, suggesting they may partly reflect forced responses to aerosols or greenhouse gases rather than purely internal cycles, though statistical evidence supports their role in modulating variance independent of long-term trends. Volcanic forcings, stemming from explosive eruptions that inject sulfate aerosols into the stratosphere, induce short-term global cooling by reflecting sunlight, providing natural experiments for validating climate sensitivity estimates. Historical awareness dates to the 1815 Tambora eruption, which caused a 0.4-0.7°C Northern Hemisphere cooling and the "year without a summer" in 1816, documented through contemporaneous accounts and tree-ring proxies. Instrumental-era studies, beginning in the mid-20th century, quantified effects from events like Agung (1963, ~0.3°C cooling) and El Chichón (1982, ~0.2°C cooling), with radiative forcing estimates derived from satellite observations of aerosol optical depth. The 1991 Mount Pinatubo eruption, releasing 20 million tons of SO2, exemplifies this: it produced a peak global mean surface cooling of 0.5°C lasting 1-2 years, followed by gradual recovery, as confirmed by multiple datasets including HadCRUT and satellite radiometry. Peer-reviewed analyses integrated these into forcing datasets for models, revealing that volcanic aerosols also alter stratospheric dynamics, sometimes amplifying winter warming in the Northern Hemisphere via disrupted jet streams. In attribution debates, volcanic forcings highlight model deficiencies; for example, CMIP5 simulations underestimated Pinatubo's cooling by 20-50% due to inadequate aerosol microphysics, prompting refinements in subsequent ensembles. Proponents of enhanced natural roles argue that clustered eruptions in the early 20th century contributed to the 1910s-1940s warming hiatus, with total forcing equivalent to -0.1 to -0.2 W/m², rivaling early anthropogenic sulfate cooling. Conversely, mainstream syntheses maintain that while volcanoes explain transient dips, their net 20th-century forcing was minor compared to greenhouse gas trends, though uncertainties in pre-1950 reconstructions persist. Evaluations up to 2025 underscore that neglecting sporadic large eruptions in projections underestimates temperature uncertainty by 0.1-0.3°C over decades, emphasizing the need for probabilistic inclusion of these forcings alongside internal variability.

Critiques of Dominant Anthropogenic Attribution

Critiques of the dominant attribution of recent climate change primarily to anthropogenic , as asserted by IPCC reports, center on discrepancies between model-based projections and empirical observations, particularly regarding (ECS) and the role of natural variability. Observational estimates of ECS, which measures the long-term temperature response to doubled atmospheric CO2, have been derived using energy balance approaches incorporating instrumental data on radiative forcings, ocean heat uptake, and surface temperatures. For instance, Lewis and Curry (2018) analyzed updated datasets from 1859–2011, estimating a median ECS of 2.0°C (likely range 1.2–3.9°C), substantially lower than the 3.0°C central estimate in CMIP5 models underlying IPCC AR5 attribution. This lower sensitivity implies that human-induced forcings account for a smaller fraction of observed warming since , leaving greater scope for unforced internal variability or underestimated natural forcings like variations. Further challenges arise from global climate models' (GCMs) systematic overestimation of warming trends relative to observations, which undermines their use in detection and attribution studies. CMIP6 ensemble means project ECS values averaging 3.9°C, exceeding instrumental records when hindcast over the satellite era (1979–present), with models simulating 1.1–1.6 times more warming than observed in the and mid-latitudes. Attribution methods, such as optimal fingerprinting, rely on these models to isolate anthropogenic "fingerprints" (e.g., stratospheric cooling and tropospheric warming patterns) from , but critics argue this assumes model realism a priori, creating circular validation. Empirical assessments, including those using unadjusted datasets and volcanic effects, suggest models inflate ECS by underrepresenting negative feedbacks like adjustments, leading to overstated anthropogenic signal strength. Underestimation of natural variability in GCMs represents another core critique, as models poorly reproduce observed multidecadal oscillations such as the Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO), which have modulated global temperatures independently of CO2 trends. For example, the post-1998 "hiatus" in surface warming, during which tropospheric temperatures rose less than expected from anthropogenic forcings alone, coincided with a negative phase of the IPO (Interdecadal Pacific Oscillation), yet IPCC AR5 attributed it primarily to internal variability without fully quantifying its potential to mimic long-term trends. Recent analyses indicate that amplifying natural variability in attribution frameworks reduces the detectable anthropogenic component, with some reconstructions showing up to 50% of 20th-century warming attributable to solar and oceanic cycles rather than CO2. Additionally, uncertainties in historical forcings, including aerosol cooling masks and solar total irradiance reconstructions, further erode confidence in isolating human causation, as small adjustments can shift attribution fractions significantly. These methodological limitations have prompted arguments that dominant attribution paradigms prioritize consistency with high-sensitivity models over first-principles physics and unadjusted data, potentially biasing policy-relevant conclusions. Peer-reviewed reassessments integrating raw radiosonde and satellite records challenge the tropospheric "hot spot" fingerprint central to IPCC claims, finding weaker amplification than modeled, which dilutes evidence for CO2 dominance. While IPCC reports acknowledge uncertainties (e.g., AR6's "very likely" >50% anthropogenic contribution to 2010–2019 warming), critics contend that the reliance on ensemble averages from divergent models—many of which fail hindcasts—systematically underweights empirical constraints favoring lower sensitivities and higher natural influence. Ongoing debates highlight the need for attribution studies to incorporate transient sensitivity metrics and paleoclimate analogs more robustly, as current approaches may conflate correlation with causation amid unresolved variability scales.

Methodological Challenges and Data Controversies

Urban Heat Island Effects and Station Siting Issues

The (UHI) effect describes elevated s in developed areas compared to rural surroundings, primarily caused by heat-absorbing materials like asphalt and , reduced evapotranspiration from vegetation loss, and anthropogenic heat from use and vehicles. This localized warming, often 1–3°C higher in cities during nights, can bias thermometer readings at weather stations if sited improperly, potentially inflating apparent long-term trends in surface datasets used for global climate assessments. Concerns over UHI contamination in temperature records emerged prominently in the late 20th century, with early studies like Karl et al. (1988) estimating UHI contributions to U.S. urban-rural differences but arguing for minimal global impact after adjustments. Agencies such as NASA GISS and NOAA developed homogenization techniques to pairwise compare station data and correct for non-climatic shifts, including urbanization, by estimating urban-rural deltas and applying offsets. However, critics contend these methods inadequately isolate UHI signals, as rural reference stations may themselves be affected by encroaching development or homogenization algorithms blending urban biases across networks. Station siting issues compound UHI risks, as guidelines from the and NOAA's Climate Reference Network (CRN) mandate placements at least 30 meters (100 feet) from artificial heat sources like buildings, pavement, or exhaust vents to minimize local biases. A volunteer-led by Anthony Watts and collaborators, initiated in 2007, surveyed 865 U.S. Historical Climatology Network (USHCN) stations—representing 70% of the network—and classified 89% as failing CRN standards, with only 11% rated acceptable (Classes 1–2); poor sites (Classes 3–5) exhibited potential errors exceeding 1–5°C due to proximity to heat emitters. This 2009 analysis suggested siting flaws could systematically warm readings, with U.S. data—among the most reliable globally—implying broader dataset vulnerabilities. Empirical evaluations of siting impacts vary. Fall et al. (2011) analyzed USHCN stations by CRN classes, finding poorly sited locations (Class 5) showed warmer minimum temperatures (+0.7°C/century, 1895–2009) and reduced diurnal ranges compared to well-sited ones (Classes 1–2), but average temperature trends converged (~0.32°C/decade, 1979–2008) due to offsetting maximum-minimum biases; homogeneity adjustments mitigated but did not fully eliminate differences. In contrast, Connolly et al. (2023) demonstrated that homogenization induces "urban blending," propagating UHI signals into rural records—in the U.S., attributing ~20% of network warming (1895–2022) to , and up to 35% at urban stations—challenging claims of effective bias removal. Globally, peer-reviewed assessments quantify UHI's influence on land temperature datasets as modest but non-negligible, with Wickham et al. (2013) estimating a maximum 0.05°C/decade urban bias in global averages after rural-urban pairing. The IPCC's Fourth Assessment Report (2007) characterized UHI as localized and unlikely to skew hemispheric trends significantly, citing wind-calmed night analyses showing no urban-rural divergence. Yet, ongoing urbanization—now affecting ~50% of land stations—and incomplete metadata on site changes sustain debates, with some studies like Sun et al. (2020) indicating up to 0.1–0.3°C/decade biases in rapidly developing regions like China. These controversies have prompted independent efforts, such as Berkeley Earth's raw data reanalysis, which initially downplayed UHI but later acknowledged station quality concerns. Adjustments' efficacy remains contested, as NOAA's pairwise homogenization presumes stable rural references, potentially undercorrecting amid global urban expansion; critics, including reanalyses of U.S. and Japanese records, argue it inadvertently homogenizes away true rural-urban contrasts, embedding ~60% of Japan's observed warming to UHI in blended trends. Empirical audits and cross-validations highlight that unadjusted rural subsets or often show subdued warming, underscoring the need for transparent, verifiable corrections in historical reconstructions.

Tropospheric Warming Patterns and Satellite Data Conflicts

Climate models driven by anthropogenic greenhouse gas forcings predict a characteristic pattern of tropospheric warming, with amplification in the tropical mid-to-upper troposphere relative to the surface, arising from the moist adiabatic lapse rate response and enhanced convection. This "tropical hot spot" is anticipated to warm at rates roughly 1.5 to 2 times the surface trend, a fingerprint expected to distinguish greenhouse forcing from natural variability or solar influences. Satellite observations from microwave sounding units (MSU) and advanced MSU (AMSU) instruments, operational since late , provide global lower tropospheric (LT) and mid-tropospheric (MT) temperature records via datasets such as (UAH) and (RSS). UAH version 6 reports a global LT warming trend of +0.14°C per from 1979 to 2023, compared to surface records showing +0.18°C per , with even smaller amplification in the where models predict the strongest signal. RSS trends are higher, at +0.21°C per globally for LT, but tropical upper tropospheric (TMT) trends remain below model projections, averaging about half the simulated amplification since 1979. Radiosonde () datasets, including homogenized records like RAOBCORE and RICH, corroborate reduced upper-tropospheric warming in the , with some analyses showing lower-tropospheric trends exceeding upper levels in recent s, contrary to model expectations. These discrepancies have fueled debate, with multimodel ensembles from CMIP5 and CMIP6 exhibiting approximately twice the observed tropical TMT warming over the satellite era. Explanations include internal variability modulating short-term trends, potential satellite biases from orbital decay or instrument calibration (though UAH applies corrections), and multidecadal oscillations like ENSO influencing observations more than models. Some studies attribute much of the gap to natural variability alone, while others highlight persistent overestimation in models even after screening for equilibrium climate sensitivity. Critics, including analyses of raw radiosonde data, contend the muted hot spot challenges model representations of tropical convection and water vapor feedbacks, potentially implying overstated climate sensitivity. Proponents argue reanalyses and post-2000 data show emerging alignment, though raw observational records lag predictions.
DatasetGlobal LT Trend (°C/decade, 1979–2022)Tropical TMT Amplification Factor vs. Surface
UAH v6+0.14~0.8–1.0 (reduced vs. models)
v4+0.21~1.2 (still below models)
CMIP6 Models (avg.)+0.25–0.30~1.5–2.0
The conflict persists as a methodological challenge, with ongoing refinements to datasets (e.g., UAH v6 adjustments for diurnal drift) narrowing but not eliminating gaps, underscoring uncertainties in validating model-derived fingerprints against empirical records.

Historical Reconstructions: Proxy Reliability and Uncertainties

Paleoclimate reconstructions rely on proxy data such as tree rings, ice cores, and lake or marine sediments to infer past temperatures and atmospheric composition prior to records. These proxies provide indirect , requiring against modern observations, which introduces uncertainties from non-stationary relationships between proxy signals and variables. Statistical analyses indicate that uncertainties in proxy-based reconstructions are predominantly driven by scatter around regressions, often leading to underestimation of margins and potential conflicts between proxy types when variability exceeds these bounds. Tree-ring width and density serve as key proxies for summer temperatures in dendroclimatology, particularly in northern high latitudes, but exhibit the " problem," where post-1960 growth trends fail to capture observed warming, showing stagnation or decline despite rising temperatures. This discrepancy arises from factors including non-stationary growth responses, methodological choices in detrending, and influences like diurnal temperature shifts or drought stress, challenging the assumption of consistent proxy-climate linkages over time. Selective tree sampling can mitigate but not eliminate divergence in some regions, such as northern , highlighting site-specific limitations and the need for cautious extrapolation. Ice core proxies, utilizing oxygen isotope ratios (δ¹⁸O) and for temperature and trapped gases for CO₂ levels, face uncertainties from spatial variability in isotope-temperature scaling, potential seasonal biases in snowfall accumulation, and paleo-elevation changes affecting baseline conditions. Reconstructing site elevations from temperature proxies yields errors on the order of hundreds of meters, reducing signal-to-noise ratios and complicating histories. Advanced models for and temperatures improve estimates but propagate uncertainties from firn diffusion and diffusion processes, with total errors varying by core but often exceeding 1–2°C for millennial-scale changes. Sedimentary proxies, including pollen, diatoms, and geochemical indicators from lake or ocean cores, reconstruct regional precipitation and temperature but suffer from chronological errors exceeding 400 years in over half of Holocene shifts, local environmental confounders, and ambiguous biological responses to multiple forcings. Interpretation demands disentangling site-specific effects like catchment hydrology from broader climate signals, limiting global syntheses and introducing biases in forward modeling of proxy responses. These issues underscore broader challenges in proxy networks, as seen in critiques of principal component analyses in millennial reconstructions, where methodological artifacts can amplify low-frequency trends, as identified in analyses of bristlecone pine dominance and centering procedures.

Discredited or Marginalized Ideas and Resolutions

Early Global Cooling Alarmism (1970s)

In the 1970s, amid observed mid-20th-century temperature declines and concerns over industrial aerosols, a narrative of impending global cooling gained prominence in popular media, though it represented a minority view within the scientific community. Global surface temperatures had cooled by approximately 0.2°C from the 1940s to the early 1970s, attributed by some to sulfate aerosols reflecting sunlight and natural oscillations like the Atlantic Multidecadal Oscillation. Scientists such as Reid Bryson of the University of Wisconsin emphasized the role of atmospheric dust and pollution in exacerbating this trend, predicting potential agricultural disruptions and a shift toward cooler climates. Media outlets amplified these concerns, often framing them as an emerging consensus despite limited support in peer-reviewed literature. The April 28, 1975, Newsweek article "The Cooling World" highlighted climatologists' warnings of a reversion to conditions, forecasting severe impacts on global food production within a decade due to shortened growing seasons and erratic weather. Similar alarmism appeared in Time magazine covers and books like Lowell Ponte's 1976 "The Cooling," which speculated on a new driven by human-induced veil of blocking solar radiation. Proponents, including early work by J. Murray Mitchell, cited paleoclimatic analogies to past cold periods, suggesting glacial advances could begin as early as the late if trends persisted. A comprehensive review of 71 peer-reviewed climate papers published between 1965 and 1979 revealed that only 7 explicitly forecasted cooling, while 44 predicted warming from rising CO2 levels and 20 remained neutral, underscoring the absence of a on cooling. Figures like Stephen Schneider initially explored cooling scenarios but soon pivoted to greenhouse warming as aerosol regulations reduced reflective particles and CO2 measurements confirmed upward trends. The alarmism waned by the early 1980s as temperatures reversed course, rising consistently thereafter and validating CO2's dominant forcing over transient effects. This episode highlighted media tendencies toward , prioritizing dramatic narratives over balanced scientific assessment, a pattern echoed in later discourse.

Overreliance on High-Sensitivity Models and Adjustments

Climate models have historically incorporated assumptions of high equilibrium climate sensitivity (ECS), defined as the long-term global surface temperature response to a doubling of atmospheric CO2 concentration, with the influential 1979 Charney report establishing a range of 1.5–4.5°C that has persisted in subsequent IPCC assessments despite accumulating evidence for lower values from instrumental records. Observational estimates using energy balance methods, which integrate historical radiative forcings, temperature changes, and ocean heat uptake, have yielded narrower and lower ranges; for example, Lewis and Curry (2018) derived a median ECS of 1.76°C (5–95% range: 1.05–2.7°C) based on updated post-2000 data, significantly below the IPCC's "likely" range centered around 3°C. This discrepancy arises partly because coupled climate models, such as those in the CMIP6 ensemble released around 2019–2020, include a subset of "hot models" with ECS exceeding 4.5°C, which simulate excessive historical warming unless offset by strong negative aerosol forcings, leading to projections that overestimate 20th-century trends when tuned to observations. The reliance on high-sensitivity models for policy-relevant projections has been critiqued for amplifying projected warming, as these models have systematically run "hot" relative to observations; for instance, CMIP5 models projected 16% faster global surface warming than observed since 1970, with the divergence attributable in part to overstated sensitivity rather than forcing errors. During the 1998–2014 "hiatus" period, models collectively predicted 2.2 times more warming than occurred, highlighting structural biases in representing feedbacks and internal variability that favor higher ECS. IPCC AR6 (2021) acknowledged this by downweighting hot CMIP6 models in assessed projections, yet retained the broad 2.5–4°C "likely" ECS range, prioritizing model ensembles over constrained observational inferences, which some analyses argue underweights evidence from historical energy budgets suggesting ECS below 2.5°C. Parallel controversies surround adjustments to historical temperature records, where homogenization algorithms correct for non-climatic artifacts like station relocations or instrument changes, but frequently result in amplified warming trends. In datasets such as NOAA's Global Historical Climatology Network, pairwise homogenization since the 1990s has systematically cooled pre-1950 temperatures while warming recent decades, increasing the centennial trend by up to 0.1–0.3°C per decade in regional analyses; for example, Canadian homogenized records show a stronger annual mean warming rate than raw data. Critics contend these adjustments, often automated and assuming breaks induce cooling biases without independent verification, align adjusted series more closely with high-sensitivity model outputs, potentially masking lower observed sensitivity; independent rural-only station analyses, less prone to urban heat island effects, exhibit reduced trends compared to homogenized urban-inclusive records. Such methodological choices have sustained narratives of rapid anthropogenic warming, even as unadjusted or satellite-derived tropospheric data reveal slower rates inconsistent with high-ECS expectations. This dual reliance on tuned models and adjusted observations has, in historical context, deferred reconciliation with empirical constraints favoring modest sensitivity. The iris hypothesis, proposed by atmospheric physicist and colleagues in , posits a mechanism in the where warming sea surface temperatures reduce the areal coverage of high-altitude cirrus clouds, thereby increasing (OLR) to space and limiting further warming. Observations from the period supported an approximate 2 W/m² increase in OLR per of surface warming, implying a potentially as low as 0.5°C per CO2 doubling, far below the 3°C or higher in many general circulation models (GCMs). This challenged dominant model-based attributions by suggesting cloud feedbacks could substantially dampen anthropogenic forcing effects, but subsequent analyses, such as a 2002 study using satellite data, argued that cirrus clouds exhibit higher shortwave albedos than assumed, potentially offsetting the longwave benefits and yielding neutral or instead. A 2021 review of anvil cirrus research affirmed ongoing debate, with some observational datasets indicating persistent signals despite model simulations often failing to replicate the iris response. Antarctic temperature trends have shown notable divergences from GCM predictions, which anticipated continent-wide warming of 0.2–0.5°C per decade since the mid-20th century under rising concentrations. Instrumental records from 1957–2006, however, revealed minimal net warming over (the larger portion of the continent), with some stations recording cooling trends of up to -0.1°C per decade, while experienced more pronounced warming linked to ocean influences. Reanalysis datasets confirm regional discrepancies, such as stable or declining temperatures in interior highlands amid global mean surface warming, attributing part of the shortfall to underestimated effects or stratospheric influences rather than solely CO2 forcing. Recent assessments through 2025 highlight that multi-model ensembles continue to overestimate warming by factors of 1.5–2 compared to adjusted observations, raising questions about the fidelity of in simulations and potential overreliance on tropospheric processes. Comparisons with other solar system bodies have been invoked to question Earth-centric anthropogenic explanations, noting apparent warming on Mars (e.g., polar cap recession observed in the 2000s) and Jupiter's bands without comparable CO2 emissions, suggesting solar variability as a common driver. However, Mars' changes align more closely with regional dust storms and albedo shifts than long-term solar irradiance trends, which measurements from satellites like SORCE indicate have been flat or slightly declining since 1978 (by ~0.1 W/m²). Venus' extreme surface temperatures (~460°C) stem from a 92-bar CO2-dominated atmosphere and minimal heat escape, differing fundamentally from Earth's thinner veil due to planetary distance, rotation, and water vapor cycles, rendering direct analogies limited. These interplanetary disparities underscore that while greenhouse gases exert forcing across bodies, local factors like orbital parameters and atmospheric thickness modulate outcomes, with Earth's observed OLR adjustments not mirrored elsewhere, thus not negating but contextualizing CO2's role amid unresolved solar and internal variability debates.

Assessment of Past Emission and Temperature Projections

Pre-2020 Forecasts for CO2 Levels and Warming

The 1979 Charney Report, commissioned by the U.S. National Academy of Sciences, estimated the equilibrium climate sensitivity to a doubling of atmospheric CO2 at 3°C, with a likely range of 1.5–4.5°C, based on radiative forcing calculations and model simulations incorporating water vapor and ice-albedo feedbacks. This sensitivity value underpinned subsequent transient warming projections by linking CO2 increases to temperature response over decades. The report did not project specific CO2 trajectories but assumed continued fossil fuel use would eventually double pre-industrial levels (from ~280 ppm to 560 ppm), implying multi-degree warming if unchecked. In 1988, NASA scientist James Hansen presented congressional testimony and a model-based forecast projecting distinguishable anthropogenic warming by the 1990s under moderate emissions (Scenario B, assuming phased-out chlorofluorocarbons but continued CO2 growth). This scenario anticipated approximately 0.7°C of global temperature rise from 1988 levels by 2019, driven by CO2 reaching around 400–420 ppm amid rising emissions from fossil fuels. Hansen's higher-emission Scenario A forecasted faster warming, up to 1.0°C or more by the early 2000s, with CO2 concentrations exceeding 430 ppm, while Scenario C (strong mitigation) projected minimal additional rise. These projections relied on a three-dimensional general circulation model emphasizing tropospheric adjustments and ocean heat uptake. The IPCC's 1990 First Assessment Report outlined business-as-usual (BAU) emissions leading to atmospheric CO2 concentrations of approximately 410 ppm by the early , escalating to 600–700 ppm by 2100 under continued high dependence. For warming, it projected a global mean surface temperature increase of 0.3°C per decade (range 0.2–0.5°C) over the subsequent century, implying 1.8–5.4°C total rise by 2100 relative to 1990, contingent on sensitivity near 3°C and offsets. These estimates drew from multi-model ensembles and simple climate models tuned to observed trends. Subsequent IPCC reports refined scenarios while maintaining similar sensitivity assumptions. The 1992 IS92 series (used in the 1995 Second Assessment) projected CO2 levels of 370–450 ppm by 2020 across variants, with IS92a (middle-of-the-road) anticipating ~380 ppm and associated decadal warming of 0.2–0.3°C. The 2000 Special Report on Emissions Scenarios (SRES) spanned families like A1 (high growth, CO2 to 600–1,000 ppm by 2100) and B1 (sustainable development, ~550 ppm), informing transient warming projections of 1.4–5.8°C by century's end in the Third Assessment Report. By the 2013–2014 Fifth Assessment Report, Representative Concentration Pathways (RCPs) forecasted CO2 at 421–450 ppm by 2040 under RCP4.5–8.5 (moderate-to-high emissions), with near-term warming rates of 0.15–0.25°C per decade through 2030, stabilizing sensitivity estimates at 1.5–4.5°C for doubled CO2. These pathways integrated integrated assessment models for emissions-to-concentration translation, emphasizing uncertainty in land sinks and economic drivers.
Report/StudyKey CO2 Projection (near-term, ~2020)Warming Projection (decadal rate or total)
Charney (1979)Not specified; assumed doubling to 560 ppm eventualECS 3°C (1.5–4.5°C range) to doubled CO2
Hansen (1988, Scenario B)~400–420 ppm~0.24°C/ initially; 0.7°C total by 2019
IPCC FAR (1990, BAU)~410 ppm early 2000s0.3°C/ (0.2–0.5°C range)
IPCC IS92a (1995)~380 ppm0.2–0.3°C/
IPCC SRES A1B (2000)~400–430 ppm0.2°C/ to 2100 midpoint
IPCC RCP6.0 (AR5, 2014)~430 ppm by 20400.15–0.25°C/ near-term

Realized Outcomes Versus Scenarios Up to 2025

Observed global surface air temperatures from 1990 to 2025 increased by approximately 0.7°C, falling short of the 1°C rise projected by the IPCC's 1990 First Assessment Report under its business-as-usual scenario, which assumed continued high emissions without policy intervention. This discrepancy arises despite atmospheric CO₂ concentrations rising from about 355 ppm in 1990 to roughly 423 ppm by 2025, a trajectory aligning more closely with moderate emission pathways (e.g., RCP4.5 or SSP2) than the highest-end scenarios like RCP8.5, which projected steeper near-term increases but were calibrated for longer-term extremes. Climate model ensembles, such as those in CMIP5 and CMIP6, have generally projected warming rates exceeding observations over the 1970–2025 period, with CMIP6's median rate at 0.221°C per compared to an observed rate of about 0.157–0.18°C per in datasets like HadCRUT5 and GISTEMP. This overestimation, noted in peer-reviewed evaluations, suggests potential biases in model sensitivity to forcings or internal variability underestimation, though recent El Niño-driven spikes in 2023–2025 brought annual anomalies temporarily into alignment with higher model projections (e.g., 2024 at ~1.28°C above 1951–1980 baseline). For regional and extreme outcomes, realized patterns up to 2025 have diverged from some scenario expectations: Arctic amplification occurred but at rates below CMIP6 means, while mid-latitude heatwaves intensified consistent with moderate projections, yet tropical cyclone frequency did not surge as anticipated in high-emission models. Sea-level rise averaged 3.4–4 mm/year over 1993–2025, matching lower-end IPCC projections rather than accelerated scenarios tied to rapid ice melt. These outcomes highlight that while forcings like CO₂ have tracked moderate paths, realized warming implies equilibrium climate sensitivity values toward the lower half of assessed ranges (2–3°C per CO₂ doubling), challenging higher-sensitivity model assumptions.

Lessons from Divergent Predictions on Extremes and Regional Changes

Climate models have historically produced a range of predictions for the frequency and intensity of extreme weather events, such as heatwaves, heavy precipitation, droughts, and tropical cyclones, often diverging due to uncertainties in parameterizations of clouds, convection, and ocean-atmosphere interactions. Early projections, including those from the 1990 IPCC assessment and subsequent CMIP ensembles, anticipated increases in the occurrence of intense tropical cyclones and agricultural droughts in subtropical regions under rising greenhouse gas concentrations. However, global observational records through 2025 reveal no statistically significant upward trend in tropical cyclone frequency or major hurricane intensity, with ambiguities in historical data attributed to sparse early observations and dominant multidecadal variability like the Atlantic Multidecadal Oscillation. Similarly, while models projected expanding drought areas in parts of the Mediterranean and southwestern North America, empirical trends show mixed outcomes, with some regions experiencing reduced drought frequency amid increased overall precipitation variability. Regional climate projections have exhibited even greater divergence from observations, particularly for patterns and contrasts, owing to inadequate resolution of mesoscale features and teleconnections. For example, CMIP5 and CMIP6 models often fail to replicate observed enhancements in Sahel rainfall since the 1980s or the stalled warming in parts of the tropical Pacific, instead overemphasizing uniform poleward shifts in storm tracks. These mismatches extend to extremes like flooding, where global flood records display no coherent increase despite model forecasts of wetter conditions in a warmer atmosphere, largely because short-term records are overwhelmed by natural oscillations such as El Niño-Southern Oscillation. In flood-prone basins, observed trends remain within model uncertainty bounds but highlight systematic biases, such as underestimation of feedbacks. Key lessons from these divergences include the outsized influence of internal climate variability on decadal-scale regional trends and extremes, which can produce apparent model "failures" even when global means align with projections. This underscores the need for large-ensemble simulations to better quantify and for hindcasting exercises that validate models against paleoclimate proxies alongside instrumental data, rather than relying solely on forward projections. Advancements in high-resolution regional models and have addressed some limitations in simulating convective extremes, but persistent discrepancies emphasize prioritizing empirical detection-attribution frameworks over ensemble means for policy guidance. Overall, these experiences have fostered greater emphasis on and acknowledgment of irreducible uncertainties in subcontinental scales, tempering deterministic claims about anthropogenic attribution of specific events.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.