Timeline: The History Of Climate Modelling

Carbon Brief - Leo Hickman

The climate models used by scientists today rely on some of the world’s most advanced supercomputers. It can take dozens of highly skilled people to build and then operate a modern-day climate model.
However, less than a century ago, climate models were little more than an idea; basic equations roughly sketched out on paper. After the second world war, though, the pace of development quickened dramatically, particularly in the US.
By the late 1960s, policymakers were being presented with the models’ findings, which strongly reinforced the theory that the continued rise in human-caused greenhouse gas emissions would alter the global climate in profound ways.
Carbon Brief charts more than 50 key moments in the history of climate modelling. Such moments include:
  • Guy Callendar’s seminal paper published in 1938.
  • The first computerised, regional weather forecast in 1950.
  • Norman Phillips’ first general circulation model in 1956.
  • The establishment of a modelling group at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, in 1964.
  • Syukuro Manabe and Richard Wetherald’s seminal climate modelling study in 1967.
  • The Met Office’s first general circulation model in 1972.
  • The Charney Report in 1979.
  • James Hansen’s three scenarios published in 1988.
  • The first Intergovernmental Panel on Climate Change (IPCC) report published in 1990.
  • The Coupled Model Intercomparison Project (CMIP) launched in 1995.
  • The IPCC’s fifth assessment report published in 2013.

Introduction to Geophysical Fluid Dynamics: Physical and Numerical Aspects. Credit: Archive.org
Weather Prediction by Numerical Process
The story of climate modelling using numerical methods begins with Lewis Fry Richardson, an English mathematician and meteorologist, when he publishes a book, entitled "Weather Prediction by Numerical Process". The book describes his idea for a new way to forecast the weather using differential equations and viewing the atmosphere as a network of gridded cells. But when he applies his own method, it takes him six weeks doing calculations by hand just to produce an eight-hour forecast. He imagines a stadium full of "computers" (64,000 human calculators) all working together to speed up the process. But without mechanical computers, his attempts fail.
Richardson builds upon the earlier ideas of the Norwegian meteorologist, Vilhelm Bjerknes, who had argued at the turn of the 20th century that atmospheric changes could be calculated from a set of seven “primitive equations”.
Before them both, in 1895, the Swedish physical chemist Svante Arrhenius had described an energy budget model that considered the radiative effects of carbon dioxide in a paper presented to the Royal Swedish Academic of Sciences.

Guy Stewart Callendar in 1934. Credit: University of East Anglia
Guy Callendar
Building on the earlier work of scientists, such as John Tyndall and Svante Arrhenius, an English steam engineer and amatuer meteorologist called Guy Callendar uses a 1D radiative transfer model to show that rising CO2 levels are warming the atmosphere. His result is published in the Quarterly Journal of the Royal Meteorological Society in a paper entitled, “The artificial production of carbon dioxide and its influence on temperature”.
Writing an appreciation of Callendar’s “classic” paper 75 years later in 2013, climate scientist Ed Hawkins writes: “What is most remarkable is that he did all the calculations by hand, without the aid of a computer…Callendar had managed to measure the temperature change of the planet, from his desk, to within the modern estimates of the uncertainties that exist.”
However, Callendar’s work is largely ignored until the 1960s.

John von Neumann. Credit: Wikimedia Commons

John von Neumann

John von Neumann, a Princeton mathematician who worked on the Manhattan Project during the second world war, proposes that new computers, such as the ENIAC at the University of Pennsylvania, be used to forecast weather. He attracts grants from the US military to develop his idea at Princeton. He says that if regional weather could be forecast then the whole of the atmosphere could one day be simulated.

ENIAC programmers, via computerhistory.org
Electronic Numerical Integrator and Computer (ENIAC)
The new meteorology group formed at Princeton by von Neumann is headed by Jule G Charney, who later becomes a key figure in climate science.
The group uses ENIAC to run the first computerised, regional weather forecast in 1950.
The 2D model divides the atmosphere into grid cells in the way Richardson had proposed.
But it still takes about 24 hours of computing to produce a 24-hour forecast – with mixed accuracy.

Offices of the US weather bureau. Credit: Paul Fearn/Alamy Stock Photo
July 1, 1954
Joint Numerical Weather Prediction Unit
As Charney’s results begin to improve, the US Weather Bureau and military decide to create the Joint Numerical Weather Prediction Unit (JNWPU).
It is based in Washington DC and is tasked with developing operational forecasts.
By May of 1955, the unit is producing real-time forecasts in advance of the weather using an IBM 701 computer, but the accuracy is inconsistent.
By 1958, with advances in computing speeds, the unit is producing forecasts looking out several days.

December 1954
Credit: Liftarn/Wikimedia Commons.
A Swedish-Norwegian collaboration beats the JNWPU team by a few months to deliver the world’s first real-time numerical weather forecast. The team involves scientists based at the Institute of Meteorology at the University of Stockholm, the Royal Swedish Air Force Weather Service and the University of Oslo. They receive the guidance of Carl-Gustav Rossby, arguably the most celebrated meteorologist of the era, who had also been working with the Princeton team. The forecasts, which focus on the North Atlantic, are performed using a Swedish computer called the BESK (the Swedish acronym for "Binary Electronic Sequence Calculator").

General Circulation Research Section
Joseph Smagorinsky. NOAA
Following pressure from John von Neumann, the US Weather Bureau creates a unit called the General Circulation Research Section. It is based in Maryland and led by Joseph Smagorinsky, who has worked under both von Neumann and Charney. The goal is to create a 3D general circulation model (GCM) of the global atmosphere based on “primitive equations”. (Von Neumann is inspired by the early findings of Norman Phillips in his Princeton team, who a year later formally publishes a paper describing the first GCM - see the entry for April 1956.) This seminal research unit, which is the first with a permanent programme developing GCMs, is later renamed the General Circulation Research Laboratory in 1959 and then renamed again as the Geophysical Fluid Dynamics Laboratory (GFDL) in 1963.

Mikhail Budyko. Credit World Meteorological Organization

Mikhail Budyko
A Russian climatologist called Mikhail Budyko, who is the director of the Geophysical Observatory in Leningrad, publishes a book called (in English), “The Heat Balance of the Earth’s Surface”. Two years later, it is translated and published by the US Weather Bureau.
The seminal book influences climate scientists for many years with its method for calculating the various components of the heat balance of the entire Earth system. Using a simple energy-balance model, he calculates the Earth’s average global temperature by balancing incoming solar energy with outgoing thermal energy.
Budyko calls for “an accumulation of data on direct balance observations” from weather stations. Once this happens, he says, then it could “open up new vistas” for the “development of the theory of climate and general circulation of the atmosphere”.

Syukuro Manabe. Credit: AIP Emilio Segrè Visual Archives

Syukuro Manabe
Smagorinsky invites Syukuro Manabe from the University of Tokyo to join his lab at the US Weather Bureau. It proves to be a key moment, with Manabe’s work at GFDL now seen as a vital chapter in the history of climate modelling. Smagorinsky asks Manabe to oversee the coding and development of atmospheric GCMs at the lab. They work together to gradually add complexity to the models, such as the evaporation of rainfall and the exchange of heat across ocean, land and ice.

April 1956
First general circulation model
Quarterly Journal of the Royal Meteorological Society
Norman Phillips, a member of the team at Princeton working under John von Neumann, publishes a paper entitled, “The general circulation of the atmosphere: A numerical experiment”, in the Quarterly Journal of the Royal Meteorological Society. His numerical experiment, which realistically depicts seasonal patterns in the troposphere, is later hailed as the first “general circulation model” (GCM) of the atmosphere. As a theoretical meteorologist, he is less interested in weather forecasts, more in what drives the circulation of the atmosphere and whether this can be modelled. He does this using a computer with just 5K of memory and a further 10K on a separate disc. Phillips works with von Neumann, Charney and Smagorinsky over this period.

Journal of Geophysical Research. 1 July 1963

July 1963
Fritz Möller
Fritz Möller, a University of Munich meteorologist who was visiting Manabe at GFDL at the time, publishes a paper entitled, "On the Influence of Changes in the CO2 Concentration in Air on the Radiation Balance of the Earth's Surface and on the Climate”, in the Journal of Geophysical Research. It examines the feedback effect of clouds on atmospheric temperature and, in doing so, seeks to disprove Gilbert Plass’s influential 1953 paper on the warming influence of human-caused CO2 emissions. Möller concludes that the “theory that climatic variations are affected by variations in the CO2 content becomes very questionable”. Möller’s paper has fundamental flaws, but it later leads Manabe to investigate how different CO2 levels affect climate models.

Akio Arakawa
Akio Arakawa. Credit: UCLA
Akio Arakawa, another graduate of the University of Tokyo, begins a two-year stint at University of California Los Angeles (UCLA). He works with Yale Mintz, who had been inspired by Phillips’ work at the General Circulation Research Section in the mid-1950s. Together, Arakawa and Mintz work on developing a model that can stay computationally stable over a long period and not “blow up” after a few days, which was a problem with Phillips’ 1956 model. Their on-going work becomes known as the “Mintz-Arakawa Model”, with the first iteration running by 1963. They are later helped by IBM's Large Scale Scientific Computation Department based in San Jose. Arakawa joins UCLA permanently in 1965 and publishes an important paper (pdf) in 1966 in the Journal of Computational Physics, entitled “Computational Design for Long-Term Numerical Integration of the Equations of Fluid Motion”.

Akira Kasahara (left) and Warren Washington with his student, Brenda Chester (right)
Warren Washington and Akira Kasahara, yet another University of Tokyo graduate, establish a climate modelling group at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. The Kasahara-Washington model offers finer resolution, but its main legacy is that it establishes NCAR as a leading climate modelling centre from the 1960s onwards.

Weather and Climate Modification Problems and Prospects

January 7, 1966
National Academy of Science report
The Committee on Atmospheric Sciences at the National Academy of Science (NAS) publishes a report called Weather and Climate Modification: Problems and Prospects. The report concludes the committee’s two-year investigation into the “recent advances in mathematical modelling of atmospheric processes”. Its focus has been to learn more about the “promise” this might offer for “weather and climate modification”. But in doing so the committee (which includes Charney) concludes that the models developed by the likes of Mintz and Arakawa and Smagorinsky and Manabe do bear “some resemblance” to the observations. They recognise, though, that the modellers need more powerful, faster computers.

Bryan and Cox (1967), Tellus
February 1967
Kirk Bryan
An oceanographer based at GFDL called Kirk Bryan is the first to model a 3D circulation of the ocean. In conjunction with his colleague Michael Cox – and overseen by Manabe – Bryan publishes a paper in the journal Tellus entitled, “A numerical investigation of the oceanic general circulation”.

May 1, 1967
Manabe and Wetherald
Manabe and Wetherald (1967), Journal of the Atmospheric Sciences
Manabe publishes what is now seen to be the most influential climate modelling paper of all time.
Along with his co-author Richard Wetherald, Manabe produces the first credible prediction, using a 1D radiative-convective model, of what would happen to the atmosphere if CO2 levels were changed. The paper is published in the Journal of the Atmospheric Sciences and entitled, “Thermal Equilibrium of the atmosphere with a given distribution of relative humidity”.Manabe and Wetherald specifically want to know what will happen to the global average temperature if the radiative transfer of energy between the surface and the troposphere is altered by an increase in CO2 levels. This is to become a central focus for climate modellers in the decades ahead. In addition, they want to know what the potential feedbacks from water vapour and clouds might be, which they discover strongly influence the CO2 effect. They estimate the effect of doubling CO2 levels – a metric which later becomes known as “climate sensitivity” – and settle on a value of 2.4C.

A Global Climatic Model Based on the Energy Balance of the Earth-Atmosphere System. Journal of Applied Meteorology and Climatology.

March 4, 1969
William D Sellers
Sellers, a meteorologist based at the University of Arizona’s Institute of Atmospheric Physics, publishes the results of his simple “energy-balance” climate model. He states: “The major conclusions of the analysis are that removing the Arctic ice cap would increase annual average polar temperatures by no more than 7C, that a decrease of the solar constant by 2–5% might be sufficient to initiate another ice age, and that man's increasing industrial activities may eventually lead to a global climate much warmer than today.”
His paper, “A Global Climatic Model Based on the Energy Balance of the Earth-Atmosphere System”, is published in the Journal of Applied Meteorology and Climatology.

A Global Climatic Model Based on the Energy Balance of the Earth-Atmosphere System. Journal of Applied Meteorology and Climatology.

April 14, 1969
Nimbus III
NASA's Nimbus III satellite is launched. It has the specific task of taking measurements of the Earth which will help to test and validate climate models. It weighs half a tonne, orbits the poles and carries with it infrared spectrometers and radiometers for measuring atmospheric temperatures and radiation profiles. However, a key measuring device fails after just three months and the satellite is finally terminated in 1972.
However, the era of climate scientists relying on satellite data to test and validate their models has begun. Climate scientists will come to rely on satellites to carry instruments, such as microwave-sounding units, into orbit.

From S. Manabe and K. Bryan, J. Atmospheric Sciences 26 (1969): 786-89 , p. 786
May 6, 1969
Manabe and Bryan
Having largely worked separately up to this point, Manabe and Bryan now come together at GFDL’s new base at Princeton to produce the first “coupled” atmosphere-ocean GCM. Many elements are brought together and interact with each other for the first time, such as the atmosphere, oceans and ice. Their coupled AOGCM also includes the transfer of water from land to oceans and back.
However, the computing time needed to model the ocean elements is considerable and truly tests their Univac 1108 computer. It takes 1,100 hours (about 46 days) to do just one run of the model. They are forced to use a highly simplified view of “Earth”, where a globe is split into three sections, equal parts ocean and land, with the poles omitted. Their paper (pdf) is published in the Journal of the Atmospheric Sciences and entitled, “Climate calculations with a combined ocean-atmosphere model”.

July 1970
Study of Critical Environmental Problems
Williams College, Massachusetts. Credit: Daderot / Wikimedia Commons
More than 100 scientists and various experts from 17 US universities gather for a month-long meeting at Williamstown, Massachusetts, to discuss the “global climatic and ecological effects of man's activities”. The meeting results in a report published by the Massachusetts Institute of Technology in October of that year called the Study of Critical Environmental Problems (SCEP).
The report is criticised as being too US-centric and a follow-up, three-week meeting involving participants from 14 countries is organised in Stockholm in 1971. The resulting output is called “Inadvertent Climate Modification: Report of the Study of Man’s Impact on Climate” (SMIC).
Joseph Smagorinsky attends both meetings. Suki Manabe and Mikhail Budyko attend the SMIC meeting. The climate change working group at both meetings is led by NCAR’s William Kellogg. Global climate models are presented as being “indispensable” for researching human-caused climate change.
The SCEP and SMIC reports both influence the landmark June 1972 meeting, also held in Stockholm, called the United Nations Conference on the Human Environment, which leads to the founding of the UN Environment Program. Human-caused climate change is now on the radar of politicians.

October 3, 1970

With the support of President Richard Nixon, a “wet NASA” is created. It is called the National Oceanic and Atmospheric Administration, or NOAA, and sits within the US Department of Commerce. Nixon says it should “serve a national need for better protection of life and property from natural hazards…for a better understanding of the total environment”. This reflects a burgeoning interest – and concern – at this time about the way humans are impacting the environment. NOAA becomes one of the world’s leading centres of climate change research, with GFDL at Princeton being one of its key research centres.

October 1972
Met Office climate model
George Corby. Credit: G. A. Corby/Met Office
UK Met Office scientists describe in a journal paper the workings of their first GCM, which they’ve been developing since 1963 when the Met Office first created its “Dynamical Climatology Branch”. They write in the abstract: “The model incorporates the hydrological cycle, topography, a simple scheme for the radiative exchanges and arrangements for the simulation of deep free convection (sub grid-scale) and for the representation of exchanges of momentum, sensible and latent heat with the underlying surface.”
Using the UK-based ATLAS computer (considered at the time to be the world’s fastest supercomputer), the team, led by George Corby, use their five-layer model to show in a follow-up paper in 1973 that “despite the deterioration in the upper part of the model, reasonable simulation was achieved of many tropospheric features…There were also, however, a number of things that were definitely incorrect.”
The Met Office becomes one of the world’s leading centres of climate modelling in the decades ahead.

Manabe and Weatherald, 1975. Journal of the Atmospheric Sciences

January 1, 1975

Doubling of CO2
Manabe and Wetherald publish another seminal paper in the Journal of the Atmospheric Sciences, entitled “The Effects of Doubling the CO2 Concentration on the climate of a General Circulation Model”. They use a 3D GCM to investigate for the first time the effects of doubling atmospheric CO2 levels. The results reveal, among other things, disproportionate warming at the poles and a “significantly” increased intensity of the hydrologic cycle. It also shows a value for climate sensitivity of 2.9C – which is still, broadly, the mid-range consensus among climate scientists.

Manabe, Bryan and Spelman, 1975. Journal of Physical Oceanography

January 1, 1975
First coupled ocean-atmosphere GCM
At the same time as his paper with Wetherald, Manabe publishes a separate paper with Kirk Bryan. It presents the results from the first coupled atmosphere-ocean GCM (AOGCM). They say that, unlike their 1969 paper, it includes “realistic rather than idealized topography”. But it is still crude by today’s standards with a grid size of 500km. It takes 50 days of computing to simulate three centuries of atmospheric and oceanic interactions. “The climate that emerges from this integration includes some of the basic features of the actual climate,” they state. It correctly models areas of the world typified by extensive desert or high rainfall.
The paper is entitled, “A Global Ocean-Atmosphere Climate Model. Part I. The Atmospheric Circulation” and published by the Journal of Physical Oceanography.

Understanding Climatic Change: A Program for Action
Understanding Climatic Change - A Program for Action. National Academy of Sciences
The National Academy of Sciences publishes a 270-page report by the US Committee for the Global Atmospheric Research Program (GARP) entitled, “Understanding Climatic Change: A Program for Action”. The committee has a number of notable names, including Charney, Manabe, Mintz, Washington and Smagorinsky. (It also includes Wally Broecker, whose 1975 Science paper first coined the term “global warming”, and Richard Lindzen, who would later become one of the most prominent climate sceptics.)
GARP had been tasked in 1972 with producing a plan for how best to respond to the increasing realisation that “man’s activities may be changing the climate”. GARP “strongly recommends” that a “major new program of research”, with “appropriate international coordination”, is needed to “increase our understanding of climatic change and to lay the foundation for its prediction”. To support this, it also recommends the development of a “wide variety” of models, accepting that realistic climate models "may be considered to have begun”.

Layers of Earth's atmosphere. Credit: Stocktrek Images, Inc. / Alamy Stock Photo
February 1977
Modelling methodologies published
A journal called Methods in Computational Physics: Advances in Research and Applications publishes an influential special volume on the “General Circulation Models of the Atmosphere”. Various climate modelling groups, including those at UCLA, NCAR and the UK Met Office, submit papers setting out how their current models work. Their papers – particularly the UCLA paper by Akio Arakawa and Vivian Lamb – form the backbone of most climate models’ “computational domain” for years afterwards.

Ad Hoc Study Group on Carbon Dioxide and Climate. National Academy of Sciences, 1979
July 23, 1979 — July 27, 1979
The Charney Report
The US National Research Council convenes a five-day “ad hoc study group on carbon dioxide and climate” at Woods Hole, Massachusetts. Chaired by Jule Charney, the assembled panel of experts (which includes a retired representative from the Mobil oil company) sets about establishing a “consensus” position on the “implications of increasing carbon dioxide”. To help them do so, they compare two models – one of Manabe’s and one by James Hansen at NASA’s Goddard Institute for Space Studies. The panel notes how heat from the atmosphere could be temporarily absorbed by the oceans and they also settle on a range for climate sensitivity of 2-4.5C, which has stayed largely the same ever since.

World Climate Research Programme
The International Council of Scientific Unions and the World Meteorological Organization unite to sponsor the launch of the World Climate Research Programme (WCRP). Its main objective is “to determine the predictability of climate and to determine the effect of human activities on climate”. WCRP, which is based in Geneva, helps to organise observational and modelling projects at an international scale. Within a few years, its work helps to establish the understanding and prediction of El Niño and its associated impact on the global climate. It also leads to much better quality data collection of oceans which, in turn, helps climate modellers.

Community Climate Model

NCAR’s Stephen Schneider and William Kellogg talking about climate models in a 1981 TV documentary.
The Community Climate Model (CCM) is created by the US National Center for Atmospheric Research (NCAR) in Colorado. It aims to be a “freely available global atmosphere model for use by the wider climate research community”. Its establishment recognises that climate models are getting increasingly complex and now involve the specific skills and inputs of a range of people. All of the source codes for the CCM (a 3D global atmospheric model) are published, along with a “users' guide”. A range of international partners are invited to take part, including the European Centre for Medium-Range Weather Forecasts.

Projecting the climatic effects of increasing carbon dioxide. United States Department of Energy, Dec 1985
December 1985
US Department of Energy report
The US government decides that it is “important to take an account” of how much climate science has moved on since the mid-1970s. So the Department of Energy commissions four “state of the art” volumes, which are reviewed by the American Association for the Advancement of Science. One of the volumes focuses on “projections”, in particular the “current knowns, unknowns, and uncertainties”. One area of focus is the “controversy” that “arose in 1979” when a paper (pdf) co-authored by MIT’s Reginald Newell concluded that climate models were overestimating climate sensitivity. Newell estimated a value lower than 0.25C.
The volume concludes: “Although papers continue to be published indicating order-of-magnitude shortcomings of model results, when these arguments have been carefully considered, they have been found to be based on improper assumptions or incorrect interpretations. Although the models are by no means perfect, where it has generally been possible to compare large-scale results from model simulations with measurements, the agreement has been good.”

Hansen et al, Journal of Geophysical Research, 1988
August 20, 1988
Hansen’s three scenarios
James Hansen is the lead author on a paper in the Journal of Geophysical Research: Atmospheres which examines various scenarios using a NASA GISS 3D model. They attempt “to simulate the global climate effects of time-dependent variations of atmospheric trace gases and aerosols”. They do this by running three scenarios in the model for the 30-year period between 1958 and 1988 and then see how this affects the modelled global climate up to 2060. The first scenario “assumes continued exponential trace gas growth” and projects 4C of warming by 2060 above 1958 levels.
Two months earlier, Hansen includes the paper’s finding in his now-famous US Senate hearing where he explains that human-caused global warming “is already happening now”. 1988 was, at that time, the warmest year on record.

Bert Bolin. Credit: KVA

November 1988
IPCC established
The United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO) establish the Intergovernmental Panel on Climate Change (IPCC). It becomes the leading international body for publishing periodic assessments of climate change. Its aim is to “provide the world with a clear scientific view on the current state of knowledge in climate change and its potential environmental and socio-economic impacts”. The first IPCC chair is Bert Bolin, a Swedish meteorologist, who had spent a year in 1950 working towards his doctorate at Princeton running early models on ENIAC, alongside the likes of Jule Charney and John von Neumann.

The Atmospheric Model Intercomparison Project launches
Lawrence Livermore National Laboratory. Credit: National Ignition Facility / Wikimedia Commons
Under the auspices of the World Climate Research Programme, the Atmospheric Model Intercomparison Project (AMIP) is launched to establish a protocol that can be used to undertake the “systematic validation, diagnosis, and intercomparison” of all atmospheric GCMs. To do this, all models are required to simulate the evolution of the climate during the period 1979-88, according to standardised metrics.
AMIP is centred at the Program for Climate Model Diagnosis and Intercomparison at the Lawrence Livermore National Laboratory in California, but is a genuinely international attempt to standardise the assessment of the world’s various climate models.

June 1989
Ronald Stouffer. Credit: John Mitchell
Around the same time, two modelling centres in the US start to publish results from their next-generation coupled atmosphere–ocean general circulation models (AOGCMs). At NCAR, Warren Washington and his colleague Gerald Meehl publish a paper in Climate Dynamics which uses their AOGCM to show how rising CO2 levels at three different rates affect global temperatures over the next 30 years.
Five months later, a paper in Nature by a team at the Geophysical Fluid Dynamics Laboratory at Princeton (led by Ronald Stouffer, but including Manabe and Bryan) sets out their AOGCM results, which highlight “a marked and unexpected interhemispheric asymmetry”. It concludes: “In the Northern Hemisphere of the model, the warming of surface air is faster and increases with latitude, with the exception of the northern North Atlantic.”

Margaret Thatcher opens the Met Office Hadley Centre. Credit: Met Office
May 25, 1990
Met Office Hadley Centre opens
Margaret Thatcher, the UK prime minister, formally opens the Met Office’s Hadley Centre for Climate Prediction and Research in Bracknell, Berkshire. Its strategic aims include “to understand the processes influencing climate change and to develop climate models”.
The first Hadley Centre coupled model boasts “11 atmospheric levels, 17 ocean levels and 2.5° × 3.75° resolution”. The Hadley Centre, now located in Exeter, remains to this day one of the world’s leading climate modelling centres.

August 27, 1990 — August 30, 1990
First IPCC report
First IPCC report. Credit: IPCC
At a meeting in Sweden, the IPCC formally adopts and publishes its first assessment report. Its summary of the latest climate model projections states that “under the IPCC business-as-usual emissions of greenhouse gases, the average rate of increase of global mean temperature during the next century is estimated to be about 0.3C per decade”.
The report carefully explains: “The most highly developed tool which we have to predict future climate is known as a general circulation model or GCM. These models are based on the laws of physics.” It says it has confidence in the models because “their simulation of present climate is generally realistic on large scales”.
But there’s a note of caution: “Although the models so far are of relatively coarse resolution, the large scale structures of the ocean and the atmosphere can be simulated with some skill. However, the coupling of [these] models reveals a strong sensitivity to small-scale errors which leads to a drift away from the observed climate. As yet, these errors must be removed by adjustments to the exchange of heat between ocean and atmosphere.”

September 20, 1990
Credit: Unsplash
The Journal of Geophysical Research publishes an important paper (pdf) confirming that clouds are the main reason for the large differences – a “roughly threefold variation” – in the way the various models respond to changes in CO2 concentrations. Robert D Cess of Stony Brook University in New York and his co-authors (which include Wetherald, Washington and the Met Office’s John Mitchell) compare the climate feedback processes in 19 atmospheric GCMs.
They say their paper “emphasises the need for improvements in the treatment of clouds in these models if they are ultimately to be used as reliable climate predictors”. It follows a similar paper by Cess et al published a year earlier in Science. Together, the papers illustrate how climate scientists from across many countries are now working collectively to improve and refine their models. Clouds remain to this day a challenge for climate modellers.

June 15, 1991
Eruption of Mount Pinatubo
Mount Pinatubo erupts, Luzon, Philippines. Credit: David Hodges / Alamy Stock Photo
A volcanic eruption in the Philippines – the second largest of the 20th century – provides a golden opportunity for scientists to further test climate models. Later that year, James Hansen submits a paper to Geophysical Research Letters which uses a NASA GISS climate model to forecast a “dramatic but temporary break in recent global warming trends”.
Over the coming years, climate models show that they can indeed accurately predict what impact the aerosols thrown into the atmosphere by Mount Pinatubo and other large volcanic eruptions can have, with the model results closely matching the observed brief period of global cooling post-eruption.

Image: Richinpit/E+/Getty Images
January 1992
Climate System Modeling' published
Cambridge University Press publishes a book called "Climate System Modeling", edited by NCAR’s Kevin Trenberth. The book is the end result of a series of workshops where scientists from a range of backgrounds – geography, physics, oceanography, meteorology, biology, public policy, etc – came together to set down the current knowledge about climate models. With 23 chapters written by 28 authors spreading over almost 800 pages, the book proves to be a vital reference.

CMIP launched
Building on the development and success of AMIP in 1989, the World Climate Research Programme (WCRP) initiates the Coupled Model Intercomparison Project (CMIP). It aims to create a “standard experimental protocol” for studying the output of coupled atmosphere-ocean GCMs (AOGCMs) and, in particular, how they project climate change in the decades ahead.
The global community of modellers use CMIP to perform "control runs" where the climate forcing is held constant. In total, 18 models from 14 modelling groups are included. They compare how the various models “output” an idealised scenario of global warming, with atmospheric CO2 increasing at the rate of 1% per year until it doubles at about Year 70.
CMIP, which is currently on its sixth iteration, is still a bedrock of climate modelling today.

August 10, 1995
Fig 4 from John Mitchell's paper. Credit: Mitchell et al/Nature
John Mitchell at the Met Office is the lead author of a much-cited paper in Nature which – for the first time in a GCM – tests what impact including sulphate aerosols has on the radiative forcing of the atmosphere. The authors find that their inclusion “significantly improves the agreement with observed global mean and large-scale patterns of temperature in recent decades” and leads them to a striking conclusion: “These model results suggest that global warming could accelerate as greenhouse-gas forcing begins to dominate over sulphate aerosol forcing.”
Mitchell’s paper builds on earlier work showing how rising quantities of aerosols might affect radiative forcing. For example, Science had published a much-discussed paper in 1971 by Ichtiaque Rasool and Stephen Schneider, both then at NASA GISS, entitled “Atmospheric carbon dioxide and aerosols: Effects of large increases on global climate”. It concluded: “An increase by only a factor of 4 in global aerosol background concentration may be sufficient to reduce the surface temperature by as much as 3.5K. If sustained over a period of several years, such a temperature decrease over the whole globe is believed to be sufficient to trigger an ice age.”

September 10, 1995
Draft of IPCC’s second assessment report
The New York Times reports that it has obtained a draft of the IPCC’s second report. The newspaper says the IPCC, in “an important shift of scientific judgment”, has concluded that “human activity is a likely cause of the warming of the global atmosphere”.
However, much of the newspapers’ focus is on the IPCC’s assessment of the accuracy of climate models’ projections: “The models have many imperfections, but the panel scientists say they have improved and are being used more effectively.”
The paper notes that the scientists’ confidence has been “boosted by more powerful statistical techniques used to validate the comparison between model predictions and observations”. However, it says “despite the new consensus among panel scientists, some skeptics, like Dr Richard S Lindzen of the Massachusetts Institute of Technology, remain unconvinced”. Lindzen is quoted as saying that IPCC's identification of the greenhouse signal "depends on the model estimate of natural variability being correct". Models do not reflect this well, he says.

Credit: US Government Printing Office, archive.org
November 16, 1995
November 16, 1995
US Congressional hearing into climate models

At the US Congress in Washington DC, the House Subcommittee on Energy and Environment holds a hearing into “climate models and projections of potential impacts of global climate change”. It is part of a wider inquiry into the “integrity and public trust of the science behind federal policies and mandates”. It is chaired by Dana Rohrabacher, a Republican congressman who goes on to become a prominent climate sceptic politician in the US.
A “balance” of expert witnesses is called to give evidence. They include the climate sceptic Pat Michaels and UK climate scientist Bob Watson, who later becomes the IPCC chair. Rohrabacher says he wants to “promote dialogue” among the witnesses due to the “controversy” over the “reliability” of climate models. He asks: “Are we so certain about the future climate changes that we should take action that will change the lives of millions of our own citizens at a cost of untold billions of dollars?”
The event sets the tone and template for many other similar hearings on the Hill in the years ahead.

July 4, 1996
Santer’s ‘fingerprint’ study
Ben Santer in the film Merchants of Doubt, 2014. Credit: Everett Collection Inc / Alamy Stock Photo
Ben Santer is the lead author of an influential paper in Nature which shows that “state-of-the-art models” match the “observed spatial patterns of temperature change in the free atmosphere from 1963 to 1987” when various combinations of changes in greenhouse gases and aerosols are added. They concluded from their modelling that “it is likely that this trend is partially due to human activities, although many uncertainties remain, particularly relating to estimates of natural variability”.
This attribution study – sometimes called a “fingerprint” study – comes just months after the IPCC’s second assessment report had concluded that “the balance of evidence suggests that there is a discernible human influence on global climate”. Santer had been a key author for the chapter that shaped that wording. That statement – and Santer et al’s Nature paper – are both aggressively attacked by climate sceptics, but later substantiated by scientists.

March 15, 2000
IPCC’s Special Report on Emissions Scenarios
Professor Nebojsa Nakicenovic. Credit: Silveri/IIASA
By the end of the 1990s, climate modellers are starting to work much more closely with integrated assessment modellers to help produce projections that are more relevant to policymakers eager to find the least-cost pathways to reducing emissions.
In 2000, the IPCC publishes a special report on emissions scenarios (SRES): “They include improved emission baselines and latest information on economic restructuring throughout the world, examine different rates and trends in technological change and expand the range of different economic-development pathways, including narrowing of the income gap between developed and developing countries.”
The IPCC team is led by Nebojsa Nakicenovic of the International Institute for Applied Systems Analysis (IIASA) in Austria. The SRES are used for the IPCC’s third assessment report in 2001, but are later further developed into the Representative Concentration Pathways (RCPs) in time for the IPCC’s fifth assessment report in 2014.

November 9, 2000
Carbon cycle included in climate models

Horse-chestnut tree. Credit: imageBROKER / Alamy Stock Photo
A team of UK-based climate scientists, led by Peter Cox at the Met Office, publish in Nature the results from the first “fully coupled three-dimensional carbon-climate model”. They conclude that carbon-cycle feedbacks could “significantly accelerate” climate change over the course of the 21st century: “We find that under a ‘business as usual’ scenario, the terrestrial biosphere acts as an overall carbon sink until about 2050, but turns into a source thereafter.”
The scientists couple a dynamic global vegetation model – called “TRIFFID” – with an ocean-atmosphere model and an ocean carbon-cycle model. Adding TRIFFID means that soil carbon and “five functional types” of plant (“broadleaf tree, needleleaf tree, C3 grass, C4 grass and shrub”) are included in climate models for the first time.

October 5, 2004
‘Extreme event’ attribution study
Summer heatwave in Paris. Credit: Idealink Photography/ Alamy Stock Photo
Following the deadly European heatwave in 2003, two UK-based climate scientists, Peter Stott and Myles Allen, publish a paper (pdf) in Nature showing that it was “very likely” that “human influence” at least doubled the chances of it occurring.
Stott and Allen do not try to establish whether human-caused emissions “caused” the extreme heatwave. Rather, they use modelling to show probabilistically that emissions raised the chances of the heatwave occuring. The paper triggers many more extreme event attribution studies, many of which are performed in near-realtime so the results can be published days after the actual event.

December 5, 2007
Santer vs Douglass
University of Rochester. Credit: Peter Steiner/Alamy Stock Photo
The International Journal of Climatology publishes a paper by a team of climate sceptic scientists, led by David H Douglass at the University of Rochester, which compares “tropospheric temperature trends of 67 runs from 22 ‘Climate of the 20th Century’ model simulations” with observed temperature trends in the tropical troposphere. They argue that there is “disagreement” between the two, “separated by more than twice the uncertainty of the model mean”. The results attract a lot of media attention.
Ten months later, a diverse group of climate modellers, led by Ben Santer, publish a paper in response in the same journal. It concludes: “This claim was based on use of older radiosonde and satellite datasets, and on two methodological errors: the neglect of observational trend uncertainties introduced by interannual climate variability, and application of an inappropriate statistical ‘consistency test’.”
Santer and his colleagues then become the focus of a coordinated campaign, which includes repeated freedom of information requests for their code, emails and data. The episode is further inflamed by the “Climategate” affair in 2009, when stolen emails sent by climate scientists are selectively quoted by climate sceptics and media in an attempt to undermine climate science.
The “Santer vs Douglass” episode typifies the way climate sceptics have sought to undermine and criticise climate modelling since at least the early 1990s.

February 12, 2008
Tipping elements
A collapsing ice shelf - Larsen B in 2005. Credit: NASA Earth Observatory
A team of scientists led by Tim Lenton at the University of East Anglia publish a paper in PNAS which uses climate models and palaeoclimate data to explore climatic “tipping points” - or tipping elements, as they call them – in response to rising human-caused emissions.
The paper lists six tipping elements it believes are of “policy relevance”, should emissions continue to rise: reorganisation of the Atlantic thermohaline circulation; melting of the Greenland ice sheet; disintegration of the West Antarctic ice sheet; Amazon rainforest dieback; dieback of boreal forests; and shift of the El Niño-Southern Oscillation regime to an El Niño-like mean state.
The paper offers a stark conclusion: “Society may be lulled into a false sense of security by smooth projections of global change. Our synthesis of present knowledge suggests that a variety of tipping elements could reach their critical point within this century under anthropogenic climate change. The greatest threats are tipping the Arctic sea-ice and the Greenland ice sheet.”

March 23, 2008
Black carbon
Flames leak from a gas/oil pipe on the edge of the Sahara desert, Libya. Credit: Purple Pilchards / Alamy Stock Photo
Nature publishes a paper by two US-based scientists, Veerabhadran Ramanathan and Greg Carmichael, which examines the “dimming” influence of “black carbon” on the atmosphere. Black carbon is the term for sooty aerosols thrown into the atmosphere through the burning of fuels, such as coal, diesel, wood and dung. The paper also examines how “the deposition of black carbon darkens snow and ice surfaces, which can contribute to melting, in particular of Arctic sea ice”.
“Aerosols in aggregate are either acting to, you could say, cool the atmosphere or mask the effect of CO2,” Carmichael tells the Guardian. “[Black carbon] is the only component of this aerosol mix that in and of itself is a heating element.” The authors argue that the impact of black carbon has, to date, been underestimated by the models.

September 22, 2008 — September 24, 2008
Ecole Normale Superiéure, Paris. Credit: Photo 12/Alamy Stock Photo
After four phases of the Coupled Model Intercomparison Project (CMIP), 20 climate modelling groups from around the world gather for a meeting at the Ecole Normale Superiéure in Paris to discuss the fifth phase. They agree to a new set of climate model experiments which aim to address outstanding questions that arose from the IPCC’s fourth assessment report published the year before. CMIP5 becomes the foundational set of coordinated modelling experiments used for the IPCC fifth assessment report published in 2013.
CMIP5 includes decadal predictions (both hindcasts and projections), coupled carbon/climate model simulations, as well as several diagnostic experiments used for understanding longer-term simulations out to 2100 and beyond.

September 7, 2012
A National Strategy for Advancing Climate Modeling
NASA's visualisation of CO2 emissions in 2006. Credit: NASA
In the US, the National Research Council publishes a “National Strategy for Advancing Climate Modeling”. The report recognises that evolutionary changes to computing hardware and software present a challenge to climate modellers: “Indications are that future increases in computing power will be achieved not through developing faster computer chips, but by connecting far more computer chips in parallel – a very different hardware infrastructure than the one currently in use. It will take significant effort to ensure that climate modeling software is compatible with this new hardware.” To date, the recommendations have largely not been delivered.

September 23, 2013 — September 27, 2013
IPCC’s fifth assessment report
Stockholm, Sweden, 2013. Credit: Arseniy Rogov/Alamy Stock Photo
At a meeting in Stockholm, Sweden, the IPCC publishes the first report of its fifth assessment cycle (AR5). The report includes an evaluation of the models. It concludes: “The long-term climate model simulations show a trend in global average surface temperature from 1951 to 2012 that agrees with the observed trend (very high confidence). There are, however, differences between simulated and observed trends over periods as short as 10 to 15 years (eg, 1998 to 2012).”
This recent “observed reduction in surface warming trend” – sometimes labelled as a “slowdown” or, inaccurately, as a “pause” or “hiatus” – subsequently becomes a focus of study for climate modellers. Four years later, a paper published in Nature in 2017 seeks to “reconcile the controversies” and concludes that a “combination of changes in forcing, uptake of heat by the oceans, natural variability and incomplete observational coverage” were to blame. The authors state that, as a result of their findings, “we are now more confident than ever that human influence is dominant in long-term warming”.
Reflecting new understanding of radiative forcings, AR5 also slightly adjusts the IPCC’s range of equilibrium climate sensitivity to “1.5C to 4.5C (high confidence)”. It adds: “The lower temperature limit of the assessed likely range is thus less than the 2C in the AR4, but the upper limit is the same.”

October 6, 2017 — October 10, 2017
IPCC’s sixth assessment report
Valerie Masson-Delmotte, co-chair of Working Group I, at the 46th Session of the Intergovernmental Panel on Climate Change, 6 September 2017. 
Climate scientists gathered in Montreal for the IPCC’s annual meeting agree to the chapter outline for AR6, which is due to be published in parts over a few months in 2021-22. The working group one report will include various “evaluations” of how the models have developed and performed since AR5. It will incorporate modelling results from the sixth cycle of CMIP, as well as an extended set of RCP scenarios. Each RCP will be paired with one or more “Shared Socioeconomic Pathways”, or SSPs, which describe potential narratives of how the future might unfold in terms of socioeconomic, demographic and technological trends.


No comments :

Post a comment

Lethal Heating is a citizens' initiative