20/01/2018

'Significant' Heatwave Roasts South-Eastern Australia As Global Records Melt

Fairfax - Peter Hannam

Most of NSW will continue to bake in a prolonged heatwave well into next week, the second big warm event to get the year off to a hot start.
The roasting comes after the state posted its hottest year on record in 2017, a period which was also Australia's third-hottest year and equal-second warmest globally.
Swimmers cool off at MacCallum pool at Cremorne Point on Friday. Photo: Dean Sewell
Sydney's west climbed above the 40-degree mark on Friday, with Penrith's 40.5 degrees the hottest for the city. Observatory Hill and other eastern sites were spared the worst, with sea breezes keeping the maximum to just over 28 degrees.
A cool change that moved across southern Victoria on Friday curtailed temperatures in that state but not before Melbourne - including the area where the Australian Open is being played - exceeded 40 degrees for a second day in a row.
That change, though, will stall over NSW and central Australia "due to no systems moving in to displace this hot air", said Grace Legge, a meteorologist with the Bureau of Meteorology.
The heat will continue through the weekend, with large areas of NSW expected to endure temperatures in the low- to mid-40s.
"Coastal regions of NSW will see some relief, with sea breezes expected to keep those temperatures down," Ms Legge said. "But just inland from the coast, we could see temperatures climb to the low 40s - including the western suburbs of Sydney, where over the next four days we're expecting to see temperatures above 40 degrees."
According to the bureau's heatwave service, virtually all of NSW can expect to have at least a "low-intensity" heatwave until at least Wednesday, with some regions reaching "severe" levels.
South-eastern Australia's big heatwave has sent people flocking to beaches from Adelaide to Melbourne and Sydney. 
'Significant'
Blair Trewin, a senior climatologist with the bureau, said the heatwave was "significant but not in an overwhelming sense", and may be notable more for its duration than intensity.
By contrast, the intense heat event last February - that forced NSW to curtail electricity demand to stabilise the power grid - will likely feature in the full State of the Climate report being compiled by the World Meteorological Organisation when it is released in March, said Dr Trewin, speaking in his capacity as scientific co-ordinator of the global review.
Overnight, the WMO was one of several major agencies to release an initial assessment of worldwide weather conditions in 2017.
The agency declared global temperatures last year in effectively a dead-heat with 2015 as the second-hottest year since records began about 140 years ago, trailing only 2016.
"We can't split [2017 and 2015] in any meaningful sense," Dr Trewin said.
Notably, last year was the warmest on record without an El Nino event, which typically boosts temperatures as the Pacific takes up less heat than in other years.
(See chart below, showing the influence of El Nino, La Nina and neutral years.)
Last year and 2015 were about 1.1 degrees above pre-industrial levels, and 17 of the warmest 18 years have happened this century, the WMO said.
Other agencies, such as the US National Oceanic and Atmospheric Administration, NASA, the UK Met Office and Japan's Meteorological Agency, all ranked last year as the second- or third-warmest on record.
Stripping out the influences of El Nino or La Nina, the background warming trend from climate change has been a consistent 0.15-0.2 degrees per decade, Dr Trewin said.
While last year was notable for its average temperatures, extremes took their toll too.
"The warmth in 2017 was accompanied by extreme weather in many countries around the world," WMO secretary-general Petteri Taalas said.
"The USA had its most expensive year ever in terms of weather and climate disasters, whilst other countries saw their development slowed or reversed by tropical cyclones, floods and drought."
An example of the greater incidence of extreme weather days was Friday's reading of 46 degrees at Hopetoun Airport in Victoria's north-west.
It was the state's 28th day recorded at 45 degrees or warmer since 2001. Such a reading only happened once in the 17 years covering the 1984-2000 period, Dr Trewin said.

Links

Long-Term Warming Trend Continued In 2017: NASA, NOAA

NASA

This map shows Earth’s average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA’s Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline. Credit: NASA’s Scientific Visualization Studio.
Earth’s global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA.
Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 degrees Fahrenheit (0.90 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York. That is second only to global temperatures in 2016.
In a separate, independent analysis, scientists at the National Oceanic and Atmospheric Administration (NOAA) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies’ records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.
Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017’s global mean change is accurate to within 0.1 degree Fahrenheit, with a 95 percent certainty level.
“Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we’ve seen over the last 40 years,” said GISS Director Gavin Schmidt.

Earth’s long-term warming trend can be seen in this visualization of NASA’s global temperature record, which shows how the planet’s temperatures are changing over time, compared to a baseline average from 1951 to 1980. The record is shown as a running five-year average. Credit: NASA’s Scientific Visualization Studio/Kathryn Mersmann. Download high-definition video here.

The planet’s average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.
Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event – and with a La Niña starting in the later months of 2017 – last year’s temperatures ranked between 2015 and 2016 in NASA’s records.
In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.
Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.
NASA’s temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.
These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.
NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth’s polar regions and global temperatures.
The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available at:
https://data.giss.nasa.gov/gistemp
GISS is a laboratory within the Earth Sciences Division of NASA’s Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University’s Earth Institute and School of Engineering and Applied Science in New York.
NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
For more information about NASA’s Earth science missions, visit:
https://www.nasa.gov/earth

Links

In-Depth: Scientists Discuss How To Improve Climate Models

Carbon Brief

While global climate models do a good job of simulating the Earth’s climate, they are not perfect.

Carbon Brief’s series on climate modelling
Despite the huge strides taken since the earliest climate models, there are some climatic processes that they do not simulate as accurately as scientists would like.
Advances in knowledge and computing power mean models are constantly revised and improved. As models become ever more sophisticated, scientists can generate a more accurate representation of the climate around us.
But this is a never-ending quest for greater precision.
In the third article in our week-long climate modelling series, Carbon Brief asked a range of climate scientists what they think the main priorities are for improving climate models over the coming decade.
These are their responses, first as sample quotes, then, below, in full:
  • Prof Pete Smith: “We can get that extra level of detail into the models and check that that’s an appropriate level of detail because a more complex model is not necessarily a better model.”
  • Dr Kate Marvel: “Higher resolution is the first priority. Right now, climate models have to approximate many physical processes that turn out to be very important.”
  • Prof John Mitchell: “The top priorities should be reducing uncertainties in climate sensitivity  and reducing uncertainties in radiative forcing – particularly that associated with aerosols.”
  • Prof Daniela Jacob: “It’s important that models will be able to simulate local characteristics, so that they are able to simulate the climate in a city, in mountainous regions, along the coast.”
  • Prof Kevin Trenberth: “Precipitation. Every model does this poorly and it is socially acceptable. It has to change.”
  • Prof Piers Forster: “The biggest uncertainty in our climate models has been their inability to simulate clouds correctly. They do a really bad job and they have done ever since they first began.”
  • Dr Lesley Ott: “Understanding carbon-climate interactions. We don’t understand those processes well enough to know if they’re going to continue.”
  • Dr Syukuro Manabe: “As the models get ever more complicated – or, as some people say, sophisticated – no one person can appreciate what’s going on inside them.”
  • Prof Stephen Belcher: “Having climate models that can give us the precision around extreme weather and climate events is definitely one priority.”
  • Prof Drew Shindell: “One of the key uncertainties is clouds, understanding the physics behind clouds and how clouds interact with aerosol particles.”
  • Prof Michael Taylor: “I think there is still some difficulty in understanding the land and sea border and any advances within models would be an advantage for island states.”
  • Prof Stefan Rahmstorf: “I think a key challenge is non-linear effects, or tipping points. For example, the Gulf stream system. We still don’t know how close we are to a threshold there.”
  • Dr James Hansen: “The fundamental issue about climate change is the delayed response of a system and that’s due to the ocean’s heat capacity.”
  • Dr Doug McNeall: “We need to be adding more processes, modelling new things, and also we need to be modelling finer detail so we can better explain the climate.”
  • Dr Ronald Stouffer: “Improving the ocean simulations, particularly in the Southern Ocean. This is a very important region for the uptake of heat and carbon from human activities.”
  • Prof Adam Scaife: “There is a signal-to-noise problem evident in climate models which means that, in some mid-latitude regions, predicted climate signals are too weak.”
  • Dr Jatin Kala: “More realistic representation of vegetation processes.”
  • Dr Katharine Hayhoe: “Natural variability is really important when we’re looking over time scales of anywhere from the next year or two to even a couple of decades.”
  • Dr Chris Jones: “I think the major gaps would include the ability to trust climate models at finer and finer scales, which, ultimately, is what people want to know.”
  • Prof Christian Jakob: “In my view, the highest priority is to have more people involved in the model development process so that more new ideas can be generated and implemented.”
  • Prof Richard Betts: “We need to represent the other aspects of the climate system that aren’t always captured in the climate models [such as] tipping points, nonlinearities.”
  • Dr Bill Hare: “One of the underdeveloped areas, including in IPCC assessment reports, is evaluating what are the avoidable impacts [of climate change].”
  • Prof Detlef van Vuuren: “I think quite a number of key Earth processes are still not very well represented, including things like the role of land use, but also pollution and nutrients.”


Prof Pete Smith
Chair in plant and soil science
University of Aberdeen

I think the ESMs [Earth system models] have a pretty good representation of many of the processes, but because they’re trying to cover the whole Earth, then you have a relatively simple description of most things in the models. There are still ESMs, for example, that have a very limited representation of nutrients – so, for example, nitrogen and phosphorus limitations on plant growth in the future.
We’ve got a great representation of these things within ecosystem models that we tend to use uncoupled and we just run those on the land surface. We’ve got a good detailed representation of some of those processes in those models – but those aren’t all yet into the ESMs. So getting that level of detail in I think is important, as well as improving the regional downscaling and improving the resolution of those ESMs.
That used to be limited by computing power, but that’s no longer a limitation. So we can get that extra level of detail into the models and check that that’s an appropriate level of detail, of course – because a more complex model is not necessarily a better model.




Dr Kate Marvel
Theoretical physicist
NASA Goddard Institute for Space Studies

Higher resolution is the first priority. Right now, climate models have to approximate many physical processes that turn out to be very important; air flowing over mountain ranges, for example, or small eddies mixing water in the ocean. This is because it’s too hard to get the large and small scales right: there’s simply no computer powerful enough to keep track of very small and large scales simultaneously. Different models make different approximations and this contributes to uncertainty in their projections. But as computing power increases, we’ll be able to explicitly capture a lot of the small-scale effects that are very important to regional climate. You can think of this as sharpening the blurry picture of climate change.
Number two is better cloud simulation. Clouds are hard for models to get right and we know that different climate models don’t agree on how hot it’s going to get, in large part because they don’t agree on what clouds will do in the future. If we can get climate models to more credibly simulate current cloud patterns and observed cloud changes, this might reduce the uncertainty in future projections
Three is better observations. Satellites have been a real game-changer for climate research, but they’re not perfect. We need to keep evaluating our models against observational data and this is difficult in the presence of observational uncertainty. Long-term global datasets are often cobbled together from many different satellite and ground-based observations, and different measurements of the same variable often disagree. Dedicated long-term measurement devices like the instruments on NASA’s Afternoon Constellation (“A-train”) of satellites will help us understand reality better and this will allow us to benchmark and re-evaluate our models.


Prof John Mitchell
Principal research fellow
Met Office Hadley Centre

The top priorities should be reducing uncertainties in climate sensitivity, getting a better understanding of the effect of climate change on atmospheric circulation (critical for understanding of regional climate change, changes in extremes) and reducing uncertainties in radiative forcing – particularly those associated with aerosols.


Prof Daniela Jacob
Director
Climate Service Center Germany

I think from a societal point of view, it’s important that models will be able to simulate local characteristics, so that they are able to simulate the local climate in parts of a city, in mountainous regions, in valleys, along the coast. There are still limitations in the climate models. Although they’ve made a lot of progress over the last decades, we still do not really know how climate is changing on a local scale.
If you look at the scientific questions behind this, then I think the most important areas to look at are clouds, how to simulate clouds, the development of clouds, the life cycle of clouds, the land surface. The representation of the land cover and the land management is something which needs to be looked at.
Of course, there are many, many other questions. It really depends on what you want to use the model for. All climate models, global or regional, are made for a specific purpose. I think that’s important to have in mind. Not all models can do the same and they are not all good in the same way.
For us, the priority is to simulate the water cycle correctly. I was very interested in getting the precipitation amounts, locations and frequency, intensity, times, weird rains correct to get the runoff simulated.



Prof Kevin Trenberth
Distinguished senior scientist
National Center for Atmospheric Research

The top priorities over the next decade for improving climate models are:
  1. Precipitation. Every model does this poorly and it is socially acceptable. It has to change. By precipitation I mean all characteristics: frequency, intensity, duration, amount, type (snow vs rain etc) at hourly resolution.
  2. Aerosols. The indirect effects of aerosols on clouds are poorly done. Some processes are included, but all models are incomplete and the result is nothing like observations. This affects climate sensitivity.
  3. Clouds. This is more generic and relates to sub-grid scale processes.
  4. Land-surface heterogeneity: this is a resolution issue and deals also with complexity.
  5. Air-sea interaction and the oceans. This also relates to mixing in the ocean, the mixed layer depth and ocean heat storage and exchanges.


Prof Piers Forster
Director
Priestley International Centre for Climate
University of Leeds
By far the biggest uncertainty in our climate models has been their inability to simulate clouds correctly. They do a really bad job and they have done ever since they first began. And this has all sorts of knock-on effects. It gives a big uncertainty in projections going further forward in time because we don’t understand the way they work, and it also gives big uncertainty to things like extreme precipitation – so that we don’t understand rainfall extremes that well. So we have all these big uncertainties from our incorrect simulation of clouds.
It is intimately tied with observations but there’s also been a huge advance in the last 10 years in the way we can observe the way clouds work. We have unprecedented satellite instruments up there currently, that can really observe clouds in a far more sophisticated way than we ever have been able to before.
They’re fantastic, and by exploiting these wonderful observations we’ve got, I think we can really test the way these climate models work.



Dr Lesley Ott
Research meteorologist
NASA Goddard Space Flight Center


One area that’s really critical is cloud–aerosol interactions. It’s something that we really don’t know too much about, we’re seeing some tantalising evidence that there could be important effects but on a global scale it’s very hard to understand. For us, in our office, it took a lot of work to get our model to run with the kind of cloud microphysics and aerosol microphysics that would actually allow us to study that. We’re now at that point where we are starting to do that kind of work and I think you’re going to see in the next five or ten years a lot more research on that.
The other thing that is particularly important, which is my research area, is understanding carbon–climate interactions. Right now, one thing that not a lot of people know is that 50% of human emissions get absorbed by plants on the land and the oceans and that’s been a really valuable resource in limiting climate change to the effects we’re seeing today. If we didn’t have that valuable resource we’d be seeing things progress much more quickly, in terms of CO2 concentrations and global warming. The problem is we don’t understand those processes well enough to know if they’re going to continue. We’re seeing a lot of energy both with atmospheric observations and new observations of the land’s surface and I hope we’re going to continue to see progress.



Dr Syukuro Manabe
Senior meteorologist
Princeton University


As the models get ever more complicated – or, as some people say, sophisticated – no one person can appreciate what’s going on inside them.
What we have to do now is more of the things that I was doing in the old days when I used a simpler parameterisation of the sub-grid scale process, but keeping basic physics such as the hydrodynamical equation, radiative transfer, etc. That model is run much faster than the so-called Earth system model which they now use for the IPCC [Intergovernmental Panel on Climate Change]. And then using a much faster computer you can run a large number of numerical experiments where you can change one factor at a time, as if the model were a virtual laboratory. You can then see how the model is responding to that change.
(This is an extract taken from the Carbon Brief Interview with Manabe conducted in July 2015.)



Prof Stephen Belcher
Chief scientist
Met Office

I think the Paris Agreement really changed the agenda for climate science. And at the Met Office, we’re really focused on two aspects of improving climate models. The first is understanding extreme events and the risks associated with extreme weather and climate events – in the current climate, but also in a future climate.
For example, the kind of heatwaves we’ve seen in Europe – we had one in 2003 and 2006 – just how severe will they become and how frequent might they become? Some of the wet winters we’ve been having in Europe as well – are they going to become the new normal, or are will they just remain unusual events? So, having climate models that can really give us the precision around these extreme weather and climate events is definitely one priority.
The other priority is that in order to achieve the goals of the Paris Agreement, we’ll need to have a very close eye on the amount of carbon we emit into the atmosphere and the amount of CO2 that remains in the atmosphere. There are other factors in the climate system that drive the concentration of CO2 and hence global warming. For example, we know that as the planet warms, permafrost might melt and emit greenhouse gases of their own – warming the planet still further. But our quantitative estimate of that permafrost and the warming that might give are not very quantitatively accurate at the moment.
Secondly, about half of the CO2 we release into the atmosphere is absorbed either by plants on land or into the ocean and tightening up those numbers is really important. As we approach the targets given in Paris, the amount of precision we need on these allowable carbon budgets – to meet the temperature changes – is going to get sharper and sharper, and so we’re going to need better climate models to address those carbon budget issues.



Prof Drew Shindell
Nicholas professor of Earth sciences
Duke University

One of the key uncertainties is clouds, understanding the physics behind clouds and how clouds interact with aerosol particles. That has, unfortunately, also been a key uncertainty for a long time and is likely to remain one.
In particular, better computer power [is needed] because we do have some observations and some process understanding, but they happen at very fine spatial and temporal scales, and that’s the hardest thing to model because it takes an enormous amount of computer power.
We can get better observations from things like satellite data, but a lot of that is very challenging because the uppermost level of clouds blocks everything below and then you can’t see what’s really going on. You can fly airplanes and get detailed information, but for one short period of time and one short area. Those are really challenging things to improve from an observational perspective – and require immense computer power.
I would say that as far as advancing our ability to really look at the issue of climate change, I think one of the things we really need to do is to make our models interact more between the physical sciences and the social economics, and to really understand the link a little more closely between climate change and the drivers and impacts of climate change.



Prof Michael Taylor
Deputy dean and senior lecturer
University of the West Indies

I think for us there is still some difficulty in understanding the land and sea border and certainly any advances in differentiating that land-sea contrast within the model would be an advantage for island states – especially small island states.
Certainly advances in representing topography at a finer scale – putting the mountains in the right place, achieving the right height for the small scale – would represent significant improvements for the small islands. And improvements in coastal processes, the dynamics of coastal climate would represent improvements for the small island community.



Prof Stefan Rahmstorf
Head of Earth systems analysis
Potsdam Institute for Climate Impact Research

I think a key challenge is non-linear effects, or tipping points. For example, the Gulf stream system. We still don’t know how close we are to a threshold there. We know there is one because we know these non-linear phenomena are very sensitively dependent on the exact state of the system and so models still widely disagree on how stable or unstable the Gulf stream system will be under global warming in the future.
There is another effect which is the changes in the atmospheric circulation, including the jet stream. That’s one area of research that we are working on currently which has a really big impact on extreme weather events and it’s this kind of phenomena that we need to understand much better.
I’ve had a longstanding interest in palaeoclimate. The last few million years have been generally colder with ice ages, but if you go way back in time for many millions of years, there are much warmer climates on Earth and we are very interested in modelling these. But it is quite difficult because of the long time scales that you have to do deal with so you can’t use the models that are used to simulate a hundred years or two hundred years. You have to design models that are highly computationally efficient to study palaeoclimate.



Dr James Hansen
Climate scientist
Columbia University

The fundamental issue about climate change, the difficulty, is the delayed response of a system and that’s due to the ocean’s heat capacity.
But then the effective heat capacity, the surface temperature, depends on the rate of mixing of the ocean water and I have presented evidence from a number of different ways that models tend to be too diffusive because of numerical reasons and coarse resolution and wave parameter rise, motions in the ocean. It can tend to exaggerate the mixing and, therefore, make the heat capacity more effective.



Dr Doug McNeall
Researcher in climate change impacts
Met Office Hadley Centre

As we’ve gone through time, climate models have got more complex, so that’s not only been due to increasing resolution, but also adding more processes. I think we need to continue both of those trends. We need to be adding more processes, modelling new things and also we need to be modelling finer detail so we can better explain the climate. We need to better explain the impacts of climate on the systems we care about, such as the human systems, ecological, carbon cycle systems. If you make the model better, if you make it look more like reality, it means that your knowledge of how the system will change gets better.


Dr Ronald Stouffer
Senior research climatologist and group head of the Climate and Ecosystems Group at the Geophysical Fluid Dynamics Laboratory (GFDL)
Princeton University
The top priorities over the next decade for improving climate models are:
  1. Evaluating and understanding climate response to changes in radiative forcing (greenhouse gases and aerosols).
  2. Improving the cloud simulation (distribution 3D and radiative properties). This is of first importance for better estimates of the climate sensitivity.
  3. Improving the ocean simulation particularly in the Southern Ocean. Models do a fairly poor job currently and this is a very important region for the uptake of heat and carbon from human activities.
  4. Higher model resolution. This helps provide improved local information on climate change. It also reduces the influence of physical parameterisations in models (a known problem).
  5. Improve the carbon simulation and modelling in general. Modelling land carbon changes is particularly a challenge do to the importance of small local scales.


Prof Adam Scaife
Physicist and head of monthly-to-decadal prediction
Met Office Hadley Centre

There is a signal-to-noise problem evident in climate models which means that, in some mid-latitude regions, predicted climate signals are too weak. This possibility was realised in the past and has actually been around in climate models for many years.
It is the top priority of my research group to try to solve this problem to improve our climate predictions and, depending on the answer, it could affect predictions on all timescales from medium range forecasts, through monthly, seasonal, decadal and even climate change projections.



Dr Jatin Kala
Lecturer in atmospheric science and ARC DECRA fellow
Murdoch University, Perth

The top priorities over the next decade for improving climate models are:
  1. Improving our abilities in simulating climate extremes.
  2. Improving the skill of climate models in simulating key modes of natural climate variability.
  3. Moving towards unstructured climate model domains (current models use a square/rectangular domain, but using mesh approaches is the next step).
  4. More realistic representation of vegetation processes.
  5. Improving convection parameterisations



Dr Katharine Hayhoe
Climate Science Center director
Texas Tech University


Climate modelling is an enormous undertaking. I think few people realise just how complex these models are. As soon as there’s a new supercomputer available anywhere in the world, there’s a climate model waiting to be run on it because we know that many of our physical processes right now are not being directly represented. They have to be “parameterised” because they occur at spatial or time scales that are smaller than the grids in the time steps that we use. So the smaller the spatial grids and the smaller the time step we use in the model, the better we’re able to actually explicitly resolve the physical processes in the climate.
We’re also learning that natural variability is really important when we’re looking over time scales of anywhere from the next year or two to even a couple of decades in the future. Natural variability is primarily controlled by exchange of heat between the ocean and the atmosphere, but it is an extremely complex process and if we want to develop better near-term predictive skills – which is looking not at what’s going to happen in the next three months but what’s going to happen between the next year and 10 years or 20 years or so – if we want to expand our understanding there, we have to understand natural variability better than we do today.
(This is an extract taken from the Carbon Brief Interview with Hayhoe conducted in November 2017.)


Dr Chris Jones
Lead researcher in vegetation and carbon cycle modelling
Met Office Hadley Centre

I think the major gaps would include the ability to trust climate models at finer and finer scales, which, ultimately, is what people want to know. At a global scale we understand the physics very well about how greenhouse gases trap energy in the atmosphere, and so the models do a pretty good job of the global scale energy balance and how the world as a planet warms up. We can recreate the 20th century global climate patterns pretty well and we know why that is.
When we start to get into the details that really affect people, that’s where the models are not yet perfect, and that’s partly because we can’t represent them in enough fine scale detail. There is always a big push as soon as we get a new computer to try and increase the resolution that we represent, and we’ve seen them get better and better in that respect over the years.
The other aspect and something that I work on is increasingly trying to look at the interactions between climate and ecosystems, and if what that allows us to do is to inform climate negotiations around things like carbon budgets, so how much CO2 can we emit to stay within a certain target.



Prof Christian Jakob
Centre deputy director
School of Earth, Atmosphere and Environment Monash University
In my view, the highest priority is to have more people involved in the model development process so that more new ideas can be generated and implemented. This has proven difficult.
Other priorities would be to improve the physical realism of the models, in particular the representation of precipitation and clouds, and to significantly increase the model development “workforce” in the relevant areas.



Prof Richard Betts
Head of climate impacts
University of Exeter & Met Office Hadley Centre

It’s worth saying at first that they are remarkably good already at simulating the general patterns of climate, the general circulation of the atmosphere and the past trend of global temperatures. But we still see systematic biases in some of the models so we have to often correct for these biases when looking at other models for impact studies. It would be good to be able to eliminate that because that introduces another level of uncertainty and inconsistency. Say we could have detailed, realistic, regional climates that don’t require this adjustment that would be a major victory.
The other thing we need to do is to find ways to represent the other aspects of the climate system that aren’t always captured in the climate models [such as] tipping points, non-linearities. They don’t always, or hardly ever, emerge from the models. You can artificially force the models to do this. We know these things have happened in the real climate in the past. We need to find ways to reproduce these in a completely realistic way so that we can do a full risk assessment of future climate change including these surprises that may occur.



Dr Bill Hare
Director
Climate Analytics

I think one of the important issues is to be doing modelling of the climate system is consequences of fully 1.5C pathways and maybe even more than that. This would allow us to begin to understand how we could prevent some of the major tipping point problems that we can already foresee coming, even for 1.5C warming, and to try and understand what it would take to protect and sustain important natural ecosystems such as coral reefs, or to prevent ice sheet disintegration.
One of the underdeveloped areas, including in IPCC assessment reports, is evaluating what are the avoidable impacts [of climate change]. It’s very hard to find a coherent survey of avoidable impacts in an IPCC assessment reports. I think we need to be getting at that so we can better inform policymakers about what the benefits are of taking some of the big transformational steps that, while economically beneficial, are definitely going to cause political problems as incumbent power producers and others try and defend their turf.
(This is an extract taken from the Carbon Brief Interview with Hare conducted in November 2017.)



Prof Detlef van Vuuren
Senior researcher
PBL Netherlands Environmental Assessment Agency

For me, broadening the representation of different factors would have a higher priority than deepening the existing process representation. I think quite a number of key Earth processes are still not very well represented, including things like the role of land use, but also pollution and nutrients. I would see that as a high priority. Activities are going on in this area, no doubt. But I personally think that the balance might shift still in this direction.
Second, ensuring somehow that we keep older versions of the models “active”. The idea sounds attractive to me that in addition of having ever better models, but still being slow despite progress in computing power, we would also the ability to have fast model runs. This could be used for more uncertainty runs, having larger ensembles, exploring a wider range of types of scenarios.
Finally, I would expect that there will be a further representation of the human system in Earth system models (ESMs) and that integrated assessment models (IAMs) will try to be more geographically explicit – in order to better represent local processes, such as water management and presence of renewable energy. These together might mean that there is the agenda of merging ESMs and IAMs more. I think this is interesting, but, at the same time, it is also very challenging as both communities already are rather interdisciplinary (so one would risk having models based on different philosophies and being too complex to understand the results).

Links