Angus Ferraro

A tiny soapbox for a climate researcher.


2 Comments

EUMETSAT Conference 2014: Final highlights

2014-09-27 11.27.13

Headquarters of the WMO, which we visited during the conference for a discussion on the socioeconomic benefits of meteorological satellites.

The EUMETSAT Meteorological Satellites Conference 2014 featured a lot of new science. Two particular points which stood out to me was the assimilation of new products into numerical weather forecasting systems, and the use of satellite data in improving our conceptual understanding of weather systems.

Until this conference I was not aware how new it was to incorporate soil moisture into numerical weather forecasting systems. Such forecasting systems spend a good deal of resources on assimilating observational data to initialise the forecast. This is very important because, as pioneering work by Ed Lorenz showed back in the 1960s, tiny differences in the initial state of an atmospheric model (and of course the real atmosphere) can lead to huge differences in the resulting forecast, even for relatively short-range forecasts.

Soil moisture is clearly a useful thing to know about in our forecasts. For weather forecasts it mainly plays a role in supplying water for weather systems. Wet surfaces supply water to the atmosphere, causing or intensifying rainfall.

A few years ago soil moisture satellite products were not considered mature enough to assimilate into weather forecast systems. This is partly because our measurements were quite uncertain (we couldn’t attach very accurate numbers to them), but also because our uncertainty was poorly characterised (we didn’t know how accurate our measurements were). In a sense, the latter is more important. Like much of science, the point is not always knowing things exactly, but accepting that it’s impossible to achieve perfect accuracy and to at least know exactly how certain we are about a measurement (Tamsin Edwards has a related blog post focusing on climate rather than weather).

After some experimental studies showed the potential for soil moisture data to improve weather forecasts, operational forecasting centres across the world began to adopt this extra data source – the ones I heard about at the conference were ECMWF and the UK Met Office, but there are probably others.

Now let’s move to something less mathematical, but equally as important and exciting. On Thursday I listened to two excellent presentations on the Conceptual Models for the Southern Hemisphere (CM4SH) project. The rationale behind CM4SH is that the vast majority of weather forecasting ‘wisdom’ is derived from Northern Hemisphere perspectives, through an accident of history. But understanding the weather of the Southern Hemisphere isn’t as simple as flipping everything upside down. Although the physics of the weather is clearly the same, the actual meteorological situation in Southern Hemisphere countries is different. For example, South Africa lies in the midlatitude belt like Europe does, but it sits rather closer towards the Equator, so the same weather system could have different effects. The configuration of Southern Hemisphere land masses is very different, and that leads to rather different weather behaviour.

CM4SH is a collaboration between the national meteorological services of South Africa, Argentina, Australia and Brazil. The work focused on building up a catalogue of typical meteorological situations in different regions of the Southern Hemisphere, analysing similarities and differences. The international CM4SH team used Google Drive to build a catalogue of these situations, their typical causes, behaviour and effects. Satellite imagery is obviously a major part of the catalogue, as it allows forecasters to track the flow of moisture, presence of clouds, direction and strength of winds. The resulting catalogue allows Southern Hemisphere forecasters to classify meteorological situations and quickly find out the typical effects of different systems. For example, if a forecaster sees a particular meteorological configuration, they can quickly check the catalogue for the effects of similar situations in the past, and see that they need to assess the risk of, say, flooding, in a vulnerable region.

I think projects like this reflect the power of the Internet to supercharge our science. Earlier this week I wrote about how the data from the new GPM mission were available and easily accessible within weeks. GPM is a huge international collaboration combining the resources of a whole constellation of satellites. CM4SH is a project which makes use of expertise from four national meteorological services to create an unprecedented collaborative resource for forecaster training and education, freely available. The CM4SH catalogue will grow over time and become more refined – the beauty of collaborative projects like this is that, as long as someone does a little pruning now and then, they can only ever become more useful.

EUMETSAT Post 1: Challenges and advances in satellite measurement

EUMETSAT Post 2: Socioeconomic benefits of meteorological satellites

Advertisements


2 Comments

EUMETSAT Conference 2014: Socioeconomic benefits of meteorological satellites

Globally, governments spend about $10 billion on meteorological satellites every year. That’s a lot of money. How do we know it’s worth it?

Yesterday night the EUMETSAT conference branched off to the WMO for a side-event asking that very question. I was impressed by the rigour of their calculations, but also by the thoughtful responses to the question of how this information should – and should not – be used.

Alain Ratier, Director of EUMETSAT, presented the results of a comprehensive activity aiming at calculating the benefit-cost ratio to the EU of polar-orbiting meteorological satellites. The cost of these things is relatively easy to estimate, but the benefits are a little more difficult. They approached the problem in two steps: first, what is the economic benefit derived from weather forecasts? Second, what impact do meteorological satellites have on weather forecast skill?

The resulting report contains some fascinating facts and figures. It has been estimated that as much as one third of the EU’s GDP is ‘weather-sensitive’. Of course, this isn’t the same as ‘weather forecast sensitive’, but it at least gives a sense of potential vulnerability. The report concluded that the total benefit of weather forecasts to the EU was just over €60 billion per year. Most of that comes in the form of ‘added value to the European economy’ (broadly, use of weather information to help manage transport networks, electricity generation, agricultural activities, and so on), but there are also contributions from protection of property and the value of private use by citizens.

Compared to the calculation of the economic benefits of weather forecasts, the calculation of the effects of satellite data on those forecasts is quite straightforward. One can assess this by ‘suppressing’ source of data in our weather forecasts. Forecasts proceed by using a numerical model of atmospheric physics to predict the future atmospheric state. Since weather prediction is a chaotic problem, it’s important we start the forecast from as close as possible a representation of the real atmospheric state. This is called initialisation and it’s absolutely crucial to weather forecasting.  In order to calculate the effects of satellite information, we can simply exclude satellites from the initialisation phase of the weather prediction.

(left) 5-day forecast for Superstorm Sandy, (middle) the forecast without polar-orbiting satellite data and (right) the actual conditions that occurred. Credit: ECMWF.

The results are quite astounding. Satellite data contributes 64% of the effect of initialisation in improving 24-hour forecasts (the other 36% comes from in-situ observations). This approach reveals that measurements from a single satellite, the EUMETSAT MetOp-A, accounts for nearly 25% of all the improvement in 24-hour forecast accuracy derived from observations. MetOp-A is a relatively new platform, indicating that recent advances are providing huge benefits to weather forecasts.

The impact of satellite observations is vividly illustrated by considering 5-day forecasts of the track of Superstorm Sandy made with and without satellite initialisation. Without the use of polar-orbiting satellites, forecasters would not have predicted that the storm would make landfall on the Western US coast. As it was, the 5-day forecast of the storm track was remarkably close to reality, allowing forecasters to issue warnings of imminent risk of high winds and flooding.

The conclusion is that meteorological satellites provide benefits that outweigh their costs by a factor of 20. This is a conservative estimate in which high-end cost estimates have been compared with low-end benefit estimates. One reason we might expect benefit estimates to be low is that private companies are often reluctant to reveal how they use weather forecasts, either because this information is commercially sensitive or because they risk being charged more for the forecast data they receive!

It’s important to consider the limits of this approach. The obvious one is that cost-benefit estimates do not include the number of human lives that have been saved by weather forecasts. Not only is this difficult to calculate, it’s also impossible to put an economic value to. It would be very interesting to see if the toolbox of social science research has some methods to assess the ‘social’ part of the ‘socioeconomic’ benefits, moving away from attaching monetary value to things and considering those benefits which aren’t as easy to quantify. This doesn’t have to mean human life; any non-monetary social benefit of weather forecasting could be considered.

I think this is especially valuable because it’s questionable whether the cost-benefit approach is truly appropriate. Cost-benefit analyses frame things in a certain way; the WMO and EUMETSAT representatives at the meeting were well aware of this. They may imply greater certainty than is appropriate, and they may encourage a naively quantitative approach to what is fundamentally a qualitative problem: is it for better or for worse that we have meteorological satellites? Answering such a question involves some value judgements simple quantitative approach can gloss over. As LP Riishøjgaard pointed out, although we can make this kind of cost-benefit estimate ‘frighteningly’ easily, it’s not obvious that we should.

EUMETSAT Post 1: Challenges and advances in satellite measurement.

EUMETSAT Post 3: Final highlights.


2 Comments

Transformational Climate Science – the future of climate research

On 15-16 May a diverse group of climate researchers gathered at the University of Exeter to discuss the state of climate change following the publication of the IPCC Fifth Assessment Report and the future of the field. In a previous post I discussed some of the key themes. Here I’m going to summarise some of what went on at the conference in terms of how we should proceed with climate research in the future. It will be biased towards physical science, since that’s my personal area of interest.

What are the outstanding challenges in climate research? What are the areas that need further investigation? Should the IPCC process function as a driver for new research efforts?

Science & policy panell (left to right): Thomas Stocker, Saffron O’Neill, Georgina Mace, Andrea Tilche, Asuncion St Clair, Chris Field. Credit: University of Exeter via Flickr.

I think the final question there is an especially interesting one. The role of the IPCC is to bring together diverse research findings and assess our state of knowledge. And yet, sometimes it is seen as an end in itself. One of the speakers at the conference noted he sometimes sees research justified as ‘important for the IPCC assessment’, and that this is a big turn-off. If that’s the best thing the researcher can say about their work it’s probably not going to be that interesting. Of course, it might be that the research is fascinating and yields new insight into some of the big challenges of contemporary climate science. In that case the authors should say so. The challenges of contemporary climate science are not challenges because the IPCC says so; they are challenges because there are scientific and policy questions that need answering. Thomas Stocker, in his remarks, noted that one of the most important things to do in future climate research is to continue with ‘curiosity-driven research’. There are many examples of pure research that did not have any obvious application spawning major advances, often with great commercial success.

I’m no science policy scholar, so I won’t discuss where the balance should lie between ‘pure’ and ‘applied’ research, but this conference provided some food for thought. Some speakers emphasised both equally, generating a tension which isn’t easily resolved. Indeed, the majority of the ‘challenges’ identified at the meeting fell on the ‘applied’ side in the sense that they were suggestions to make climate research more policy-relevant. Perhaps that is unsurprising at a meeting structured around the IPCC, with its strong emphasis on policy-relevance.

One of the main challenges identified during the meeting was moving from the robust aspects of climate theory to those phenomena which actually matter to people on the ground. Robust aspects of climate theory are largely thermodynamically driven, argued Stephen Belcher. We understand that the accumulating energy balance of the Earth will lead to warming, and that the land will warm faster than the ocean. We understand that surface warming leads to greater evaporation and consequently, on average, greater precipitation. But the things we really care about are rather smaller in scale. We experience climate through weather events, and these are influenced as much by dynamic as thermodynamic factors. Unfortunately, we have much less confidence in our understanding of these dynamical processes. They have smaller spatial scales and shorter temporal scales, and so they are much more computationally demanding to model. They involve processes which are not well understood. Ted Shepherd has spoken similarly about the need to focus on the climate dynamics of global warming. It certainly seems like a fertile area for future research, though also a very challenging one.

On the subject of things that people actually care about, Mat Collins and David Stephenson both discussed moving from simplistic averages to the broader statistics of climate. We experience climate through weather, and we care about it most of all when it’s extreme. It’s the ‘tails’ of the probability distribution of weather events that we care about. Unfortunately, said Mat Collins, we don’t really have a good idea about how to assess this. Our current batch of climate model simulations are a statistically questionable sample – they have known deficiencies, biases and interdependencies. We need to address this or develop techniques to deal with it.

On the theme of translating our physical understanding into more relevant information, there was also some discussion of modelling of the politico-economic systems. Integrated Assessment Models attempt to do this, but there is no coordinated intercomparison of these models like there is for climate models. Some at the meeting objected, saying we don’t have good enough theory to be able to credibly model economics. Perhaps that’s true, but just because something is complicated and uncertain doesn’t mean we shouldn’t try to model it; in fact, perhaps it means we should! An intercomparison would at least help us know where we stand.

A final note: this continued emphasis on relevance seems to me to require a greater role of values in presenting stories about what humans care about. Simon Caney spoke about the major breakthrough of including ethicists and philosophers in WG3. More broadly, I think a move to greater policy-relevance would need everyone involved to be crystal clear about what is factual and what it normative (value-based). People were mostly good at that in this meeting. A productive discussion on climate change needs good-quality factual basis and a wide range of normative viewpoints. There was even some discussion about how it might required new forms of collaborative decision-making.

Regardless, the very necessary shift towards policy relevance will mean the potential for even greater controversies. Sam Fankhauser spoke about the need to develop very clear channels for communication to help get around this: ‘whatever we say will be used in that very emotional debate’. It’s difficult and sometimes downright unpleasant, but I think ultimately we have to embrace that.

claimtoken-5385ae663679f


3 Comments

Transformational Climate Science – approaching the problem of climate change

On 15-16 May a diverse group of climate researchers gathered at the University of Exeter to discuss the state of climate change following the publication of the IPCC Fifth Assessment Report and the future of the field. In a previous post I discussed some of the key themes. Here I’m going to summarise some of what went on at the conference in terms of how we should approach climate change.

How does the IPCC work? Is climate research doing what it should? Should it change?

Chris Field presents an overview of the AR5 WG2 report. Credit: University of Exeter via Flickr.

The Transformational Climate Science meeting had sessions structured around the three IPCC working groups (The Physical Science Basis; Impacts, Adaptation and Vulnerability; Mitigation of Climate Change). However, the IPCC is not the bottom line in climate research. It’s important to remember that its main role is to summarise our state of knowledge rather than to do new research (though it does do this as well to some extent). However, the IPCC remains a convenient ‘hook’ on which to hang our deliberations about climate change, which is presumably why the meeting was structured as it was.

As a physical scientist, I was looking forward to learning about working groups 2 and 3. Working Groups 2 and 3 (WG2 & WG3) bring together an astonishingly broad group of people: physical scientists, economists, sociologists, political scientists, philosophers…I got the impression the level of ‘cohesion’ was a little lower in these working groups than WG1. In WG1 everyone has different specialisms, but participants probably understand each others’ way of thinking well, whereas I don’t think that would be the case for people coming from diverse cognitive traditions in WG2 and WG3.

Aside from the need to bring together people with different expertise to cover the subject matter, there’s another benefit to this diversity. In the meeting a number of IPCC authors acknowledged their work could not be completely free of value judgements. By bringing together a diverse group of people, the hope is that at last a range of different value systems can be considered. A number of authors also made it explicit when they were trying to be objective and reporting ‘IPCC opinion’, and when they were talking about their own personal opinion.

One of the challenges faced by the authors of the WG2 report was the tendency of negative impacts of climate change to be reported more than positive ones. Sari Kovats, in her remarks, explicitly noted this and pointed out this was something authors were aware of and attempted to deal with as best they could. She also described what she saw as the problems in writing a report with limited quantitative research. She gave the example of the Russian heatwave and wildfires of 2010. We do not have a good idea of the impacts of this event on human health, economic productivity or food supply. In short, we lack good data. This problem becomes worse in less developed countries, which is understandable but frustrating since we might also expect such countries to be more vulnerable to climate risks.

I thought Sari’s presentation was one of the most interesting at the meeting. It described nicely what the state of the art is when it comes to studying climate impacts. She described the challenges of interpreting small-scale qualitative studies with the goal of drawing conclusions for quantitative assessments of climate risk. Then she outlined what she thought WG2 did well and what she thought it didn’t. This includes the problem that less developed countries do not have the demographic and health data needed to assess climate impacts, and that the report did much better at describing regional inequalities in impacts than it did the socioeconomic inequalities. In a globalised world, perhaps socioeconomic divides are as important as geographical ones.

Chris Field gave some thoughts on the role of WG2. He saw it as a prompt for discussion of publicly acceptable solutions – the start of a dialogue rather than its end. I found this extremely encouraging, and in line with previous discussions of the importance of considering the value systems of different stakeholders.

I admit to finding this surprising. I had rather lazily assumed that IPCC reports didn’t include discussion of normative aspects of climate science and policy. It was encouraging to see Simon Caney talk specifically about this point. For the first time the WG3 report included a section on ethics. He pointed out that ‘dangerous’ is a value judgement, and it was vitally important to consider peoples’ values. He gave the example of people who say ‘we should do whatever it takes to tackle climate change’. They almost certainly don’t mean that. Caney pointed out that different people have different priorities, but that it was unlikely anyone genuinely things climate change is the only priority.

Such perspectives are very valuable. Caney also brought in the view that the ‘right to emit’ is an odd concept. What matters for people is the access to energy to enable them to fulfil their requirements. He argued that Amartya Sen’s perspective on serving capabilities was more relevant than considering every person’s equal right to emit greenhouse gases. The emissions are a side-effect of the requirement for energy, and we should view responses to climate change in terms of serving capabilities rather than picking out such a side-effect.

One final thought – Saffron O’Neill pointed out that media coverage of WG1 is greater than either WG2 (one third less) or WG3 (three quarters less). Interestingly, the amount of Twitter activity on the conference hashtag also seemed lower during WG2 and WG3 sessions. It’s interesting to consider why this might be the case. One simple reason might be that the WG1 report is released first. But is there something deeper here? Do we ‘value’ the explicit and factual nature of WG1 more than the difficult, fuzzy, value-laden world of WG3? Perhaps, but I think that’s a shame. It seems especially odd that those who self-identify as ‘sceptics’ focus so much on WG1, when there’s a whole lot more stuff up for legitimate debate in WG2 and WG3.


12 Comments

Transformational Climate Science – meeting report

On 15-16 May 2014, the University of Exeter hosted an impressive array of climate change researchers from across the world. It was a medium-sized conference discussing the state of climate change research across all three working groups of the Intergovernmental Panel on Climate Change, along with goals and challenges for the future.

I found the meeting absolutely fascinating for all manner of reasons, most of which I hope to cover in two following blog posts. This post is something of an introduction.

Conference attendees gathering in the University of Exeter’s Forum. Credit: University of Exeter via Flickr.

One of the most obvious draws for me was that it brought together people from all three IPCC working groups. As a physical scientist I am familiar with the workings and results presented by the first working group, but the other two are rather more mysterious to me. This meeting served as a great summary. In case you’re not aware, the IPCC reports are produced by three separate groups:

These working groups operate rather separately. Once they have all released their reports they are combined in a synthesis report. The synthesis report for the Fifth Assessment goes to governments in October 2014. So, where next?

In the next two blog posts I’m going to discuss two themes which I felt ran through the conference.

The first is: how should we approach climate change? What kind of discussions should we be having, and how should they work? How should decisions be made?

The second is: what is the future of climate research? What information do we need and how can we get that information?

These questions are clearly inter-related. The first question is more of a political one, but the second one is clearly also politically relevant, as ultimately the choice of what information we need lies with policymakers and the public. This is one of the over-arching topics which transcended both of the themes: that climate research and policymaking is a mixture of facts and values. In simple terms: it is a fact that the planet has warmed, will continue to warm to a greater or lesser degree, and that this warming will have impacts. However, what we do about it (or indeed whether we do anything about it) is a question of values. It is a normative question in which there is no single right answer.

Even though facts might be seen as ‘valueless’, many of the speakers at the meeting argued there was no such thing. Asuncion St Clair quoted Bruno Latour: ‘no knowledge is neutral’. The way facts are presented requires the imposition of some kind of value system. Ottmar Edendorfer said at the conference that he sees the role of the IPCC as akin to that of a map-maker. The map-maker doesn’t tell the user which route to take. The map-maker examines the landscape and maps out the features, obstacles and characteristics of all paths. And yet the map-maker can’t just present the ‘facts’. The choice of what goes on the map depends on what the map-maker thinks the user needs. Take, for example, the difference between political and topographic maps. One presents largely artificial boundaries between nation-states; the other presents details of the landscape. Which one you choose would depend on your needs.

Even though it’s not possible to be completely neutral, then, perhaps the IPCC could try to address this problem by providing as much information as possible. Of course, this doesn’t make it very readable and that’s why there are two summaries that attempt to make the make points easier to grasp: the Summary for Policymakers (the content of which has to be agreed to by governments) and a Technical Summary (which doesn’t). But the choice of what goes in there might also be normative.

Given its stated goal to be ‘policy relevant, not policy prescriptive’, and the enormous complexity of its subject matter, the IPCC often makes very careful statements emphasising precisely what we do and do not know. Chris Field pointed out that this leads to something of a problem. He said that some of the statements turned out so vague that they were open to almost any interpretation. Different media outlets could make very different readings of the report and come to sometimes diametrically opposed conclusions!

This raises the issue of framings. ‘Framing is everything in this debate’ said Georgina Mace. What this means is that, given a more-or-less neutral presentation of information there is no single implication that naturally comes out. The implications of the findings of the IPCC depend on how one views the world. At the meeting Saffron O’Neill presented the results of some of her work on media framing of AR5. Common frames included: ‘settled science’, ‘unsettled science’, ‘security’ and ‘morality and ethics’. She pointed out that different frames implied very different policy options.

In the coming blog posts I hope to draw out some more detail on the two main areas of the conference: how should we approach climate change and what is the future of climate research? After all that talk of framings it’s important to say that these are my personal impressions, and not an objective report. If you want to find out exactly what went on at the meeting, you can catch up on the presentations and panel discussions on the website.

Other coverage:


2 Comments

How do we decide whether geoengineering is worth it?

Citation: A J Ferraro, A J Charlton-Perez, E J Highwood (2014) PLOS ONE, doi:10.1371/journal.pone.0088849

Some have proposed we take a different approach to climate change and attempt to stop global warming by reflecting sunlight. We have a new paper out today which asks the question: how do we decide whether such geoengineering would be effective?

Maps of climate model simulations using the risk matrix. The simulation uses stratospheric aerosols to balance the surface warming from a quadrupling of carbon dioxide.

Maps of effectiveness of geoengineering using a risk approach. The simulation uses stratospheric aerosols to balance the surface warming from a quadrupling of carbon dioxide.

Does geoengineering have the potential to reduce climate risk?

One way to exert a cooling influence on the climate would be to pump tiny particles up into the stratosphere, where they would reflect a small amount of the Sun’s energy. Should we consider intentionally modifying our environment in this way in order to affect the climate? Some argue there is a chance of unintended side effects, and that such meddling is too risky. Others argue the opposite: that it is too risky to allow global warming to continue.

What are these risks? A basic way to think about it is that people are adapted to our present climate. They are used to a particular mix of warm and cold, wet and dry. As climate changes, this mix will also change, posing a risk to those not prepared for it. For example, a warmer climate might be seen as a risk for healthcare systems not equipped to deal with medical problems associated with heat waves. A wetter climate might increase the risk of flooding. Risks like this could be costly – which is essentially why climate change could pose a problem.

Could geoengineering be used to help? Geoengineering with stratospheric aerosols might pose risks of its own: reduced rainfall, depletion of the ozone layer. It might also produce benefits: reduced warming and enhanced agricultural productivity. We need a way to compare the risks and benefits of geoengineering with the risks and benefits of not geoengineering (here, we are assuming we don’t do a good job of reducing greenhouse gas emissions).

How do we weigh up different kinds of risk?

Consider this: you are diagnosed with a medical condition which may deteriorate in future and cause you difficulty. You are given the option of a treatment which might stop the symptoms of the disease but may also have other side-effects. Do you take the treatment? You have to weigh up the risks.

A matrix showing the different outcomes of geoengineering. On the horizontal axis is the probability of a big climate change under carbon dioxide. On the vertical axis is the probability of a big change in climate under geoengineering.

A matrix showing the different outcomes of geoengineering. On the horizontal axis is the probability of a big climate change under carbon dioxide. On the vertical axis is the probability of a big change in climate under geoengineering. [EDIT: Thanks to the reviewer who suggested this method of presentation!]

In the same way we have to weigh up the risks to decide whether geoengineering is worthwhile. We would want it to reduce climate risk compared to not geoengineering. But there’s another layer of complexity here. Perhaps the reduction in risk happens somewhere that wasn’t actually at high risk of big climate changes in the first place. So perhaps no one cares?

We looked at this by dividing climate risk into four possible outcomes, shown in the diagram on the left. The horizontal axis shows the chance of getting a substantial climate change in the first place from carbon dioxide. The vertical axis shows the chance of getting a substantial change from geoengineering. So, if geoengineering reduces climate risk but there wasn’t much risk to start with (low change of substantial climate change on the horizontal axis). we classify geoengineering as ‘benign’ (it hasn’t really done much). If geoengineering reduces risk where carbon dioxide increases risk we classify geoengineering as ‘effective’. But what if geoengineering increases risk? We classify it as ‘ineffective’ if geoengineering introduces climate risk in a similar manner to carbon dioxide. Finally, if geoengineering introduces climate risk into areas which were not previously at risk from carbon dioxide-driven climate change, we classify geoengineering as ‘damaging’.

This way of looking at things can be used to classify climate changes. The maps in this post give an example: temperature and precipitation from a climate model. The ‘global warming’ case involves a climate with levels of carbon dioxide four times what we have now, and a climate about 4 degrees C warmer. The ‘geoengineering’ case uses stratospheric aerosols to counterbalance this warming. So as expected, if you look at temperature, geoengineering is largely effective. But rainfall looks rather different. Geoengineering is not effective in quite large parts of the globe.

Trade-offs

We have made some subjective choices here, and different choices would give quite different results as to the effectiveness of geoengineering. To further complicate things, I would expect different climate models to paint quite different pictures of regional changes.

Geoengineering isn’t necessarily good or bad. It involves a trade-off between risks. These risks are different for different aspects of climate. As these (and many previous) results have shown, it might not be a good idea to use geoengineering to counterbalance all warming, because this would produce large rainfall changes. Approaches like the one described here could be used to find what the optimum level of geoengineering is that would minimise changes in both temperature and rainfall.


Leave a comment

Impact of geoengineering on rainfall could be greater than we thought

Citation: A J Ferraro, E J Highwood, A J Charlton-Perez (2014) Environ. Res. Lett. 9 014001 doi:10.1088/1748-9326/9/1/014001

Aerosol layer (grey stripe in centre) produced by the 1991 eruption of Mt. Pinatubo.

I have a paper out today (with my PhD supervisors Ellie Highwood and Andrew Charlton-Perez) which suggests that the impact of geoengineering on rainfall in the tropics could be greater than we thought.

Geoengineering is a proposed response to climate warming driven by greenhouse gases. Basically, the idea is to mimic the effects of a large volcanic eruption on the Earth’s climate by injecting tiny particles called aerosols into the stratosphere. These particles would reflect a small amount of the energy coming from the Sun, cooling the planet. The basic idea makes sense, and from observing the climate following volcanic eruptions we know it could provide some cooling.

It’s also well understood that using geoengineering to counteract the warming effects of greenhouse gases and bring the surface temperature down would reduce global rainfall to levels lower than those we would get if there was no geoengineering or enhanced greenhouse gas levels. This is because the reduction in solar energy reaching the surface means there is less energy available to evaporate water, so the atmosphere has less water available to fall as rain.

Temperature changes from carbon dioxide and geoengineering

Tropical temperature changes from carbon dioxide and geoengineering

But my research suggests there’s another effect stratospheric aerosols have on rainfall, especially in the Tropics. Here, rain is mainly produced by towering convective clouds which transport heat energy up from the surface to the atmosphere.

Our paper shows that aerosols in the stratosphere emit radiation down into the troposphere below, interfering with this convection. Geoengineering aerosols emit energy (in the form of radiation, as shown in the picture above) downwards into the troposphere, which causes the upper troposphere to warm up. In essence, the heating from the aerosol increases the stability of the tropical troposphere.

We don’t see in the increase in stability when geoengineering is represented by just turning down the Sun (right-hand panel in the picture above) because there isn’t any aerosol in the stratosphere to emit radiation downwards*.

This effect could be quite important depending on how strongly aerosols interact with radiation in the way I just described. In my climate model simulations I used one particular type of sulphate aerosol with specific radiative properties. However, it’s possible that aerosols in the real atmosphere could behave rather differently. This research shows its important to get the aerosol properties right if you want to correctly predict the effects of stratospheric aerosol geoengineering on the climate.

It’s very difficult to know what the properties of geoengineering aerosols in the real atmosphere might be. It’s not clear how much the aerosols would ‘clump’ together, which would increase their size and increase the amount of energy emitted into the troposphere. This is important because the more energy emitted down into the troposphere, the weaker tropical convection (and rainfall) becomes.

Geoengineering isn’t a ‘quick fix’ to the problem of greenhouse-gas-driven climate change. We’ve know that for a long time. This research shows that there are some important side-effects of geoengineering which should be taken into account when thinking about whether or not it’s a viable option. How important these sides effects are depends on the size and properties of the aerosol, which, as I’ve said, we don’t really know. In order to work how what geoengineering does and doesn’t do, we’d have to crack the tricky problem of understanding how the aerosols behave in the atmosphere.

* EDIT: This is important. Solar dimming geoengineering to counterbalance increasing CO2 concentrations decreases rainfall from pre-industrial levels, but globally this is smaller than the increase that would happen from CO2 alone. So in that sense solar dimming geoengineering gets us closer to the pre-industrial ‘baseline’. Including the aerosol effect on tropical rainfall, however, shows that the reduction in rainfall from aerosol geoengineering to counterbalance increasing CO2 concentrations is about the same size as the increase that would happen from CO2 alone. So sulphate aerosol geoengineering to counteract CO2 takes us about as far from the ‘baseline’ as CO2 alone does.