Malte Meinshausen on Reducing Greenhouse Gas Emission
New Hot Paper Commentary, July 2010
![]() |
Article: Greenhouse-gas emission targets for limiting global warming to 2 degrees C
Authors: Meinshausen, M;Meinshausen,
N;Hare, W;Raper, SCB;Frieler, K;Knutti, R;Frame, DJ;Allen,
MR |
Malte Meinshausen talks with ScienceWatch.com and answers a few questions about this month's New Hot Papers paper in the field of Geosciences.
Why do you think your paper is highly
cited?
Our paper directly addressed a policy-relevant question, i.e., how much greenhouse gas emission reductions are necessary at a global scale in order to stay below 2 degree global warming relative to pre-industrial levels. Taking into account the broad range of existing literature probably helped to trigger people's interest.
Furthermore, some people may have been interested in our attempt to bridge the gap between the academic geophysical insight, i.e., that it is the cumulative carbon budget that matters for CO2-induced warming, and the political realities, i.e., discussing milestones for emission levels in 2020 and up to 2050.
Sometimes, our paper is cited as a study that highlights the fact that cumulative carbon emissions matter. Here, of course, many colleagues deserve credit. They presented this point excellently in a number of papers last year: (Allen et al., 2009, Nature, Matthews et al., 2009, Nature, Zickfeld et al., 2009, PNAS), or even earlier, sometimes more or less between the lines (such as Matthews and Caldeira, 2008, GRL, Solomon et al., 2009, and PNAS; Hare, 1997).
Does it describe a new discovery, methodology, or
synthesis of knowledge?
"...our research might have been helpful, providing a sense of the uncertainties involved and offering a tool for decision makers."
This study provides mainly a synthesis of knowledge, which is new in so far as it attempts to comprehensively capture the uncertainties along a large part of the cause-effect chain, i.e. from anthropogenic emissions to CO2-concentration and global mean temperatures. A new methodological aspect was that our historical constraining of future climate projections took several independent uncertainties in historical temperature observations into account, which affects the width of future uncertainty ranges.
New in terms of the presentation of the results was our decision to move even one step beyond the fact that the cumulative carbon budget is the key determinant for CO2-induced warming. On the one hand, we took other non-CO2 emissions into account. On the other hand, we attempted to bridge the gap between the geophysical "truth" that cumulative emissions "over all times" are crucial and the political "realities" of international negotiations, in which emission milestones for 2020 and up to 2050 are discussed.
It turned out that if a 2C target shall be met with a likely chance, this translation of scientifically robust indicators for policy can work quite well. All cumulative emissions between now and the point when the peak temperature is reached determine how high that peak will be.
In the case of more ambitious mitigation scenarios that have a likely chance to meet the 2C target, the peak of temperatures will approximately be reached by 2050 or 2060. Thus, to stay below 2C with a likely chance, keeping within such a 1,000 GtCO2 emission budget between 2000 and 2050 is a necessary but not sufficient condition.
Would you summarize the significance of your paper
in layman's terms?
Our habit to produce energy by burning fossil fuels causes global warming. Civil societies and policy makers will have to make a choice about how much climate change will be too much. Currently, the international community regards 2C as such a level, which is not safe, but which might prevent many of the more dramatic impacts.
For small island states, even 2C implies long-term sea level rises that will affect them existentially—which is why these states call for a 1.5C target. Anyway, once policy makers have set a temperature target, scientists can work out what is necessary in order to achieve this target.
Of course, there are uncertainties involved. It is a bit like a policy target to reduce the number of deaths on a highway. Setting a speed limit of 130 km/h will reduce the number of deaths, setting a 100 km/h speed limit will reduce fatal accidents quite a bit more. Both speed limits will not be safe, however.
Thus, our study attempted to provide some guidance concerning how much greenhouse gas emissions will have to be reduced over the coming decades in order to have a likely chance of achieving a goal to keep warming below 2C.
How did you become involved in this research, and
how would you describe the particular challenges, setbacks, and
successes that you've encountered along the way?
This work started long time ago. In fact, it was part of my Ph.D. proposal 10 years ago, and I simply was missing a piece of the puzzle each time I tried to answer a question like "How much emission reductions are necessary to avoid temperature levels of X".
"...we attempted to bridge the gap between the geophysical "truth" that cumulative emissions "over all times" are crucial and the political "realities" of international negotiations, in which emission milestones for 2020 and up to 2050 are discussed."
In my Ph.D. times, I was then mainly concerned about multi-gas emission pathways and the uncertain link between stabilized greenhouse gas concentration levels and temperatures. In the years thereafter, I started to look for the climate model with which I could synthesize all those uncertainties.
This was one of the motivations to reprogram the Fortran code of coupled carbon cycle climate model MAGICC (developed by Tom Wigley and Sarah Raper) so that one could do historical constraining with it (see Meinshausen, 2008, ACPD). However, capturing the uncertainties in transient climate change, the historical constraining of the climate model, and combining all those puzzle pieces still took a while.
Where do you see your research leading in the
future?
Global mean temperatures are only an indicator for multiple regional climate impacts and their severity. Thus, continuing to model that long cause-effect chain with its many uncertainties—from climate protection agreement rules, national emissions to global temperatures and further to regional impacts, sea level rise etc.—is going to keep me entertained for the coming years, I believe.
We strive to do just that in the PRIMAP research group at the Potsdam Institute for Climate Impact Research—with the slightly lengthy name "Potsdam Real-time Integrated Model for the probabilistic Assessment of emission Pathways."
Do you foresee any social or political
implications for your research?
One would hope so. Although last year's Copenhagen climate conference was falling short of public expectations in many respects, the international community agreed on a common goal: to limit global warming to below 2C relative to pre-industrial levels—and to revisit a potential 1.5C goal by 2015.
The question is then, by how much and how quickly we have to reduce greenhouse gas emissions to avoid the climate impacts that are associated with warming beyond 2C? In this regard, our research might have been helpful, providing a sense of the uncertainties involved and offering a tool for decision makers.
If they chose to aim for a 2C goal with a reasonable likelihood, global cumulative emissions would need to be limited to 1,000 GtCO2 until 2050, which implies less than half global emissions by 2050 compared to 1990. However, it is relatively clear that in its current, non-binding, form, the Copenhagen Accord cannot ensure that the 2C goal will be achieved—given the emission reduction pledges written into it to date (see our most recent publication, Rogelj et al., 2010, Nature).
Dr. Malte Meinshausen
PRIMAP Team Leader
Earth System Sciences
Potsdam Institute for Climate Impact Research
KEYWORDS: GREENHOUSE-GAS EMISSION TARGETS, GLOBAL WARMING LIMIT, CLIMATE SENSITIVITY, TEMPERATURE, UNCERTAINTIES, PROJECTIONS, CARBON CYCLE.