logo

Wake up daily to our latest coverage of business done better, directly in your inbox.

logo

Get your weekly dose of analysis on rising corporate activism.

logo

The best of solutions journalism in the sustainability space, published monthly.

Select Newsletter

By signing up you agree to our privacy policy. You can opt out anytime.

Phil Covington headshot

Despite Rising Carbon Emissions - Global Mean Temperatures Have Been Flat

By Phil Covington

A couple of weeks ago, The Economist ran a report on climate change stating that over the last 15 years, air temperatures at the Earth's surface have remained flat despite greenhouse-gas emissions continuing to soar. They went onto draw attention to a recent report by James Hansen (the well known climate scientist who retired from NASA's Goddard Institute for Space Studies this month) which asserts, "The five-year mean global temperature has been flat for a decade." The clear implication being that temperatures are not rising as fast as predicted.

The piece goes on to cite a study by Ed Hawkins at the University of Reading in the UK that tracks actual global mean temperatures against 20 climate models, and which finds the actual observed temperatures since 2005 are already running at the low end of temperature range projections of those models. If observed mean temperatures remain flat in the next few years, The Economist suggests, it would mean the Earth's temperature would be below the lowest temperature range predictions.

So does this mean climate change is less of a problem than climate science has suggested? Not really.

The Economist argues the mismatch between increasing levels of carbon and lower-than-predicted temperature increase is "among the biggest puzzles in climate science." However, James Hansen's report which they cite, (co-authored by M.Sato and R.Ruedy) does not appear to share The Economist's quandary.

In fact, the quote above which appeared in The Economist is actually incomplete. Hansen's report actually says, "The five-year mean global temperature has been flat for a decade, which we interpret as a combination of natural variability and a slowdown in the growth rate of net climate forcing."

Climate forcing is defined as "an imposed perturbation of the planet's energy balance that would tend to alter global temperature." Climate forcing can exert either a warming or cooling effect. For example, solar irradiance fluctuations have a warming or cooling effect while atmospheric aerosols (small suspended atmospheric particulates, either manmade or volcanic) are other climate forcing agents. Carbon dioxide is, of course, another.

The Hansen report concludes that despite the slowdown in climate forcing effects, background global warming is continuing. The report says the 5-year running mean global temperatures may largely be a consequence of the first half of the past decade having predominantly El Niño (warming) conditions, while the second half had predominantly La Niña (cooling) conditions. The report also notes we have been in a period of a prolonged solar minimum - in turn having a cooling effect.

In addition, and this is important, the report points out that even though an observed flattening of temperatures has occurred, the "standstill" temperature is nonetheless at a much higher level than existed at any year in the prior decade except for 1998 (a strong El Niño year). Bottom line; the planet is still hotter.

It is therefore dangerous and incorrect to conclude that recent flattening of surface temperatures means climate change is over. Furthermore, the short period of observed temperature flattening is hardly a significant time scale in order to signify a change in trend. The University of Reading study (mentioned previously), shows actual temperatures are clearly trending in an upward direction since 1950 when their data begins.

That said, it would also be ill-advised to dismiss the mismatch between observed temperatures versus modeled temperature projections as irrelevant. What does it say about the quality of the models? What the models attempt to do is assess "climate sensitivity" - which very simply, is the effect CO2 emissions have on temperature.

Climate model methodology differs and leads to variability. The Economist points out that the Intergovernmental Panel on Climate Change, (IPCC) bases its findings partly on "General Circulation Models" (GCM), which are very detailed and complex, where "the sensitivity is based upon how accurately the model describes the processes and feedbacks in the climate system." However, a potential flaw with GCM, they point out, is that it does not respond to new temperature readings. The Economist implies that the newer (flattened) temperature data is not fed back into the models - therefore potentially affecting accuracy.

Other climate modeling methodologies The Economist mentions are "energy balance models" - These are less complex, but do explicitly use temperature data to estimate the sensitivity of the climate system, thereby feeding back actual climate observations into such models.

Which methodology is more accurate? The IPCC's position on climate sensitivity is that with a doubling of CO2, temperature increases will range between 2 degrees and 4.5 degrees Celsius with a likely estimate of around 3 degrees Celsius.  Other models The Economist cited suggest more conservative estimates, though they have been criticized for using a flawed study in their piece.  

One thing it seems safe to say though, is that climate science continues to be subject to uncertainty. For example, the Hansen report expressly admits things such as "aerosol forcing is extremely uncertain," and later on states, "The one major wild card in projections of future climate change is the unmeasured climate forcing due to aerosol changes and their effects on clouds." These statements do not exactly indicate a definitive understanding of at least some of the important components of climate modeling. So the extent to which a consensus exists among climate scientists is still not without knowledge gaps, different assumptions, and different approaches to methodology.

Already it is politically difficult to implement policies that would mitigate carbon emissions, and will likely remain so. And no doubt, it would problematic to bring about carbon mitigation policies if atmospheric carbon dioxide of 392 ppm CO2 today (and growing) leads to smaller than expected rises in temperature.  What if this flattening of temperature continues going forward?

However, given the uncertainty, there is no reason to be complacent. The information in The Economist doesn't let us off the hook in terms of unrestrained carbon emissions - after all, the planet is already hotter, and  2012 was the hottest year ever in the USA.

I think The Economist said it best back in 2010 when they concluded "The Science of Climate Change - The Clouds of Unknowing" as follows; "The doubters are right that uncertainties are rife in climate science. They are wrong when they present that as a reason for inaction."

[image credit: Mikael Miettinen]

 

Phil Covington headshot

Phil Covington holds an MBA in Sustainable Management from Presidio Graduate School. In the past, he spent 16 years in the freight transportation and logistics industry. Today, Phil's writing focuses on transportation, forestry, technology and matters of sustainability in business.

Read more stories by Phil Covington