Category Archives: long term forecasting

CO2 Concentrations Spiral Up, Global Temperature Stabilizes – Was Gibst?

Predicting global temperature is challenging. This is not only because climate and weather are complex, but because carbon dioxide (CO2) concentrations continue to skyrocket, while global temperature has stabilized since around 2000.

Changes in Global Mean Temperature

The NASA Goddard Institute for Space Studies maintains extensive and updated charts on global temperature.

globalmeantempdelta

The chart for changes annual mean global temperature is compiled from weather stations from around the planet.

There is also hermispheric variation, with the northern hemisphere showing more increases than the southern hemisphere.

hemi

At the same time, observations of the annual change in mean temperature have stabilized since around 2000, as the five year moving averages show.

Atmospheric Carbon Dioxide Concentrations

The National Oceanic and Atmospheric Administration (NOAA) maintains measurements of atmospheric carbon dioxide taken in Hawaii at Mauna Loa. These show continual increase since the measurements were first initiated in the late 1950’s.

Here’s a chart showing recent monthly measurements, highlighting the consistent seasonal pattern and strong positive trend since 2010.

Maunaloa1

Here’s all the data. The black line in both charts represents the seasonally corrected trend.

Maunaloa2

A Forecasting Problem

This is a big problem for anyone interested in predicting the future trajectory of climate.

So, according to these measurements on Mauna Loa, carbon dioxide concentrations in the atmosphere have been increasing monotonically (with seasonal variation) since 1958, when measurements first began. Yet global temperatures have not increased on a clear trend since around 2000.

I want to comment in detail sometime on the forecasting controversies that have swirled around these types of measurements and their interpretation, but here let me just suggest the outlines of the problem.

So, it’s clear that the relationship between atmospheric CO2 concentrations and global temperature is not linear, or that there are major intervening variables. Cloud cover may increase with higher temperatures, due to more evaporation. The oceans are still warming, so maybe they are absorbing the additional heat. Perhaps there are other complex feedback processes involved.

However, if my reading of the IPCC literature is correct, these suggestions are still anecdotal, since the big systems models seem quite unable to account for this trajectory of temperature – or at least, recent data appear as outliers.

So there you have it. As noted in earlier posts here, global population is forecast to increase by perhaps one billion by 2030. Global output, even given uncertain impacts of coming recessions, may grow to $150 trillion dollars by 2030. Emissions of greenhouse gases, including but not limited to CO2 also will increase – especially given the paralyzing impacts of the current “pause in global warming” on coordinated policy responses. Deforestation is certainly a problem in this context, although we have not here reviewed the prospects.

One thing to note, however, is that the first two charts presented above trace out changes in global mean temperature by year. The actual level of global mean temperature surged through the 1990’s and remains high. That mean that ice caps are melting, and various processes related to higher temperatures are currently underway.

Population Forecasts, 2020 and 2030

The United Nations population division produces widely-cited forecasts with country detail on a number of key metrics, such as age structure and median age.

The latest update (2012 revision) estimates 2010 base population at 6.9 billion persons, projecting global population at 7.7 billion and 8.4 billion in 2020 and 2030, respectively, in a medium fertility scenario.

The low fertility scenario projects 7.5 billion persons for 2020 and approximately 8.0 billion for 2030.

So, bottom line, global population is unlikely to peak during this forecast period to 2030, although it is likely to decline, under all fertility scenarios, for key players in the global economy – such as Japan and Germany.

Population decline is even possible, according to the 2012 revision, in a low fertility scenario for China, although not with higher birth rates, as indicated in the following chart.

ChinaIndia

Some rudimentary data analytics shows the importance of the estimate of median age in a country for its projected population growth in the 2012 revision.

For example, here is a scatter diagram of the median age within a country (horizontal or x-axis) and the percentage increase or decrease 2010-2030 in the medium fertility scenario of the UN projections. Thus, just to clarify, a 60 percent “percentage growth” on the vertical axis means 2030 population is 60 percent larger than the estimated base year 2010 population.

scatter

Note that a polynomial regression fits this scatter of points with a relatively high R2. This indicates that the median age is negatively related to the projected population change for a country in this period with drop-offs in the earliest and oldest median ages in the population of countries.

Thus, in the first chart, Chinese growth includes the possibility of a decline over this period, and India’s does not. This is related to the fact that the median age of China in 2010 is estimated at 34.6 years, while the median age in 2010 in India is estimated at 25.5 years of age.

China and India, of course, are the world’s two most populous countries.

Here are some other interesting charts from the UN projections.

Russia, Japan, and Germany

RJG

The comparison for these countries is between the high fertility and low fertility scenarios. The middle fertility scenario lies pretty squarely between these curves for each nation.

Indonesia, Brazil, and Nigeria

IBN

Nigeria has the highest population growth rates for any larger country for this period, again because its 2010 median age is listed as around 18 years of age.

Accuracy of UN Population Forecasts

The accuracy of UN population forecasts has improved over the past several decades, with improved estimates of base population (See for example. Data quality and accuracy of UN Population Projections, 1950-1995). Needless to say forecasts for industrially developed counties usually have been better than for nations in the developing world.

Changes in migration account for significant errors in national population forecasts, as when a large contingent, some legal, some side-stepping legal immigration channels, came from Mexico and other Spanish-speaking areas “South of the Border,” changing birth patterns in the US from the early 1990’s to the years after 2000. In fact, during the early 1990’s, Census was predicting peak population for the US might occur as early as 2025. This idea went by the wayside, however, as younger, more fertile Hispanic families took their place in the country.

Current UN forecasts indicate US population should increase in the medium fertility scenario from 312 million to 338 million and 363 million, respectively, by 2020 and 2030.

2020 and 2030 – Forecasts and Projections

I’d like to establish a context for discussing longer term forecasts, in this case to 2020 and 2030.

So, just below, I give you my take on 1990-2005. A lot happened that was unanticipated at the beginning of this period. One should expect, I think, the same to be true for 2015-2030.

Along those lines, I also suggest Big Picture factors that may come into play over the next fifteen or so years.

In coming posts, I want to summarize forecasts and projections I have seen for this period.

And I’m a little unusual in the technical forecasting community, since I’m equipped to do matrix programming, discuss boosting and bagging and so forth, and, on the other side of the aisle, weave together these stories and scenarios about process, causes, and factors. The quantitative is usually where I get paid, but, at the same time, I think it is easy to underestimate the benefit of trying to keep track of the Big Picture, the global dynamics, the political economy, and so forth.

1990-2005

The 1990’s rolled out with a nasty little recession in 1991 and voters throwing the first George Bush out of office, in favor of a clarinet-playing former Governor of Arkansas with a penchant for the ladies. Then, the United States experienced the longest period of economic prosperity since the 1960’s, fueled by the tech revolution and rise of the Internet. The breakup of the Soviet Union became official with democratic forms struggling to take root in Russia and former Soviet Republics. The US defense budget was cut about 40 percent from 1980 levels. Deregulation became a theme, and deregulation of telecoms led to burgeoning investments in telecom systems. The end of the decade saw the absurd Y2K problem, where details of computer clocks were supposed to stop everything at midnight, the turn of the century.

The New Millennium saw another recession in 2001, which was particularly sharp for the tech industry. Another Bush took the Presidency, after the Supreme Court intervened in the disputed General Election. Then there was 9/11 – September 11, 2001, with the destruction of the World Trade Center by large airliners being flown into the upper stories. This was a pivotal event. There was immediate surge in the military budget and in US military action in Afghanistan and then the invasion of Iraq, putatively because Saddam Hussein possessed “weapons of mass destruction.”

The US economy pretty much languished after the 2001-2002 recession, being stimulated to an extent by the rise in the defense budget, then by housing activity triggered by continued lowering of interest rates by the US Federal Reserve Bank under the redoubtable Alan Greenspan.

Another development that became especially noticeable after 2000 was the rise of China as a manufacturing and export power. The construction of the Shanghai skyline from the late 1990’s to the middle of the last decade was nothing less than stupendous.

The Importance of Technical Change

So what is important over a span of time? Are there underlying determinants?

I’ve got to believe technical change is an important element in historical process. If we take the fifteen year period sketched above, for example, a lot of the story is driven, at some level, by technical developments, especially in information technology (IT).

My favorite explanation of the collapse of the Soviet Union, for example, includes Silicon Valley as a key driver. The Soviet planned economy was a huge lumbering machine, compared to the nimble, change-oriented shops in the Valley, innovating new computer setups every few months. One immediate consequence was the US fighter aircraft came to totally dominate the old MIG planes, with their electronically guided missiles and tracking systems.

And to go on in this vein, focusing on the rise of US tech and then the movement of production to China is a strategic process for understanding the past couple of decades.

Big Picture Factors

Suffice it to say – new technology will be as much a driver of change in the next fifteen years, as it has been over the past fifteen.

Indeed, according to the futurist Ray Kurzweil, something called The Singularity stalks the human future. Perhaps around 2045, somewhat outside our forecast horizon in this discussion, technology will converge to completely outperform human intelligence. Commentators ranging from Stanislaus Ulam to Kurzweil believe that it is impossible to project human history beyond this point – hence the name.

Conventionally, this will involve biotechnology, computer technology, and robotics – but also could involve nanotechnology.

In any case, hefty doses of new technology may be necessary just to keep on a level course. I’m thinking, for example, of the diminishing effectiveness of antibiotics. So we have the evolution of “superbugs,” as well as the emergence of new epidemics through mutation or disease vectors jumping species lines. Ebola is a particularly gruesome example.

And while on technology, it is fair to observe that complex technologies just at or beyond the boundary of human control present deep challenges. Deep-sea oil drilling and the Gulf of Mexico oil spill, under British Petroleum, and the Fukishima nuclear disaster, still leaking radioactivity into the Pacific, are two examples.

Population or more generally demography is another Big Picture factor. Populations are aging in the United States, Europe, and Japan, but also in China. And global population continues to grow, possibly by another billion by 2030.

Climate change is another Big Picture factor.

The global climate is a complex, dynamic system. There is lots of noise in the discussion and uncertainties, such as whether there may be a cooling interval, as carbon dioxide and methane concentrations continue to rise globally. A number of studies commissioned by US and other intelligence agencies, though, highlight the potential for massive impacts from, say, basic changes in monsoon patterns in South Asia.

In terms of geopolitics, I suspect the shift in the economic center of gravity to somewhere along the Asian rim is another Big Picture development.

There are many relevant metrics. The proportion of global output produced by the United States, according to the World Economic Outlook (WEO) of the International Monetary Fund (IMF), will continue to diminuish, as Chinese growth in the worst case is projected to exceed levels of economic growth in the US and, certainly, in Europe.

Then, there is the issue of the US being the policeman of the world. At some point, the cost of maintaining a global span of military bases and force readiness for multiple theatres of action will weigh heavily on the US – as one could argue is already happening to some degree.

Challenges to the global dominance of the US dollar can be predicted, also, in the next fifteen years.

Sustainability

Whether any of the above “Big Picture” factors actually come into play by 2020 or 2030 is, of course, a speculation. But I think the basic technique of long term forecasting is to inventory possible influences like these. Then, you construct scenarios.

One thing appears certain. And that is there will be surprises.

In looking at forecasts for the next five to fifteen years, I also want to give thought to sustainability. Are there institutions and arrangements which could offer a backup to the various types of instabilities which could emerge?

And there is apparently an increasing chance of an increase in the general level of warfare, perhaps with linking of action in various theatres. I have to say, too, that I am poorly equipped to comment on these conflicts, although, as they ramp up, I attempt to learn more about the players and underlying dynamics.

I’ll be using this venue as a scratch-pad to record the projections of others and some thoughts I might have in response vis a vis 2020 and 2030.

Analyzing Complex Seasonal Patterns

When time series data are available in frequencies higher than quarterly or monthly, many forecasting programs hit a wall in analyzing seasonal effects.

Researchers from the Australian Monash University published an interesting paper in the Journal of the American Statistical Association (JASA), along with an R program, to handle this situation – what can be called “complex seasonality.”

I’ve updated and modified one of their computations – using weekly, instead of daily, data on US conventional gasoline prices – and find the whole thing pretty intriguing.

tbatschart

If you look at the color codes in the legend below the chart, it’s a little easier to read and understand.

Here’s what I did.

I grabbed the conventional weekly US gasoline prices from FRED. These prices are for “regular” – the plain vanilla choice at the pump. I established a start date of the first week in 2000, after looking the earlier data over. Then, I used tbats(.) in the Hyndman R Forecast package which readers familiar with this site know can be downloaded for use in the open source matrix programming language R.

Then, I established an end date for a time series I call newGP of the first week in 2012, forecasting ahead with the results of applying tbats(.) to the historic data from 2000:1 to 2012:1 where the second number refers to weeks which run from 1 to 52. Note that some data scrubbing is needed to shoehorn the gas price data into 52 weeks on a consistent basis. I averaged “week 53” with the nearest acceptable week (either 52 or 1 in the next year), and then got rid of the week 53’s.

The forecast for 104 weeks is shown by the solid red line in the chart above.

This actually looks promising, as if it might encode some useful information for, say, US transportation agencies.

A draft of the JASA paper is available as a PDF download. It’s called Forecasting time series with complex seasonal patterns using exponential smoothing and in addition to daily US gas prices, analyzes daily electricity demand in Turkey and bank call center data.

I’m only going part of the way to analyzing the gas price data, since I have not taken on daily data yet. But the seasonal pattern identified by tbats(.) from the weekly data is interesting and is shown below.

tbatsgasprice

The weekly frequency may enable us to “get inside” a mid-year wobble in the pattern with some precision. Judging from the out-of-sample performance of the model, this “wobble” can in some cases be accentuated and be quite significant.

Trignometric series fit to the higher frequency data extract the seasonal patterns in tbats(.), which also features other advanced features, such as a capability for estimating ARMA (autoregressive moving average) models for the residuals.

I’m not fully optimizing the estimation, but these results are sufficiently strong to encourage exploring the toggles and switches on the routine.

Another routine which works at this level of aggregation is the stlf(.) routine. This is uses STL decomposition described in some detail in Chapter 36 Patterns Discovery Based on Time-Series Decomposition in a collection of essays on data mining.

Thoughts

Good forecasting software elicits sort of addictive behavior, when initial applications of routines seem promising. How much better can the out-of-sample forecasts be made with optimization of the features of the routine? How well does the routine do when you look at several past periods? There is even the possibility of extracting further information from the residuals through bootstrapping or bagging at some point. I think there is no other way than exhaustive exploration.

The payoff to the forecaster is the amazement of his or her managers, when features of a forecast turn out to be spot-on, prescient, or what have you – and this does happen with good software. An alternative, for example, to the Hyndman R Forecast package is the program STAMP I also am exploring. STAMP has been around for many years with a version running – get this – on DOS, which appears to have had more features than the current Windows incarnation. In any case, I remember getting a “gee whiz” reaction from the executive of a regional bus district once, relating to ridership forecasts. So it’s fun to wring every possible pattern from the data.

Energy Forecasts – Parting Shots

There is obviously a big difference between macro and micro, when it comes to energy forecasting.

At the micro-level – for example, electric utility load forecasting – considerable precision often can be attained in the short run, or very short run, when seasonal, daily, and holiday usage patterns are taken into account.

At the macro level, on the other hand – for global energy supply, demand, and prices – big risks are associated with projections beyond a year or so. Many things can intervene, such as supply disruptions which in 2013, occurred in Nigeria, Iraq, and Lybia. And long range energy forecasts – forget it. Even well-funded studies with star researchers from the best universities and biggest companies show huge errors ten or twenty years out (See A Half Century of Long-Range Energy Forecasts: Errors Made, Lessons Learned, and Implications for Forecasting).

Peak Oil

This makes big picture concepts such as peak oil challenging to evaluate. Will there be a time in the future when global oil production levels peak and then decline, triggering a frenzied search for substitutes and exerting pressure on the whole structure of civilization in what some have called the petrochemical age?

Since the OPEC Oil Embargo of 1974, there have been researchers, thinkers, and writers who point to this as an eventuality. Commentators and researchers associated with the post carbon institute carry on the tradition.

Oil prices have not always cooperated, as the following CPI-adjusted price of crude oil suggests.

oilprice

The basic axiom is simply that natural resource reserves and availability are always conditional on price. With high enough prices, more oil can be extracted from somewhere – from deeper wells, from offshore platforms that are expensive and dangerous to erect, from secondary recovery, and now, from nonconventional sources, such as shale oil and gas.

Note this axiom of resource economics does not really say that there will never be a time when total oil production begins to decline. It just implies that oil will never be totally exhausted, if we loosen the price constraint.

Net Energy Analysis

Net energy analysis provides a counterpoint to the peak oil conversation. In principle, we can calculate the net energy contributions of various energy sources today. No forecasting is really necessary. Just a deep understanding of industrial process and input-output relationships.

Along these lines, several researchers and again David Hughes with the post carbon institute project that the Canadian tar sands have a significantly lower net energy contribution that, say, oil from conventional wells.

Net energy analysis resembles life cycle cost analysis, which has seen widespread application in environmental assessment. Still neither technique is foolproof, or perhaps I should say that both techniques would require huge research investments, including on-site observation and modeling, to properly implement.

Energy Conservation

Higher energy prices since the 1970’s also have encouraged increasing energy efficiency. This is probably one of the main reasons why long range energy projections from, say, the 1980’s usually look like wild overestimates by 2000.

The potential is still there, as a 2009 McKinsey study documents –

The research shows that the US economy has the potential to reduce annual non-transportation energy consumption by roughly 23 percent by 2020, eliminating more than $1.2 trillion in waste—well beyond the $520 billion upfront investment (not including program costs) that would be required. The reduction in energy use would also result in the abatement of 1.1 gigatons of greenhouse-gas emissions annually—the equivalent of taking the entire US fleet of passenger vehicles and light trucks off the roads.

The McKinsey folks are pretty hard-nosed, tough-minded, not usually given to gross exaggerations.

A Sense In Which We May Already Have Reached Peak Oil

Check this YouTube out. Steven Kopits’ view of supply-constrained markets in oil is novel, but his observations about dollar investment to conventional oil output seem to hit the mark. The new oil production is from the US in large part, and comes from nonconventional sources, i.e. shale oil. This requires more effort, as witnessed by the poor financials of a lot of these players, who are speculating on expansion of export markets, but who would go bust at current domestic prices.

For Kopits slides go here. Check out these graphs from the recent BP report, too.

Forecasting the Price of Gold – 3

Ukraine developments and other counter-currents, such as Janet Yellen’s recent comments, highlight my final topic on gold price forecasting – multivariate gold price forecasting models.

On the one hand, there has been increasing uncertainty as a result of Ukrainian turmoil, counterbalanced today by the reaction to the seemingly hawkish comments by Chairperson Janet Yellen of the US Federal Reserve Bank.

SPYDRGold

Traditionally, gold is considered a hedge against uncertainty. Indulge your imagination and it’s not hard to conjure up scary scenarios in the Ukraine. On the other hand, some interpret Yellen as signaling an earlier move to moving the Federal funds rate off zero, increasing interest rates, and, in the eyes of the market, making gold more expensive to hold.

Multivariate Forecasting Models of Gold Price – Some Considerations

It’s this zoo of factors and influences that you have to enter, if you want to try to forecast the price of gold in the short or longer term.

Variables to consider include inflation, exchange rates, gold lease rates, interest rates, stock market levels and volatility, and political uncertainty.

A lot of effort has been devoted to proving or attempting to question that gold is a hedge against inflation.

The bottom line appears to be that gold prices rise with inflation – over a matter of decades, but in shorter time periods, intervening factors can drive the real price of gold substantially away from a constant relationship to the overall price level.

Real (and possibly nominal) interest rates are a significant influence on gold prices in shorter time periods, but this relationship is complex. My reading of the literature suggests a better understanding of the supply side of the picture is probably necessary to bring all this into focus.

The Goldman Sachs Global Economics Paper 183 – Forecasting Gold as a Commodity – focuses on the supply side with charts such as the following –

GSfigure1

The story here is that gold mine production responds to real interest rates, and thus the semi-periodic fluctuations in real interest rates are linked with a cycle of growth in gold production.

The Goldman Sachs Paper 183 suggests that higher real interest rates speed extraction, since the opportunity cost of leaving ore deposits in the ground increases. This is indeed the flip side of the negative impact of real interest rates on investment.

And, as noted in an earlier post,the Goldman Sachs forecast in 2010 proved prescient. Real interest rates have remained low since that time, and gold prices drifted down from higher levels at the end of the last decade.

Elasticities

Elasticities of response in a regression relationship show how percentage changes in the dependent variable – gold prices in this case – respond to percentage changes in, for example, the price level.

For gold to be an effective hedge against inflation, the elasticity of gold price with respect to changes in the price level should be approximately equal to 1.

This appears to be a credible elasticity for the United States, based on two studies conducted with different time spans of gold price data.

These studies are Gold as an Inflation Hedge? and the more recent Does Gold Act As An Inflation Hedge in the US and Japan. Also, a Gold Council report, Short-run and long-run determinants of the price of gold, develops a competent analysis.

These studies explore the cointegration of gold prices and inflation. Cointegration of unit root time series is an alternative to first differencing to reduce such time series to stationarity.

Thus, it’s not hard to show strong evidence that standard gold price series are one type or another of a random walk. Accordingly, straight-forward regression analysis of such series can easily lead to spurious correlation.

You might, for example, regress the price of gold onto some metric of the cumulative activity of an amoeba (characterized by Brownian motion) and come up with t-statistics that are, apparently, statistically significant. But that would, of course, be nonsense, and the relationship could evaporate with subsequent movements of either series.

So, the better research always gives consideration to the question of whether the variables in the models are, first of all, nonstationary OR whether there are cointegrated relationships.

While I am on the topic literature, I have to recommend looking at Theories of Gold Price Movements: Common Wisdom or Myths? This appears in the Wesleyan University Undergraduate Economic Review and makes for lively reading.

Thus, instead of viewing gold as a special asset, the authors suggest it is more reasonable to view gold as another currency, whose value is a reflection of the value of U.S. dollar.

The authors consider and reject a variety of hypotheses – such as the safe haven or consumer fear motivation to hold gold. They find a very significant relationship between the price movement of gold, real interest rates and the exchange rate, suggesting a close relationship between gold and the value of U.S. dollar. The multiple linear regressions verify these findings.

The Bottom Line

Over relatively long time periods – one to several decades – the price of gold moves more or less in concert with measures of the price level. In the shorter term, forecasting faces serious challenges, although there is a literature on the multivariate prediction of gold prices.

One prediction, however, seems reasonable on the basis of this review. Real interest rates should rise as the US Federal Reserve backs off from quantitative easing and other central banks around the world follow suit. Thus, increases in real interest rates seem likely at some point in the next few years. This seems to indicate that gold mining will strive to increase output, and perhaps that gold mining stocks might be a play.

Forecasting – Climate Change and Infrastructure

You really have to become something like a social philosopher to enter the climate change and infrastructure discussion. I mean this several ways.

Of course, there is first the continuing issue of whether or not climate change is real, or is currently being reversed by a “pause” due to the oceans or changes in trade winds absorbing some of the increase in temperatures. So for purposes of discussion, I’m going to assume that climate change is real, and with a new El Niño this year global temperatures and a whole panoply of related weather phenomena – like major hurricanes – will come back in spades.

But then can we do anything about it? Is it possible for a developed or “mature” society to plan for an uncertain, but increasingly likely future? With this question come visions of the amazingly dysfunctional US Congress, mordantly satirized in the US TV show House of Cards.

The National Society of Professional Engineers points out that major infrastructure bills relating to funding the US highway system and water systems are coming up in Congress in 2014.

Desperately needed long-term infrastructure projects were deferred to address other national priorities or simply fell victim to the ongoing budget crisis. In fact, federal lawmakers extended the surface transportation authorization an unprecedented 10 times between 2005 and 2012, when Congress finally authorized the two-year Moving Ahead for Progress in the 21st Century Act (MAP-21). Now, with MAP-21 set to expire before the end of 2014, two of the most significant pieces of infrastructure legislation are taking center stage in Congress. The Water Resources Reform and Development Act (WRRDA) and the reauthorization of the surface transportation bill present a rare opportunity for Congress to set long-term priorities and provide needed investment in our nation’s infrastructure. Collectively, these two bills cover much, though not all, of US infrastructure. The question then becomes, can Congress overcome continuing partisan gridlock and a decades-long pattern of short-term fixes to make a meaningful commitment to the long-term needs of US infrastructure?

Yes, for sure, that is the question.

Hurricane Sandy – really by the time it hit New Jersey and New York a fierce tropical storm – wreaked havoc on Far Rockaway, flooding the New York City subway system in 2012. This gave rise to talk of sea walls after the event.  And I assume something like that is in the planning stages on drawing boards somewhere on the East Coast. But the cost of “ten story tall pilings” on which would be hinged giant gates is on the order of billions of US dollars.

California

I notice interesting writing coming out of California, pertaining to the smart grid and the need to extend this concept from electricity to water.

The California Energy Commission (CEC) publishes an Integrated Energy Policy Report (IEPR – pronounced eye-per) every two years, and the 2013 IEPR was just approved ..Let’s look at two climate change impacts – temperature and precipitation.  From a temperature perspective, the IEPR anticipates that as the thermometer rises, so does the demand for electricity to run AC.  San Francisco Peninsula communities that never had a need for AC will install a couple million units to deal with summer temperatures formerly confined to the Central Valley.  PG&E and municipal utilities in Northern California will notice impacts in seasonal demand for electricity in both the duration of heat waves and peak apexes during the hottest times of day.  In the southern part of the state, the demand will also grow as AC units work harder to offset hotter days. At the same time, increased temperatures decrease power plant efficiencies, whether the plant generates electricity from natural gas, solar thermal, nuclear, or geothermal.  Their cooling processes are also negatively impacted by heat waves.  Increased temperatures also impact transmission lines – reducing their efficiency and creating line sags that can trigger service disruptions. Then there’s precipitation.  Governor Jerry Brown just announced a drought emergency for the state.  A significant portion of California’s water storage system relies on the Sierra Mountains snowpack, which is frighteningly low this winter.  This snowpack supplies most of the water sourced within the state, and hydropower derived from it supplies about 15% of the state’s homegrown electricity.  A hotter climate means snowfall becomes rainfall, and it is no longer freely stored as snow that obligingly melts as temperatures rise.  It may not be as reliably scheduled for generation of hydro power as snowfalls shift to rainfalls. We may also receive less precipitation as a result of climate change – that’s a big unknown right now.  One thing is certain.  A hotter climate will require more water for agriculture – a $45 billion economy in California – to sustain crops.  And whether it is water for industrial, commercial, agricultural, or residential uses, what doesn’t fall from the skies will require electricity to pump it, transport it, desalinate it, or treat it.

Boom – A Journal of California packs more punch in discussing the “worst case”

“The choice before us is not to stop climate change,” says Jonathan Parfrey, executive director of Climate Resolve in Los Angeles. “That ship has sailed. There’s no going back. There will be impacts. The choice that’s before humanity is how bad are we going to do it to ourselves?”

So what will it be? Do you want the good news or the bad news first?

The bad news. OK.

If we choose to do nothing, the nightmare scenario plays out something like this: amid prolonged drought conditions, wildfires continuously burn across a dust-dry landscape, while potable water has become such a precious commodity that watering plants is a luxury only residents of elite, gated communities can afford. Decimated by fires, the power grid infrastructure that once distributed electricity—towers and wires—now loom as ghostly relics stripped of function. Along the coast, sea level rise has decimated beachfront properties while flooding from frequent superstorms has transformed underground systems, such as Bay Area Rapid Transit (BART), into an unintended, unmanaged sewer system..

This article goes on to the “good news” which projects a wave of innovations and green technology by 2050 to 2075 in California.

Sea Level Rise

Noone knows, at this point, the extent of the rise in sea level in coming years, and interestingly, I never seen a climate change denier also, in the same breath, deny that sea levels have been rising historically.

There are interesting resources on sea level rise, although projections of how much rise over what period are uncertain, because no one knows whether a big ice mass, such as parts of the Antarctic ice shelf are going to melt on an accelerated schedule sometime soon.

An excellent scientific summary of the sea level situation historically can be found in Understanding global sea levels: past, present and future.

Here is an overall graph of Global Mean Sea Level –

GMSL

This inexorable trend has given rise to map resources which suggest coastal areas which would be underwater or adversely affected in the future by sea surges.

The New York Times’ interactive What Could Disappear suggests Boston might look like this, with a five foot rise in sea level expected by 2100

Boston

The problem, of course, is that globally populations are concentrated in coastal areas.

Also, storm surges are nonlinearly related to sea level. Thus, a one (1) foot rise in sea level could be linked with significantly more than 1 foot increases in the height of storm surges.

Longer Term Forecasts

Some years back, an interesting controversy arose over present value discounting in calculating impacts of climate change.

So, currently, the medium term forecasts of climate change impacts – sea level rises of maybe 1 to 2 feet, average temperature increases of one or two degrees, and so forth – seem roughly manageable. The problem always seems to come in the longer term – after 2100 for example in the recent National Academy of Sciences study funded, among others, by the US intelligence community.

The problem with calculating the impacts and significance of these longer term impacts today is that the present value accounting framework just makes things that far into the future almost insignificant.

Currently, for example, global output is on the order of 80 trillion dollars. Suppose we accept a discount rate of 4 percent. Then, calculating the discount factor 150 years from today, in 2154, we have 0 .003. So according to this logic, the loss of 80 trillion dollars worth of production in 2154 has a present value of about 250 billion dollars. Thus, losing an amount of output in 150 years equal to the total productive activity of the planet today is worth a mere 250 billion dollars in present value terms, or about the current GDP of Ireland.

Now I may have rounded and glossed some of the arithmetic possibly, but the point stands no matter how you make the computation.

This is totally absurd. Because as a guide to losing future output of $80 trillion dollars in a century and one half, it seems we should be willing to spend on a planetary basis more than a one-time cost of $35 per person today, when the per capita global output is on the order of $1000 per person.

So we need a better accounting framework.

Of course, there are counterarguments. For example, in 150 years, perhaps science will have discovered how to boost the carbon dioxide processing capabilities of plants, so we can have more pollution. And looking back 150 years to the era of the horse and buggy, we can see that there has been tremendous technological change.

But this is a little like waiting for the amazing “secret weapons” to be unveiled in a war you are losing.

Header photo courtesy of NASA

Possibilities for Abrupt Climate Change

The National Research Council (NRC) published ABRUPT IMPACTS OF CLIMATE CHANGE recently, downloadable from the National Academies Press website.

It’s the third NRC report to focus on abrupt climate change, the first being published in 2002. NRC members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine.

The climate change issue is a profound problem in causal discovery and forecasting, to say the very least.

Before I highlight graphic and pictoral resources of the recent NRC report, let me note that Menzie Chin at Econbrowser posted recently on Economic Implications of Anthropogenic Climate Change and Extreme Weather. Chin focuses on the scientific consensus, presenting graphics illustrating the more or less relentless upward march of global average temperatures and estimates (by James Stock no less) of the man-made (anthropogenic) component.

The Econbrowser Comments section is usually interesting and revealing, and this time is no exception. Comments range from “climate change is a left-wing conspiracy” and arguments that “warmer would be better” to the more defensible thought that coming to grips with global climate change would probably mean restructuring our economic setup, its incentives, and so forth.

But I do think the main aspects of the climate change problem – is it real, what are its impacts, what can be done – are amenable to causal analysis at fairly deep levels.

To dispel ideological nonsense, current trends in energy use – growing globally at about 2 percent per annum over a long period – lead to the Earth becoming a small star within two thousand years, or less – generating the amount of energy radiated by the Sun. Of course, changes in energy use trends can be expected before then, when for example the average ambient temperature reaches the boiling point of water, and so forth. These types of calculations also can be made realistically about the proliferation of the automobile culture globally with respect to air pollution and, again, contributions to average temperature. Or one might simply consider the increase in the use of materials and energy for a global population of ten billion, up from today’s number of about 7 billion.

Highlights of the Recent NRC Report

It’s worth quoting the opening paragraph of the report summary –

Levels of carbon dioxide and other greenhouse gases in Earth’s atmosphere are exceeding levels recorded in the past millions of years, and thus climate is being forced beyond the range of the recent geological era. Lacking concerted action by the world’s nations, it is clear that the future climate will be warmer, sea levels will rise, global rainfall patterns will change, and ecosystems will be altered.

So because of growing CO2 (and other greenhouse gases), climate change is underway.

The question considered in ABRUPT IMPACTS OF CLIMATE CHANGE (AICH), however, is whether various thresholds will be crossed, whereby rapid, relatively discontinuous climate change occurs. Such abrupt changes – with radical shifts occurring over decades, rather than centuries – before. AICH thus cites,

..the end of the Younger Dryas, a period of cold climatic conditions and drought in the north that occurred about 12,000 years ago. Following a millennium-long cold period, the Younger Dryas abruptly terminated in a few decades or less and is associated with the extinction of 72 percent of the large-bodied mammals in North America.

The main abrupt climate change noted in AICH is rapid decline of the Artic sea ice. AICH puts up a chart which is one of the clearest examples of a trend you can pull from environmental science, I would think.

ArticSeaIce

AICH also puts species extinction front and center as a near-term and certain discontinuous effect of current trends.

Apart from melting of the Artic sea ice and species extinction, AICH lists destabilization of the Antarctic ice sheet as a nearer term possibility with dramatic consequences. Because a lot of this ice in the Antarctic is underwater, apparently, it is more at risk than, say, the Greenland ice sheet. Melting of either one (or both) of these ice sheets would raise sea levels tens of meters – an estimated 60 meters with melting of both.

Two other possibilities mentioned in previous NRC reports on abrupt climate change are discussed and evaluated as low probability developments until after 2100. These are stopping of the ocean currents that circulate water in the Atlantic, warming northern Europe, and release of methane from permafrost or deep ocean deposits.

The AMOC is the ocean circulation pattern that involves the northward flow of warm near-surface waters into the northern North Atlantic and Nordic Seas, and the south- ward flow at depth of the cold dense waters formed in those high latitude regions. This circulation pattern plays a critical role in the global transport of oceanic heat, salt, and carbon. Paleoclimate evidence of temperature and other changes recorded in North Atlantic Ocean sediments, Greenland ice cores and other archives suggest that the AMOC abruptly shut down and restarted in the past—possibly triggered by large pulses of glacial meltwater or gradual meltwater supplies crossing a threshold—raising questions about the potential for abrupt change in the future.

Despite these concerns, recent climate and Earth system model simulations indicate that the AMOC is currently stable in the face of likely perturbations, and that an abrupt change will not occur in this century. This is a robust result across many different models, and one that eases some of the concerns about future climate change.

With respect to the methane deposits in Siberia and elsewhere,

Large amounts of carbon are stored at high latitudes in potentially labile reservoirs such as permafrost soils and methane-containing ices called methane hydrate or clathrate, especially offshore in ocean marginal sediments. Owing to their sheer size, these carbon stocks have the potential to massively affect Earth’s climate should they somehow be released to the atmosphere. An abrupt release of methane is particularly worrisome because methane is many times more potent than carbon dioxide as a greenhouse gas over short time scales. Furthermore, methane is oxidized to carbon dioxide in the atmosphere, representing another carbon dioxide pathway from the biosphere to the atmosphere.

According to current scientific understanding, Arctic carbon stores are poised to play a significant amplifying role in the century-scale buildup of carbon dioxide and methane in the atmosphere, but are unlikely to do so abruptly, i.e., on a timescale of one or a few decades. Although comforting, this conclusion is based on immature science and sparse monitoring capabilities. Basic research is required to assess the long-term stability of currently frozen Arctic and sub-Arctic soil stocks, and of the possibility of increasing the release of methane gas bubbles from currently frozen marine and terrestrial sediments, as temperatures rise.

So some bad news and, I suppose, good news – more time to address what would certainly be completely catastrophic to the global economy and world population.

AICH has some neat graphics and pictoral exhibits.

For example, Miami Florida will be largely underwater within a few decades, according to many standard forecasts of increases in sea level (click to enlarge).

Florida

But perhaps most chilling of all (actually not a good metaphor here but you know what I mean) is a graphic I have not seen before, but which dovetails with my initial comments and observations of physicists.

This chart toward the end of the AICH report projects increase in global temperature beyond any past historic level (or prehistoric, for that matter) by the end of the century.

TempRise

So, for sure, there will be species extinction in the near term, hopefully not including the human species just yet.

Economic Impacts

In closing, I do think the primary obstacle to a sober evaluation of climate change involves social and economic implications. The climate change deniers may be right – acknowledging and adequately planning for responses to climate change would involve significant changes in social control and probably economic organization.

Of course, the AICH adopts a more moderate perspective – let’s be sure and set up monitoring of all this, so we can be prepared.

Hopefully, that will happen to some degree.

But adopting a more pro-active stance seems unlikely, at least in the near term. There is a wholesale rush to bringing one to several trillion persons who are basically living in huts with dirt floors into “the modern world.” Their children are traveling to cities, where they will earn much higher incomes, probably, and send money back home. The urge to have a family is almost universal, almost a concomitant of healthy love of a man and a woman. Tradeoffs between economic growth and environmental quality are a tough sell, when there are millions of new consumers and workers to be incorporated into the global supply chain. The developed nations – where energy and pollution output ratios are much better – are not persuasive when they suggest a developing giant like India or China should tow the line, limit energy consumption, throttle back economic growth in order to have a cooler future for the planet. You already got yours Jack, and now you want to cut back? What about mine? As standards of living degrade in the developed world with slower growth there, and as the wealthy grab more power in the situation, garnering even more relative wealth, the political dialogue gets stuck, when it comes to making changes for the good of all.

I could continue, and probably will sometime, but it seems to me that from a longer term forecasting perspective darker scenarios could well be considered. I’m sure we will see quite a few of these. One of the primary ones would be a kind of devolution of the global economy – the sort of thing one might expect if air travel were less possible because of, say, a major uptick in volcanism, or huge droughts took hold in parts of Asia.

Again and again, I come back to the personal thought of local self-reliance. There has been a growth with global supply chains and various centralizations, mergers, and so forth toward de-skilling populations, pushing them into meaningless service sector jobs (fast food), and losing old knowledge about, say, canning fruits and vegetables, or simply growing your own food. This sort of thing has always been a sort of quirky alternative to life in the fast lane. But inasmuch as life in the fast lane involves too much energy use for too many people to pursue, I think decentralized alternatives for lifestyle deserve a serious second look.

Polar bear on ice flow at top from http://metro.co.uk/2010/03/03/polar-bears-cling-to-iceberg-as-climate-change-ruins-their-day-141656/