This year, Prof. William Nordhaus (Yale) published The Climate Casino: Risk, Uncertainty and Economics for a Warming World. The book continues Professor Nordhaus' forty year career investigating the economics of energy resources. The book was also recently reviewed by Paul Krugman (Princeton, here) and is interesting on many levels: the use of sophisticated macroeconomic models, attempts to forecast far out into the future, the use of probability and uncertainty (the models are purely deterministic), Prof. Nordhaus' willingness to make very specific policy recommendations (increase the price of CO2 and increase technological change in the energy sector), and finally his long running argument about the Limits to Growth.
The initial argument about the Limits to Growth grew out of Prof. Nordhaus' work on nonrenewable resources, particularly fossil fuels. The argument is captured in the causal diagram above. A group of engineers at MIT were arguing that the depletion of nonrenewable resources would lead to economic collapse. Prof. Nordhaus countered that technological change would create substitutes that would prevent any future collapse from happening. Paul Krugman points out that Nordhaus was wrong about the specifics in his long-term forecasts (real oil prices are about twice as high as the models predicted while real coal and natural gas prices are much lower) but was right to have attempted the forecast (presumably, we should be learning from what went wrong).
In any event, the argument about nonrenewable resources has morphed into an argument about the impacts of CO2 emissions on climate change, that is, the impact of extracting fossil fuels and pumping the by-products into the atmosphere.
The new argument is summarized in the causal diagram at the left which in part, is taken from page 10 of the Climate Casino. The resource extraction issue is still lurking in the link between economic growth and CO2 emissions but technological change is now supposed to help us reduce emissions by decreasing the carbon intensity of the economy.
Nordhaus accepts the basic causal link that Economic Growth -> CO2 Emissions -> Climate Change -> Ecological Impacts. He is less certain about what will happen as a result of observable ecological impacts (sea level rise, super typhoons, water scarcity, habitat loss, coastal flooding, ocean acidification, reduction of agricultural yields, and other impacts). His major argument is that policy change is needed, but his concern is also with avoiding negative effects on economic growth. The policy measures he is most focused on are cap-and-trade systems, carbon taxes, and environmental regulation. Nordhaus is looking for a policy that will raise the price of carbon, can be universally applied and will not inhibit economic growth.
One causal link that is not presented in the summary diagram on page 10 of the Climate Casino is the link between Ecological Impacts and Economic Growth. If all the negative ecological impacts listed above actually happen, negative economic impacts (Limits to Growth) would seem to be highly likely. Possibly, this is just a matter of emphasis. Policy change is already happening and Prof. Nordhaus wants to shape the debate (stack the deck?). Aside from the political agenda, there are deeper issues that go back to the reasons Prof. Nordhaus' original long-range forecasts were off the mark.
The Climate Casino uses an embellished version of the same macroeconomic growth model used to make the original predictions (here) about nonrenewable resource allocation. If the initial forecasts were not very accurate, has anything fundamental been changed in the new version to produce better projections? Publication of the Climate Casino gives us the opportunity to look back at Prof. Nordhaus' model and his competitor's models (to include the Limits to Growth models). In future posts, I'll review chapters of the Climate Casino and look more closely at the underlying models and how they were constructed. The "deep dives" into model specifics are not particularly reassuring.
...Nobody has yet developed the mathematical equations and computer models needed to do really good economic predictions--Allan Marks, 2012.
State Space Models
All state space models are written and estimated in the R programming language. The models are available here with instructions and R procedures for manipulating the models here here.
Thursday, November 14, 2013
Friday, June 21, 2013
Can Biodiversity Coexist with Food Production?
Watch Balancing Costa Rica's Farming With Preservation with Nature on PBS. See more from PBS NewsHour.
To my knowledge, there are no macroeconomic models that include measures of biodiversity. From the video above, it would seem to be a major omission. It's also interesting that Costa Rica is one of the first countries to recognize that biodiversity can coexist with food production. The question for this blog is how to include such measures (how to value Nature) in macro models, a topic I'll take up in future posts.
Friday, June 14, 2013
Causal Analysis of the Neoclassical Economic Growth Model
The neoclassical economic growth model, in one form or another, encapsulates the conventional wisdom on long-run economic growth. The model captures many of the issues thought to be important: productivity, capital formation, population growth, and technology. In this post, I will present the equations for the Solow-Swan model, convert the model into a causal path diagram and explore the model's assumptions, predictions, insights and shortcomings.
The heart of the Solow-Swan model is Cobb-Douglas production function which models total production, Y, as a function of Total Factor Productivity Growth (TFPG) or technology, A, capital, K, and labor, L. The exponent alpha is simultaneously the elasticity of production, the capital-income ratio and the the shares of capital and labor in output. Capital investment (I) comes from savings, sY, and when compared to depreciation, (1-d), determines growth of the capital stock. The growth of the labor force is a function of population growth, N (the difference between births and deaths, b-d), and the percent of population employed, e. Technological change (TFPG), A, is exogenously given and grows at the rate g.
Dropping the functional forms and mapping the causal connections among variables yields the directed graph above where total output, Y, is replaced by gross domestic product, Q. The causal model makes clear that as long as population and technological change are growing exponentially, gross domestic product will also grow exponentially, forever. Also, the idea that as long as technological change exceeds population growth, output-per-capita will increase is appealing to economists and addresses criticisms of the Malthusian model where population growth exceeds productive growth.
The assumption of uncontrolled, unending exponential growth can be modified somewhat by assuming population growth is exogenous and specifying a regression model (Wonnacott and Wonnacott, 1970: 93-95):
However, technological change (TFPG, the regression constant) still is assumed to grow exponentially. For reasonable long-run projections it can be assumed that, at some point in the future, technological progress will decline and eventually stop. For example, in the DICE model:
"Assumptions about future growth trends are as followed: The rate of growth of population was assumed to decrease slowly, stabilizing at 10.5 billion people in the 22nd century. The rate of growth of total factor productivity [the growth rate of A(t) in Eq. 3] was calculated to be 1.3% per annum in the period from 1960 to 1989. This rate was assumed to decline slowly over the coming decades. We calibrate the model by fitting the solution for the first three decades to the actual data for 1965, 1975, and 1985 and then optimizing for capital accumulation and GHG emissions in the future..." page 1317 in An Optimal Transition Path for Controlling Greenhouse Gases" William D. Nordhaus (1992).
There are many problems applying the neoclassical growth. For estimation, capital stock measures are not available or have questionable validity in many countries. Although the number of inputs (land, intermediate goods, etc.) are typically expanded when used for growth accounting, as a transformation function the model does not make sense (neither labor or capital are not transformed into output through production). For cross-country comparisons, the assumption that one production function would cover all countries (Somalia would have the same production function as would Canada) seems unrealistic. Since the model is nonlinear, it cannot be aggregated from the individual level to the nation state or from the nation state to world system. Finally, the assumption of decreasing technological change in the future is only made because exponential growth forever is unreasonable not because there is any evidence for such an assumption.
A lot more can be said about the neoclassical production function, particularly that causality might run in the opposite direction from that assumed in the Cobb-Douglas production function. The other points are best made when considering other macro models, which I will do in future posts.
Saturday, January 19, 2013
Do We Need Better Econometric Models?
I have made a number of arguments in the last post (here) suggesting that we need better econometric models. One of the arguments was that econometric forecasting models did a poor job of predicting the Subprime Mortgage Crisis (the Financial Crisis of 2007-2008).
Yesterday the Federal Reserve Open Market Committee (FOMC) released transcripts of closed-door policy meetings in 2007 (here), at the start of the Subprime Mortgage Crisis. In the quote below from the Washington Post (here) we see that the models used by the FED staff economists were not predicting the upcoming crisis. The reasons why these models failed is a major concern of this blog and should be a major topic of discussion for the economics profession.
Typically, econometric models predict from one period to the next, whether that is months, quarters of years. The models are basically a collection of structural equations, some of which are dynamic, that is, have a time component. For example, the simple model Q(t) = A Q(t-1) + B X(t-1) + E would look similar to the dynamic equations where Q is some variable, t represents a time point, A represents a coefficient matrix for the endogenous variables, B represents a coefficient matrix for the exogenous variables (X) and E represents error.
The problem here is that when the model gets to time t, the predictions for Q contain E. If E is non-random (systematic positive shocks from the liar loans in the financial sector, for example), Q keeps accumulating errors and creates a bubble. It's ability to predict Q(t) is compromised by adding up errors from the developing Subprime Mortgage Bubble and makes everything look pretty rosy, as is evident from Mr. Stockton's comments.
The way in which econometric models are used, in this case simulating from one period to the next, is one of their major problems. The other problem is the theory (or lack of it) underlying the models. Both of these arguments will take many more posts to develop.
Yesterday the Federal Reserve Open Market Committee (FOMC) released transcripts of closed-door policy meetings in 2007 (here), at the start of the Subprime Mortgage Crisis. In the quote below from the Washington Post (here) we see that the models used by the FED staff economists were not predicting the upcoming crisis. The reasons why these models failed is a major concern of this blog and should be a major topic of discussion for the economics profession.
In December 2007, the month that the recession is now known to have begun, Fed officials were working from economic projections that would prove wildly inaccurate. They forecast sluggish but sustained growth in 2008 followed by a bounceback in 2009. Staff economist Dave Stockton acknowledged that his was a more optimistic view:
“Our forecast could admittedly be read as still painting a pretty benign
picture: Despite all the financial turmoil, the economy avoids recession and, even with steeply higher prices for food and energy and a lower exchange value of the dollar, we achieve some modest edging-off of inflation.”
Dave Stockton, who was the Chief Economist and Director of Research and Statistics at the Federal Reserve Board from 2000-2011, was basing his forecasts on those being produced by the FED econometric models. The burning question is why those models performed so poorly during the financial crisis and whether anything has been done about it after the crisis.picture: Despite all the financial turmoil, the economy avoids recession and, even with steeply higher prices for food and energy and a lower exchange value of the dollar, we achieve some modest edging-off of inflation.”
Typically, econometric models predict from one period to the next, whether that is months, quarters of years. The models are basically a collection of structural equations, some of which are dynamic, that is, have a time component. For example, the simple model Q(t) = A Q(t-1) + B X(t-1) + E would look similar to the dynamic equations where Q is some variable, t represents a time point, A represents a coefficient matrix for the endogenous variables, B represents a coefficient matrix for the exogenous variables (X) and E represents error.
The problem here is that when the model gets to time t, the predictions for Q contain E. If E is non-random (systematic positive shocks from the liar loans in the financial sector, for example), Q keeps accumulating errors and creates a bubble. It's ability to predict Q(t) is compromised by adding up errors from the developing Subprime Mortgage Bubble and makes everything look pretty rosy, as is evident from Mr. Stockton's comments.
The way in which econometric models are used, in this case simulating from one period to the next, is one of their major problems. The other problem is the theory (or lack of it) underlying the models. Both of these arguments will take many more posts to develop.
Subscribe to:
Posts (Atom)