Home
> Uncategorized > “The current crisis and the culpability of macroeconomic theory”
“The current crisis and the culpability of macroeconomic theory”
RWER contributor Paul Ormerod has an interesting paper in the current issue of Twenty-First Century Society It can be downloaded for free at http://www.paulormerod.com/pdf/accsoct09%20br.pdf And here is its abstract:
AbstractThe ideas at the heart of modern macroeconomics provided the intellectual justification of the economic policies of the past 10 to 15 years. It is these ideas which the financial crisis falsified. The dominant paradigm in macroeconomic theory over the past 30 years has been that of rational agents making optimal decisions under the assumption that they form their expectations about the future rationally—the rational agent using rational expectations, or RARE for short. The focus of the paper is on the way in which RARE deals with risk and uncertainty. It is this which is at the root of the problems, both for the discipline of economics and, much more importantly, for the economy itself and the financial crisis. Modern RARE macroeconomics bears a heavy burden of responsibility for the financial crisis. The discipline provided the intellectual underpinning for a world in which situations involving risk led to it being systematically underestimated, and in which situations of genuine uncertainty were not recognised for what they were.
































What follows is an expansion on the theme of our knowledge of the future, base on Ormerod’s erudite and illuminating blog on the bankruptcy of modern macroeconomics.
Anyone who makes a long-term loan, undertakes a project or starts a new business is venturing into the future, a world where one moves successively and faster than one expects, from a state of high probability into one which is a murky mix of good estimates, fuzzy estimates, bad information, no information and some off-the-wall events.
Thus it is convenient to classify our knowledge about the future into four categories, with fuzzy boundaries, needless to say —
[1] High probability. It is reasonable to assume that several reliable suppliers with long and successful business histories will still be around five or even ten, years from now.
[2] Risk. We can specify all [or nearly all] of the outcomes, estimate their probabilities within reasonable margins of error and estimate their benefits or costs in monetary terms, again with reasonable margins of error.
[3] Uncertainty. At best we can specify only some of the outcomes and some of the probabilities. A typical case would be a partial list in rank order of probability.
[4] Highly improbable events. These all have miniscule probabilities and come in three “flavors” —
[a] Outcomes of paths to failure. These arise from the innumerable “paths to failure” which are accidentally built into systems which are complicated, complex or both. For example —
communications systems, electric transmission and distribution systems, nuclear-steam electric-generating stations, semi-conductors and space vehicles.
These systems have so many paths to failure that we cannot imagine all of them, much afford the time and money to test for them. Although each one has a miniscule probability of being activated, there are so many that there is a significant probability that some one will ! Three Mile Island may be a good example.
[b] Other HIE’s which are foreseable. Before they happen, they seem to be too improbable to be worth planing for. Examples would be — Chernobyl, Deepwater Horizon and Exxon Valdez.
By contrast , one should definitely try to prevent an asteroid from crashing into earth, because the consequences would be so horrendous.
[c] Other HIE’s which are not foreseeable. For example — Some of the “black swan” events describe by Taleb.
[Some people claim that the NYSE crashes of 10/77 and 10/87 were black swans. But Minsky, Sornette and others have argued cogently that such events are endogenous to the market.]
The foregoing puts managers dealing with the long run on the spot. For exaample —
[1] If the probability of a misfortune is only one percent and its estimate cost is only one million dollars, then the certainty equivalent is an event which cost $10,000. If the anticipatory measures are expensive, one would be inclined to do nothing.
[2] However, it the cost of a misfortune is $10 billion, then the certainty equivalent has a cost of $100 million, and one had better take some precautionary measures !
Things get even more difficult when the people involved are more important than the equipment and machinery. For example — in crude oil and stock markets, the “playiers” must not only deal with the limitations on their information and skills but on their anticipation of what the other players are going to do, taking into account imperfect knowledge of their limited information and skills. In many cases, this leads to a non-equilibrium system situation which does not resemble a market so much as a pack of unruly hound dogs chasing each others’ tails.
The foregoing is complicated by the players’ incomplete and inconsistent preferences, by inconsistent attitudes towards discounting and by other behavioral anomalies.
For a more detailed discussion of these problems, see the references in the section, “Perceptions of reality — knowns and unknowns” in my 2006 paper, “Notes for a new paradigm for economics”, available for the asking at .
“Notes” is available at . ###
This elaborates on the subject of uncertainty, mentioned briefly in my first comment on Ormerod.
A major source of uncertainty is the tendency of dynamic systems to generate runs of data [relating to their output] which look like the run came from somewhere else. This greatly complicates the interpretation and forecasting of the system behavior.
For example — Weather systems are basically chaotic and therefore deterministic. But as an empirical matter, storms which arise off the west coast of Africa and eventually turn into hurricanes as they approach the Lesser Antilles, spend most of their time in a random mode ! If the random behavior is Markovian, it may even generate pseudo trends !
Although a logistic curve is purely deterministic, consecutive results when one is used as the “seed” for the next may exhibit oscillating, random or trend behavior, as well as the signature chaotic, depending on the initial seed.
For example — Complex systems occasionally go into chaotic modes [such as 10/77 and 10/87 on the NYSE] or into random modes, such as a “trading range” in a stock market. Usually one has no clue as to how long these will last or where the system will be in its hyper space at the end of such a run.
For example — Some random systems may generate pseudo cycles or pseudo trends which are strikingly similar to the real thing, as with cumulative winnings or losings when one is betting equal amounts on heads or tails.
Another source of uncertainty is the inability of both econometric modelers and Wall Street technicians to call correctly more that 70% of the turning points in the time series generated by complex systems, such as a stock-market price series. As one 39-year veteran of Wall Street said a few years ago, “If you get 80%, you are a hero !” ###