Home > Uncategorized > Economic crises and uncertainty

Economic crises and uncertainty

from Lars Syll

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known ‘data-generating process’ that unfolds over time and on which we, therefore, have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course, it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances. 

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30% and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end, this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist nor the deciding individual can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead, we could just admit that we often simply do not know and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic crisis!

  1. James Beckman
    December 6, 2018 at 12:03 pm

    Since I have long dabbled in real estate, in California I was well aware of the awful quality of the loans being made before 2007-2008. Lenders were also, but it was to their self-interest not to sound an alarm. The real alarm must be made, I believe, by the practitioners & not the model-builders. The latter lag the former by at least economic crisis, it appears.

  2. December 6, 2018 at 12:59 pm

    Modeling is compatible with the radical – absolutely complete – uncertainty. If you do not have a dumb model. If you use a model of RCO (Risk-Constrained Optimization) type.

  3. Frank Salter
    December 6, 2018 at 1:39 pm

    How hard is it to understand that if one wishes to deal with cyclic events, one has to solve differential equations in time? The answer would appear to be that it is very hard indeed. There are few examples of any analyst even making the attempt. So it is necessary for analysts to starts analyses based on solutions evolving over time.

  4. December 6, 2018 at 4:07 pm

    But then we see unemployment correlated to total private credit almost in the absolute and obviously there’s something going on there. If we study irrelevant variables it’s going to look hopelessly complex to predict anything, and of course, it will be.

  5. December 6, 2018 at 4:07 pm

    The FBI knew about the fraud being perpetrated in the sub-prime mortgage business as early as 2004 and where were the economists then? https://www.huffingtonpost.com/william-k-black/the-two-documents-everyon_b_169813.html Everyone that was going to make a big killing through sub-primes kept his mouth shut to the fraud being committed including auditors, accounting firms, banks, oversight agencies, mortgage servicers, etc. and perhaps even economists.

  6. Craig
    December 6, 2018 at 8:43 pm

    Nothing is clear or seemingly stable preceding paradigm change, and everything is transformed and clarified when the new paradigm is perceived and implemented.

  7. December 17, 2018 at 1:01 pm

    While uncertainty cannot be reduced to risk, there is risk associated with uncertainty. The question we face is how scientists should deal with these risks of uncertainty. Prof. Dr. Jeroen P. van der Sluijs of the University of Bergen’s Centre for the study of the Sciences and the Humanities (SVT), suggests this answer in his paper, “Beyond consensus: reflections from a democratic perspective on the interaction between climate politics and science.” Reflecting on the international debate about the Intergovernmental Panel on Climate Change (IPCC) and climate science, Sluijs concludes that too little attention has been paid to the political role of the IPCC. This article reflects on that political role by distinguishing three strategies to deal with scientific uncertainties in interfacing science and policy: 1) quantify uncertainty, 2) building scientific consensus, and 3) openness about ignorance. Each strategy has strengths and weaknesses. The way the international community has set up the IPCC and its procedures has basically been guided by the consensus approach. The current emphasis on restoring faith in the IPCC by improving its procedures reinforces this strategy. Guaranteeing the scientific reliability of IPCC reports is indeed essential but it does not address the main weakness of the consensus approach: the underexposure of both scientific and political dissent. As a result of this weakness climate science has become politicized over the past decades. Moreover, as Sluijs illustrates for the Netherlands, the consensus approach has hindered a full-blown political climate debate. The third policy strategy that aims for more openness and attention for diversity and deep uncertainty in knowledge and views may inspire more democratic ways to organize the interface between climate politics and science. I agree with Sluijs’ conclusions, though I’m not so naïve as to believe that some parties will take advantage of the expanded political debate to put forward false narratives to impede the work of the IPCC.

  1. No trackbacks yet.

Leave a reply to Vladimir A. Masch Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.