## Uncertainty and the “data-generating process” in macroeconomics

from **Lars Syll**

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.

**But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. ** In “modern” macroeconomics – dynamic stochastic general equilibrium, new synthesis, new-classical and new-Keynesian -variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

“Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental *Treatise on Probability* (1921), this is no always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

**In the end this is what it all boils down to. **We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

**Some macroeconomists, however, still want to be able to use their hammer. **So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

**Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophy!**

yes and that is why I have argued that the mainstream models assume, as Paul Samuelson insisted, the ergodic axiom. If the siituation is truly uncertain in Keynes’s and your ontological sense, then economics must be a nonergodic science.

Paul Davidson

Yes, it is nonergodic. But data also DOES have its place. The real problem is we aren’t actually using ALL of the needed “tools” to help us understand and then craft more workable and humane economic and financial systems. We need science and math AND Wisdom. After all wisdom is the higher order level of human thinking which has observed human acting for like 8000 years. It’s probably a “science” we can trust.

I assert that you can (actually scientifically) reduce/condense human wisdom to the four ideas, values and experiences of confidence, hope, love and grace. So lets base our systems on these, derive policies that accurately reflect them and then monitor the results with math and science so as to craft appropriately wise regulation as needed.

As we see the record of utilizing only math and science in economics and finance not only hasn’t worked very well it has resulted in an arrogant and rigid scientific and mathematical orthodoxy which is actually the polar opposite of real wisdom.

If we are borrowing from Peter to pay Paul and vice versa, which I claim we are, then any reduction in the supply of new money borrowed from banks, for any reason, will cause mathematically inevitable defaults.

Thus every, even slight, economic slowdown can become a deflationary death spiral.

In cartoon form:

More robust written proofs:

http://paulgrignon.netfirms.com/MoneyasDebt/twicelentanimated.html

http://paulgrignon.netfirms.com/MoneyasDebt/Analysis_of_Banking.html

Math has the potential to accurately model the temporal universe…if it actually looks everywhere it needs to look and also actually de-constructs all faulty orthodoxies. Wisdom condensed and de-mystified has already codified and done that, and so is not only “nice” but a deeper and better basis upon which to craft policy.

To put another slant on this, the probabilistic philosophy of science behind Samuelson’s ergodic axiom, neo-classical mathematical modelling and even the thinking of many physicists, originated not in the realistic observation and insight of a Bacon or Bhaskar but in the spin put on Newton’s methods by a non-scientific doubting Thomas, the imaginative sceptic Davud Hume. Even Keynes studied probability some years before Shannon treated noise as a disturbance of an INTENDED data generating process, not as the cause of the process. (The intention is formed by pulses of electricity flowing predictably down neurons, but with considerable uncertainty as to which neurons and at what times). I’m a delighted fan of chaos theory, but the data generating process the mathematics describes is the attraction before it becomes a strange attractor formed by different attractions accidentally coming within range of each other. A river is a data generating process wherein the macro data is on the whole determined by its banks, not by the ripples and the whirlpools it may have in it. Traffic data is usually determined by the traffic sticking to roads, not accidentally or deliberately leaving them (Only when the chaos on the roads gets so bad that people start driving cross country will the road structure not largely determine the traffic data). The beginnings and the end of energy waves have to be brought together by turbulence, not accident, to be captured as sub-atomic particles.

Despite what Hume had to say and today even most physicists have been taught to think, probability didn’t generate anything real, only distorted images of reality. And as Erich Fromm put it, the First Commandment of the decalogue is also the first commandment of logic: “Thou shalt not confuse the image with the reality”. The context of that was the Israelites valuing money they could see (the Golden Calf) more than the helpful God they had forgotten.