Hicks on the limited applicability of probability calculus
from Lars Syll
When we cannot accept that the observations, along the time-series available to us, are independent, or cannot by some device be divided into groups that can be treated as independent, we get into much deeper water. For we have then, in strict logic, no more than one observation, all of the separate items having to be taken together. For the analysis of that the probability calculus is useless; it does not apply. We are left to use our judgement, making sense of what has happened as best we can, in the manner of the historian. Applied economics does then come back to history, after all.
I am bold enough to conclude, from these considerations that the usefulness of ‘statistical’ or ‘stochastic’ methods in economics is a good deal less than is now conventionally supposed. We have no business to turn to them automatically; we should always ask ourselves, before we apply them, whether they are appropriate to the problem at hand. Very often they are not. Thus it is not at all sensible to take a small number of observations (sometimes no more than a dozen observations) and to use the rules of probability to deduce from them a ‘significant’ general law. For we are assuming, if we do so, that the variations from one to another of the observations are random, so that if we had a larger sample (as we do not) they would by some averaging tend to disappear. But what nonsense this is when the observations are derived, as not infrequently happens, from different countries, or localities, or industries — entities about which we may well have relevant information, but which we have deliberately decided, by our procedure, to ignore. By all means let us plot the points on a chart, and try to explain them; but it does not help in explaining them to suppress their names. The probability calculus is no excuse for forgetfulness.
John Hicks’ Causality in economics ought to be on the reading list of every course in economic methodology.
Let me add three suggestions which come from Hicks’s reading.
(1) Renounce as a matter of scientific research all macroeconometric efforts that may contribute to predictions.
As long as there is a demand for prediction, searches for predictive model will continue, but they are activities like astrology. Let us ignore them.
(2) Put more efforts on historic and empirical research.
As Hicks put it,
(3) Change the agenda of our research. Try to find firmer causal relations even if they are valid for a very restricted range. Try to build a systemic theory on the base of those firm hypotheses.
It is necessary to know causal structure which underlies beneath the apparent processes. We may say that we need more evidence-based economics.
Well dug out, Lars.
Yoshinori’s first comment threw me, as I initially read his “macroeconometric” as “macroeconomic”! In any case, understanding that economics is modelling flows (https://rwer.wordpress.com/2020/02/03/the-knowledge-of-childless-philosophers/#comment-164344) I disagree with his conclusion, “Let us ignore them”. One can see that the rivers go where their channel takes them, but so do floods, and where these can occur town planners need to predict not only how often but how seriously in order to decide what (if anything) to do about them.
Governments are perhaps at the other end of this. Given shortages of water, if and how to top up from other sources (as in Keynes’ employment on infrastructure). Even my schoolboy introduction to differential calculus via economics had manufacturers knowing what would sell but tolerancing production levels to allow for risk of what of what might not.
So yes, history, but not just of sales but of opening up a region’s productive capacities with railroads to markets; taking people to bits to see empirically how nerves open up our productive capacities by linking sensed information with control of our actions and habitual formation of efficient paths.
And yes: change the research agenda to that being implied by what BinLi and I have been discussing: like Newton, studying the geometric form of paths rather than an algebraic representation of them which eliminates Boulding’s (“The Image”) iconic contact with reality.
Lars complains, in effect, that eyes are not very good for seeing; the light and our preconceptions can play tricks with them. All quite true. But try seeing without them. Statistical analysis is easily misused in social studies but without it there is no progress beyond very broad and vague generalisations. Establishing the limits of a generalisation and the necessary qualifications entails confronting it with data.
All events are embedded in history and the result of innumerable particularities. That does not preclude the existence of regularities. An uncontroversial example is the seasons. They did not generate human history but they are an ever-present influence – ask Napoleon. Economics is the search for such regularities and valid generalisations in a subset of human activities related to making a living, getting and spending. Any generalisations are unlikely to apply everywhere, in a hunter-gatherer tribe and in a complex commercial economy. We have to find and specify the limits for any one and detect its presence in historical events shaped by many other factors. Given the data sets we have that is impossible without resort to statistical analysis.
Once you find a generalisation that seems to hold on past data it is always tempting to wonder whether it will hold in future. It may not; an essential qualification may have been missed because not evident in the historical sample. Anyway the system is evolving and the generalisation may be obsolescent. We cannot prohibit empirical research, however, on the grounds that it might be used for forecasting.
.
Gerald’s poor eyes analogy is all quite false. It is a red herring; a false analogy implicit in which is the dismissal of evidence which we can can through actually using our _real_ eyes through observation, historical studies, and evidence readily available and regularly used to understand the real world around us. Gerald falsely dismisses such evidence as merely “vague generalisations” simply because he has a one-tool-toolbox and the world is a statistical hammer.
.
At this point I find his response silly and ridiculously deaf, dumb and blind and self-contradicting to statements he has made elsewhere. It seems he does protest too much and is simply enjoying playing the devils advocate for a pastime or somehow feels his livelihood is threatened if Lars is right.
.
If one wants to know the causes of the Subprime Mortgage crisis statistical analysis is hardly the tool one would use to confirm the vast volumes of paper-trail evidence forthcoming from legal subpoena revealing the real underlying fraud, manipulation, and deception practiced by banks, mortgage brokers, ratings agencies, lawyers, accountants, etc.
.
Statistical analysis is used in many contexts, and rightfully so, but only a fool dismisses as “vague generalisations” other forms of evidence simply because they are not statistically based. This is simply put absurd.
.
.
Using statistical analysis can be a very useful tool–within limits–but push that tool beyond limits and ignore other sources of evidence one can _see with one’s own eyes_ one is behaving more like an idiot savant than a real scientist. For example in uncovering financial shenanigans one can use statistical analysis based upon knowledge of the relationship between the balance sheet and income statements to determine if potentially there is some shenanigans going on that are misrepresenting the actual financial health of a company. But this only suggests where one should actually use one one’s _real eyes_ to investigate further and determine if the potential is actually real. There is after an art to fooling investors which statistics alone cannot uncover.
.
Repeatedly Ken Zimmerman and Robert Locke have raised the point that one one wants to understand economics then use one’s _real eyes_ to study what businesses and business men and women _actually do_ and what their reasons for doing so actually were. When such historical observations and historical analysis is falsely dismissed as merely “vague generalisations” one is lead to ask, whatever do you mean?
.
.
When statistical shadows are considered more real than actual evidence seen with our own eyes (such as pretending that the random walks of stock market prices tells us anything about the underlying rules created by lobbyists, politicians, and actual case studies of market manipulation underlying bubbles-n-crashes, then we as a society have lost our real eyes and dehumanized economics.
“It’s the monetary and financial paradigm, stupid!”
Yes, Craig, but Meta Capitalism (Rob?) is right about Gerald being way off the point here: Lars was after all merely quoting Hicks. However, what astonishes me about Rob is his ability to unself-consciously quote other people’s comments at such length, even allowing that here they are very relevant. By the time one has read them one has almost forgotten what the argument was about!
All I will say is that Gerald seems to be looking only at the data (measurements or verbal interpretations of what other people are seeing) rather than focussing his senses on events in the real physical world.
.
What one sees and does’t see; the methods one uses and don’t use; the values one assume and don’t assume, are inescapably part of the equation.
.
.
I have over the last few years while studying economics added to a database a list of various authors definition(s) of economics. It is rather illuminating that these definitions are all over the map. Some ridiculously narrow, as Gerald’s above, and others more expansive including institutions, culture, power-relations, politics, etc. It appears economics is really defined in the eyes of the beholder, and that in turn determines which methodologies, which critical assumptions, and which evidence is seen or not seen ;-)
.
The obviously correct thing to do is to dedicate ourselves to integrating only the truths in opposing perspectives along with their highest ethical considerations, i.e. wisdom/paradigm perception.
That and focus on the equally obvious fact that “It’s the monetary and financial paradigm, stupid!”
One reason for such compartmentalisation within the social sciences is the attempt by economists during the late nineteenth century to emulate contemporary physics by becoming ‘scientific’. To do so, ‘political’ was dropped from ‘political economy’, to focus almost exclusively on how rational individuals maximize their happiness by allocating scare resourcees amongst unlimited wants. Thus, this ‘new’ economics, or ‘neoclassical economics’, limited its approach to one narrowly defined as ‘scientific’, and mostly focused on the question of rational choice rather than the investigation of the economy’s ability to provision. Needless to say, not all economists accepted this constricted scope and method, giving rise to the proliferation of many schools of thought within economics.
.
(….) Other disciplines, particularly sociology and anthropology, formed and developed in order to investigate areas and issues jettisoned by neoclassical economics, such as group behaviour, institutions, property rights, power, culture and the historical evolution of capitalism.
.
Does such compartmentalisation (within the social sciences) help or hinder? Although we believe in the benefits of specialisation, we also feel that specialisation without cooperation is limiting and self-defeating. Each discipline can and should learn from others. One of the goals of education should be to recognize that in the real world our problems are not demarcated by discrete disciplines. For example, climate change is neither a sociological, environmental nor economic phenomenon. We need insights of all diciplines to solve our problems, and yet each of the social sciences is a work in progress, since there is a lot we do not yet know. Perhaps one of our goals as social scientists should be to reduce the barriers, blend the disciplines, and/or work across disciplines: that is, to be interdisciplinary.
.
Exclusive reliance on only one discipline gives a misleading and myopic understanding.
.
— Reardon, Jack et. al.. Introducing a New Economics [Pluralist, Sustainable, and Progressive]. London: Pluto Press; 2018; pp. 2-3.
The obviously correct thing to do is to dedicate ourselves to integrating only the truths in opposing perspectives along with their highest ethical considerations, i.e. wisdom/paradigm perception.
That and focus on the equally obvious fact that “It’s the monetary and financial paradigm, stupid!”
On this blog people routinely identify economics with neoclassical economics. Kalecki, Simon and Minsky were all economists – and great ones. Everyone here agrees that the application of the neo-classical approach to macroeconomic phenomena is an error and the new-classical or Chicago school is ridiculous so I don’t know what (psychological?) purpose is served by repeating it endlessly.
Understanding any historical episode requires attention to all the particularities of the situation – as historians do. The question of which of the factors at play represent more general tendencies and could recur is not trivial. Of course one has to frame causal hypotheses based on one’s knowledge of the world – only Lars thinks hypotheses emerge from data. But when it comes to test the generality of the hypothesis you need a lot of data given the complexity of social systems. And you cannot analyse that data except statistically. Everything else is anecdote and literature.
Not that I’ve got anything against literature…..
“And you cannot analyse that data except statistically”? One can analyse it systematically, as one does an arabic number by checking the reasonableness of all the digits in turn as far as is necessary, from the most significant to the least, it being the case that even the smallest of the most significant digit (or indeed the number of the digits) is more significant than all the less significant digits put together, no matter whether they are right or wrong. How else can a few lines in a caricature sometimes capture more of a person’s character than a detailed portrait? As caricaturist G K Chesterton put it, because of “the significance of outline”. We are here discussing macroeconomics, not the micro variety.
Perhaps we’ll get a somewhat broader perspective on probability if we look outside economics. In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission’s (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The “Rasmussen Report” inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report’s controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a “figure of merit” to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power’s safety to a growing chorus of critics. Subsequent attacks on the Report’s methods and numerical estimates damaged the NRC’s credibility. PRA’s fortunes revived when the 1979 Three Mile Island (TMI) accident demonstrated PRA’s potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report’s controversies endure in mistrust of PRA and its experts. This has not stopped nuclear and other experts from applying PRA. The latest report was published in 2016.
PRA grew out of concerns that the nuclear Industry remained wedded to a “deterministic analysis” and a redundant “defense-in-depth” approach that downplayed the role of risk assessment in safety evaluations. Regulators using a deterministic approach simply tried to imagine “credible” mishaps and their consequences at a nuclear facility and then required the defense-in-depth approach—layers of redundant safety features—to guard against them. Before TMI no severe accidents that melted the core of a plant had ever occurred, and no sure way existed to calculate the probability of a major accident. NRC experts used their collective judgment to determine what accidents were credible, and the agency often mandated multiple safety systems to compensate for the uncertainty of an accident’s probability and consequences. This approach had worked well in protecting public safety; defense-in-depth was critical in preventing sizable releases of the most dangerous forms of radiation at TMI. However, the defense-in-depth approach was not effective in prioritizing accidents or in judging when an extra, often expensive, safety system produced a
commensurate increase in margins of safety. Proponents argued that a PRA, with its much more detailed use of probabilities and modeling of plant and human behavior, could
better deal with such issues.
After it was concluded that “fool-proof” devices for the safety of nuclear reactors could not be created, research on how components and systems of nuclear reactors could fail became the focus. This involved fault trees applying Boolean logic. Which eventually became the “go to” approach for assessing risk for nuclear reactors. With digitization and ever more powerful computers fault trees and Boolean diagrams became ever bigger, more detailed. and quickly and easily applied. Now these are buried within “apps” used at every reactor in the US, and most around the world. With many reactors coming to the end of their operating life, the effectiveness of the apps is about to be tested, severely.