Home > Uncategorized > Hicks on the limited applicability of probability calculus

Hicks on the limited applicability of probability calculus

from Lars Syll

When we cannot accept that the observations, along the time-series available to us, are independent, or cannot by some device be divided into groups that can be treated as independent, we get into much deeper water. For we have then, in strict logic, no more than one observation, all of the separate items having to be taken together. For the analysis of that the probability calculus is useless; it does not apply. We are left to use our judgement, making sense of what has happened as best we can, in the manner of the historian. Applied economics does then come back to history, after all.

hicksI am bold enough to conclude, from these considerations that the usefulness of ‘statistical’ or ‘stochastic’ methods in economics is a good deal less than is now conventionally supposed. We have no business to turn to them automatically; we should always ask ourselves, before we apply them, whether they are appropriate to the problem at hand. Very often they are not. Thus it is not at all sensible to take a small number of observations (sometimes no more than a dozen observations) and to use the rules of probability to deduce from them a ‘significant’ general law. For we are assuming, if we do so, that the variations from one to another of the observations are random, so that if we had a larger sample (as we do not) they would by some averaging tend to disappear. But what nonsense this is when the observations are derived, as not infrequently happens, from different countries, or localities, or industries — entities about which we may well have relevant information, but which we have deliberately decided, by our procedure, to ignore. By all means let us plot the points on a chart, and try to explain them; but it does not help in explaining them to suppress their names. The probability calculus is no excuse for forgetfulness.

John Hicks’ Causality in economics ought to be on the reading list of every course in economic methodology.

  1. Yoshinori Shiozawa
    February 6, 2020 at 6:09 am

    Let me add three suggestions which come from Hicks’s reading.

    (1) Renounce as a matter of scientific research all macroeconometric efforts that may contribute to predictions.

    As long as there is a demand for prediction, searches for predictive model will continue, but they are activities like astrology. Let us ignore them.

    (2) Put more efforts on historic and empirical research.

    As Hicks put it,

    We are left to use our judgment, making sense of what has happened as best we can, in the manner of the historian. Applied economics does then come back to history, after all.

    (3) Change the agenda of our research. Try to find firmer causal relations even if they are valid for a very restricted range. Try to build a systemic theory on the base of those firm hypotheses.

    It is necessary to know causal structure which underlies beneath the apparent processes. We may say that we need more evidence-based economics.

  2. February 6, 2020 at 11:48 am

    Well dug out, Lars.

    Yoshinori’s first comment threw me, as I initially read his “macroeconometric” as “macroeconomic”! In any case, understanding that economics is modelling flows (https://rwer.wordpress.com/2020/02/03/the-knowledge-of-childless-philosophers/#comment-164344) I disagree with his conclusion, “Let us ignore them”. One can see that the rivers go where their channel takes them, but so do floods, and where these can occur town planners need to predict not only how often but how seriously in order to decide what (if anything) to do about them.

    Governments are perhaps at the other end of this. Given shortages of water, if and how to top up from other sources (as in Keynes’ employment on infrastructure). Even my schoolboy introduction to differential calculus via economics had manufacturers knowing what would sell but tolerancing production levels to allow for risk of what of what might not.

    So yes, history, but not just of sales but of opening up a region’s productive capacities with railroads to markets; taking people to bits to see empirically how nerves open up our productive capacities by linking sensed information with control of our actions and habitual formation of efficient paths.

    And yes: change the research agenda to that being implied by what BinLi and I have been discussing: like Newton, studying the geometric form of paths rather than an algebraic representation of them which eliminates Boulding’s (“The Image”) iconic contact with reality.

  3. ghholtham
    February 7, 2020 at 7:35 pm

    Lars complains, in effect, that eyes are not very good for seeing; the light and our preconceptions can play tricks with them. All quite true. But try seeing without them. Statistical analysis is easily misused in social studies but without it there is no progress beyond very broad and vague generalisations. Establishing the limits of a generalisation and the necessary qualifications entails confronting it with data.

    All events are embedded in history and the result of innumerable particularities. That does not preclude the existence of regularities. An uncontroversial example is the seasons. They did not generate human history but they are an ever-present influence – ask Napoleon. Economics is the search for such regularities and valid generalisations in a subset of human activities related to making a living, getting and spending. Any generalisations are unlikely to apply everywhere, in a hunter-gatherer tribe and in a complex commercial economy. We have to find and specify the limits for any one and detect its presence in historical events shaped by many other factors. Given the data sets we have that is impossible without resort to statistical analysis.

    Once you find a generalisation that seems to hold on past data it is always tempting to wonder whether it will hold in future. It may not; an essential qualification may have been missed because not evident in the historical sample. Anyway the system is evolving and the generalisation may be obsolescent. We cannot prohibit empirical research, however, on the grounds that it might be used for forecasting.

    • Meta Capitalism
      February 8, 2020 at 12:22 am

      Lars complains, in effect, that eyes are not very good for seeing; the light and our preconceptions can play tricks with them. All quite true. But try seeing without them. ~ Gerald Holtham’s False Analogy

      .
      Gerald’s poor eyes analogy is all quite false. It is a red herring; a false analogy implicit in which is the dismissal of evidence which we can can through actually using our _real_ eyes through observation, historical studies, and evidence readily available and regularly used to understand the real world around us. Gerald falsely dismisses such evidence as merely “vague generalisations” simply because he has a one-tool-toolbox and the world is a statistical hammer.
      .
      At this point I find his response silly and ridiculously deaf, dumb and blind and self-contradicting to statements he has made elsewhere. It seems he does protest too much and is simply enjoying playing the devils advocate for a pastime or somehow feels his livelihood is threatened if Lars is right.
      .
      If one wants to know the causes of the Subprime Mortgage crisis statistical analysis is hardly the tool one would use to confirm the vast volumes of paper-trail evidence forthcoming from legal subpoena revealing the real underlying fraud, manipulation, and deception practiced by banks, mortgage brokers, ratings agencies, lawyers, accountants, etc.
      .
      Statistical analysis is used in many contexts, and rightfully so, but only a fool dismisses as “vague generalisations” other forms of evidence simply because they are not statistically based. This is simply put absurd.
      .

      I can’t afford the operation, but would you accept a small payment to touch up the x-rays?
      — WARREN BUFFETT, CEO OF BERKSHIRE HATHAWAY

      .
      Using statistical analysis can be a very useful tool–within limits–but push that tool beyond limits and ignore other sources of evidence one can _see with one’s own eyes_ one is behaving more like an idiot savant than a real scientist. For example in uncovering financial shenanigans one can use statistical analysis based upon knowledge of the relationship between the balance sheet and income statements to determine if potentially there is some shenanigans going on that are misrepresenting the actual financial health of a company. But this only suggests where one should actually use one one’s _real eyes_ to investigate further and determine if the potential is actually real. There is after an art to fooling investors which statistics alone cannot uncover.
      .
      Repeatedly Ken Zimmerman and Robert Locke have raised the point that one one wants to understand economics then use one’s _real eyes_ to study what businesses and business men and women _actually do_ and what their reasons for doing so actually were. When such historical observations and historical analysis is falsely dismissed as merely “vague generalisations” one is lead to ask, whatever do you mean?
      .

      “Foroohar demystifies the decline in America’s economic prominence, showing that the competitive threats came not from the outside—migration or China—but from within our borders. She explains how finance has permeated every aspect of our economic and political life, and how those who caused the financial crisis wound up benefiting from it.”
      — Joseph E. Stiglitz, Nobel laureate in economics and former head of the Council of Economic Advisors
      .
      At one point, a reporter pressed the former official on whether he thought that the Dodd-Frank bank reform regulation, which was still only half finished at the time, had been unduly influenced by Wall Street’s lobbying efforts. The official insisted that this wasn’t the case. I was taken aback—I had recently done a column citing academic research showing that 93 percent of all the public consultation on the Volcker Rule, one of the most contentious parts of the Dodd-Frank regulation, had been taken with the financial industry. Wall Street, not Main Street, was clearly the primary voice in the room as the regulation was being crafted. I raised my hand and shared the statistic, and then asked why so many such meetings had been done with bankers themselves, rather than a broader group of stakeholders. The official looked at me in an honest befuddlement, and said, “Who else should we have taken them with?”
      .
      That moment captured for me how difficult it is to grapple with the role of finance in our economy and our society. Finance holds a disproportionate amount of power in sheer economic terms. (It represents about 7 percent of our economy but takes around 25 percent of all corporate profit, while creating only 4 percent of all jobs.) But its power to shape the thinking and the mind-set of government officials, regulators, CEOs, and even many consumers (who are, of course, brought into the status quo market system via their 401(k) plans) is even more important. This “cognitive capture,” as academics call it, was a crucial reason that the policy decisions taken post-2008 resulted in large gains for the financial industry but losses for homeowners, small businesses, workers, and consumers. It’s also the reason that the rules of our capitalist system haven’t yet been rewritten in a way that would force the financial markets to do what they were set up to do: support Main Street. As the aforementioned conversation shows, when all the people in charge of deciding how market capitalism should operate are themselves beholden to the financial industry, it’s impossible to craft a system that will be fair for everyone. (Foroohar, Rana. Makers and Takers: How Wall Street Destroyed Main Street . Kindle Location 103-109.)

      .
      When statistical shadows are considered more real than actual evidence seen with our own eyes (such as pretending that the random walks of stock market prices tells us anything about the underlying rules created by lobbyists, politicians, and actual case studies of market manipulation underlying bubbles-n-crashes, then we as a society have lost our real eyes and dehumanized economics.

      • Craig
        February 8, 2020 at 12:49 am

        “It’s the monetary and financial paradigm, stupid!”

      • February 8, 2020 at 9:54 am

        Yes, Craig, but Meta Capitalism (Rob?) is right about Gerald being way off the point here: Lars was after all merely quoting Hicks. However, what astonishes me about Rob is his ability to unself-consciously quote other people’s comments at such length, even allowing that here they are very relevant. By the time one has read them one has almost forgotten what the argument was about!

        All I will say is that Gerald seems to be looking only at the data (measurements or verbal interpretations of what other people are seeing) rather than focussing his senses on events in the real physical world.

    • Meta Capitalism
      February 8, 2020 at 1:28 am

      Offer, Avner and Söderberg Gabriel. The Nobel Factor [The Prize in Economics, Social Democracy, and the Market Turn]. New Jersey: Princeton University Press; 2016; pp. 60-67.
      Notes: Arrow’s prize-partner John Hicks (another high theorist) dismissed the idea of economics as a science ‘Our science colleagues find permanent truths; economists, who deal with the daily actions of men and the consequences of these, can rarely hope to find the same permanency.’ In his 1974 Nobel Lecture, Friedrich von Hayek denied that economics could meet the standards of science. (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn by Avner Offer, Gabriel Söderberg, 2016, 60)
      .
      [Hayek’s] criterion of scientific validity was Popper’s falsification. The Nobel Prize for economic science did not live up to it, and risked a descent into ‘scientism’, the mere pretence of scientific certainty. Hayek did not advocate better science—economics could never be a science, because its core variables could not be observed. It was better to be vaguely right than precisely wrong, he stated (although in different words), a view often attributed to his rival Keynes. Economics was indeterminate, like biology or gardening. True knowledge was innate, and could not be confirmed scientifically by observation. (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn by Avner Offer, Gabriel Söderberg, 2016, 61)
      .
      Friedman’s [Nobel Prize] award provoked Gunnar Myrdal (NPW, 1974) to suggest the abolition of the prize. Rather like Hayek (his ideological opponent) he stated that economics could not be a science, since its data were human attitudes and behaviour, whose causes are evolving and inaccessible. Economics could never identify constants or what in former times were called ‘laws of nature’. Economists were as unlike astronomers as was possible. The other reason to distrust economics (on which Myrdal had written with authority some decades before) was that it was shot through with values, and could not avoid taking a view about proper ends. ‘[Economists] keep silent about the role of values in research. They regularly assume that there is a solid body of theories and facts, established without implying value premises, from which policy conclusions can then also be drawn.’ (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn, Avner Offer, Gabriel Söderberg, 2016, 62)
      .
      Does it matter if economics is science or not? If science is what scientists do, then perhaps we can leave it to economists to decide. But once economists begin to lay down policy, that autonomy is no longer tenable….

      In addition to theory and evidence, economics is also a normative discipline, and often claims that one policy is better than another. Implicit in the discipline’s policy norms are its norms of validity. Taking economics at face value, it privileges the value of efficiency, defined as satisfaction of individual preferences at the lowest cost. Now efficiency is worth having, but so are other values, such as truth, justice, beauty, freedom, loyalty, or obligation. To privilege efficiency is a value choice, which is not independent of the state of the world, and of the prevalence of other desirables…. The ultimate economic efficiency criterion, ‘Pareto efficiency’, is not even a pure efficiency criterion of ‘more for less’, since it does not question pre-existing endowments (inherited or otherwise unearned), which is itself a value choice. (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn, Avner Offer, Gabriel Söderberg, 2016, 64-65)
      .
      Hayek had it right ‘to entrust to science … more than scientific method can achieve may have deplorable effects’. Joseph Schumpeter (an NPW-level economist born too soon), wrote,
      .
      Unsatisfactory [empirical] performance has always been and still is accompanied [in economics] by unjustified claims, and especially by irresponsible applications to practical problems that were and are beyond the powers of the contemporaneous analytic apparatus. (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn, Avner Offer, Gabriel Söderberg, 2016, 66)
      .
      Academic evidence might be inconclusive, but reality is not. As Gunnar Myrdal said, ‘facts kick’. Bad theory makes bad policy, and when reality does not comply, it often has to be coerced. Bad theory is itself a means of coercion In the Soviet Union, it provided justification for the gulag; in the ‘free market’ United States, for its own massive gulag, a prison system larger in proportion than in any other country, fed by a labour market with frayed safety nets, and managed for profit. The Nobel Prize in economics (as we shall see) was an afterthought, a whimsy almost, of a modern central bank. Monetary doctrine between the wars, which gave rise to modern central banking, also gave rise to depression, unemployment, inequality, and ultimately to a second world war. Good economic theory (in the same years), went some way to fix the harm. (The Nobel Factor The Prize in Economics, Social Democracy, and the Market Turn, Avner Offer, Gabriel Söderberg, 2016, 67)

      .
      What one sees and does’t see; the methods one uses and don’t use; the values one assume and don’t assume, are inescapably part of the equation.
      .

      Economics is the search for such regularities and valid generalisations in a subset of human activities related to making a living, getting and spending. ~ Gerald Holtham’s Self-Serving Physics Envy (aka econophysics) Definition of Economics

      .
      I have over the last few years while studying economics added to a database a list of various authors definition(s) of economics. It is rather illuminating that these definitions are all over the map. Some ridiculously narrow, as Gerald’s above, and others more expansive including institutions, culture, power-relations, politics, etc. It appears economics is really defined in the eyes of the beholder, and that in turn determines which methodologies, which critical assumptions, and which evidence is seen or not seen ;-)
      .

      Economics is about how people organise and manage the production of goods and services, as well as the resources that are used in the process of production. The subject matter of economics covers enormous range of issues, problems, and questions, including questions about how production is organised; why particular activities are undertaken and whether they should be; the nature and functions of the institutions that are associated with organising and carrying out production activities — from banks and manufacturers to shipping and training; and the efficiency of the production process — what criteria should be used to evaluate them, what purposes they serve, and so on. It is the task of economic theory to elucidate these problems and issues which all have to do with people’s activities and, at the root of their activities, their decisions and plans. (Addelson 1995, 3)
      .
      Although it did not matter much at the time, from my earliest encounter with neoclassical economics I remember feeling uneasy about this portrayal of decision-making and choice. The ‘theory of consumer choice’, unfortunately, was the undergraduate’s introduction both to economics and to a neoclassical model. Explaining the purpose of this model, our lecturer spoke about selecting an optimal shopping basket. In spite of the penchant that undergraduates are supposed to have for swallowing whole whatever they are told, the analogy of compiling an optimal basket when faced with an income constraint, while detailing a huge range of possible things on which one might spend money, seemed a long way from the experience of going shopping or from buying the things that family members want. I imagine that students still feel this way about the models, and in teaching economics to graduate management students (who are a critical bunch as the best of times) I used to try to make these models more palatable, arguing — using Hayek’s terminology — that one could think of them as attempting to bring out the logic of what is involved in making effective (optimising) decisions, i.e., as a ‘pure logic of choice’. (Addelson 1995, 3)
      .
      I now think that this sort of rationalisation is specious. Part of the purpose of this book is to substantiate the assertion that neoclassical models of ‘decision-making’ and ‘choice’ will always be unpalatable, because they have nothing to contribute to our understanding of how people make decisions about managing resources. Orthodox or ‘mainstream’ economics is unable to explain choice and conduct because its methodology demands that the scholar look at problems in a way that makes it impossible to understand choice. What a person does and the choices that she makes — whether it is a spouse, a new car, or a career that she is choosing — depends on how she understands her social circumstances. This consideration is formally recognised in social theory in the tradition of Verstehem [see: https://en.wikipedia.org/wiki/Verstehen%5D, or subjective understanding. A theory that purports to explain people’s conduct — what they do and why they do it, including the choices and decisions they make — which is certainly a central task of social science, must be based on a satisfactory explanation of how they themselves understand. Yet the ‘perspective’ of an agent that is embedded in neoclassical theory, as a determinate equilibrium theory, has no bearing on how an individual does ‘see’ things; nor could a person conceivably understand in the way that the rational agent is supposed to ‘know’ about the world. (Addelson 1995, 3-4) (Addelson, Mark. Equilibrium Versus Understanding [Towards the Restoration of Economics as Social Theory]. London: Routledge; 1995; pp. 3-4. )

  4. Craig
    February 7, 2020 at 9:45 pm

    The obviously correct thing to do is to dedicate ourselves to integrating only the truths in opposing perspectives along with their highest ethical considerations, i.e. wisdom/paradigm perception.

    That and focus on the equally obvious fact that “It’s the monetary and financial paradigm, stupid!”

  5. Meta Capitalism
    February 8, 2020 at 1:46 am

    One reason for such compartmentalisation within the social sciences is the attempt by economists during the late nineteenth century to emulate contemporary physics by becoming ‘scientific’. To do so, ‘political’ was dropped from ‘political economy’, to focus almost exclusively on how rational individuals maximize their happiness by allocating scare resourcees amongst unlimited wants. Thus, this ‘new’ economics, or ‘neoclassical economics’, limited its approach to one narrowly defined as ‘scientific’, and mostly focused on the question of rational choice rather than the investigation of the economy’s ability to provision. Needless to say, not all economists accepted this constricted scope and method, giving rise to the proliferation of many schools of thought within economics.
    .
    (….) Other disciplines, particularly sociology and anthropology, formed and developed in order to investigate areas and issues jettisoned by neoclassical economics, such as group behaviour, institutions, property rights, power, culture and the historical evolution of capitalism.
    .
    Does such compartmentalisation (within the social sciences) help or hinder? Although we believe in the benefits of specialisation, we also feel that specialisation without cooperation is limiting and self-defeating. Each discipline can and should learn from others. One of the goals of education should be to recognize that in the real world our problems are not demarcated by discrete disciplines. For example, climate change is neither a sociological, environmental nor economic phenomenon. We need insights of all diciplines to solve our problems, and yet each of the social sciences is a work in progress, since there is a lot we do not yet know. Perhaps one of our goals as social scientists should be to reduce the barriers, blend the disciplines, and/or work across disciplines: that is, to be interdisciplinary.
    .
    Exclusive reliance on only one discipline gives a misleading and myopic understanding.
    .
    — Reardon, Jack et. al.. Introducing a New Economics [Pluralist, Sustainable, and Progressive]. London: Pluto Press; 2018; pp. 2-3.

    • Craig
      February 8, 2020 at 8:11 am

      The obviously correct thing to do is to dedicate ourselves to integrating only the truths in opposing perspectives along with their highest ethical considerations, i.e. wisdom/paradigm perception.

      That and focus on the equally obvious fact that “It’s the monetary and financial paradigm, stupid!”

  6. ghholtham
    February 8, 2020 at 4:58 pm

    On this blog people routinely identify economics with neoclassical economics. Kalecki, Simon and Minsky were all economists – and great ones. Everyone here agrees that the application of the neo-classical approach to macroeconomic phenomena is an error and the new-classical or Chicago school is ridiculous so I don’t know what (psychological?) purpose is served by repeating it endlessly.
    Understanding any historical episode requires attention to all the particularities of the situation – as historians do. The question of which of the factors at play represent more general tendencies and could recur is not trivial. Of course one has to frame causal hypotheses based on one’s knowledge of the world – only Lars thinks hypotheses emerge from data. But when it comes to test the generality of the hypothesis you need a lot of data given the complexity of social systems. And you cannot analyse that data except statistically. Everything else is anecdote and literature.
    Not that I’ve got anything against literature…..

    • February 8, 2020 at 7:02 pm

      “And you cannot analyse that data except statistically”? One can analyse it systematically, as one does an arabic number by checking the reasonableness of all the digits in turn as far as is necessary, from the most significant to the least, it being the case that even the smallest of the most significant digit (or indeed the number of the digits) is more significant than all the less significant digits put together, no matter whether they are right or wrong. How else can a few lines in a caricature sometimes capture more of a person’s character than a detailed portrait? As caricaturist G K Chesterton put it, because of “the significance of outline”. We are here discussing macroeconomics, not the micro variety.

  7. Ken Zimmerman
    February 25, 2020 at 1:16 pm

    Perhaps we’ll get a somewhat broader perspective on probability if we look outside economics. In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission’s (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The “Rasmussen Report” inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report’s controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a “figure of merit” to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power’s safety to a growing chorus of critics. Subsequent attacks on the Report’s methods and numerical estimates damaged the NRC’s credibility. PRA’s fortunes revived when the 1979 Three Mile Island (TMI) accident demonstrated PRA’s potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report’s controversies endure in mistrust of PRA and its experts. This has not stopped nuclear and other experts from applying PRA. The latest report was published in 2016.

    PRA grew out of concerns that the nuclear Industry remained wedded to a “deterministic analysis” and a redundant “defense-in-depth” approach that downplayed the role of risk assessment in safety evaluations. Regulators using a deterministic approach simply tried to imagine “credible” mishaps and their consequences at a nuclear facility and then required the defense-in-depth approach—layers of redundant safety features—to guard against them. Before TMI no severe accidents that melted the core of a plant had ever occurred, and no sure way existed to calculate the probability of a major accident. NRC experts used their collective judgment to determine what accidents were credible, and the agency often mandated multiple safety systems to compensate for the uncertainty of an accident’s probability and consequences. This approach had worked well in protecting public safety; defense-in-depth was critical in preventing sizable releases of the most dangerous forms of radiation at TMI. However, the defense-in-depth approach was not effective in prioritizing accidents or in judging when an extra, often expensive, safety system produced a
    commensurate increase in margins of safety. Proponents argued that a PRA, with its much more detailed use of probabilities and modeling of plant and human behavior, could
    better deal with such issues.

    After it was concluded that “fool-proof” devices for the safety of nuclear reactors could not be created, research on how components and systems of nuclear reactors could fail became the focus. This involved fault trees applying Boolean logic. Which eventually became the “go to” approach for assessing risk for nuclear reactors. With digitization and ever more powerful computers fault trees and Boolean diagrams became ever bigger, more detailed. and quickly and easily applied. Now these are buried within “apps” used at every reactor in the US, and most around the world. With many reactors coming to the end of their operating life, the effectiveness of the apps is about to be tested, severely.

  1. No trackbacks yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.