Home > Uncategorized > Econometric testing

Econometric testing

from Lars Syll

Debating econometrics and its shortcomings yours truly often gets the response from econometricians that “ok, maybe econometrics isn’t perfect, but you have to admit that it is a great technique for empirical testing of economic hypotheses.”

But is econometrics — really — such a great testing instrument?

ecokEconometrics is supposed to be able to test economic theories. But to serve as a testing device you have to make many assumptions, many of which themselves cannot be tested or verified. To make things worse, there are also only rarely strong and reliable ways of telling us which set of assumptions is to be preferred. Trying to test and infer causality from (non-experimental) data you have to rely on assumptions such as disturbance terms being ‘independent and identically distributed’; functions being additive, linear, and with constant coefficients; parameters being’ ‘invariant under intervention; variables being ‘exogenous’, ‘identifiable’, ‘structural and so on. Unfortunately, we are seldom or never informed of where that kind of ‘knowledge’ comes from, beyond referring to the economic theory that one is supposed to test. Performing technical tests is of course needed, but perhaps even more important is to know — as David Colander put it — “how to deal with situations where the assumptions of the tests do not fit the data.”

That leaves us in the awkward position of having to admit that if the assumptions made do not hold, the inferences, conclusions, and testing outcomes econometricians come up with simply do not follow from the data and statistics they use.

The central question is “how do we learn from empirical data?” Testing statistical/econometric models is one way, but we have to remember that the value of testing hinges on our ability to validate the — often unarticulated technical — basic assumptions on which the testing models build. If the model is wrong, the test apparatus simply gives us fictional values. There is always a strong risk that one puts a blind eye to some of those non-fulfilled technical assumptions that actually make the testing results — and the inferences we build on them — unwarranted.

Haavelmo’s probabilistic revolution gave econometricians their basic framework for testing economic hypotheses. It still builds on the assumption that the hypotheses can be treated as hypotheses about (joint) probability distributions and that economic variables can be treated as if pulled out of an urn as a random sample. But as far as I can see economic variables are nothing of that kind.

I still do not find any hard evidence that econometric testing uniquely has been able to “exclude a theory”. As Renzo Orsi put it: “If one judges the success of the discipline on the basis of its capability of eliminating invalid theories, econometrics has not been very successful.”

Most econometricians today … believe that the main objective of applied econometrics is the confrontation of economic theories with observable phenomena. This involves theory testing, for example testing monetarism or rational consumer behaviour. The econometrician’s task would be to find out whether a particular economic theory is true or not, using economic data and statistical tools. Nobody would say that this is easy. But is it possible? This question is discussed in Keuzenkamp and Magnus (1995). At the end of our paper we invited the readers to name a published paper that contains a test which, in their opinion, significantly changed the way economists think about some economic proposition … What happened? One Dutch colleague called me up and asked whether he could participate without having to accept the prize. I replied that he could, but he did not participate. Nobody else responded. Such is the state of current econometrics.

Jan Magnus

  1. gerald holtham
    January 20, 2023 at 10:40 pm

    Rational expectations is disproved by the fact that models incorporating it do not fit the data as well as the same models without the RE restrictions. They fit and forecast worse than the unrestricted reduced form.
    Hendry disproved Friedman’s assertion that the velocity of circulation was stable and therefore showed monetarism had no firm empirical basis.
    Friedman’s assertion that the stability of the savings rate over time proved that the rate of consumption is stable at different income levels is disproved by cross section analysis showing that savings rates vary strongly across income cohorts.
    The problem is not the absence of disproof but the fact that the profession ignores empirical evidence. I know Lars is well meaning and doesn’t mean to serve the dark side but by attacking such techniques as we have to test theories he supports the theological tendency in economics that prefers doctrine to evidence.

    • rsm
      January 21, 2023 at 8:54 pm

      Sure, but is the margin of error wide enough that the consensus is ultimately swayed by rhetoric, not data?

      《Most generally, noise makes it very difficult to test either practical or academic theories about the way that financial or economic markets work. We are forced to act largely in the dark.》 – Fischer Black in “Noise”

    • rsm
      January 23, 2023 at 4:41 am

      Why doesn’t the story that includes disproven rational expectations naturally lead to arbitrary prices, inflation seen as a power play, and a policy of fully indexing the economy?

  2. Steven Klees
    January 21, 2023 at 10:12 pm

    Gerald Holtham’s comment accuses Lars of serving the “dark side” in his critique of econometric testing. To the contrary, Lars did not go quite far enough. His critique of the assumptions of models that can be statistically tested was spot on, but he left room by implying that under some circumstances “testing econometric models is one way” to proceed. The unfortunate truth is that econometric models NEVER fulfill their underlying statistical assumptions that Lars lays out. That is obvious from their translations to the three substantive assumptions necessary to trust regression coefficients: (1) that all relevant variables are in the model; (2) that each of them is measured “correctly;” and (3) that we know the proper functional form. These conditions never hold, all econometric models are grossly misspecified, all regression coefficients are biased to an unknown extent, and the empirical debates about whose specification is “better” are interminable. As Lars, Orsi, and Magnus point out there is not one question that has been resolved by econometrics, not one theory that has been invalidated (Holtham’s beliefs notwithstanding – which are obviously belied by those who still hold on to rational expectations theory).

    Econometrics is a pseudoscience within the pseudoscience of neoclassical economics (or a parascience as the late, great political economist Samir Amin called it, likening it to parapsychology). Ed Leamer foretold this decades ago in his “Let’s take the ‘con’ out of econometrics” AER article. Unfortunately, the emperor truly has no clothes. We can learn from empirical data, but we are stuck with arguing from crosstabs which is messy but it’s all the face valid data we’ve got. Torturing those correlations to yield causal impact estimates is neither possible nor sensible.

    • yoshinorishiozawa
      January 24, 2023 at 5:51 pm

      Interesting comment! Yes, “Lars did not go quite far enough.”

      Readers are invited to read this article by Steven Klees Neoclassical Economics is Dead. What Comes Next? in Evonomics: The next evolution of economics(Oct 31, 2020). Although I do not agree with all of his opinions and judgments, he has a very wide view on all aspects of economics, economy, education, and policy. It is marvelous.

      As for Edward E. Leamer’s “Let’s take the ‘con’ out of econometrics” (it is Lars’s favorite, too) is not bad, but the problem with his economics is that he continued to believe the usefulness of Heckshcer-Ohlin framework, which is no other than the typical neoclassical economics in the international trade.

    • Meta Capitalism
      January 27, 2023 at 6:18 am

      Just got your book Steven, look forward to reading it. Also, thank you YS for pointing his article out.

      • Steven Klees
        January 31, 2023 at 12:03 am

        I’ve been away but thanks Yoshinori for the kind words and Meta, I hope you like the book!

  3. gerald holtham
    January 22, 2023 at 5:11 pm

    Rational expectations? I repeat “The problem is not the absence of disproof but the fact that the profession ignores empirical evidence.”
    Leamer said let’s take the con out of econometrics. Fair enough; there’s been plenty of it. He did not say lets take the metrics out of econ.
    Steve Klees points to resolvable problems with the machinery to defend a luddite approach. Some of us are more ambitious.
    Rhetoric works better in persuasion. Yes afraid so. Empirical victories are almost always won laboriously on points, seldom if ever by a clean knock-out.

    • Meta Capitalism
      January 22, 2023 at 10:20 pm

      By the second half of the twentieth century, these celebrated social scientists found themselves firmly entrenched in a growing and powerful governmental apparatus. Those who labored in the highest echelons of government were not reluctant to admit that they worked “in the field of values as well as that of fact and theory.” But the history that had privileged them also deceived. The world of politics and power ultimately took matters out of their hands, and they, determined to remain true to a professional and an intellectual code that had grown only more venerable with time, assiduously cultivated alternatives of an altogether different sort. These “humble, competent people” applied themselves to these new agendas with the same capability and determination that had distinguished the efforts of generations of their forebears. Yet this time there was a difference, for in their scholastic introversion, in their thoughtless alliance with new business elites determined to use public policy for private rather than communal ends, in their pursuit of a desiccated market paradigm [market fundamentalism] that ran the risk of rendering the very foundations of a democratic and free commerce as themselves objects of profit making and accumulation, this new generation of specialists refashioned a social science apparently disconnected from and seemingly unengaged with the social and political world in which they lived. (Bernstein 2004, 184) (Bernstein, Michael A. A Perilous Progress [Economists and Public Purpose in Twentieth-Century America]. Oxford: Princeton University Press; 2004; p. 184.)
      .
      As Amartya Sen [2000: 143] has it: “The far-reaching powers of the market mechanism have to be supplemented by the creation of basic social equity and justice.” (Lars Pålsson Syll 2016, 136, in On the Use and Misuse of Theories and Models in Mainstream Economics.)
      .
      A society where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implodes. A society that promotes unfettered selfishness as the one and only virtue, erodes the cement that keeps us together, and in the end we are only left with people in the ice cold water of egoism and greed. (Lars Pålsson Syll 2016, 163, in On the Use and Misuse of Theories and Models in Mainstream Economics.)

      .
      Despite Gerald Holtham’s lip service to valid forms of evidence aside from econometrics (statistical analysis) his rhetoric repeatedly implicitly dismisses the fact that even in experimental studies these qualitative methods that provide insight into deeper causes are frequently backgrounded to highlight formal mathematical modeling in what can only be described as an appeal to the authority of being “scientific.” We repeatedly see this haughty attitude in his choice of words that are meant to create rhetorical caricatures of such qualitative methodologies as merely anecdote and literature:

      Understanding any historical episode requires attention to all the particularities of the situation – as historians do. The question of which of the factors at play represent more general tendencies and could recur is not trivial…. [Y]ou cannot analyse that data except statistically. Everything else is anecdote and literature. (Gerald Holtham Trivializing Science (aka espousing scientism), RWER, 2/8/2020)

      .
      The researcher proposes, and data disposes. ~ Gerald Holtham Fetishizing Statistics, RWER, 2/11/2020

      .
      Repeatedly on this blog Gerald has expressed the sentiment that evidence is not obtained through controlled experiments or careful statistical analysis [i.e., econometrics] then ergo we can only give up and just believe what we want to believe. Implicit in such statements—despite remonstrations to the contrary—is the “presupposition is that the application of the methods of natural science is the yardstick for social science. This is scientism.”
      .
      Gerald seems to want his cake and eat it too. Why is that?
      .

      I am at one with Lars in being sure it cannot extract causation, merely (weakly) test theories of it. But it is also used in the mundane business of forecasting, without pretending to have a causal theory or have a true understanding of an underlying system. (….) This is not science; one is looking not for causes but for leading indicators. One’s confidence in projections is limited but the results are likely to be better than guesses not informed by a careful inspection of persistent relationships in the data. I don’t know why reality permits this sort of exercise but I know it often works because I have made money from doing it! (Gerald Holtham, RWER, Not Science, but Bread-n-Butter Living, 1/10/2023)
      .
      Upton Sinclair perhaps said it best when he observed that “it is difficult to get a man to believe something when his salary depends upon him not believing it.” (McIntyre 2018, 39-40)

      .
      Gerald is loose with his use of terms (e.g., theological) when what he really means to accuse Lars of is being ideological in the “narrow meaning of ideology” as being “false and illusory idea which promotes a specific political or ethical value. By ‘false and illusory’ it is meant that the idea is contradictory to, or not grounded in, observed facts. Divorced from reality, ideology can hardly then be proved true or false. It is just believed to be true or false without significant evidence to support that belief (Lynne Chester, Tae-Hee Jo 2022).” Chester & Jo call this form of ideology “Ideology I.”
      .
      But ideology has various meanings depending on its contextual use. It can be used to describe what in other terms is called a “worldview,” (Chester & Jo 2022). In this sense ideology is compatible with science. This is called “Ideology II” by Chester and Jo. And then there is a third form they designate “Ideology III” which is a form of “‘scientific ideology,’ or ‘pseudo-science,’ because the core mainstream economic doctrines, relying upon mathematical logic, either ignore facts or selectively (mis)use facts to fit them into an ordained ideological vision of the world (Chester & Jo 2022).”
      .
      Gerald fetishizes “the law of large numbers” and so-called statistical data as the only “scientific” means to obtaining knowledge (despite is claims to not do so) and dismisses and creates false caricatures of other qualitative methods regularly used in the social sciences.
      .
      I know Gerald is well meaning and doesn’t mean to serve the dark side but by dismissing such qualitative techniques as mere “anecdote and literature” he espouses a form of Ideology III which truly can be characterized as a form of religion qua theological believe framework that prefers doctrine to evidence.

      • Meta Capitalism
        January 23, 2023 at 1:47 pm

        I sure messed the links up on this one. Sorry folks.

  4. Laurent Leduc
    January 23, 2023 at 3:30 pm

    Thank you for such a thoughtful response, Meta Capitalism. Especially for the Chester and Jo reference.

    I was struck throughout this discussion by the question, in what ways is this relevant for the past, present and future responses to the pandemic?

    • Meta Capitalism
      January 24, 2023 at 7:39 am

      Thank you, Laurent. You question is an interesting. I am somewhat unsure how to approach it though. On the one hand the pandemic made very obvious how neoliberal capitalism’s assumptions of the benefits of globalization and its consequent deindustrialization of the of the United States (this is mainly I lived and worked, so I will use it as an exemplar, although it is somewhat of an outlier in my view) left our society vulnerable to global value[less] chains. The pandemic also highlighted how those who were considered least in our capitalist society turned out to really be “essential workers” who heroically exposed themselves to the virus doing their jobs to keep essential necessities of life available to the better off. Much could be said about the kinds of ills in US society that the pandemic laid bare.

  5. Gerald Holtham
    January 23, 2023 at 8:01 pm

    Meta ascribes to me a series of beliefs that I do not hold, mainly on the basis of emphasising one rhetorical quote and dismissing all my remarks to the contrary as “lip service”. Remember that the context for this discussion is not me proposing that statistics answers all questions; I have never asserted that. Actually, much of the work I have done in economics has used qualitative methods. The context is Lars criticising the use of econometrics, and indeed of pilot studies, in testing propositions in economics . The criticisms are not, like Leamer’s, of bad practice, which would be fair enough. They are of the methods themselves, holding that they are intrinsically unable to answer important questions. There are obviously important questions that they can’t answer. But the argument that they can’t answer any important questions depends on Lars’ extreme criteria for what it means to “answer” a question. My “scientism” does not lead me to think I can establish anything with complete certainty. Yet that is what Lars demands as the basis for his criticism.
    Like Lars, Meta is vague about the “other qualitative methods that give insight into deeper causes”. Perhaps we agree about them but since no-one ever says what they are, how can we tell?

    • Meta Capitalism
      January 23, 2023 at 9:55 pm

      Gerald interprets Lars critiques of the misuse of econometrics and mathematical model building by the field of economics as “extreme criteria,” yet when one examines actual historical cases Lars’ critique fits like a glove. As Roi makes clear in his “little vignette,” there are real-world consequences when “mathematical claims do not quite work as descriptions of reality.”

      Clearly, trained mathematicians as well as physicists and economists played a role in being overly confident in their mathematical models during the Global Financial Crisis (GFC). To understand why is to understand Lars’ argument, which is similar to Roi’s argument of how mathematical sense can be broken. And this includes statistical analysis (e.g., econometrics). To highlight these examples is neither a condemnation of mathematics or mathematicians (nor of physicists or economists for that matter) but just a recognition of a similar pattern in the past and present.

      Mill’s was speaking to this very issue despite how “quaint” or old fashioned his language may appear to us today. In the GFC the mathematical models were indeed divorced from the underlying reality of the actual practices of banks, ratings agencies, mortgage brokers, etc. The model and its many assumptions were wrong because they could not account for the human element in the social behavior that was not mathematically tractable and because many turned a blind eye to other sources of evidence that clearly showed that the models were divorced from reality.

      I honestly don’t understand what is confusing about Mill’s, Spiegler’s, Roi’s, or Lars’ arguments when they make the same points drawing on both recent and historical events. Spiegler goes on to offer additional corrective methodologies, such as a continual immersion in the real-world practices of the given domain under consideration (e.g., participant observation, case studies, and personal direct experience reported by participants, etc.) that many other fields within the social science domain use.

      I do, from where I am sitting, think Gerald is missing the point of Spiegler’s argument, which is in many ways similar to Roi’s and Lars’ argument. In the 1990s with the collapse of the Soviet Union many ex-soviet citizens who were brilliant physicists immigrated to the US and took up jobs as “quants” in the financial industry and helped, along with many others, to create these financial monstrosities of mathematical complexity that Warren Buffett called “weapons of mass-destruction.” They were paid to create them, proud of creating them, and invested in believing they were right, and this colluded to create perverse incentives to turn a blind eye to other types of evidence.

      A few sounded the alarm regarding these weapons of mass destruction and in more than few cases they did so based upon an audit examination of the actual underlying quality of the mortgage loans that these mortgage-backed securities (MBS) were based upon. In forensic accounting (a field I am very familiar with) one uses statistical analysis to examine various financial performance metrics in an attempt to discover, as Gerald has noted elsewhere, “leading indicators” of where to look for “deeper causes” of potential financial shenanigans. Such financial metrics are really just surrogates for things like revenue, earnings, cash flow, account receivables, inventory management, and liquidity and solvency risks, etc.

      But such statistical analysis is too shallow to be trusted as evidence of the actual quality of the underlying “deeper” financial health of a given company. Metrics can be manipulated and used to misrepresent the actual financial health of a company (or a given financial product, like MBSs). And the only way to get at the “deeper causes” realities is to put boots on the ground and do, for example, an actual physical inventory audit, or examination of the actual source material (bank transactions, etc.) rather than merely depending on a statistical analysis of financial metrics.

      The further away from the original source material the less trustworthy the so-called statistical evidence that relies on surrogate metrics becomes. In the case of the Subprime Crisis, it was those who after being tipped off by some quick statistical analysis went a step further and became experientially and intimately familiar with the underlying “deeper causal” realities by engaging in a little good old anthropological footwork:
      .

      In fact, Wittgenstein’s own ideas on this matter are influenced by what he called Sraffa’s “‘anthropological way’ of seeing philosophical problems” (Monk 1990: 261). Sraffa’s anthropological way consists in observing how rules are exteriorized or exhibited in observable human practices, which remain the basis for an analysis of social reality. This is in line with Sraffa’s method of focusing on observable entities as a basis for analyzing social reality. The letters between Sraffa and Wittgenstein highlight this issue well (McGuiness 2008). (Chester & Jo 2022)

      .
      So, to Gerald’s claim that Lars, or I, or the many authors published by the WEA have been “vague about the “other qualitative methods” I can only say bullshit; you cannot make a Gerald see what his bread-n-butter livelihood requires him to at best pay lip service to or at worst turn a blind eye to and pretend the evidence is “vague” when it most certainly is anything but vague.

      • rsm
        January 24, 2023 at 11:44 pm

        Does the story you tell of the Great Financial Crisis leave out the irrational, spreading panic that arbitrarily devalued perfectly fine mortgage-backed security assets well in excess of actual defaults, which have been higher since (during the pandemic) without causing the same crisis?

        In other words, are you ascribing blame to models that were actually good despite panicking humans fickly ignoring them due to ultra-pessimistic rumors of defaults rising to a level they never attained?

  6. Gerald Holtham
    January 25, 2023 at 7:30 pm

    I was wrong. Lars and Meta have not been vague about other qualitative methods for testing theories. That would imply they had said anything intelligible about them at all. My question is very simple: what are these methods? List them. Point to examples of their successful use.
    I will guarantee, before you start, that using Lars own arguments I can prove that every one of them is defective. How can we be sure their conclusions are truly general? How can we know there are not factors we have ignored etc etc. Lars sets standards of certainty that are not achievable by any empirical method in social studies.
    Meta does not seem to realise that he and Lars are operating at different levels of abstraction. It is fine to criticize bad models, bad econometrics. All Meta’s points and example are of that kind. I’m fine with that. But Lars elevates practical objections to things being badly done to the level of philosophy and claims these methods are intrinsically fundamentally flawed (limited would be a better word). Well, in his terms, yes they are. But so are all conceivable methods. That’s the point. When Meta calms down perhaps he’ll get it.

    RSM No, the models in use pre GFC weren’t any good. They were rubbish, as many of the heterodox knew. Minsky’s model fitted the situation pretty well. He expounded it verbally but it was a model and Steve Keen and others have formalised it. It has never been applied by any official institution. What is truly tragic is the failed models are still in use in central banks – economists again proving themselves impervious to evidence. That is what we should be attacking.

    • Meta Capitalism
      January 26, 2023 at 4:30 am

      It is a red herring to claim that Lars or I don’t understand that _all_methodologies have limitations. Gerald puts words into Lars (and my mouth) that we never said. He is attempting to misdirect the issue to a claim that was never made. He is eliding the point; Lars critique is on point; Gerald reads into it what is not there simply because he doesn’t like it.

    • yoshinorishiozawa
      January 27, 2023 at 12:07 pm

      Gerald, it is good that you came to acknowledge that Lars Syll has no alternatives. It is not wise to object point by point to what Meta contends. They have no real contents.

      Lars Syll almost alway repeats the same argument. One target is econometrics. Another is mathematics or deductive reasoning. His accusation is almost always mistargeted. It is true that we have plethora of econometrics. Is is also true that we have plethora of mathematics. But it is wrong to reject econometrics and mathematics from economics. What we need is an alternative economics that can supersede and replace actual mainstream economics. Lars Syll always confuse theory and models, probably because Lars Syll does not know what is a theory and how theoretical research proceeds. As long as we do not have any good macroeconomic theory, it is inevitable that we cannot obtain any good econometric models of an economy as a whole.

      It is wrong to accuse econometrics by the fact that we have not yet a good econometric models that fits well to facts. It is wrong to accuse theory itself by the fact that we have not yet a good economic theory.
      A theory is not given from heaven. We must find it. Or, we have to create it. Lars Syll lacks this attitude. His methodology works only to accuse or refute existing theories. It is easy to do that, but to accuse alone is as reactive as to defend the actual theories that failed.

  7. rsm
    January 25, 2023 at 10:12 pm

    Gerald,

    Was Minsky’s balance sheet model effectively used by shadow banks to create derivatives and insurance that they knew could easily be bailed out by the Fed in a panic, or risk debit cards stop working? Can we start there?

    Are those derivatives and insurance pieces still dominant today, having survived the pandemic?

    Is the central bank interest-tightening model really a head fake because banks profit more from high interest rates by making more loans, thus increasing the money supply despite the Fed’s ostensible intentions? (See https://fedguy.com/credit-boom/ )

    Why are self-styled heterodox economists not using balance sheet models (much like fedguy, and Mehrling do)?

  8. Gerald Holtham
    January 27, 2023 at 12:54 pm

    Rsm

    I don’t think people speculate in the expectation they will get bailed out. They either underestimate the risk or assume they can exit before the herd. Derivatives that fell out of favour after the GFC are creeping back into use but most institutions are now required to have bigger capital reserves or buffers than they did. Unfortunately regulation tends to follow a Minsky-sequence cycle too. The New Deal regs were undone in the 80s and 90s and the post 2008 ones are under pressure from the political Right.
    Banks’ margins do rise as interest rates go up but credit growth tends to fall because demand for credit falls. Borrowers also become less credit-worthy so banks hesitate and spreads widen.
    Some are. Godley and Cripps at Cambridge UK started building stock-flow consistent models decades ago. Neo-classical models tend to avoid any institutional detail and do not focus on the financial sector. Money is a response to uncertainty and doesn’t easily fit in GE models.

  9. Gerald Holtham
    January 31, 2023 at 4:58 am

    Meta, why not answer my question? What are these other methods for testing theories?

    • Meta Capitalism
      March 8, 2023 at 6:47 pm

      [W]ithout [statistical analysis] there is no progress beyond very broad and vague generalisations. Establishing the limits of a generalisation and the necessary qualifications entails confronting it with data…. Given the data sets we have that is impossible without resort to statistical analysis. (Gerald Holtham, RWER, Given the Data Sets, 2/7/2020, See response.)
      .
      Understanding any historical episode requires attention to all the particularities of the situation – as historians do. The question of which of the factors at play represent more general tendencies and could recur is not trivial. Of course, one has to frame causal hypotheses based on one’s knowledge of the world…. But when it comes to test the generality of the hypothesis you need a lot of data given the complexity of social systems. And you cannot analyse that data except statistically. Everything else is anecdote and literature. (Gerald Holtham, Some People (aka Gerald Holtham) Are Saying …, RWER), 2/8/2020, emphasis added)
      .
      Some people don’t think that studying human societies admits of a “scientific” approach, i.e. the search for regularities or patterns in events that depend on general principles. Every event is unique with multiple causes, as any historian will tell you. (Gerald Holtham, RWER, Some people Are Saying, 3/6/2023)

      .
      Gerald claims his historical comments on RWER are being misinterpreted on the basis of “one rhetorical quote” and all his remarks to the contrary are being ignored. In reality nothing could be further from the truth. Gerald has repeatedly made statements on RWER that are at face value self-contradictory.
      .
      Which of Gerald’s comments are rhetoric and which represent his “true beliefs” is beside the point, but it is fair to put them in stark contrast and point out that on face value it seems Gerald wants to have his cake and eat it too. In reality there are many (far to many to list here) so-called “rhetorical” statements made by Gerald that stand in stark contradiction to his other more balanced and reasonable comments that admit the limits of statistical analysis and the need for qualitative evidence.
      .
      Gerald elides his comments below that assert that statistical analysis is the only way to make progress in understanding economics and/or social systems or that any other methods are at best mere “anecdote and literature” or at worst “conspiracy theories.”
      .
      Gerald rhetorically shifts the meanings of what I said, which was that he has made self-contradictory statements, to the false claim that I stated he was “proposing that statistics answers all questions.” That is simply false.
      .
      That Gerald has made numerous self-contradictory comments on RWER is a fact and to point them out with his own words side-by-side is neither to misrepresent or to be unfair. Ironically, in this very post Gerald does it again, stating, on the one hand, that in much of his own work in economics he “has used qualitative methods,” but on the other hand, “My question is very simple: what are these methods? List them. Point to examples of their successful use.”
      .
      I note that all throughout RWER these methods have been extensively discussed in many of the books published by WEA and I have extensively quoted them and other sources as well that document these methodologies, notably citing Spiegler and Leontief of recent.
      .
      Yet, Gerald will “guarantee,” before we even look at the evidence, that he “can prove that every one of them is defective.” Of course, no method is perfect, they all come with limitations, and again, Gerald has admitted as much in other comments. And one can prove (or disprove) anything with statistical analysis if one tortures the data enough and one has the requisite “guarantee” to do so before the evidence is allowed to speak for itself.
      .
      Gerald rails against Leontief and Spiegler whose evidence has been presented, and then in the same diatribe admits using the similar qualitative methods in “much of the work” he has done.
      .
      Assuming Gerald isn’t using vague “anecdotes and literature” or “conspiracy theories” one can only assume he is using the similar qualitative methods advocated by Spiegler, Leontief, and many other professional economists. Yet he pretends he doesn’t know what these methods are. This is the thinnest of pretense and revealing to say the least.

      • Meta Capitalism
        March 10, 2023 at 3:31 am

        Addendum: On January 22, 2021, Lars Syll posted Leontief’s devastating critique of econom(etr)ics on Real World Economic Review (RWER).
        .
        Gerald Hotham commented:
        .

        Leontieff was right and David Freedman was wrong…. One big problem with economics is inattention to rigorous empirical testing. (Gerald Holtham, RWER, Comment On Leontief’s Nonobserved Facts, 1/22/2021)

        .
        In the article (fully available in Lars link in original post) Leontief among other critical insights wrote:
        .

        To penetrate below the skin-thin surface of conventional [econometric mathematical models], it will be necessary to develop a systematic study of the structural characteristics and functioning of [economic targets, e.g., households, etc.], an area in which description and analysis of social, anthropological and demographic factors must obviously occupy the center of the stage. (Leontief 1971, 4, emphasis added.)
        .
        (…) An exceptional example of a healthy balance between theoretical and empirical analysis and the readiness of professional economists to cooperate with experts in the neighboring disciplines is offered by Agricultural Economics as it developed in this country over the last fifty years…. Preoccupation with the standard of living of the rural population has led agriculture economists into collaboration with home economists and sociologists, that is, with social scientists of the “softer” kind. While centering their interest on only one part of the economic system, agricultural economists demonstrated the effectiveness of systematic combination of theoretical approach with detailed factual analysis. They were also the first among economists to make use of the advanced methods of mathematical statistics. However, in their hands, statistical inference became a complement to, not substitute for, empirical research. (Leontief 1971, 5, emphasis added.)

        .
        Clearly, Leontief distinguished empirical research from statistical inference. Then he called for cross-disciplinary cooperation with sociologists and anthropologists and demographers, which means the appropriate professional methodologies they use, some of which are qualitative, what he means by being engage in empirical research of observable human behavior and phenomena. One has to be deaf, dumb, and blind to not recognize this simple reality. These are the same qualitative methods referred to by Spiegler et. al. So for Gerald, a professional econometrician who claims to have used qualitative methods in his work to pretend he doesn’t know what these methods are is pure pretentious hogwash and a most disingenuous form of rhetoric.

  1. No trackbacks yet.

Leave a reply to Gerald Holtham Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.