Home > The Economics Profession > Why econometrics still hasn’t delivered (wonkish)

Why econometrics still hasn’t delivered (wonkish)

from Lars Syll

In the article The Scientific Model of Causality renowned econometrician and Nobe laureate James Heckman writes (emphasis added):

A model is a set of possible counterfactual worlds constructed under some rules. The rules may be laws of physics, the consequences of utility maximization, or the rules governing social interactions … A model is in the mind. As a consequence, causality is in the mind.

Even though this is a standard view among econometricians, it’s – at least from a realist point of view – rather untenable. The reason we as scientists are interested in causality is that it’s a part of the way the world works. We represent the workings of causality in the real world by means of models, but that doesn’t mean that causality isn’t a fact pertaining to relations and structures that exist in the real world. If it was only “in the mind,” most of us couldn’t care less.  

icebergsThe reason behind Heckman’s and most other econometricians’ nominalist-positivist view of science and models, is the belief that science can only deal with observable regularity patterns of a more or less lawlike kind. Only data matters and trying to (ontologically) go beyond observed data in search of the underlying real factors and relations that generate the data is not admissable. All has to take place in the econometric mind’s model since the real factors and relations according to the econometric (epistemologically based) methodology are beyond reach since they allegedly are both unobservable and unmeasurable. This also means that instead of treating the model-based findings as interesting clues for digging deepeer into real structures and mechanisms, they are treated as the end points of the investigation. Or as Asad Zaman puts it in Methodological Mistakes and Econometric Consequences:

Instead of taking it as a first step, as a clue to explore, conventional econometric methodology terminates at the discovery of a good fit … Conventional econometric methodology is a failure because it is merely an attempt to find patterns in the data, without any tools to assess whether or not the given pattern reflects some real forces which shape the data.

The critique put forward here is in line with what mathematical statistician David Freedman writes in Statistical Models and Causal Inference (2010):

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

Econometrics is basically a deductive method. Given the assumptions (such as manipulability, transitivity, Reichenbach probability principles, separability, additivity, linearity etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Real target systems are seldom epistemically isomorphic to axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by statistical/econometric procedures like regression analysis may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

Most advocates of econometrics and regression analysis want to have deductively automated answers to fundamental causal questions. Econometricians think – as David Hendry expressed it in Econometrics – alchemy or science? (1980) – they “have found their Philosophers’ Stone; it is called regression analysis and is used for transforming data into ‘significant results!’” But as David Freedman poignantly notes in Statistical Models: “Taking assumptions for granted is what makes statistical techniques into philosophers’ stones.” To apply “thin” methods we have to have “thick” background knowledge of what’s going on in the real world, and not in idealized models. Conclusions can only be as certain as their premises – and that also applies to the quest for causality in econometrics and regression analysis.

Without requirements of depth, explanations most often do not have practical significance. Only if we search for and find fundamental structural causes, can we hopefully also take effective measures to remedy problems like e.g. unemployment, poverty, discrimination and underdevelopment. A social science must try to establish what relations exist between different phenomena and the systematic forces that operate within the different realms of reality. If econometrics is to progress, it has to abandon its outdated nominalist-positivist view of science and the belief that science can only deal with observable regularity patterns of a more or less law-like kind. Scientific theories ought to do more than just describe event-regularities and patterns – they also have to analyze and describe the mechanisms, structures, and processes that give birth to these patterns and eventual regularities.

  1. December 27, 2012 at 7:02 pm

    “Scientific theories ought to do more than just describe event-regularities and patterns – they also have to analyze and describe the mechanisms, structures, and processes that give birth to these patterns and eventual regularities.”

    Descriptions aren’t theories. Theories must attempt to explain WHY things happen.

    Here is a very simple “causal” model that explains why every acceleration of bank credit creation must be followed by a damaging crash in which borrowers default by mathematical certainty, not through any fault of their own. The fact that M2 has diverged disproportionately from M1 for the past 30 years is “evidence” from the real world.



    Real science depends of theories being advanced and all attempts being made to refute them. A theory stands for as long as it can’t be refuted or improved upon. Lars, consider yourself, and any one else you choose, invited to refute the logical arguments and evidence presented at the links above and at this one:


    I don’t claim to be confused or a “failure”.

    I claim to have a very plausible answer that explains real world events, one that no one to date has refuted. Not one economist on this list has taken up the challenge. I guess whining about failure is more fun?

    • December 28, 2012 at 10:49 am

      Paul Grignon, you might be able to develop this idea further by looking at what the banks lend money FOR. Mostly this appears to consist of mortgages, in other words, for land purchase, since the building value represents less than half of the total. In return, the banks become the owner of the land for the duration of the loan. In other words, the interest paid by the borrower is functionally equivalent to rent.

      Thus you could regard a mortgage as a land rental capture device, in which the banks create the credit at minimal cost for the purchase of land which they lease out to the borrower.

      As more and more banks pile into the market, the bubble is created, since it is the banks that have scrambled to acquire the “assets” of which the loans are comprised.

      On this analysis, the design of the banking system is A CONSEQUENCE OF the system of tenure in which land rent is privately appropriated instead of being collected as the principal source of public revenue. Thus the root cause is one step back.

      If this analysis is false, I have yet to see it refuted.

      • December 28, 2012 at 11:16 pm

        “If this analysis is false, I have yet to see it refuted.”Nor have I, yet to see refuted: “QE 4 The People”.

        Wray, Black, Hudson, Mosler where is your profound reply ?
        Blogger justaluckyfool said…
        Why MMT frustrates the hell out of me(Justaluckyfool)

        This may be a good thing, but their absolute refusal to defend their position at times is unexcused. Their silence to challenge, or improve is inexcusable
        With a prayer I ask again, Hudson, Wray, Mosler, Black, Mitchell,
        all others :
        IS IT TRUE ?
        (!) QE is a proven method that would allow the FED to purchase
        unlimited amounts of assets for an unlimited period of time?
        It is an Einstein style “simple” question.
        (2) QE is presently being used to allow “private for profit banks” (PFPB)
        to make tremendous profits for their own selfish use ?
        Simple question.
        (3) If 1 and 2 are true. Would it be true to say, “QE can be used to raise
        tremendous profits(revenues) for the people for their own selfish use(…to form a more perfect union) ?
        Three simple questions that await a profound answer, not for one but for all
        Could you, would you, please, either endorse or improve a battle cry,
        “Tax Money, Not Income”, “QE 4 Prosperity”
        Is it true ? :
        (1) The Fed can QE unlimited amounts of currency for the purchase of assets for an unlimited period of time. (proven by Bernanke by QE 3 (infinity)
        (2) The Fed can turn that revenue over to the US Treasury to be placed in the General Account.
        (3) The only change would be that of taking the income (revenue) from these assets away from the “private for profit banks” and giving it back to the people.
        “Give me equality, or give me death”.
        Read more…http://bit.ly/MlQWNs -“Zero Income Taxes Solves Worldwide Crises”

        But no matter what; A sincere thanks to rwer.wordpress.com was allowing ‘social media’ to perhaps be a path for the future happiness of mankind.

      • December 29, 2012 at 12:59 am

        Economists treat land as just another asset with which to speculate. Many of them thought the bursting of the dotcom bubble was going to trigger the credit crunch. But it was mortgages and other financial instruments, mostly backed up by land, which brought on the collapse.

        They do not recognise that land is part of the real economy, unlike the stock-market or bullion. It affects everything we do as economic agents. It especially affects households, giving huge advantages to those who already own property whilst those who do not have to struggle with unbearable mortgages or rents.

        Where is there any proper definition of ‘land’ in any economic text book?

  2. December 27, 2012 at 7:55 pm

    Agreed. The aim and purpose of science is to establish cause and effect relationships. It does this by collecting data, analysing relationships within the body of data and postulating theories. These can be tested through attempts to demonstrate the falsehood of the theories, either by further analysis of the data, or the collection of additional data, or the construction of experiments, either deliberately or by the observation of “natural experiments”. If possible, experiments are constructed in which all conditions are kept the same apart from the variable under investigation. If this is not possible, the effect of particular variables can be studied using statistical methods.

    A key element in any scientific analysis is a definition of terms which is accepted by all within the discipline.

    Economics can and has been studied in this way, but seemingly the method has gone out of fashion.

  3. Bruce E. Woych
    December 27, 2012 at 8:09 pm



    “… A model is in the mind. As a consequence, causality is in the mind.”

    A bit of clever sophistry to assemble equilibrium theory mechanics into rational choice game theory. It permits a perversion of direct causal links as relativistic incidentalism; and licenses the equivocation of social reality to a random set of associated hierarchies.

    The “Prize” in 2012 essentially is econo-magic as well: non-market-markets…variable based decision making …etc.

    What do we expect from the U. of Chicago’s legacy in the Economics Dept. ?
    see critical summary :
    Nobel Memorial Prize in Economic Sciences (read it ALL)

  4. Bruce E. Woych
    December 28, 2012 at 12:40 am

    How Big Business Poisons Academic Research
    The energy industry and Big Agribusiness are distorting academic research by wielding corporate influence.
    by Wenonah Hauter

  5. Ken Zimmerman
    December 28, 2012 at 9:29 am

    “Cause and effect,” one of the common dualities of western life. To paraphrase William James, there are many possible causes and many possible effects in the multiverses. “Cause and effect” is another of the misunderstandings by and about science (econometrics if you will). In every situation one or more cause and effect relationships can be claimed. And as the situations change so can the cause and effect relationships that can be claimed. It is how and by which actors these relationships are constructed we should be examining, never expecting to find a final and fixed cause and effect, as it is unlikely there is such and even if there were there would be no means for us to to find and describe it. Econometrics is nothing special in its focus on what does not exist and could not be found if it did.

  6. December 28, 2012 at 3:29 pm

    Lars, this is on a par for significance with the recent “economic framework” discussion.

    Yes, causality is in the mind, but Adam Smith’s friend David Hume, trying to evade God in the form of Aristotle’s logical First Cause, relied on the “fallacy of the undistributed middle” as per Bruce’s comment, not allowing for the brain causing the appearances of causality evident in the mind. Kant of course had a go at this, though those influenced by Humean positivists interpret Kant as though he were JUST an idealist. Hume’s epistemic fallacy was picked up by critical realist Roy Bhaskar, whose dialectical philosophy of science suggests a kinder interpretation of econometrics than Zaman’s “failure”: that econometrics is not itself a science but just one of four phases of a science, of which the others might be described as causal insight, experimental testing and quality assurance.

    The purpose of modern science was not Hume’s “democratic” agreement on descriptions of patterns in data (which “magicians” can of course surreptitiously create), but 134 years older: Bacon’s “taking things to bits to see how they work” with the intention of creating new crafts in which the redundant could find employment. (See if you can find THAT in your story and the other responses to it). Hume’s version of science just accepted things as they were, without committing his paymasters to any Baconian inconvenience. I have myself studied philosophy of science since 1957, after rejecting Hume’s sophistry as a trainee experimental scientist, but found the way of refuting his slippery arguments not in philosophy but in C E Shannon’s discovery that electric switching circuits performed logic and that the same information is conveyed by the switch being ‘on’ as by power passing down the communication channel. Subsequent demonstration that neurons are networks of switching circuits and retroduction (reverse engineering) of the universe to the switching on of energy at the Big Bang leads to the postulate that an expanding universe evolved from one “structure of communication” form to another in four phases, starting with the [black] energy forming electromagnetic waves [light] at its boundary and these breaking up forward and sideways into high and low energy sprays. The evidence remains consistent with this as the sub-atomic particles form into atoms, then molecules, thru chemically sensitive cells into light sensitive vegetation, physically mobile animals and linguistically mobile men with four parts to their brain, whose logical use of symbols enables them to communicate and react independently of remembered situations.

    With the evolution of science from the successes and failures of practical construction came the evolution of mechanics, electrodynamics, electromagnetic and digital communication and the spawning of general purpose automation. The “mechanism” of electronic automation is essentially a structure of communication, in which energy is switched into and guided along communication pathways, much as use of rivers, railways and roads has largely automated our modern economies. Just as the four phases of nature have spawned the four phases of economic evolution of real economies, so by now this has spawned four phases of a shadow economy (money-making, speculation, insurance, derivatives) leaving us at the point where it is about to spawn something else.

    Electronic automation relies on error correction (e.g. the use of error-detecting programming logics; elimination of computer viruses before they can do too much damage). So long as social engineers continue to fail to understand the scientific error recognition and negative feedback correction techniques bequeathed to us by Shannon (though practiced pre-scientifically in the catholic tradition), what is likely to be spawned by their currently fashionable and continuous (as against occasionally appropriate) use of positive feedbacks (e.g. compound interest and enforcement of unjust law) is chaos, self-destruction, death and even the extinction of life on our planet. The problem is that serious.

    Econometrics can’t “deliver” when the problem is not metrical but logical, or more precisely illogic brought about by failure of educators to understand Christian traditions and scientific developments in the understanding of logic, language and communication.

    • Bruce E. Woych
      December 30, 2012 at 7:41 pm

      I am a great servant to the idea of reverse engineering and continuous learning systems but somewhere beyond the interactive (and inverse) interrelationship between open and closed systems we get a notion of emergent theory (holistics) which speculate between two modalities in Western thought: the FIRST; Organic (germination collective assemblage/natural) and the SECOND; mechanical (atomistic/elements/conventional). The cross fertilization between these two disparate traditions have been prolifically reassembling phase changes (modifications) or retrofitting into the mix. If indeed the Universe is based as a “quartet” assemblage” this pattern would become 1) Organic 2) Mechanical 3)Organic/mechanical 4)Mechanically /organic

      I can’t help but notice that this resembles a Mendelian epigenetic model that may well link mirror cells to cognitive predispositions (with implications for perception and interpretation of the universe as subsequent programing modalities…but with endless possibilities much like true emergent individualism in societies but non-reductive…in fact …conditional OR emergent qualities.
      Of course, staying on issue…there is no way that econmetrics as hodgepodges of variables would be anything more than a Rorschach test for opinions working out its order.

      At another level, however, I wonder if you have a thought on this matter from above?
      Thanks very much:

      • Bruce E. Woych
        December 30, 2012 at 8:40 pm

        The original American Cultural Anthropology School under Franz Boas…(before it was fragmented into professionalized specializations)…was divided (or “spawned” as you say…) into 4 parts. Cultural / Linguistics /Physical / & Archeological and (permitting for subdivisions in each) the qualitative process demanded ultimate integration from each division.

        Your “four phases of nature” struck home with my “four phases of culture background!”

  7. December 28, 2012 at 7:00 pm

    Econometrics can’t deliver because it ignores that money is created as a debt with a schedule for repayment and, instead, treats money as a neutral medium of exchange. Based upon an erroneous foundation, which is the opposite of the truth, it will always be erroneous.

    • December 28, 2012 at 11:24 pm

      True, Paul, but why does it still do that? Why do the relatively few scammers still get away with doing so? I’m suggesting the issue which needs to be resolved is a systemic one in education, not a particular one in econometrics. When, following Keynes’s adversary Ramsey, people are no longer taught there is a truth to distinguish from falsehood, is it any wonder that economics and its metrics are built on dishonest or at best erroneous foundations?

      Incidentally, having looked up the meaning of “wonkish”: at 76, if that makes me a wonk, who cares? Some of the most interesting conversations I’ve ever had have been with old folk looking back over their lives, and if anyone thinks that daft, more fool them.

  8. December 29, 2012 at 10:22 am

    Back on wonking: “a person who studies a subject or issue in an excessively assiduous and thorough manner”? Excessively for what purpose? We won’t solve our economic problems without being prepared to get our heads round the detail of alternatives “outside the box” of positivist economics.

    There is an analogy worth reflecting on between the four phases of a substance like water (ionic, gaseous, liquid, gaseous and solid forms as heat decreases) and Bhaskar’s three levels/four phases of science: (real mechanisms, events both real and actual, experiences which are real, actual and empirical as abstraction decreases), where what is missing is the liquid: the “stringing together” of atoms phase which precedes their interlocking in a crystalline structure – which grows from a linear surface or crystal-form seed.

    In the scientific programming language Algol68 the empirical data is assigned to specific objects and both these and the programs are “typed” according to the mode of interpretation of the data encoding (so the type acts as a verbal cross-index between Bhaskar’s empirical phenomena, events and mechanisms); and by reference levels (corresponding to the four phases of Bhaskar’s science, such that econometric data stands for itself, the labels of objects become variables REFering to events, those of types similarly REF REFs with respect to the data and those of typed procedures REF REF REFs). Although this aspect wasn’t developed by the creators of Algol-68, a reverse ordering also applies. If the available procedures stand for themselves, the type refers to a particular subset of them, the object types are ref refs to the procedures and the data ref ref refs to interpretation processes. In much the same way, setting up a telephone call involves dialling the number, the line communicating it to the exchange, the exchange connecting you to the relevant line and only then can you begin the process of conversing. The naive “positivist” view of this, unthinkingly taking for granted that all the necessary mechanisms are in place, is that all you do is dial the number and then you can converse. That was how Algol 60 and BASIC for beginners were structured!

    As Algol68 was being introduced I was told it was not originally designed as a programming language but as a language for scientists with a grammar which required them to define their terms unambiguously. Any economists aspiring to be scientists should take a look at it. Unlike many of this old man’s references, this one is available online at

    • December 29, 2012 at 10:30 am

      Apologies for an unedited-out “gaseous” in my water phases analogy.

  9. Bruce E. Woych
    December 29, 2012 at 4:43 pm


    Warming and Dimming: An Econometric Assessment of Climate Changemore
    by Tom Holden

    August 2007 Warming and Dimming: An Econometric Assessment of Climate Change

    Global warming is a serious concern for policy in the present context. In this research, we explore the different effects of carbon dioxide and aerosols upon that process. Carbon dioxide is known to have a deleterious effect upon climate. Before now, political action taken to fight global warming has aimed mainly at reducing the emission of carbon dioxide. Nobel prizewinning chemist Paul Crutzen has proposed an alternative and highly controversial way offighting global warming:

    He has suggested decreasing the amount of solar radiation reaching the surface of the earth by artificially increasing the level of aerosols in the stratosphere. By increasing pollution, more solar radiation is directly
    reflected back into space and the surface of the earth receives less energy and hence there is less warming.

    By Glenn Scherer

    Climate Risks Have Been Underestimated for the Last 20 Years


  10. Bruce E. Woych
    December 29, 2012 at 4:50 pm

    Econometrics vs Climate Science
    Submitted by Doug L. Hoffman Tue, 03/23/2010 – 14:41

    Recently, a number of papers have surfaced that use advanced statistical methods to analyze climate data. The techniques involved have been developed not by climate scientists but by economists and social scientists. These new tools belong to the field of econometrics. The use of statistical break tests and polynomial cointegration to analyze the relationships between time series data for greenhouse gas concentrations, insolation, aerosol levels and temperature have shown that these data are non-stationary. The implication of these findings is that much of the statistical analysis applied by climate scientists is flawed and potentially misleading. So strong is the statistical evidence that a couple of economists are claiming to have refuted the theory of anthropogenic global warming. This, on top of everything else that has recently transpired, may indicate that a climate change paradigm shift is imminent.”


    “The most controversial aspect of this new work is that its methodology comes, not from climate science or a related field, but from econometrics.”


  11. A.J. Sutter
    January 8, 2013 at 2:54 pm

    “A model is in the mind. As a consequence, causality is in the mind” — a simple instance of the fallacy of division, de-bunked by Aristotle.

    You might also have mentioned Heckman’s remark @p2 of his paper: “This paper develops the scientific model of causality developed in economics and compares it to methods advocated in epidemiology, statistics, and in many of the social sciences outside of economics that have been influenced by statistics and epidemiology” — why not call it “the **economic** model of causality,” then?

    Perhaps it’s not necessary to be so wonkish to notice the flaws and rhetorical land-grabbing in Heckman’s paper.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s