Home > Uncategorized > The tractability hoax in modern economics

The tractability hoax in modern economics

from Lars Syll

While the paternity of the theoretical apparatus underlying the new neoclassical synthesis in macro is contested, there is wide agreement that the methodological framework was largely architected by Robert Lucas … Bringing a representative agent meant foregoing the possibility to tackle inequality, redistribution and justice concerns. Was it deliberate? How much does this choice owe to tractability? What macroeconomists were chasing, in these years, was a renewed explanation of the business cycle. They were trying to write microfounded and dynamic models …

tractable-2Rational expectations imposed cross-equation restrictions, yet estimating these new models substantially raised the computing burden. Assuming a representative agent mitigated computational demands, and allowed macroeconomists to get away with general equilibrium aggregate issues: it made new-classical models analytically and computationally tractable …

Was tractability the main reason why Lucas embraced the representative agent (and market clearing)? Or could he have improved tractability through alternative hypotheses, leading to opposed policy conclusions? … Some macroeconomists may have endorsed the new class of Lucas-critique-proof models because they liked its policy conclusions. Other may have retained some hypotheses, then some simplifications, “because it makes the model tractable.” And while the limits of simplifying assumptions are often emphasized by those who propose them, as they spread, caveats are forgotten. Tractability restricts the range of accepted models and prevent economists from discussing some social issues, and with time, from even “seeing” them. Tractability ‘filters’ economists’ reality … The aggregate effect of “looking for tractable models” is unknown, and yet it is crucial to understand the current state of economics.

Beatrice Cherrier

Cherrier’s highly readable article underlines​ that the essence of mainstream​ (neoclassical) economic theory is its almost exclusive use of a deductivist methodology. A methodology that is more or less used without a smack of argument to justify its relevance.

The theories and models that mainstream economists construct describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.

Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in economic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity​ when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real​-world target systems do​ not take us very far​ unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to an open real-world target system.

Henry Louis Mencken once wrote that “there is always an easy solution to every human problem – neat, plausible and wrong.” And mainstream economics has indeed been wrong. Very wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real-world​d target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.

The punch line is that most of the problems that mainstream economics is wrestling with, issues from its attempts at formalistic modelling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real-world​ economic problems. And as someone has so wisely remarked, murder is — unfortunately — the only way to reduce biology to chemistry – reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime.

If scientific progress in economics – as Robert Lucas and other latter days mainstream economists seem to think – lies in our ability to tell “better and better stories” without considering the realm of imagination and ideas a retreat from real-world target systems reality, one would, of course, think our economics journal being filled with articles supporting the stories with empirical evidence. However, I would argue that the journals still show a striking and embarrassing paucity of empirical studies that (try to) substantiate these theoretical claims. Equally amazing is how little one has to say about the relationship between the model and real-world target systems. It is as though thinking explicit discussion, argumentation and justification on the subject not required. Mainstream economic theory is obviously navigating in dire straits.

If the ultimate criteria for success of a deductivist system is to what extent it predicts and cohere with (parts of) reality, modern mainstream economics seems to be a hopeless misallocation of scientific resources. To focus scientific endeavours on proving things in models is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real-world economic target systems. These systems do not conform to the restricted closed-system structure the mainstream modelling strategy presupposes.

Mainstream economic theory still today consists mainly in investigating economic models. It has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence still only plays a minor role in mainstream economic theory, where models largely function as substitutes for empirical evidence.

What is wrong with mainstream economics is not that it employs models per se, but that it employs poor models. They are poor because they do not bridge to the real world target system in which we live. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on mathematical deductivist modelling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.​​

  1. April 22, 2018 at 4:14 pm

    As Robert Locke has commented previously on this blog, the issue is profitably framed as the epistemological and thus political contest between the idiographic and nomothetic – more precisely the politics of suppressing the idiographic modes of analysis.

    Instead of presuming a humanist (ideographic) notion of the individual that comports with our sense of ourselves as individuals and thus our choices as particular, it has become permissible, even preferred, to deny our individuality and see us best defined as ‘representative rational agents’ whose individuality lies only in our failure to be fully rational, our ‘bias’.

    How did this unfruitful and morally indefensible way of thinking come to dominate our academic activity? Addressing that question might lead towards freeing ourselves from the methodological prison we now inhabit.

    Coase, among a few others, was aware of all this and, we might argue, doing his best to show us the way out of the cave – an economics of non-zero transaction costs.

    But how many others are doing so? Bleating on endlessly as we remind each other of our intellectually impoverished condition has not gotten us anywhere in the century + since some of our predecessors pushed back against Menger.

    Why, oh why?

    • Robert locke
      April 22, 2018 at 5:35 pm

      Unless spender’s clearly posed question is rigourously pursued and answered we on this blog will waste. our time.

      • edward ross
        April 22, 2018 at 8:38 pm

        Just as a reminder I am the old uneducated educated individual who managed a humble mature age B.A, Thus I completely agree with Lars Syll’s blog and following posts and Robert Locke’s challenge to deal with the real issue of relating economic debate to the real issues, facing people in the real world. Otherwise they are emulating rats on the treadmill running flat-out getting no where.

    • Frank Salter
      April 22, 2018 at 7:04 pm

      Possibly it might be:

      It seemed a good idea at the time and we haven’t had a better one and its maths we are able to do.

  2. April 22, 2018 at 7:33 pm

    I guess that’s the point. Why was it considered ‘a good idea at the time’?

    In those days (Menger’s) scholars were sufficiently well read (Aristotle is the text in question) to know what the suppression of the ideographic was about. Today, of course, we have no idea, we are oblivious. That’s the great thing about dogma, it shuts out critique, and why it is so fundamentally political.

    But whose political project did we become? Did ‘they’ (whoever they were/are) know better than us – so that we have no need to question their game-plan?

    • robert locke
      April 24, 2018 at 7:08 am

      As an historian, I was trained to start with people living in a specific time and place, and in order to understand them to seek out the original sources, produced by people in their lives, memoires, minutes of meetings, committee work in legislatures, and then turn to what others have said about them. The secondary sources were deemed especially difficult, when the observer (a contemporary judging people in a different epoch, with tools of inquiry fashioned in his/her particular time). When I ran across the historical judgments of people trained in neoclassical economics about events in places and times they did not try to learn about through the study of original sources, but through a supposed scientific method to better understand what was happening to them then the people themselves, I sent up a flare. It is a totally contrary way to study people than what we learned in historiography in our graduate studies. I generally concluded that the generalizations about human behavior social scientists contrive do not survive ideographic comparisons. We can either throw the ideographic comparison’s out, which was done after WWII, or take them into consideration in order to promote clarity of exposition. What we end up with is a lot of frustrated historians, who say, but these judgements do not conform to the facts of human experience — spoken to deaf ears.

  3. April 23, 2018 at 10:35 am

    “Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level”. [Via link in Syll’s quote of Cherrier].

    True, but the fact of specialisation does! Agents are representative not of the whole economy but of their specialism, and such rationality as they may have acts in different fields within which the ends are instrumental. What one sees is not just birds but specialists defending their patch, deferring to more aggressive rivals and in peaceful co-existence with non-rivals.

    Professors of course like to defend their patch with long words like ‘nomographic’ and ‘ideographic’! In a previous age they used Latin, but allowed artists to communicate the gist of their stories in ideographic forms, like illustrated church walls and stained glass windows. Come Machiavelli telling would-be princes – perhaps captains of industry as well as holders of territory – to tell the people what they wanted to hear, and we got the Latin translated (with useful variations) into local languages, privatisation of the church (i.e. community) educational system and puritans going the whole hog by deplastering church walls and smashing their ideographic windows. The would-be princes didn’t want a Catholic church – the equivalent of today’s United Nations – able to keep them in some semblence of order. They wanted to take what they wanted and have the people believe it was someone else – a thieving church or untrustworthy neighbours. Americans might like to remember it was puritans who in 1620 sailed from London to colonise their land, after having “had a skinful” at the Anchor Inn. [By way of explanation, my daughter moved nearby and took me to The Anchor a week ago].

    What I find interesting is how the ideographic came to be suppressed in maths. Before the introduction of the arabic numbering system and algebra, Euclid’s geometry was ideographic; since Descartes introduced coordinate geometry the “higher mathematics” has gone all abstract and visually indistinguishable from algebra. In my childhood maths still meant arithmetic, algebra and Euclidean geometry; since the introduction of non-ideographic computer programming geometry seems to have been reduced to the dimensionality rather than the directionality of imaginary spaces.

    One thing is for sure: most kids these days don’t begin to understand maths. Is this “dumbing down” deliberate? Regarding Professor Spender’s question, is it a question of Machiavellian politics having dumbed down political management over so many generations that it is in fact NOT now being done with rational deliberation? Is it being done automatically by zombies?

  4. April 24, 2018 at 2:49 pm

    “phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.”

    Quantum computing, when and if it is available to many, would probably assist in these studies.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.