from Lars Syll
Submission to observed or experimental data is the golden rule which dominates any scientific discipline. Any theory whatever, if it is not verified by empirical evidence, has no scientific value and should be rejected.
Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.
Mainstream — neoclassical — economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.
To have valid evidence is not enough. What economics needs is sound evidence. Why? Simply because the premises of a valid argument do not have to be true, but a sound argument, on the other hand, is not only valid, but builds on premises that are true. Aiming only for validity, without soundness, is setting the economics aspirations level too low for developing a realist and relevant science.
from Lars Syll
Today the trend to greater equality of incomes which characterised the postwar period has been reversed. Inequality is now rising rapidly. Contrary to the rising-tide hypothesis, the rising tide has only lifted the large yachts, and many of the smaller boats have been left dashed on the rocks. This is partly because the extraordinary growth in top incomes has coincided with an economic slowdown.
The trickle-down notion— along with its theoretical justification, marginal productivity theory— needs urgent rethinking. That theory attempts both to explain inequality— why it occurs— and to justify it— why it would be beneficial for the economy as a whole. This essay looks critically at both claims. It argues in favour of alternative explanations of inequality, with particular reference to the theory of rent-seeking and to the influence of institutional and political factors, which have shaped labour markets and patterns of remuneration. And it shows that, far from being either necessary or good for economic growth, excessive inequality tends to lead to weaker economic performance. In light of this, it argues for a range of policies that would increase both equity and economic well-being.
Mainstream economics textbooks usually refer to the interrelationship between technological development and education as the main causal force behind increased inequality. If the educational system (supply) develops at the same pace as technology (demand), there should be no increase, ceteris paribus, in the ratio between high-income (highly educated) groups and low-income (low education) groups. In the race between technology and education, the proliferation of skilled-biased technological change has, however, allegedly increased the premium for the highly educated group. Read more…
from Maria Alejandra Madi
From the 1950s onwards, the macroeconomic models of the neoclassical synthesis, based a system of simultaneous equations, focused on the interaction between the market for goods and services and the money market in the context of a general equilibrium analysis. According to John Hicks (1904-1989), in the general case, the capitalist economy is at full employment level of output. The underlying employment theory is based on the demand and supply of labour in a competitive market. In fact, this neoclassical approach supposes that price adjustment market mechanisms could guarantee full employment. In same specific cases, however, the general equilibrium implied by the IS-LM model could not necessarily correspond to a full employment level of output. This situation, called unemployment equilibrium, would be the result of market imperfections, such as rigid money wages, interest-inelastic investment demand, income-inelastic money demand, among others.
In the 1960s, mainstream macroeconomic models expanded the analysis of the negative correlation between inflation and unemployment. This correlation was based on the conclusions drawn from an empirical study -the Philips curve- about the negative relationship between the evolution of the rate of employment and the rate of variation of nominal wages in England at the turn of the 20th century. The attempt to incorporate the Phillips curve (trade-off between inflation and unemployment) in the analysis of the labour market dynamics turned out to put emphasis on the role of nominal wages in determining prices, and ultimately, on the demands of workers that put pressure on inflation. read more
from Lars Syll
As yours truly wrote last week, there has been much discussion going on in the economics academia on Paul Romer’s recent critique of ‘modern’ macroeconomics.
Now Oxford professor Simon Wren-Lewis has a blog post up arguing that Romer’s critique is
unfair and wide of the mark in places … Paul’s discussion of real effects from monetary policy, and the insistence on productivity shocks as business cycle drivers, is pretty dated … Yet it took a long time for RBC models to be replaced by New Keynesian models, and you will still see RBC models around. Elements of the New Classical counter revolution of the 1980s still persist in some places … The impression Paul Romer’s article gives, might just have been true in a few years in the 1980s before New Keynesian theory arrived. Since the 1990s New Keynesian theory is now the orthodoxy, and is used by central banks around the world.
Now this rather unsuccessful attempt to disarm the real force of Romer’s critique should come as no surprise for anyone who has been following Wren-Lewis’ writings over the years.
In a recent paper — Unravelling the New Classical Counter Revolution — Wren-Lewis writes approvingly about all the ‘impressive’ theoretical insights New Classical economics has brought to macroeconomics: Read more…
Some of the economists who agree about the state of macro in private conversations will not say so in public. This is consistent with the explanation based on different prices. Yet some of them also discourage me from disagreeing openly, which calls for some other explanation.
They may feel that they will pay a price too if they have to witness the unpleasant reaction that criticism of a revered leader provokes. There is no question that the emotions are intense. After I criticized a paper by Lucas, I had a chance encounter with someone who was so angry that at first he could not speak. Eventually, he told me, “You are killing Bob.”
But my sense is that the problem goes even deeper that avoidance. Several economists I know seem to have assimilated a norm that the post-real macroeconomists actively promote – that it is an extremely serious violation of some honor code for anyone to criticize openly a revered authority figure – and that neither facts that are false, nor predictions that are wrong, nor models that make no sense matter enough to worry about …
Science, and all the other research fields spawned by the enlightenment, survive by “turning the dial to zero” on these innate moral senses. Members cultivate the conviction that nothing is sacred and that authority should always be challenged … By rejecting any reliance on central authority, the members of a research field can coordinate their independent efforts only by maintaining an unwavering commitment to the pursuit of truth, established imperfectly, via the rough consensus that emerges from many independent assessments of publicly disclosed facts and logic; assessments that are made by people who honor clearly stated disagreement, who accept their own fallibility, and relish the chance to subvert any claim of authority, not to mention any claim of infallibility.
This is part of why yours truly appreciate Romer’s article, and even find it ‘brave.’ Everyone knows what he says is true, but few have the courage to openly speak and write about it. The ‘honour code’ in academia certainly needs revision. Read more…
from Lars Syll
In 2007 Thomas Sargent gave a graduation speech at University of California at Berkeley, giving the grads “a short list of valuable lessons that our beautiful subject teaches”:
1. Many things that are desirable are not feasible.
2. Individuals and communities face trade-oﬀs.
3. Other people have more information about their abilities, their eﬀorts, and their preferences than you do.
4. Everyone responds to incentives, including people you want to help. That is why social safety nets don’t always end up working as intended.
5. There are trade oﬀs between equality and eﬃciency.
6. In an equilibrium of a game or an economy, people are satisﬁed with their choices. That is why it is diﬃcult for well meaning outsiders to change things for better or worse.
7. In the future, you too will respond to incentives. That is why there are some promises that you’d like to make but can’t. No one will believe those promises because they know that later it will not be in your interest to deliver. The lesson here is this: before you make a promise, think about whether you will want to keep it if and when your circumstances change. This is how you earn a reputation.
8. Governments and voters respond to incentives too. That is why governments sometimes default on loans and other promises that they have made.
9. It is feasible for one generation to shift costs to subsequent ones. That is what national government debts and the U.S. social security system do (but not the social security system of Singapore).
10. When a government spends, its citizens eventually pay, either today or tomorrow, either through explicit taxes or implicit ones like inﬂation.
11. Most people want other people to pay for public goods and government transfers (especially transfers to themselves).
12. Because market prices aggregate traders’ information, it is diﬃcult to forecast stock prices and interest rates and exchange rates.
Reading through this list of “valuable lessons” things suddenly fall in place.
This kind of self-righteous neoliberal drivel has again and again been praised and prized. And not only by econ bloggers and right-wing think-tanks.
Out of the seventy six laureates that have been awarded ‘The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel,’ twenty eight have been affiliated to The University of Chicago. The world is really a small place when it comes to economics …
from Lars Syll
Peter Dorman is one of those rare economists that it is always a pleasure to read. Here his critical eye is focussed on economists’ infatuation with homogeneity and averages:
You may feel a gnawing discomfort with the way economists use statistical techniques. Ostensibly they focus on the difference between people, countries or whatever the units of observation happen to be, but they nevertheless seem to treat the population of cases as interchangeable—as homogenous on some fundamental level. As if people were replicants.
You are right, and this brief talk is about why and how you’re right, and what this implies for the questions people bring to statistical analysis and the methods they use.
Our point of departure will be a simple multiple regression model of the form
y = β0 + β1 x1 + β2 x2 + …. + ε
where y is an outcome variable, x1 is an explanatory variable of interest, the other x’s are control variables, the β’s are coefficients on these variables (or a constant term, in the case of β0), and ε is a vector of residuals. We could apply the same analysis to more complex functional forms, and we would see the same things, so let’s stay simple.
What question does this model answer? It tells us the average effect that variations in x1 have on the outcome y, controlling for the effects of other explanatory variables. Repeat: it’s the average effect of x1 on y.
This model is applied to a sample of observations. What is assumed to be the same for these observations? (1) The outcome variable y is meaningful for all of them. (2) The list of potential explanatory factors, the x’s, is the same for all. (3) The effects these factors have on the outcome, the β’s, are the same for all. (4) The proper functional form that best explains the outcome is the same for all. In these four respects all units of observation are regarded as essentially the same.
Now what is permitted to differ across these observations? Simply the values of the x’s and therefore the values of y and ε. That’s it.
Thus measures of the difference between individual people or other objects of study are purchased at the cost of immense assumptions of sameness. It is these assumptions that both reflect and justify the search for average effects …
In the end, statistical analysis is about imposing a common structure on observations in order to understand differentiation. Any structure requires assuming some kinds of sameness, but some approaches make much more sweeping assumptions than others. An unfortunate symbiosis has arisen in economics between statistical methods that excessively rule out diversity and statistical questions that center on average (non-diverse) effects. This is damaging in many contexts, including hypothesis testing, program evaluation, forecasting—you name it …
The first step toward recovery is admitting you have a problem. Every statistical analyst should come clean about what assumptions of homogeneity are being made, in light of their plausibility and the opportunities that exist for relaxing them.
Exhibit 10 from The other half of macroeconomics and the three stages of economic development
Richard C. Koo
from Lars Syll
In Economics Rules (OUP 2015), Dani Rodrik maintains that ‘imaginative empirical methods’ — such as game theoretical applications, natural experiments, field experiments, lab experiments, RCTs — can help us to answer questions conerning the external validity of economic models. In Rodrik’s view they are more or less tests of ‘an underlying economic model’ and enable economists to make the right selection from the ever expanding ‘collection of potentially applicable models.’ Writes Rodrik:
Another way we can observe the transformation of the discipline is by looking at the new areas of research that have flourished in recent decades. Three of these are particularly noteworthy: behavioral economics, randomized controlled trials (RCTs), and institutions … They suggest that the view of economics as an insular, inbred discipline closed to the outside influences is more caricature than reality.
I beg to differ. When looked at carefully, there are in fact few real reasons to share Rodrik’s optimism on this ’empirical turn’ in economics.
Field studies and experiments face the same basic problem as theoretical models — they are built on rather artificial conditions and have difficulties with the ‘trade-off’ between internal and external validity. The more artificial conditions, the more internal validity, but also less external validity. The more we rig experiments/field studies/models to avoid the ‘confounding factors’, the less the conditions are reminicent of the real ‘target system.’ You could of course discuss the field vs. experiments vs. theoretical models in terms of realism — but the nodal issue is not about that, but basically about how economists using different isolation strategies in different ‘nomological machines’ attempt to learn about causal relationships. I have strong doubts on the generalizability of all three research strategies, because the probability is high that causal mechanisms are different in different contexts and that lack of homogeneity/stability/invariance doesn’t give us warranted export licenses to the ‘real’ societies or economies.
from Thomas Palley
After more than 7 years of economic recovery, the Federal Reserve is positioning itself to tighten monetary policy by raising interest rates. In light of the wobbly reaction in financial markets, an important question that must be asked is whether raising interest rates is the right tool.
It could well be that the world’s leading central bank is going about the process of tightening in the wrong way. Owing to the dollar’s preeminent standing, that could have severe global repercussions.
Just as the Fed has had to rethink how it combats recessions, so too it must rethink how it transitions from an easy monetary policy stance to a tighter stance.
A quick review
In December 2015, the U.S. Federal Reserve increased interest rates for the first time in almost a decade. This move came with the expectation of gradually raising its interest rate to a new normal of 3%. Initially, normalization aimed to lighten pressure on the monetary policy pedal, and only later would it turn to hitting the monetary policy brake.
from Lars Syll
In his review of Mervyn King’s The End of Alchemy: Money, Banking, and the Future of the Global Economy Krugman writes:
Is this argument right, analytically? I’d like to see King lay out a specific model for his claims, because I suspect that this is exactly the kind of situation in which words alone can create an illusion of logical coherence that dissipates when you try to do the math. Also, it’s unclear what this has to do with radical uncertainty. But this is a topic that really should be hashed out in technical working papers.
This passage really says it all.
Despite all his radical rhetoric, Krugman is — where it really counts — nothing but a die-hard mainstream neoclassical economist. Just like Milton Friedman, Robert Lucas or Greg Mankiw.
The only economic analysis that Krugman and other mainstream economists accept is the one that takes place within the analytic-formalistic modeling strategy that makes up the core of mainstream economics. All models and theories that do not live up to the precepts of the mainstream methodological canon are pruned. You’re free to take your models — not using (mathematical) models at all is, as made clear by Krugman’s comment on King, totally unthinkable — and apply them to whatever you want – as long as you do it within the mainstream approach and its modeling strategy. If you do not follow this particular mathematical-deductive analytical formalism you’re not even considered doing economics. ‘If it isn’t modeled, it isn’t economics.’
That isn’t pluralism.
That’s a methodological reductionist straightjacket. Read more…
from Lars Syll
Having read my post on Krugman and Hicks’ IS-LM misrepresentation of Keynes’ theory, professor Jan Kregel kindly sent an unpublished article he wrote back in 1984 — The Importance of Choosing a Model: Hicks vs. Keynes on Money, Interest and Prices — in which it is argued that Hicks’ particular presentation of Keynes’ theory and choice of model was “crucial to its destruction”:
from Lars Syll
Given how sweeping the changes wrought by SMD (Sonnenschein-Mantel-Debreu) theory seem to be, it is understandable that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory. These views are widely shared … General equilibrium theory “poses some arduous challenges” as a “paradigm for organizing and synthesizing economic data” so that “a widely accepted empirical counterpart to general equilibrium theory remains to be developed” (Hansen and Heckman 1996). This seems to be the now-accepted view thirty years after the advent of SMD theory …
And so what? Why should we care about Sonnenschein-Mantel-Debreu?
Because Sonnenschein-Mantel-Debreu ultimately explains why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis! Read more…
from Lars Syll
Raphaële Chappe has written a very interesting article about the value of general equilibrium theory, concluding in the following words:
For a student of real world markets, general equilibrium theory appears strangely distant. It is not surprising that a highly abstract framework consisting of hyper-rational agents might be ill equipped to provide a sufficiently credible account of markets in modern capitalism. What is more surprising is that despite these obvious limitations (some even claim that general equilibrium has been “dead” since the Sonnenschein-Mantel-Debreu results in the 1970s) the framework is still central to the Ph.D. curriculum, and continues to play a preeminent role in the high theory of economics. To the extent it shows the limits of the way of thinking, this is fair enough, but that is not how the subject is approached, by Mas-Colell/ Whinston/Green or any other major text. Is this an example of the greater importance given in our ‘science’ to the rites of justification than to the task of explanation? When a way of thinking limits our thinking then it’s time, with due appreciation for those who built it, to ‘throw away the ladder’.
The economist considering general equilibrium since the SMD results dead, is Frank Ackerman, and this is what he has to say on general equilibrium: Read more…
from Asad Zaman
The IMF has been among the principal architects, executors and enforcers of the neoliberal agenda all over the globe for several decades. This is why an IMF publication with an article entitled “Neoliberalism: Oversold?” is as surprising as an ISIS article entitled “Violence: Oversold?” would be. Has neoliberalism become so unpopular that even the IMF does not want to be associated with it? In this essay, we study the lessons that the IMF claims to have learnt from experience with its single-minded drive to enforce the neoliberal agenda throughout the globe.
The article by Ostry, Loungani, and Furceri, starts out by praising the neoliberal agenda for getting many things right. The authors write that they will confine their critique to two aspects. The first is capital account liberalisation, which means freely allowing capital flows across national borders. The second is “austerity”, which requires tight control of budget deficits, raising taxes, lowering expenses and making borrowing costly for the government. read more
from Lars Syll
Paul Krugman has often been criticized by people like yours truly for getting things pretty wrong on the economics of John Maynard Keynes.
Surely we don’t want to do economics via textual analysis of the masters. The questions one should ask about
any economic approach are whether it helps us understand what’s going on, and whether it provides useful guidance for decisions.
So I don’t care whether Hicksian IS-LM is Keynesian in the sense that Keynes himself would have approved of it, and neither should you.
The reason for this rather debonair attitude seems to be that history of economic thought may be OK, but what really counts is if reading Keynes gives birth to new and interesting insights and ideas.
No serious economist would question that explaining and understanding “what’s going on” in our economies is the most important task economists can set themselves — but it is not the only task. And to compare one’s favourite economic gadget model to what “austerians” and other madmen from Chicago have conjured up, well, that’s like playing tennis with the nets down, and we have to have higher aspirations as scientists.
from Lars Syll
Economic theory, like anthropology, ‘works’ by studying societies which are in some relevant sense simpler or more primitive than our own, in the hope either that relations that are important but hidden in our society will be laid bare in simpler ones, or that concrete evidence can be discovered for possibilities which are open to us which are without precedent in our own history.
Unlike anthropologists, however, economists simply invent the primitive societies we study, a practice which frees us from limiting ourselves to societies which can be physically visited as sparing us the discomforts of long stays among savages. This method of society-invention is the source of the utopian character of economics; and of the mix of distrust and envy with which we are viewed by our fellow social scientists. The point of studying wholly fictional, rather than actual societies, is that it is relatively inexpensive to subject them to external forces of various types and observe the way they react. If, subjected to forces similar to those acting on actual societies, the artificial society reacts in a similar way, we gain confidence that there are useable connections between the invented society and the one we really care about.
Although neither yours truly, nor anthropologists (I guess), will recognise anything in this description even remotely reminiscent of practices actually used in real sciences, this quote still gives a very good picture of Lucas’ warped methodology.
from Asad Zaman
A driving spirit of the modern age is the desire to banish all speculation about things beyond the physical and observable realms of our existence. This spirit was well expressed by one of the leading Enlightenment philosophers, David Hume, who called for burning all books which did not deal with the observable and quantifiable phenomena: “If we take in our hand any volume; of divinity or school metaphysics, for instance; let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning concerning matter of fact and existence? No. Commit it then to the flames: for it can contain nothing but sophistry and illusion.”
This is a breathtakingly bold assertion. The literate reader may examine his or her bookshelf to see what little, if anything, would survive after applying Hume’s prescriptions. Nonetheless, the spirit of the secular age was very much in tune with Hume, and relegated vast areas of human knowledge captured in literature, history, and the arts, to second-class citizenship. The modern world has been shaped by this downgrading of the spiritual, intuitive, and mystical, and the elevation of the rational as supreme judge and arbiter over all other faculties.
The leaders of the Enlightenment advocated rationality as the sole criterion for establishing an authoritative system of ethics, aesthetics, and knowledge. This has led to a dualism which has become firmly embedded in the foundations of Western thought, and has created a social science incapable of perceiving, let alone solving the problems currently being faced by humanity as a whole. Western hegemony has led to the global and widespread acceptance of this dualism, clearly expressed by Hume, in embracing the quantitative and passionately and violently rejecting the qualitative. Exploring the full range of difficulties caused by this dualism would take several books. In this essay we consider just one of the salient problems. Harvard Professor Julie Reuben expressed it as follows: “Truth was (a united whole) embracing spiritual, moral, and cognitive knowledge. By the 1930’s, this unity was shattered; factual cognitive knowledge (was separated from) moral/spiritual knowledge.” read more
from Grazia Ietto-Gillies
The business media is awash with news about transnational companies (TNCs) be they in the services or manufacturing or agriculture sector. The news may refer to performance or strategies or plans for real investment (or the lack of them) or takeovers. There is currently also considerable interest in their tax minimization strategies.
Yet economics textbooks and courses are still shying away from this most relevant part of our contemporary economies. This is true of both orthodox/neoclassical approaches and – I regret to say – of alternative ones as a quick analysis of textbooks recommended in the WEA Pedagogy page shows.
It could be argued that the nationality of the investor, employer, or producer does not matter: a firm is a firm and the task of economics is to study it independently of where it invests or its nationality. I have argued (Ietto-Gillies, 2004 and 2012: introduction and Ch. 14) that the existence of nation-states with their different regulatory regimes makes a specific study of the TNC necessary. The regulatory regimes refer to taxation or labour and social security or currencies or environmental laws. The differences in regulatory regimes across different countries generate opportunities for alternative, profitable strategies for firms able to operate across national frontiers. Such operations allow the TNC to take advantage of different fiscal, currency or labour and social security or environmental regulations. Most relevant, transnationality increases the bargaining power of TNCs over labour as we see on an almost daily basis throughout the world. On the fiscal side the advantages that TNCs derive from their tax minimization strategies are partly linked to strategic location of their headquarters in tax-friendly countries and partly to the widely used manipulation of transfer prices (Eden, 2001; OECD, 2010). read more
from Lars Syll
Paul Krugman has in numerous posts on his blog tried to defend “the whole enterprise of Keynes/Hicks macroeconomic theory” and especially his own somewhat idiosyncratic version of IS-LM.
The main problem is simpliciter that there is no such thing as a Keynes-Hicks macroeconomic theory!
So, let us get some things straight.
There is nothing in the post-General Theory writings of Keynes that suggests him considering Hicks’s IS-LM anywhere near a faithful rendering of his thought. In Keynes’s canonical statement of the essence of his theory in the 1937 QJE-article there is nothing to even suggest that Keynes would have thought the existence of a Keynes-Hicks-IS-LM-theory anything but pure nonsense. So of course there can’t be any “vindication for the whole enterprise of Keynes/Hicks macroeconomic theory” – simply because “Keynes/Hicks” never existed.
And it gets even worse!
John Hicks, the man who invented IS-LM in his 1937 Econometrica review of Keynes’ General Theory – ‘Mr. Keynes and the ‘Classics’. A Suggested Interpretation’ – returned to it in an article in 1980 – ‘IS-LM: an explanation’ – in Journal of Post Keynesian Economics. Self-critically he wrote: Read more…