Archive

Author Archive

Busting the NAIRU myth

May 14, 2018 Leave a comment

from Lars Syll

Even as it became conventional wisdom, the supposed relationship between unemployment and increasing or decreasing rates of inflation was breaking down — notably in the 1990s. Unemployment got below 4 percent in 2000 without inflation taking off. Since the onset of Great Recession, the gap between theory and reality has only grown …

phillips-curve-lr-1Once we see how weak the foundations for the natural rate of unemployment are, other arguments for pursuing rates of unemployment economists once thought impossible become more clear. Wages can increase at the expense of corporate profits without causing inflation. Indeed, since 2014 we are seeing an increase in the share of the economy that goes to labor.

Even better, lower unemployment doesn’t just help workers: It can spur overall growth. As the economist J.W. Mason argues, as we approach full employment incentives emerge for greater investment in labor-saving productivity, as companies seek to keep labor costs in check as workers demand more. This productivity increase stimulates yet more growth.

The harder we push on improving output and employment, the more we learn how much we can achieve on those two fronts. That hopeful idea is the polar opposite of a natural, unalterable rate of unemployment. And it’s an idea and attitude that we need to embrace if we’re to have a shot at fully recovering from the wreckage of the Great Recession.

Mike Konczal/Vox 

Read more…

Textbooks — peddling lies about money and finance

May 11, 2018 3 comments

from Lars Syll

slide_6A couple of years ago — in a debatewith James Galbraith and Willem Buiter — Paul Krugman made it perfectly clear that he was a strong believer of the ‘loanable funds’ theory.

Unfortunately, this is not an exception among ‘New Keynesian’ economists.

Neglecting anything resembling a real-world finance system, Greg Mankiw — in his intermediate textbook Macroeconomics — more or less equates finance to the neoclassical thought-construction of a ‘market for loanable funds.’

On the subject of financial crises, he admits that

perhaps we should view speculative excess and its ramifications as an inherent feature of market economies … but preventing them entirely may be too much to ask given our current knowledge.

This is, of course, self-evident for all of us who understand that genuine uncertainty makes any such hopes totally unfounded. But it’s rather odd to read this in a book that bases its models on assumptions of rational expectations, representative actors and dynamically stochastic general equilibrium – assumptions that convey the view that markets – give or take a few rigidities and menu costs – are efficient! For being one of many neoclassical economists so proud of their consistent models, Mankiw here certainly is flagrantly inconsistent!  Read more…

Debunking the NAIRU hoax

May 10, 2018 7 comments

from Lars Syll

powemp3In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

Many politicians and economists subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail​ since governments and central banks can not push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.  Read more…

DSGE models in the ‘New Keynesian’ repair shop

May 8, 2018 7 comments

from Lars Syll

The problem of the DSGE-models (and more generally of rational expectations macroeconomic models) is that they assume extraordinary cognitive capabilities of individual agents. Recent developments in other disciplines including psychology and brain science overwhelmingly document that individual agents struggle with limited cognitive abilities, restricting their capacity to understand the world. As a result, individual agents use small bits of information and simple rules to guide their behavior.

aa-model-train-repair-584x300The fact that the assumption of rational expectations is implausible does not necessarily mean that models using such an assumption cannot be powerful tools in making empirical predictions. The problem, however, is that rational expectations macroeconomic model make systematically wrong predictions, in particular about the speed with which prices adjust. This empirical failure could have led the profession of macroeconomists to drop the model and to look for another one. Instead, macroeconomists decided to stick to the rational expectations model but to load it with a series of ad-hoc repairs that were motivated by a desire to improve its fit. These repair operations most often involved adding lags to the models so as to create sufficient inertia in variables. These repair operations most often involved adding lags to the models so as to create sufficient inertia in variables. These operations were successful in the sense that the fit was significantly improved. In another sense, however, they were failures because the inertia building tricks are really departures from rationality. As a result, the present DSGE-models create a dynamics the largest part of which is the result of the ad-hoc repair operations. These have nothing to do with optimizing behavior and rationality of expectations. In a way it can be said that these ad-hoc repairs introduced heuristics in the model through the back door.

The success of the DSGE-model has much to do with the story it tells about how the macroeconomy functions. This is a story in which rationality of superbly informed and identical agents reigns … We have questioned this story by presenting an alternative one. This is a story in which agents do not understand the model well, and use a trial and error learning strategy to discover its underlying logic …

Read more…

The pretense-of-knowledge syndrome​

May 6, 2018 17 comments

from Lars Syll

What does concern me about my discipline … is that its current core — by which I mainly mean the so-called dynamic stochastic general equilibrium approach — has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one …

caballero8While it often makes sense to assume rational expectations for a limited application to isolate a particular mechanism that is distinct from the role of expectations formation, this assumption no longer makes sense once we assemble the whole model. Agents could be fully rational with respect to their local environments and everyday activities, but they are most probably nearly clueless with respect to the statistics about which current macroeconomic models expect them to have full information and rational information.

This issue is not one that can be addressed by adding a parameter capturing a little bit more risk aversion about macro-economic, rather than local, phenomena. The reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic and policy​ work. In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach lessplausible …

Read more…

Mathematics and economics

May 5, 2018 45 comments

from Lars Syll

Many mainstream economists have the idea that because heterodox people — like yours truly — often criticize the application of mathematics in economics, we are critical of math per se.

This is totally unfounded and ridiculous. I do not know how many times I have been asked to answer this straw-man objection to heterodox economics.

No, there is nothing wrong with mathematics per se.

No, there is nothing wrong with applying mathematics to economics.

amathMathematics is one valuable tool among other valuable tools for understanding and explaining things in economics.

What is, however, totally wrong, are the utterly simplistic beliefs that

• “math is the only valid tool”

• “math is always and everywhere self-evidently applicable”

• “math is all that really counts”

• “if it’s not in math, it’s not really economics”

• “almost everything can be adequately understood and analyzed with math”

That said, let us never forget that without strong evidence all kinds of absurd claims and nonsense may pretend to be science. Using math can never be a substitute​ for thinking. Or as Paul Romer has it in his showdown with ‘post-real’ Chicago economics:

Math cannot establish the truth value of a fact. Never has. Never will.

Financial regulations

May 4, 2018 26 comments

from Lars Syll

A couple of years ago, former chairman of the Fed, Alan Greenspan, wrote in an article in the Financial Timesre the increased demands for stronger regulation of banks and finance:

Alan Greenspan and Ayn Rand at the White House after Greenspan was sworn in as chairman of Gerald Ford’s Council of Economic Advisers, September 1974Since the devastating Japanese earthquake and, earlier, the global financial tsunami, governments have been pressed to guarantee their populations against virtually all the risks exposed by those extremely low probability events. But should they? Guarantees require the building up of a buffer of idle resources that are not otherwise engaged in the production of goods and services. They are employed only if, and when, the crisis emerges.

The buffer may encompass expensive building materials whose earthquake flexibility is needed for only a minute or two every century, or an extensive stock of vaccines for a feared epidemic that may never occur. Any excess bank equity capital also would constitute a buffer that is not otherwise available to finance productivity-enhancing capital investment.

That is — to say the least — astonishing. Not wanting to take genuine uncertainty or ‘fat tails’ seriously is ominous enough. Is there anything the year 2008 taught us, it is that the ‘tail risks’ are genuinely real and must be included in all financial calculations. But even worse is how someone – who surely ought to have read at least an introductory course in economics – can get the idea that demand for higher capital requirements of banks would be equivalent to building buffers of ‘idle resources.’ The claim is from an economist’s point of view absolute nonsense.

Read more…

MMT — the Wicksell connection

May 3, 2018 21 comments

from Lars Syll

Most mainstream economists seem to think the idea behind Modern Monetary Theory is something new that some wild heterodox economic cranks have come up with.

New? Cranks? How about reading one of the great founders of neoclassical economics — Knut Wicksell. This is what Wicksell wrote in 1898 on ‘pure credit systems’ in Interest and Prices (Geldzins und Güterpreise):

It is possible to go even further. There is no real need for any money at all if a payment between two customers can be accomplished by simply transferring the appropriate sum of money in the books of the bank 

A pure credit system has not yet … been completely developed in this form. But here and there it is to be found in the somewhat different guise of the banknote system …

We intend therefor, as a basis for the following discussion, to imagine a state of affairs in which money does not actually circulate at all, neither in the form of coin … nor in the form of notes, but where all domestic payments are effected by means of the Giro system and bookkeeping transfers. A thorough analysis of this purely imaginary case seems to me to be worth while, for it provides a precise antithesis to the equally imaginay case of a pure cash system, in which credit plays no part whatever [the exact equivalent of the often used neoclassical model assumption of ‘cash in advance’ – LPS] …

For the sake of simplicity, let us then assume that the whole monetary system of a country is in the hands of a single credit institution, provided with an adequate number of branches, at which each independent economic individual keeps an account on which he can draw cheques.

Read more…

The Lucas critique comes back with a vengeance in DSGE models

May 1, 2018 3 comments

from Lars Syll

Both approaches to DSGE macroeconometrics (VAR and Bayesian) have evident vulnerabilities, which substantially derive from how parameters are handled in the technique. In brief, parameters from formally elegant models are calibrated in order to obtain simulated values that reproduce some stylized fact and/or some empirical data distribution, thus relating the underlying theoretical model and the observational data. But there are at least three main respects in which this practice fails.

lucasFirst of all, DSGE models have substantial difficulties in taking account of many important mechanisms that actually govern real economies, for example, institutional constraints like the tax system, thereby reducing DSGE power in policy analysis … In the attempt to deal with this serious problem, various parameter constraints on the model policy block are provided. They derive from institutional analysis and reflect policymakers’ operational procedures. However such model extensions, which are intended to reshape its predictions to reality and to deal with the underlying optimization problem, prove to be highly unflexible, turning DSGE into a “straitjacket tool” … In particular, the structure imposed on DSGE parameters entails various identification problems, such as observational equivalence, underidentification, and partial and weak identification.  Read more…

The loanable funds fallacy

April 27, 2018 12 comments

from Lars Syll

The loanable funds theory is in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credits set by banks and determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIt is a beautiful fairy tale, but the problem is that banks are notbarter institutions that transfer pre-existing loanable funds from depositors to borrowers. Why? Because, in the real world, there simply are no pre-existing loanable funds. Banks create new funds — credit — only if someone has previously got into debt! Banks are monetary institutions, not barter vehicles.

In the traditional loanable funds theory — as presented in mainstream macroeconomics textbooks — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings via a lower interest.   Read more…

The case for a new economics

April 24, 2018 31 comments

from Lars Syll

When the great crash hit a decade ago, the public realised that the economics profession was clueless …

After 10 years in the shadow of the crisis, the profession’s more open minds have recognised there is serious re-thinking to be done …

But the truth is that most of the “reforms” have been about adding modules to the basic template, leaving the core of the old discipline essentially intact. My view is that this is insufficient, and treats the symptoms rather than the underlying malaise …

RE-LogoIf we accept that we need fundamental reform, what should the new economics—“de-conomics” as I’m calling it—look like?

First, we need to accept that there is no such thing as “value-free” analysis of the economy. As I’ve explained, neoclassical economics pretends to be ethically neutral while smuggling in an individualistic, anti-social ethos …

Second, the analysis needs to be based around how human beings actually operate—rather than how neoclassicism asserts that “rational economic person (or firm)” should operate …

Third, we need to put the good life centre stage, rather than prioritising the areas that are most amenable to analysis via late-19th century linear mathematics. Technological progress and power relationships between firms, workers and governments need to be at the heart of economic discourse and research …

Finally, economics needs to be pluralistic. For the last half-century neoclassical economics has been gradually colonising other social science disciplines such as sociology and political science. It is high time this process reversed itself so that there was two-way traffic and a mutually beneficial learning exchange between disciplines. It is possible—and probably desirable—that the “deconomics” of the future looks more like psychology, sociology or anthropology than it does today’s arid economics …

The change I am seeking is no more fundamental than the transition from classical to neoclassical economics, and that was accomplished without the discipline imploding. And this time around we’ve got then-unimaginable data and other resources. So there can be no excuse for delay. Let economists free themselves of a misleading map, and then—with clear eyes—look at the world anew.

Howard Reed/Prospect Magazine

Read more…

The tractability hoax in modern economics

April 22, 2018 8 comments

from Lars Syll

While the paternity of the theoretical apparatus underlying the new neoclassical synthesis in macro is contested, there is wide agreement that the methodological framework was largely architected by Robert Lucas … Bringing a representative agent meant foregoing the possibility to tackle inequality, redistribution and justice concerns. Was it deliberate? How much does this choice owe to tractability? What macroeconomists were chasing, in these years, was a renewed explanation of the business cycle. They were trying to write microfounded and dynamic models …

tractable-2Rational expectations imposed cross-equation restrictions, yet estimating these new models substantially raised the computing burden. Assuming a representative agent mitigated computational demands, and allowed macroeconomists to get away with general equilibrium aggregate issues: it made new-classical models analytically and computationally tractable …

Was tractability the main reason why Lucas embraced the representative agent (and market clearing)? Or could he have improved tractability through alternative hypotheses, leading to opposed policy conclusions? … Some macroeconomists may have endorsed the new class of Lucas-critique-proof models because they liked its policy conclusions. Other may have retained some hypotheses, then some simplifications, “because it makes the model tractable.” And while the limits of simplifying assumptions are often emphasized by those who propose them, as they spread, caveats are forgotten. Tractability restricts the range of accepted models and prevent economists from discussing some social issues, and with time, from even “seeing” them. Tractability ‘filters’ economists’ reality … The aggregate effect of “looking for tractable models” is unknown, and yet it is crucial to understand the current state of economics.

Beatrice Cherrier

Read more…

Sometimes we do not know because we cannot know

April 21, 2018 42 comments

from Lars Syll

Some time ago, Bank of England’s Andrew G Haldane and Benjamin Nelson presented a paper with the title Tails of the unexpected. The main message of the paper was that we should not let us be fooled by randomness:

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

blNormality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, it merits a couple of comments.

Read more…

DSGE models — overconfident macroeconomic story-telling

April 19, 2018 5 comments

from Lars Syll

A recent paper by Christiano, Eichenbaum and Trabandt (C.E.T.) on Dynamic Stochastic General Equilibrium Models (DSGEs) has generated quite a reaction in the blogosphere …

The-DSGE-Model-Quarrel-Again-e1512989462377Bradford Delongpoints out that new Keynesian models were constructed to show that old Keynesian and old Monetarist policy conclusions were relatively robust, and not blown out of the water by rational expectations … The DSGE framework was then constructed so that new Keynesians could talk to RBCites. None of this has, so far, materially advanced the project of understanding the macroeconomic policy-relevant emergent properties of really existing industrial and post-industrial economies …  Read more…

Shortcomings of regression analysis

April 17, 2018 74 comments

from Lars Syll

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis. In his Intelligence and How to Get It (Norton 2011) he writes:

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

Read more…

Modeling economic risk

April 14, 2018 14 comments

from Lars Syll

sav2Model builders face a constant temptation to make things more complicated than necessary because this makes them appear more capable than they really are.

Remember that a model is not the truth.

It is a lie to help you get your point across.

And in the case of modeling economic risk, your model is a lie about others, who are probably lying themselves.

And what’s worse than a simple lie?

A complicated lie.

The wisdom of crowds

April 14, 2018 7 comments

from Lars Syll

A classic demonstration of group intelligence is the jelly-beans-in-the-jar experiment, in which invariably the group’s estimate is superior to the vast majority of the individual guesses. When finance professor Jack Treynor ran the experiment in his class with a jar that held 850 beans, the group estimate was 871. Only one of the fifty-six people in the class made a better guess.

There are two lessons to draw from these experiments. First, in most of them the members of the group were not talking to each other or working on a problem together. They were making individual guesses, which were aggregated and then averaged … Second, the group’s guess will not be better than that of every single person in the group each time. WoCIn many (perhaps most) cases, there will be a few people who do better than the group. This is, in some sense, a good thing, since especially in situations where there is an incentive for doing well (like, say, the stock market) it gives people reason to keep participating. But there is no evidence in these studies that certain people consistently outperform the group. In other words, if you run ten different jelly-bean-counting experiments, it’s likely that each time one or two students will outperform the group. But they will not be the same students each time. Over the ten experiments, the group’s performance will almost certainly be the best possible. The simplest way to get reliably good answers is just to ask the group each time.

Read more…

Keynesian uncertainty

April 12, 2018 45 comments

from Lars Syll

0In “modern” macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course, it has to. Who really honestly believes that we have access to this mythical Holy Grail, the data-generating process?

“Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances. Read more…

How to get published​ in top economics journals

April 9, 2018 2 comments

from Lars Syll

an-inconvenient-truth1By the early 1980s it was already common knowledge among people I hung out with that the only way to get non-crazy macro-economics published was to wrap sensible assumptions about output and employment in something else, something that involved rational expectations and intertemporal stuff and made the paper respectable. And yes, that was conscious knowledge, which shaped the kinds of papers we wrote.

Paul Krugman

More or less says it all, doesn’t it?

And for those of us who do not want to play according to​ those sickening hypocritical rules — well, here’s one particularly good alternative.

Keynes vs. Keynesianism

April 8, 2018 17 comments

from Lars Syll

keynes3But these more recent writers like their predecessors were still dealing with a system in which the amount of the factors employed was given and the other relevant facts were known more or less for certain. This does not mean that they were dealing with a system in which change was ruled out, or even one in which the disappointment of expectation was ruled out. But at any given time facts and expectations were assumed to be given in a definite and calculable form; and risks, of which, tho admitted, not much notice was taken, were supposed to be capable of an exact actuarial computation. The calculus of probability, tho mention of it was kept in the background, was supposed to be capable of reducing uncertainty to the same calculable status as that of certainty itself …

Thus the fact that our knowledge of the future is fluctuating, vague and uncertain, renders Wealth a peculiarly unsuitable subject for the methods of the classical economic theory.

John Maynard Keynes QJE 1937

And this emphasis on the importance of uncertainty is not even mentioned in IS-LM or ‘New’ Keynesianism …