Archive

Author Archive

Why the p-value is a poor substitute for scientific​ reasoning

May 24, 2018 10 comments

from Lars Syll

A non-trivial part of teaching statistics is made up of learning students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests really are, still most students misinterpret them.

This is not to blame on students’ ignorance, but rather on significance testing not being particularly transparent (conditional probability inference is difficult even to those of us who teach and practice it). A lot of researchers fall prey​ to the same mistakes.  Read more…

Krugman’s Gadget Keynesianism

May 23, 2018 4 comments

from Lars Syll

Paul Krugman has often been criticized by people like yours truly for getting things pretty wrong on the economics of John Maynard Keynes.

krugmanWhen Krugman has responded to the critique, by himself rather gratuitously portrayed as about “What Keynes Really Meant,” the overall conclusion is — “Krugman Doesn’t Care.”

Responding to an earlier post of mine questioning his IS-LM-Keynesianism, Krugman write:

Surely we don’t want to do economics via textual analysis of the masters. The questions one should ask about any economic approach are whether it helps us understand what’s going on​ and whether it provides useful guidance for decisions.

So I don’t care whether Hicksian IS-LM is Keynesian in the sense that Keynes himself would have approved of it, and neither should you.

The reason for this rather debonair attitude seems to be that history of economic thought may be OK, but what really counts is if reading Keynes gives birth to new and interesting insights and ideas.

No serious economist would question that explaining and understanding “what’s going on” in our economies is the most important task economists can set themselves — but it is not the only task.  And to compare one’s favourite economic gadget model to what madmen from Chicago have conjured up, well, that’s like playing tennis with the nets down, and we have to have higher aspirations as scientists.  Read more…

Statisticism — confusing statistics and research

May 22, 2018 3 comments

from Lars Syll

140113.bigdataCoupled with downright incompetence in statistics, we often find the syndrome that I have come to call statisticism: the notion that computing is synonymous with doing research, the naïve faith that statistics is a complete or sufficient basis for scientific methodology, the superstition that statistical formulas exist for evaluating such things as the relative merits of different substantive theories or the “importance” of  the causes of a “dependent variable”; and the delusion that decomposing the covariations of some arbitrary and haphazardly assembled collection of variables can somehow justify not only a “causal model” but also, praise a mark, a “measurement model.” There would be no point in deploring such caricatures of the scientific enterprise if there were a clearly identifiable sector of social science research wherein such fallacies were clearly recognized and emphatically out of bounds.

Dudley Duncan

Statistical reasoning certainly seems paradoxical to most people.

Take for example the well-known Simpson’s paradox.

Read more…

Schumpeter — an early champion of MMT

May 20, 2018 17 comments

from Lars Syll

Evidently this phenomenon is peculiar to money and has no analogue in the world of commodities. No claim to sheep increases the number of sheep. But a deposit, though legally only a claim to legal-tender money, serves within very wide limits the same purposes that this money itself would serve. Banks do not, of course, ‘create’ legal- tender money and still less do they ‘create’ machines. They do, however, something—it is perhaps easier to see this in the case of the issue of banknotes—which, in its economic effects, comes pretty near to creating legal-tender money and which may lead to the creation of ‘real capital’ that could not have been created without this practice.sch But this alters the analytic situation profoundly and makes it highly inadvisable to construe bank credit on the model of existing funds’ being withdrawn from previous uses by an entirely imaginary act of saving and then lent out by their owners. It is much more realistic to say that the banks ‘create credit,’ that is, that they create deposits in their act of lending, than to say that they lend the deposits that have been entrusted to them. And the reason for insisting on this is that depositors should not be invested with the insignia of a role which they do not play. The theory to which economists clung so tenaciously makes them out to be savers when they neither save nor intend to do so; it attributes to them an influence on the ‘supply of credit’ which they do not have. The theory of ‘credit creation’ not only recognizes patent facts without obscuring them by artificial constructions; it also brings out the peculiar mechanism of saving and investment that is characteristic of fullfledged capitalist society and the true role of banks in capitalist evolution. With less qualification than has to be added in most cases, this theory therefore constitutes definite advance in analysis.

Oh dear, oh dear, Krugman gets it so wrong, so wrong

May 17, 2018 3 comments

from Lars Syll

Economic-Model-1024x576Economics is a science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world. It is compelled to be this, because, unlike the typical natural science, the material to which it is applied is, in too many respects, not homogeneous through time. The object of a model is to segregate the semi-permanent or relatively constant factors from those which are transitory or fluctuating so as to develop a logical way of thinking about the latter, and of understanding the time sequences to which they give rise in particular cases … Good economists are scarce because the gift for using “vigilant observation” to choose good models, although it does not require a highly specialised intellectual technique, appears to be a very rare one.

J. M. Keynes in letter to Roy Harrod (1938)

I came to think of this passage when I read “sort of  New Keynesian” economist Paul Krugman’s blog in the ongoing discussion on the state of macro. Krugman argues that even though he and other “sort of New Keynesian” macroeconomists use the same “equipment” as RBC-New-Classical-freshwater macroeconomists, he resents the allegation that they are sharing the same endeavour. Krugman writesRead more…

Busting the NAIRU myth

May 14, 2018 Leave a comment

from Lars Syll

Even as it became conventional wisdom, the supposed relationship between unemployment and increasing or decreasing rates of inflation was breaking down — notably in the 1990s. Unemployment got below 4 percent in 2000 without inflation taking off. Since the onset of Great Recession, the gap between theory and reality has only grown …

phillips-curve-lr-1Once we see how weak the foundations for the natural rate of unemployment are, other arguments for pursuing rates of unemployment economists once thought impossible become more clear. Wages can increase at the expense of corporate profits without causing inflation. Indeed, since 2014 we are seeing an increase in the share of the economy that goes to labor.

Even better, lower unemployment doesn’t just help workers: It can spur overall growth. As the economist J.W. Mason argues, as we approach full employment incentives emerge for greater investment in labor-saving productivity, as companies seek to keep labor costs in check as workers demand more. This productivity increase stimulates yet more growth.

The harder we push on improving output and employment, the more we learn how much we can achieve on those two fronts. That hopeful idea is the polar opposite of a natural, unalterable rate of unemployment. And it’s an idea and attitude that we need to embrace if we’re to have a shot at fully recovering from the wreckage of the Great Recession.

Mike Konczal/Vox 

Read more…

Textbooks — peddling lies about money and finance

May 11, 2018 3 comments

from Lars Syll

slide_6A couple of years ago — in a debatewith James Galbraith and Willem Buiter — Paul Krugman made it perfectly clear that he was a strong believer of the ‘loanable funds’ theory.

Unfortunately, this is not an exception among ‘New Keynesian’ economists.

Neglecting anything resembling a real-world finance system, Greg Mankiw — in his intermediate textbook Macroeconomics — more or less equates finance to the neoclassical thought-construction of a ‘market for loanable funds.’

On the subject of financial crises, he admits that

perhaps we should view speculative excess and its ramifications as an inherent feature of market economies … but preventing them entirely may be too much to ask given our current knowledge.

This is, of course, self-evident for all of us who understand that genuine uncertainty makes any such hopes totally unfounded. But it’s rather odd to read this in a book that bases its models on assumptions of rational expectations, representative actors and dynamically stochastic general equilibrium – assumptions that convey the view that markets – give or take a few rigidities and menu costs – are efficient! For being one of many neoclassical economists so proud of their consistent models, Mankiw here certainly is flagrantly inconsistent!  Read more…

Debunking the NAIRU hoax

May 10, 2018 7 comments

from Lars Syll

powemp3In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

Many politicians and economists subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail​ since governments and central banks can not push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.  Read more…

DSGE models in the ‘New Keynesian’ repair shop

May 8, 2018 7 comments

from Lars Syll

The problem of the DSGE-models (and more generally of rational expectations macroeconomic models) is that they assume extraordinary cognitive capabilities of individual agents. Recent developments in other disciplines including psychology and brain science overwhelmingly document that individual agents struggle with limited cognitive abilities, restricting their capacity to understand the world. As a result, individual agents use small bits of information and simple rules to guide their behavior.

aa-model-train-repair-584x300The fact that the assumption of rational expectations is implausible does not necessarily mean that models using such an assumption cannot be powerful tools in making empirical predictions. The problem, however, is that rational expectations macroeconomic model make systematically wrong predictions, in particular about the speed with which prices adjust. This empirical failure could have led the profession of macroeconomists to drop the model and to look for another one. Instead, macroeconomists decided to stick to the rational expectations model but to load it with a series of ad-hoc repairs that were motivated by a desire to improve its fit. These repair operations most often involved adding lags to the models so as to create sufficient inertia in variables. These repair operations most often involved adding lags to the models so as to create sufficient inertia in variables. These operations were successful in the sense that the fit was significantly improved. In another sense, however, they were failures because the inertia building tricks are really departures from rationality. As a result, the present DSGE-models create a dynamics the largest part of which is the result of the ad-hoc repair operations. These have nothing to do with optimizing behavior and rationality of expectations. In a way it can be said that these ad-hoc repairs introduced heuristics in the model through the back door.

The success of the DSGE-model has much to do with the story it tells about how the macroeconomy functions. This is a story in which rationality of superbly informed and identical agents reigns … We have questioned this story by presenting an alternative one. This is a story in which agents do not understand the model well, and use a trial and error learning strategy to discover its underlying logic …

Read more…

The pretense-of-knowledge syndrome​

May 6, 2018 17 comments

from Lars Syll

What does concern me about my discipline … is that its current core — by which I mainly mean the so-called dynamic stochastic general equilibrium approach — has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one …

caballero8While it often makes sense to assume rational expectations for a limited application to isolate a particular mechanism that is distinct from the role of expectations formation, this assumption no longer makes sense once we assemble the whole model. Agents could be fully rational with respect to their local environments and everyday activities, but they are most probably nearly clueless with respect to the statistics about which current macroeconomic models expect them to have full information and rational information.

This issue is not one that can be addressed by adding a parameter capturing a little bit more risk aversion about macro-economic, rather than local, phenomena. The reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic and policy​ work. In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach lessplausible …

Read more…

Mathematics and economics

May 5, 2018 45 comments

from Lars Syll

Many mainstream economists have the idea that because heterodox people — like yours truly — often criticize the application of mathematics in economics, we are critical of math per se.

This is totally unfounded and ridiculous. I do not know how many times I have been asked to answer this straw-man objection to heterodox economics.

No, there is nothing wrong with mathematics per se.

No, there is nothing wrong with applying mathematics to economics.

amathMathematics is one valuable tool among other valuable tools for understanding and explaining things in economics.

What is, however, totally wrong, are the utterly simplistic beliefs that

• “math is the only valid tool”

• “math is always and everywhere self-evidently applicable”

• “math is all that really counts”

• “if it’s not in math, it’s not really economics”

• “almost everything can be adequately understood and analyzed with math”

That said, let us never forget that without strong evidence all kinds of absurd claims and nonsense may pretend to be science. Using math can never be a substitute​ for thinking. Or as Paul Romer has it in his showdown with ‘post-real’ Chicago economics:

Math cannot establish the truth value of a fact. Never has. Never will.

Financial regulations

May 4, 2018 26 comments

from Lars Syll

A couple of years ago, former chairman of the Fed, Alan Greenspan, wrote in an article in the Financial Timesre the increased demands for stronger regulation of banks and finance:

Alan Greenspan and Ayn Rand at the White House after Greenspan was sworn in as chairman of Gerald Ford’s Council of Economic Advisers, September 1974Since the devastating Japanese earthquake and, earlier, the global financial tsunami, governments have been pressed to guarantee their populations against virtually all the risks exposed by those extremely low probability events. But should they? Guarantees require the building up of a buffer of idle resources that are not otherwise engaged in the production of goods and services. They are employed only if, and when, the crisis emerges.

The buffer may encompass expensive building materials whose earthquake flexibility is needed for only a minute or two every century, or an extensive stock of vaccines for a feared epidemic that may never occur. Any excess bank equity capital also would constitute a buffer that is not otherwise available to finance productivity-enhancing capital investment.

That is — to say the least — astonishing. Not wanting to take genuine uncertainty or ‘fat tails’ seriously is ominous enough. Is there anything the year 2008 taught us, it is that the ‘tail risks’ are genuinely real and must be included in all financial calculations. But even worse is how someone – who surely ought to have read at least an introductory course in economics – can get the idea that demand for higher capital requirements of banks would be equivalent to building buffers of ‘idle resources.’ The claim is from an economist’s point of view absolute nonsense.

Read more…

MMT — the Wicksell connection

May 3, 2018 21 comments

from Lars Syll

Most mainstream economists seem to think the idea behind Modern Monetary Theory is something new that some wild heterodox economic cranks have come up with.

New? Cranks? How about reading one of the great founders of neoclassical economics — Knut Wicksell. This is what Wicksell wrote in 1898 on ‘pure credit systems’ in Interest and Prices (Geldzins und Güterpreise):

It is possible to go even further. There is no real need for any money at all if a payment between two customers can be accomplished by simply transferring the appropriate sum of money in the books of the bank 

A pure credit system has not yet … been completely developed in this form. But here and there it is to be found in the somewhat different guise of the banknote system …

We intend therefor, as a basis for the following discussion, to imagine a state of affairs in which money does not actually circulate at all, neither in the form of coin … nor in the form of notes, but where all domestic payments are effected by means of the Giro system and bookkeeping transfers. A thorough analysis of this purely imaginary case seems to me to be worth while, for it provides a precise antithesis to the equally imaginay case of a pure cash system, in which credit plays no part whatever [the exact equivalent of the often used neoclassical model assumption of ‘cash in advance’ – LPS] …

For the sake of simplicity, let us then assume that the whole monetary system of a country is in the hands of a single credit institution, provided with an adequate number of branches, at which each independent economic individual keeps an account on which he can draw cheques.

Read more…

The Lucas critique comes back with a vengeance in DSGE models

May 1, 2018 3 comments

from Lars Syll

Both approaches to DSGE macroeconometrics (VAR and Bayesian) have evident vulnerabilities, which substantially derive from how parameters are handled in the technique. In brief, parameters from formally elegant models are calibrated in order to obtain simulated values that reproduce some stylized fact and/or some empirical data distribution, thus relating the underlying theoretical model and the observational data. But there are at least three main respects in which this practice fails.

lucasFirst of all, DSGE models have substantial difficulties in taking account of many important mechanisms that actually govern real economies, for example, institutional constraints like the tax system, thereby reducing DSGE power in policy analysis … In the attempt to deal with this serious problem, various parameter constraints on the model policy block are provided. They derive from institutional analysis and reflect policymakers’ operational procedures. However such model extensions, which are intended to reshape its predictions to reality and to deal with the underlying optimization problem, prove to be highly unflexible, turning DSGE into a “straitjacket tool” … In particular, the structure imposed on DSGE parameters entails various identification problems, such as observational equivalence, underidentification, and partial and weak identification.  Read more…

The loanable funds fallacy

April 27, 2018 12 comments

from Lars Syll

The loanable funds theory is in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credits set by banks and determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIt is a beautiful fairy tale, but the problem is that banks are notbarter institutions that transfer pre-existing loanable funds from depositors to borrowers. Why? Because, in the real world, there simply are no pre-existing loanable funds. Banks create new funds — credit — only if someone has previously got into debt! Banks are monetary institutions, not barter vehicles.

In the traditional loanable funds theory — as presented in mainstream macroeconomics textbooks — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings via a lower interest.   Read more…

The case for a new economics

April 24, 2018 31 comments

from Lars Syll

When the great crash hit a decade ago, the public realised that the economics profession was clueless …

After 10 years in the shadow of the crisis, the profession’s more open minds have recognised there is serious re-thinking to be done …

But the truth is that most of the “reforms” have been about adding modules to the basic template, leaving the core of the old discipline essentially intact. My view is that this is insufficient, and treats the symptoms rather than the underlying malaise …

RE-LogoIf we accept that we need fundamental reform, what should the new economics—“de-conomics” as I’m calling it—look like?

First, we need to accept that there is no such thing as “value-free” analysis of the economy. As I’ve explained, neoclassical economics pretends to be ethically neutral while smuggling in an individualistic, anti-social ethos …

Second, the analysis needs to be based around how human beings actually operate—rather than how neoclassicism asserts that “rational economic person (or firm)” should operate …

Third, we need to put the good life centre stage, rather than prioritising the areas that are most amenable to analysis via late-19th century linear mathematics. Technological progress and power relationships between firms, workers and governments need to be at the heart of economic discourse and research …

Finally, economics needs to be pluralistic. For the last half-century neoclassical economics has been gradually colonising other social science disciplines such as sociology and political science. It is high time this process reversed itself so that there was two-way traffic and a mutually beneficial learning exchange between disciplines. It is possible—and probably desirable—that the “deconomics” of the future looks more like psychology, sociology or anthropology than it does today’s arid economics …

The change I am seeking is no more fundamental than the transition from classical to neoclassical economics, and that was accomplished without the discipline imploding. And this time around we’ve got then-unimaginable data and other resources. So there can be no excuse for delay. Let economists free themselves of a misleading map, and then—with clear eyes—look at the world anew.

Howard Reed/Prospect Magazine

Read more…

The tractability hoax in modern economics

April 22, 2018 8 comments

from Lars Syll

While the paternity of the theoretical apparatus underlying the new neoclassical synthesis in macro is contested, there is wide agreement that the methodological framework was largely architected by Robert Lucas … Bringing a representative agent meant foregoing the possibility to tackle inequality, redistribution and justice concerns. Was it deliberate? How much does this choice owe to tractability? What macroeconomists were chasing, in these years, was a renewed explanation of the business cycle. They were trying to write microfounded and dynamic models …

tractable-2Rational expectations imposed cross-equation restrictions, yet estimating these new models substantially raised the computing burden. Assuming a representative agent mitigated computational demands, and allowed macroeconomists to get away with general equilibrium aggregate issues: it made new-classical models analytically and computationally tractable …

Was tractability the main reason why Lucas embraced the representative agent (and market clearing)? Or could he have improved tractability through alternative hypotheses, leading to opposed policy conclusions? … Some macroeconomists may have endorsed the new class of Lucas-critique-proof models because they liked its policy conclusions. Other may have retained some hypotheses, then some simplifications, “because it makes the model tractable.” And while the limits of simplifying assumptions are often emphasized by those who propose them, as they spread, caveats are forgotten. Tractability restricts the range of accepted models and prevent economists from discussing some social issues, and with time, from even “seeing” them. Tractability ‘filters’ economists’ reality … The aggregate effect of “looking for tractable models” is unknown, and yet it is crucial to understand the current state of economics.

Beatrice Cherrier

Read more…

Sometimes we do not know because we cannot know

April 21, 2018 42 comments

from Lars Syll

Some time ago, Bank of England’s Andrew G Haldane and Benjamin Nelson presented a paper with the title Tails of the unexpected. The main message of the paper was that we should not let us be fooled by randomness:

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

blNormality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, it merits a couple of comments.

Read more…

DSGE models — overconfident macroeconomic story-telling

April 19, 2018 5 comments

from Lars Syll

A recent paper by Christiano, Eichenbaum and Trabandt (C.E.T.) on Dynamic Stochastic General Equilibrium Models (DSGEs) has generated quite a reaction in the blogosphere …

The-DSGE-Model-Quarrel-Again-e1512989462377Bradford Delongpoints out that new Keynesian models were constructed to show that old Keynesian and old Monetarist policy conclusions were relatively robust, and not blown out of the water by rational expectations … The DSGE framework was then constructed so that new Keynesians could talk to RBCites. None of this has, so far, materially advanced the project of understanding the macroeconomic policy-relevant emergent properties of really existing industrial and post-industrial economies …  Read more…

Shortcomings of regression analysis

April 17, 2018 74 comments

from Lars Syll

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis. In his Intelligence and How to Get It (Norton 2011) he writes:

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

Read more…