Archive

Author Archive

Mainstream monetary theory — neat, plausible, and utterly wrong

June 24, 2017 26 comments

from Lars Syll

In modern times legal currencies are totally based on fiat. Currencies no longer have intrinsic value (as gold and silver). What gives them value is basically the legal status given to them by government and the simple fact that you have to pay your taxes with them. That also enables governments to run a kind of monopoly business where it never can run out of money. Hence spending becomes the prime mover and taxing and borrowing is degraded to following acts. If we have a depression, the solution, then, is not austerity. It is spending. Budget deficits are not the major problem, since fiat money means that governments can always make more of them.

Financing quantitative easing, fiscal expansion, and other similar operations, is made possible by simply crediting a bank account and thereby – by a single keystroke – actually creating money. One of the most important reasons why so many countries are still stuck in depression-like economic quagmires is that people in general – including most mainstream economists – simply don’t understand the workings of modern monetary systems. The result is totally and utterly wrong-headed austerity policies, emanating out of a groundless fear of creating inflation via central banks printing money, in a situation where we rather should fear deflation and inadequate effective demand.

Read more…

Do you want to get a Nobel prize? Eat chocolate and move to Chicago!

June 21, 2017 7 comments

from Lars Syll

chocolateSource

As we’ve noticed, again and again, correlation is not the same as causation …

If you want to get the prize in economics — and want to be on the sure side — yours truly would suggest you complement  your intake of chocolate with a move to Chicago.

Out of the 78 laureates that have been awarded “The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel,” 28 have been affiliated to The University of Chicago — that is 36%. The world is really a small place when it comes to economics …

Leontief on the dismal state of economics

June 19, 2017 12 comments

from Lars Syll

Much of current academic teaching and research has been criticized for its lack of relevance, that is, of immediate practical impact … I submit that the consistently indifferent performance in practical applications is in fact a symptom of a fundamental imbalance in the present state of our discipline. The weak and all too slowly growing empirical foundation clearly cannot support the proliferating superstructure of pure, or should I say, speculative economic theory …

leontif_nobel_fullUncritical enthusiasm for mathematical formulation tends often to conceal the ephemeral substantive content of the argument behind the formidable front of algebraic signs … In the presentation of a new model, attention nowadays is usually centered on a step-by-step derivation of its formal properties. But if the author — or at least the referee who recommended the manuscript for publication — is technically competent, such mathematical manipulations, however long and intricate, can even without further checking be accepted as correct. Nevertheless, they are usually spelled out at great length. By the time it comes to interpretation of the substantive conclusions, the assumptions on which the model has been based are easily forgotten. But it is precisely the empirical validity of these assumptions on which the usefulness of the entire exercise depends.

Read more…

Ed Leamer and the pitfalls of econometrics

June 17, 2017 1 comment

from Lars Syll

Ed Leamer’s Tantalus on the Road to Asymptopia is one of my favourite critiques of econometrics, and for the benefit of those who are not versed in the econometric jargong, this handy summary gives the gist of it in plain English:

noahtantalus

Most work in econometrics and regression analysis is — still — made on the assumption that the researcher has a theoretical model that is ‘true.’ Based on this belief of having a correct specification for an econometric model or running a regression, one proceeds as if the only problem remaining to solve have to do with measurement and observation.  Read more…

Solow being uncomfortable with ‘modern’ macroeconomics

June 14, 2017 23 comments

from Lars Syll

So in what sense is this “dynamic stochastic general equilibrium” model firmly grounded in the principles of economic theory? I do not want to be misunderstood. Friends have reminded me that much of the effort of “modern macro” goes into the incorporation of important deviations from the Panglossian assumptions that underlie the simplistic application of the Ramsey model to positive macroeconomics. Research focuses on the implications of wage and price stickiness, gaps and asymmetries of information, long-term contracts, imperfect competition, search, bargaining and other forms of strategic behavior, and so on. That is indeed so, and it is how progress is made.

But this diversity only intensifies my uncomfortable feeling that something is being put over on us, by ourselves. Why do so many of those research papers begin with a bow to the Ramsey model and cling to the basic outline? Every one of the deviations that I just mentioned was being studied by macroeconomists before the “modern” approach took over. That research was dismissed as “lacking microfoundations.” My point is precisely that attaching a realistic or behavioral deviation to the Ramsey model does not confer microfoundational legitimacy on the combination. Quite the contrary: a story loses legitimacy and credibility when it is spliced to a simple, extreme, and on the face of it, irrelevant special case. This is the core of my objection: adding some realistic frictions does not make it any more plausible that an observed economy is acting out the desires of a single, consistent, forward-looking intelligence …

For completeness, I suppose it could also be true that the bow to the Ramsey model is like wearing the school colors or singing the Notre Dame fight song: a harmless way of providing some apparent intellectual unity, and maybe even a minimal commonality of approach. That seems hardly worthy of grown-ups, especially because there is always a danger that some of the in-group come to believe the slogans, and it distorts their work …

There has always been a purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts. Most of us have felt that tug. Here is a theory that gives you just that, and this
time “everything” means everything: macro, not micro. The theory is neat, learnable, not terribly difficult, but just technical enough to feel like “science.”

Robert Solow

Read more…

Economics textbooks transmogrifying truth — wages and unemployment

June 10, 2017 7 comments

from Lars Syll

51j8ZC7N3QL._SY400_A couple of weeks ago yours truly was sent a copy of the new edition of Chad Jones intermediate textbook Macroeconomics (4th ed, W W Norton, 2018). There’s much in the book I like, e. g. Jones’ combining of more traditional short-run macroeconomic analysis with an accessible coverage of the Romer model — the foundation of modern growth theory — and DSGE business cycle models.

Unfortunately it also contains some utter nonsense!

In chapter 7 — on “The Labor Market, Wages, and Unemployment” — Jones writes (p. 184):

The point of this experiment is to show that wage rigidities can lead to large movements in employment. Indeed, they are the reason John Maynard Keynes gave, in The General Theory of Employment, Interest, and Money (1936), for the high unemployment of the Great Depression.

A serious editor — who really checked the facts — would immediately find that although Keynes in General Theory devoted substantial attention to the subject of wage rigidities, he certainly did not hold the view that wage rigidities were “the reason … for the high unemployment of the Great Depression.”  Read more…

‘Cauchy logic’ in economics

June 8, 2017 Leave a comment

from Lars Syll

What is 0.999 …, really? Is it 1? Or is it some number infinitesimally less than 1?

The right answer is to unmask the question. What is 0.999 …, really? It appears to refer to a kind of sum:

.9 + + 0.09 + 0.009 + 0.0009 + …

9781594205224M1401819961But what does that mean? That pesky ellipsis is the real problem. There can be no controversy about what it means to add up two, or three, or a hundred numbers. But infinitely many? That’s a different story. In the real world, you can never have infinitely many heaps. What’s the numerical value of an infinite sum? It doesn’t have one — until we give it one. That was the great innovation of Augustin-Louis Cauchy, who introduced the notion of limit into calculus in the 1820s.

The British number theorist G. H. Hardy … explains it best: “It is broadly true to say that mathematicians before Cauchy asked not, ‘How shall we define 1 – 1 – 1 + 1 – 1 …’ but ‘What is 1 -1 + 1 – 1 + …?’”

No matter how tight a cordon we draw around the number 1, the sum will eventually, after some finite number of steps, penetrate it, and never leave. Under those circumstances, Cauchy said, we should simply define the value of the infinite sum to be 1.

I have no problem with solving problems in mathematics by ‘defining’ them away. But how about the real world? Maybe that ought to be a question to ponder even for economists all to fond of uncritically following the mathematical way when applying their mathematical models to the real world, where indeed “you can never have infinitely many heaps” …  Read more…

Economics textbooks transmogrifying truth — growth theory

June 5, 2017 2 comments

from Lars Syll


The above vidoe is one in a series of videos where Alex Tabarrok and Tyler Cowen present their economics textbook Principles of Economics. In a later video, ‘ideas’ are introduced into the Solow growth model. But, not with a single word does one acknowledge that this in total contradiction to the Holy Grail of their mainstream economics — the “iron logic of diminishing returns.”

In Paul Romer’s Endogenous Technological Change (1990) knowledge is made the most important driving force of growth. Knowledge – or ideas – are according to Romer the locomotive of growth. But as Allyn Young, Piero Sraffa and others had shown already in the 1920s, knowledge is also something that has to do with increasing returns to scale and therefore not really compatible with neoclassical economics with its emphasis on decreasing returns to scale and the “iron logic of diminishing returns.” Read more…

What is Post Keynesian Economics?

June 1, 2017 6 comments

from Lars Syll

encycJohn Maynard Keynes’s 1936 book The General Theory of Employment, Interest, and Money attempted to overthrow classical theory and revolutionize how economists think about the economy. Economists who build upon Keynes’s General Theory to analyze the economic problems of the twenty-first-century global economy are called Post Keynesians. Keynes’s “principle of effective demand” (1936, chap. 2) declared that the axioms underlying classical theory were not applicable to a money-using, entrepreneurial economic system. Consequently, the mainstream theory’s “teaching is misleading and disastrous if we attempt to apply it to the facts of experience” (Keynes 1936, p. 3). To develop an economic theory applicable to a monetary economy, Keynes suggested rejecting three basic axioms of classical economics (1936, p. 16).

Unfortunately, the axioms that Keynes suggested for rejection are still part of the foundation of twenty-first-century mainstream economic theory. Post Keynesians have thrown out the three axioms that Keynes suggested rejecting in The General Theory. The rejected axioms are the ergodic axiom, the gross-substitution axiom, and the neutral-money axiom … Only if these axioms are rejected can a model be developed that has the following characteristics:

•Money matters in the long and short run, that is, changes in the money supply can affect decisions that determine the level of employment and real economic output.

•As the economic system moves from an irrevocable past to an uncertain future, decision makers recognize that they make important, costly decisions in uncertain conditions where reliable, rational calculations regarding the future are impossible.

•People and organizations enter into monetary contracts. These money contracts are a human institution developed to efficiently organize time-consuming production and exchange processes. The money-wage contract is the most ubiquitous of these contracts.

•Unemployment, rather than full employment, is a common laissez-faire situation in a market-oriented, monetary production economy.

•The ergodic axiom postulates that all future events are actuarially certain, that is, that the future can be accurately forecasted from an analysis of existing market data. Consequently, this axiom implies that income earned at any employment level is entirely spent either on produced goods for today’s consumption or on buying investment goods that will be used to produce goods for the (known) future consumption of today’s savers. In other words, orthodox theory assumes that all income is always immediately spent on producibles, so there is never a lack of effective demand for things that industry can produce at full employment … Post Keynesian theory rejects the ergodic axiom.

In Post Keynesian theory … people recognize that the future is uncertain (nonergodic) and cannot be reliably predicted.

Paul Davidson

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even made it conceivable?   Read more…

Chicago economics — a dangerous pseudo-scientific zombie

May 31, 2017 4 comments

from Lars Syll

Savings-and-InvestmentsEvery dollar of increased government spending must correspond to one less dollar of private spending. Jobs created by stimulus spending are offset by jobs lost from the decline in private spending. We can build roads instead of factories, but fiscal stimulus can’t help us to build more of both. This form of “crowding out” is just accounting, and doesn’t rest on any perceptions or behavioral assumptions.

John Cochrane

What Cochrane is reiterating here is nothing but Say’s law, basically saying that savings are equal to investments, and that if the state increases investments, then private investments have to come down (‘crowding out’). As an accounting identity there is of course nothing to say about the law, but as such it is also totally uninteresting from an economic point of view. As some of my Swedish forerunners — Gunnar Myrdal and Erik Lindahl — stressed more than 80 years ago, it’s really a question of ex ante and ex post adjustments. And as further stressed by a famous English economist about the same time, what happens when ex ante savings and investments differ, is that we basically get output adjustments. GDP changes and so makes saving and investments equal ex ost. And this, nota bene, says nothing at all about the success or failure of fiscal policies!

Read more…

Modern economics — pseudo-science based on FWUTV

May 29, 2017 2 comments

from Lars Syll

The use of FWUTV — facts with unknown truth values — is, as Paul Romeer noticed in last year’s perhaps most interesting insider critique of mainstream economics, all to often used in macroeconomic modelling. But there are other parts of ‘modern’ economics than New Classical RBC economics that also have succumbed to this questionable practice:

CnGyMOeWAAEQVaHStatistical significance is not the same as real-world significance — all it offers is an indication of whether you’re seeing an effect where there is none. Even this narrow technical meaning, though, depends on where you set the threshold at which you are willing to discard the ‘null hypothesis’ — that is, in the above case, the possibility that there is no effect. I would argue that there’s no good reason to always set it at 5 percent. Rather, it should depend on what is being studied, and on the risks involved in acting — or failing to act — on the conclusions …

This example illustrates three lessons. First, researchers shouldn’t blindly follow convention in picking an appropriate p-value cutoff. Second, in order to choose the right p-value threshold, they need to know how the threshold affects the probability of a Type II error. Finally, they should consider, as best they can, the costs associated with the two kinds of errors.

Statistics is a powerful tool. But, like any powerful tool, it can’t be used the same way in all situations.

Narayana Kocherlakota

Read more…

Neoliberalism — an oversold ideology

May 25, 2017 4 comments

from Lars Syll

So what’s wrong with the economy? …

austerity_world_tour_greeceA 2002 study of United States fiscal policy by the economists Olivier Blanchard and Roberto Perotti found that ‘both increases in taxes and increases in government spending have a strong negative effect on private investment spending.’ They noted that this finding is ‘difficult to reconcile with Keynesian theory.’

Consistent with this, a more recent study of international data by the economists Alberto Alesina and Silvia Ardagna found that ‘fiscal stimuli based on tax cuts are more likely to increase growth than those based on spending increases.’

Greg Mankiw

From Mankiw’s perspective ‘the Alesina work suggests a still plausible hypothesis.’

Hmm …   Read more…

The laws of mathematics and economics

May 22, 2017 20 comments

from Lars Syll

Some commentators on this blog — and elsewhere — seem to have problems with yours truly’s critique of the overly debonair attitude with which mathematics is applied to economics. In case you think the critique is some odd outcome of heterodox idiosyncrasy, well, maybe you should think twice …

einstein

Mainstream economics — an emperor turned out to be naked

May 20, 2017 4 comments

from Lars Syll

The main reason why the teaching of microeconomics (or of “ micro foundations” of macroeconomics) has been called “autistic” is because it is increasingly impossible to discuss real-world economic questions with microeconomists – and with almost all neoclassical theorists. They are trapped in their system, and don’t in fact care about the outside world any more. If you consult any microeconomic textbook, it is full of maths (e.g. Kreps or Mas-Colell, Whinston and Green) or of “tales” (e.g. Varian or Schotter), without real data (occasionally you find “examples”, or “applications”, with numerical examples – but they are purely fictitious, invented by the authors).

an-inconvenient-truth1At first, French students got quite a lot of support from teachers and professors: hundreds of teachers signed petitions backing their movement – specially pleading for “pluralism” in teaching the different ways of approaching economics. But when the students proposed a precise program of studies … almost all teachers refused, considering that is was “too much” because “students must learn all these things, even with some mathematical details”. When you ask them “why?”, the answer usually goes something like this: “Well, even if we, personally, never use the kind of ‘theory’ or ‘tools’ taught in micoreconomics Courses … surely there are people who do ‘use’ and ‘apply’ them, even if it is in an ‘unrealistic’, or ‘excessive’ way”.

But when you ask those scholars who do “use these tools”, especially those who do a lot of econometrics with “representative agent” models, they answer (if you insist quite a bit): “OK, I agree with you that it is nonsense to represent the whole economy by the (intertemporal) choice of one agent –- consumer and producer — or by a unique household that owns a unique firm; but if you don’t do that, you don’t do anything !”

Bernard Guerrien

Yes indeed — “you don’t do anything!”   Read more…

Formal mathematical modeling in economics — a dead-end

May 17, 2017 3 comments

from Lars Syll

Using formal mathematical modeling, mainstream economists sure can guarantee that the conclusions hold given the assumptions. However, the validity we get in abstract model worlds does not warrantly transfer to real world economies. Validity may be good, but it isn’t enough. From a realist perspective both relevance and soundness are sine qua non.

broken-linkIn their search for validity, rigour and precision, mainstream macro modellers of various ilks construct microfounded DSGE models that standardly assume rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative household/ consumer/producer agents with homothetic and identical preferences, etc., etc. At the same time the models standardly ignore complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc., etc.

Behavioural and experimental economics — not to speak of psychology — show beyond any doubts that “deep parameters” — peoples’ preferences, choices and forecasts — are regularly influenced by those of other participants in the economy. And how about the homogeneity assumption? And if all actors are the same – why and with whom do they transact? And why does economics have to be exclusively teleological (concerned with intentional states of individuals)? Where are the arguments for that ontological reductionism? And what about collective intentionality and constitutive background rules?   Read more…

Structural econometrics

May 16, 2017 2 comments

from Lars Syll

In a blog post the other day, Noah Smith returned again to the discussion about the ’empirical revolution’ in economics and how to — if it really does exist — evaluate it. Counter those who think quasi-experiments and RCTs are the true solutions to finding causal parameters, Noah argues that without structural models

empirical results are only locally valid. And you don’t really know how local “local” is. If you find that raising the minimum wage from $10 to $12 doesn’t reduce employment much in Seattle, what does that really tell you about what would happen if you raised it from $10 to $15 in Baltimore?

That’s a good reason to want a good structural model. With a good structural model, you can predict the effects of policies far away from the current state of the world.

If only that were true! But it’s not.

Structural econometrics — essentially going back to the Cowles programme — more or less takes for granted the possibility of a priori postulating relations that describe economic behaviours as invariant within a Walrasian general equilibrium system. In practice that means the structural model is based on a straightjacket delivered by economic theory. Causal inferences in those models are — by assumption — made possible since the econometrician is supposed to know the true structure of the economy. And, of course, those exact assumptions are the crux of the matter. If the assumptions don’t hold, there is no reason whatsoever  to have any faith in the conclusions drawn, since they do not follow from the statistical machinery used!   Read more…

What happens when a small and dangerous sect captures the teaching of economics

May 12, 2017 4 comments

from Lars Syll

The fallacy of composition basically consists of the false belief that the whole is nothing but the sum of its parts.  In the society and in the economy this is arguably not the case. An adequate analysis of society and economy a fortiori can’t proceed by just adding up the acts and decisions of individuals. The whole is more than a sum of parts.

This fact shows up when orthodox/mainstream/neoclassical economics tries to argue for the existence of The Law of Demand – when the price of a commodity falls, the demand for it will increase – on the aggregate. Although it may be said that one succeeds in establishing The Law for single individuals it soon turned out – in the Sonnenschein-Mantel-Debreu theorem firmly established already in 1976 – that it wasn’t possible to extend The Law of Demand to apply on the market level, unless one made ridiculously unrealistic assumptions such as individuals all having homothetic preferences – which actually implies that all individuals have identical preferences.

This could only be conceivable if all agents are identical (i. e. there is in essence only one actor) — the (in)famous representative actor. So, yes, it was possible to generalize The Law of Demand – as long as we assumed that on the aggregate level there was only one commodity and one actor. What generalization! Does this sound reasonable? Of course not. This is pure nonsense!   Read more…

The tragedy of pseudoscientific and self-defeatingly arrogant economics

May 10, 2017 2 comments

from Lars Syll

The problem of any branch of knowledge is to systematize a set of particular observations in a more coherent form, called hypothesis or ‘theory.’ Two problems must be resolved by those attempting to develop theory: (1) finding agreement on what has been observed; (2) finding agreement on how to systematize those observations.

comic1In economics, there would be more agreement on the second point than on the first. Many would agree that using the short-hand rules of mathematics is a convenient way of systematizing and communicating knowledge — provided we have agreement on the first problem, namely what observations are being systematized. Social sciences face this problem in the absence of controlled experiments in a changing, non-repetitive world. This problem may be more acute for economics than for other branches of social science, because economists like to believe that they are dealing with quantitative facts, and can use standard statistical methods. However, what are quantitative facts in a changing world? If one is dealing with questions of general interest that arise in macroeconomics, one has to first agree on ‘robust’ so-called ‘stylized’ facts based on observation: for example, we can agree that business cycles occur; that total output grows as a long term trend; that unemployment and financial crisis are recurring problems, and so on.

Read more…

The spectacular failure of DSGE models

May 8, 2017 38 comments

from Lars Syll

In most aspects of their lives humans must plan forwards. They take decisions today that affect their future in complex interactions with the decisions of others. When taking such decisions, the available information is only ever a subset of the universe of past and present information, as no individual or group of individuals can be aware of all the relevant information. Hence, views or expectations about the future, relevant for their decisions, use a partial information set, formally expressed as a conditional expectation given the available information.

macroeconomics-14-638Moreover, all such views are predicated on there being no un-anticipated future changes in the environment pertinent to the decision. This is formally captured in the concept of ‘stationarity’. Without stationarity, good outcomes based on conditional expectations could not be achieved consistently. Fortunately, there are periods of stability when insights into the way that past events unfolded can assist in planning for the future.

The world, however, is far from completely stationary. Unanticipated events occur, and they cannot be dealt with using standard data-transformation techniques such as differencing, or by taking linear combinations, or ratios. In particular, ‘extrinsic unpredictability’ – unpredicted shifts of the distributions of economic variables at unanticipated times – is common. As we shall illustrate, extrinsic unpredictability has dramatic consequences for the standard macroeconomic forecasting models used by governments around the world – models known as ‘dynamic stochastic general equilibrium’ models – or DSGE models …

Many of the theoretical equations in DSGE models take a form in which a variable today, say incomes (denoted as yt) depends inter alia on its ‘expected future value’… For example, yt may be the log-difference between a de-trended level and its steady-state value. Implicitly, such a formulation assumes some form of stationarity is achieved by de-trending.

Unfortunately, in most economies, the underlying distributions can shift unexpectedly. This vitiates any assumption of stationarity. The consequences for DSGEs are profound. As we explain below, the mathematical basis of a DSGE model fails when distributions shift … This would be like a fire station automatically burning down at every outbreak of a fire. Economic agents are affected by, and notice such shifts. They consequently change their plans, and perhaps the way they form their expectations. When they do so, they violate the key assumptions on which DSGEs are built.

David Hendry & Grayham Mizon

Read more…

Mainstream textbooks — full of utter nonsense!

May 7, 2017 4 comments

from Lars Syll

The other day yours truly was sent a copy of the new edition of Chad Jones intermediate textbook Macroeconomics (4th ed, W W Norton, 2018). There’s much in the book I like, e. g. Jones’  combining of more traditional short-run macroeconomic analysis with an accessible coverage of the Romer model — the foundation of modern growth theory — and DSGE business cycle models.

Unfortunately it also contains some utter nonsense!

In chapter 7 — on “The Labor Market, Wages, and Unemployment” — Jones writes (p. 184):

51j8ZC7N3QL._SY400_

 

The point of this experiment is to show that wage rigidities can lead to large movements in employment. Indeed, they are the reason John Maynard Keynes gave, in The General Theory of Employment, Interest, and Money (1936), for the high unemployment of the Great Depression.

 

But this is pure nonsense. A serious editor — who really checked the facts — would immediately find that although Keynes in General Theory devoted substantial attention to the subject of wage rigidities, he certainly did not hold the view that wage rigidities were “the reason … for the high unemployment of the Great Depression.”

Read more…