Archive

Author Archive

Paul Krugman — a methodological critique

January 18, 2019 4 comments

from Lars Syll

Alex Rosenberg — chair of the philosophy department at Duke University and renowned economic methodologist — has an interesting article on What’s Wrong with Paul Krugman’s Philosophy of Economics in 3:AM Magazine. Writes Rosenberg:

theoryKrugman writes: “So how do you do useful economics? In general, what we really do is combine maximization-and-equilibrium as a first cut with a variety of ad hoc modifications reflecting what seem to be empirical regularities about how both individual behavior and markets depart from this idealized case.”

But if you ask the New Classical economists, they’ll say, this is exactly what we do—combine maximizing-and-equilibrium with empirical regularities …

Read more…

Bayesianism — a patently​ absurd approach to science

January 16, 2019 7 comments

from Lars Syll

Back in 1991, when yours truly earned his first PhD with a dissertation on decision making and rationality in social choice theory and game theory, I concluded that “repeatedly it seems as though mathematical tractability and elegance — rather than realism and relevance — have been the most applied guidelines for the behavioural assumptions being made. On a political and social level, it is doubtful if the methodological individualism, ahistoricity and formalism they are advocating are especially valid.”

This, of course, was like swearing in church. My mainstream colleagues were — to say the least — not exactly überjoyed.

The decision theoretical approach I was most critical of, was the one building on the then reawakened Bayesian subjectivist (personalistic) interpretation of probability.

One of my inspirations when working on the dissertation was Henry E. Kyburg, and I still think his critique is the ultimate take-down of Bayesian hubris:  Read more…

Is ‘modern’ macroeconomics for real?

January 13, 2019 11 comments

from Lars Syll

861cf344575acd50ed67b35d88615f2318610d8148e8c471ad10ca0132cda91eEmpirically, far from isolating a microeconomic core, real-business-cycle models, as with other representative-agent models, use macroeconomic aggregates for their testing and estimation. Thus, to the degree that such models are successful in explaining empirical phenomena, they point to the ontological centrality of macroeconomic and not to microeconomic entities … At the empirical level, even the new classical representative-agent models are fundamentally macroeconomic in content …

The nature of microeconomics and macroeconomics — as they are currently practised​ — undermines the prospects for a reduction of macroeconomics to microeconomics. Both microeconomics and macroeconomics must refer to irreducible macroeconomic entities.

Kevin Hoover

Kevin Hoover has been writing on microfoundations for more than 25 years, and is beyond any doubts the one economist/econometrician/methodologist who has thought most on the issue. It’s always interesting to compare his qualified and methodologically founded assessment on the representative-agent-rational-expectations microfoundationalist program with the more or less apologetic views of freshwater economists like Robert Lucas:  Read more…

Insignificant ‘statistical significance’

January 11, 2019 3 comments

from Lars Syll

ad11

Read more…

Cutting wages — the wrong medicine

January 9, 2019 30 comments

from Lars Syll

'Sure, your salaries are low but think of all the apples you're getting.'A couple of years ago yours truly had a discussion with the chairman of the Swedish Royal Academy of Sciences (yes, the one that yearly presents the winners of ‘The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel’). What started the discussion was the allegation that the level of employment in the long run is a result of people’s own rational intertemporal choices and that how much people work basically is a question of incentives.

Somehow the argument sounded familiar.

When being awarded the ‘Nobel prize’ in 2011, Thomas Sargent declared that workers ought to be prepared for having low unemployment compensations in order to get the right incentives to search for jobs. The Swedish right-wing finance minister at the time appreciated Sargent’s statement and declared it to be a “healthy warning” for those who wanted to increase compensation levels.

The view is symptomatic. As in the 1930s, more and more right-wing politicians — and economists — now suggest that lowering wages is the right medicine to strengthen the competitiveness of their faltering economies, get the economy going, increase employment and create growth that will get rid of towering debts and create balance in the state budgets.  Read more…

Why Bayesianism has not resolved a single fundamental​ scientific​ dispute

January 8, 2019 2 comments

from Lars Syll

419fn8sv1fl-_sx332_bo1204203200_Bayesian reasoning works, undeniably, where we know (or are ready to assume) that the process studied fits certain special though abstract causal structures, often called ‘statistical models’ … However, when we choose among hypotheses in important scientific controversies, we usually lack such prior knowledge​ of causal structures, or it is irrelevant to the choice. As a consequence, such Bayesian inference to the preferred alternative has not resolved, even temporarily, a single fundamental scientific dispute.

Mainstream economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules (preferably the ones axiomatized by Ramsey (1931), de Finetti (1937) or Savage (1954)) — that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately — via some “Dutch book” or “money pump” argument — susceptible to being ruined by some clever “bookie”.  Read more…

20th anniversary for the euro — no reason for celebration

January 5, 2019 11 comments

from Lars Syll

When the euro was created twenty years ago, it was celebrated with fireworks at the European Central Bank headquarters in Frankfurt. Today we know better. There are no reasons to celebrate the 20-year anniversary. On the contrary.

euroAlready since its start, the euro has been in crisis. And the crisis is far from over. The tough austerity measures imposed in the eurozone has made economy after economy contract. And it has not only made things worse in the periphery countries, but also in countries like France and Germany. Alarming facts that should be taken seriously.

Europe may face a future with growing economic disparities where we will have​ to confront increasing hostility between nations and peoples. What we’ve seen lately in France shows that the protests against technocratic attempts to undermine democracy may go extremely violent.

The problems — created to a large extent by the euro — may not only endanger our economies, but also our democracy itself. How much whipping can democracy take? How many more are going to get seriously hurt and ruined before we end this madness and scrap the euro?

The euro has taken away the possibility for national governments to manage their economies in a meaningful way — and in country after country, the people have had to pay the true costs of its concomitant misguided austerity policies.  Read more…

How to re-establish​​ trust in economics as a science

January 3, 2019 184 comments

from Lars Syll

Students all over the world are increasingly questioning if the kind of economics they are taught — mainstream economics — really is of any value. Some have even started to question if economics is a science.

Two ‘Nobel laureates’ in economics — Robert Shiller and Paul Krugman — have lately tried to respond:

Critics of “economic sciences” sometimes refer to the development of a “pseudoscience” of economics, arguing that it uses the trappings of science, like dense mathematics, but only for show …

220px-Robert_J._Shiller_2017My belief is that economics is somewhat more vulnerable than the physical sciences to models whose validity will never be clear, because the necessity for approximation is much stronger than in the physical sciences, especially given that the models describe people rather than magnetic resonances or fundamental particles …

But all the mathematics in economics is not … charlatanism. Economics has an important quantitative side, which cannot be escaped …

While economics presents its own methodological problems, the basic challenges facing researchers are not fundamentally different from those faced by researchers in other fields.

Robert Shiller

Read more…

The essence of scientific reasoning

December 29, 2018 20 comments

from Lars Syll

dedIn deductive reasoning all knowledge obtainable is already latent in the postulates. Rigour is needed to prevent the successive inferences growing less and less accurate as we proceed. The conclusions are never more accurate than the data. In inductive reasoning we are performing part of the process by which new knowledge is created. The conclusions normally grow more and more accurate as more data are included. It should never be true, though it is still often said, that the conclusions are no more accurate than the data on which they are based.

R. A. Fisher

In science we standardly use a logically non-valid inference — the fallacy of affirming the consequent — of the following form:  Read more…

Teaching of economics — captured by a small and dangerous sect

December 25, 2018 38 comments

from Lars Syll

Dept_of_Econ_Fac_Pic

The fallacy of composition basically consists of the false belief that the whole is nothing but the sum of its parts.  In the society and in the economy this is arguably not the case. An adequate analysis of society and economy a fortiori can’t proceed by just adding up the acts and decisions of individuals. The whole is more than a sum of parts.  Read more…

Why statistical significance is worthless in science

December 20, 2018 2 comments

from Lars Syll

pvaluessuck

There are at least around 20 or so common misunderstandings and abuses of p-values and NHST [Null Hypothesis Significance Testing]. Most of them are related to the definition of p-value … Other misunderstandings are about the implications of statistical significance.

Statistical significance does not me

an substantive significance: just because an observation (or a more extreme observation) was unlikely had there been no differences in the population does not mean that the observed differences are large enough to be of practical relevance. At high enough sample sizes, any difference will be statistically significant​ regardless of effect size.

Statistical non-significance does not entail equivalence: a failure to reject the null hypothesis is just that. It does not mean that the two groups are equivalent, since statistical non-significance can be due to low sample size.  Read more…

Why all models are wrong

December 18, 2018 20 comments

from Lars Syll

moModels share three common characteristics: First, they simplify, stripping away unnecessary details, abstracting from reality, or creating anew from whole cloth. Second, they formalize, making precise definitions. Models use mathematics, not words … Models create structures within which we can think logically … But the logic comes at a cost, which leads to their third characteristic: all models are wrong … Models are wrong because they simplify. They omit details. By considering many models, we can overcome the narrowing of rigor by crisscrossing the landscape of the possible.

To rely on a single  model is hubris. It invites disaster … We need many models to make sense of complex systems.

Yes indeed. To rely on a single mainstream economic theory and its models is hubris.  It certainly does invite disaster. To make sense of complex economic phenomena we need many theories and models. We need pluralism. Pluralism both in theories and methods.   Read more…

Disconfirming rational expectations

December 16, 2018 15 comments

from Lars Syll

56238100Empirical efforts at testing the correctness of the rational expectations hypothesis have resulted in a series of empirical studies that have more or less concluded that it is not consistent with the facts. In one of the more well-known and highly respected evaluation reviews made, Michael Lovell (1986) concluded:

it seems to me that the weight of empirical evidence is sufficiently strong to compel us to suspend belief in the hypothesis of rational expectations, pending the accumulation of additional empirical evidence.

And this is how Nikolay Gertchev summarizes studies on the empirical correctness of the hypothesis:  Read more…

Econometrics — analysis with incredible​ certitude​

December 14, 2018 9 comments

from Lars Syll

9780199679348There have been over four decades of econometric research on business cycles …

But the significance of the formalization becomes more difficult to identify when it is assessed from the applied perspective …

The wide conviction of the superiority of the methods of the science has converted the econometric community largely to a group of fundamentalist guards of mathematical rigour … So much so that the relevance of the research to business cycles is reduced to empirical illustrations. To that extent, probabilistic formalisation has trapped econometric business cycle research in the pursuit of means at the expense of ends.

The limits of econometric forecasting have, as noted by Qin, been critically pointed out many times before. Trygve Haavelmo assessed the role of econometrics — in an article from 1958 — and although mainly positive of the “repair work” and “clearing-up work” done, Haavelmo also found some grounds for despair:  Read more…

Your model is internally consistent? So what!

December 13, 2018 22 comments

from Lars Syll

‘New Keynesian’ macroeconomist Simon Wren-Lewis has a post on his blog discussing how evidence is treated in modern macroeconomics (emphasis added):

quote-Oscar-Wilde-consistency-is-the-last-refuge-of-the-58The unique property that DSGE models have is internal consistency. Take a DSGE model, and alter a few equations so that they fit the data much better, and you have what could be called a structural econometric model. It is internally inconsistent, but because it fits the data better it may be a better guide for policy.

Being able to model a credible world, a world that somehow could be considered real or similar to the real world is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified (in terms of resemblance, relevance, etc.). At the very least, the minimalist demand on models in terms of credibility has to give away to a stronger epistemic demand of appropriate similarity and plausibility. One could of course also ask for a sensitivity or robustness analysis, but the credible world, even after having tested it for sensitivity and robustness, can still be far away from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Yours truly and people like Tony Lawson have for many years been urging economists to pay attention to the ontological foundations of their assumptions and models. Sad to say, Read more…

‘Controlling for’ — a methodological urban legend

December 10, 2018 3 comments

from Lars Syll

Trying to reduce the risk of having established only ‘spurious relations’ when dealing with observational data, statisticians and econometricians standardly add control variables. The hope is that one thereby will be able to make more reliable causal inferences. But — as Keynes showed already back in the 1930s when criticizing statistical-econometric applications of regression analysis — if you do not manage to get hold of all potential confounding factors, the model risks producing estimates of the variable of interest that are even worse than models without any control variables at all. Conclusion: think twice before you simply include ‘control variables’ in your models!

The gender pay gap is a fact that, sad to say, to a non-negligible extent is the result of discrimination. And even though many women are not deliberately discriminated against, but rather self-select into lower-wage jobs, this in no way magically explains away the discrimination gap. As decades of socialization research has shown, women may be ‘structural’ victims of impersonal social mechanisms that in different ways aggrieve them. Wage discrimination is unacceptable. Wage discrimination is a shame.   Read more…

DSGE — models built on shaky ground

December 8, 2018 9 comments

from Lars Syll

In most aspects of their lives humans must plan forwards. They take decisions today that affect their future in complex interactions with the decisions of others. When taking such decisions, the available information is only ever a subset of the universe of past and present information, as no individual or group of individuals can be aware of all the relevant information. Hence, views or expectations about the future, relevant for their decisions, use a partial information set, formally expressed as a conditional expectation given the available information.

vraylar-shaky-ground-large-4Moreover, all such views are predicated on there being no un-anticipated future changes in the environment pertinent to the decision. This is formally captured in the concept of ‘stationarity’. Without stationarity, good outcomes based on conditional expectations could not be achieved consistently. Fortunately, there are periods of stability when insights into the way that past events unfolded can assist in planning for the future.

The world, however, is far from completely stationary. Unanticipated events occur, and they cannot be dealt with using standard data-transformation techniques such as differencing, or by taking linear combinations, or ratios. In particular, ‘extrinsic unpredictability’ – unpredicted shifts of the distributions of economic variables at unanticipated times – is common. As we shall illustrate, extrinsic unpredictability has dramatic consequences for the standard macroeconomic forecasting models used by governments around the world – models known as ‘dynamic stochastic general equilibrium’ models – or DSGE models …  Read more…

Economic crises and uncertainty

December 6, 2018 8 comments

from Lars Syll

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known ‘data-generating process’ that unfolds over time and on which we, therefore, have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course, it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.  Read more…

Polanyi and Keynes on the idea of ‘self-adjusting’ markets

December 2, 2018 4 comments

from Lars Syll

Paul Krugman has repeatedly over the years argued that we should continue to use maistream economics hobby horses like IS-LM and AS-AD models. Here’s one example:

So why do AS-AD? … We do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility. Or to put it differently, you do want somehow to make clear the notion (which even fairly Keynesian guys like me share) that money is neutral in the long run.

I seriously doubt that Keynes would have been impressed by having his theory being characterized by​ catchwords like “tendency to return to full employment” and “money is neutral in the long run.”

quote-our-thesis-is-that-the-idea-of-a-self-adjusting-market-implied-a-stark-utopia-such-an-karl-polanyi-120-53-85

One of Keynes’s central tenets is that there is no strong automatic tendency for economies to move towards full employment levels.  Read more…

Demystifying economics

November 29, 2018 3 comments

from Lars Syll

The first thing to understand about macroeconomic theory is that it is weirder than you think. The heart of it is the idea that the economy can be thought of as a single infinite-lived individual trading off leisure and consumption over all future time …

reality-header2This approach is formalized in something called the Euler equation … Some version of this equation is the basis of most articles on macroeconomic theory published in a mainstream journal in the past 30 years …

The models may abstract away from features of the world that non-economists might think are rather fundamental to “the economy” — like the existence of businesses, money, and government … But in today’s profession, if you don’t at least start from there, you’re not doing economics.

J W Mason

Yes indeed, mainstream macroeconomics sure is weird. Very weird. And among the weirdest things are those Euler equations Mason mentions in his article.   Read more…