Archive

Author Archive

Regression analysis — a case of wishful thinking

July 19, 2018 1 comment

from Lars Syll

The impossibility of proper specification is true generally in regression analyses across the social sciences, whether we are looking at the factors affecting occupational status, voting behavior, etc. The problem is that as implied by the conditions for regression analyses to yield accurate, unbiased estimates, you need to investigate a phenomenon that has underlying mathematical regularities – and, moreover, you need to know what they are. Neither seems true. I have no reason to believe that the way in which multiple factors affect earnings, student achievement, and GNP have some underlying mathematical regularity across individuals or countries. More likely, each individual or country has a different function, and one that changes over time. Even if there was some constancy, the processes are so complex that we have no idea of what the function looks like.

regressionResearchers recognize that they do not know the true function and seem to treat, usually implicitly, their results as a good-enough approximation. But there is no basis for the belief that the results of what is run in practice is anything close to the underlying phenomenon, even if there is an underlying phenomenon. This just seems to be wishful thinking. Most regression analysis research doesn’t even pay lip service to theoretical regularities. But you can’t just regress anything you want and expect the results to approximate reality. And even when researchers take somewhat seriously the need to have an underlying theoretical framework – as they have, at least to some extent, in the examples of studies of earnings, educational achievement, and GNP that I have used to illustrate my argument – they are so far from the conditions necessary for proper specification that one can have no confidence in the validity of the results.

Steven J. Klees 

Read more…

The main reason why almost all econometric models are wrong

July 17, 2018 21 comments

from Lars Syll

How come that econometrics and statistical regression analyses still have not taken us very far in discovering, understanding, or explaining causation in socio-economic contexts? That is the question yours truly has tried to answer in an article published in the latest issue of World Economic Association Commentaries:

The processes that generate socio-economic data in the real world cannot just be assumed to always be adequately captured by a probability measure. And, so, it cannot be maintained that it even should be mandatory to treat observations and data — whether cross-section, time series or panel data — as events generated by some probability model. The important activities of most economic agents do not usually include throwing dice or spinning roulette-wheels. Data generating processes — at least outside of nomological machines like dice and roulette-wheels — are not self-evidently best modelled with probability measures.

EGOBILD2017When economists and econometricians — often uncritically and without arguments — simply assume that one can apply probability distributions from statistical theory on their own area of research, they are really skating on thin ice. If you cannot show that data satisfies all the conditions of the probabilistic nomological machine, then the statistical inferences made in mainstream economics lack sound foundations.

Statistical — and econometric — patterns should never be seen as anything other than possible clues to follow. Behind observable data, there are real structures and mechanisms operating, things that are — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.

Statistics cannot establish the truth value of a fact. Never has. Never will.

Hard and soft science — a flawed dichotomy

July 16, 2018 14 comments

from Lars Syll

The distinctions between hard and soft sciences are part of our culture … But the important distinction is really not between the hard and the soft sciences. Rather, it is between the hard and the easy sciences. Easy-to-do science is what those in physics, chemistry, geology, and some other fields do. Hard-to-do science is what the social scientists do and, in particular, it is what we educational researchers do. In my estimation, we have the hardest-to-do science of them all! We do our science under conditions that physical scientists find intolerable. We face particular problems and must deal with local conditions that limit generalizations and theory building-problems that are different from those faced by the easier-to-do sciences …

Context-MAtters_Blog_Chip_180321_093400Huge context effects cause scientists great trouble in trying to understand school life … A science that must always be sure the myriad particulars are well understood is harder to build than a science that can focus on the regularities of nature across contexts …

Doing science and implementing scientific findings are so difficult in education because humans in schools are embedded in complex and changing networks of social interaction. The participants in those networks have variable power to affect each other from day to day, and the ordinary events of life (a sick child, a messy divorce, a passionate love affair, migraine headaches, hot flashes, a birthday party, alcohol abuse, a new principal, a new child in the classroom, rain that keeps the children from a recess outside the school building) all affect doing science in school settings by limiting the generalizability of educational research findings. Compared to designing bridges and circuits or splitting either atoms or genes, the science to help change schools and classrooms is harder to do because context cannot be controlled.

David Berliner

Read more…

What are axiomatizations good for?

July 14, 2018 21 comments

from Lars Syll

Axiomatic decision theory was pioneered in the early 20th century by Ramsey (1926) and de Finetti (1931,1937), and achieved remarkable success in shaping economic theory … A remarkable amount of economic research is now centered around axiomatic models of decision …

UnknownWhat have these axiomatizations done for us lately? What have we gained from them? Are they leading to advances in economic analysis, or are they perhaps attracting some of the best minds in the field to deal with difficult problems that are of little import? Why is it the case that in other sciences, such as psychology, biology, and chemistry, such axiomatic work is so rarely found? Are we devoting too much time for axiomatic derivations at the expense of developing theories that fit the data?

This paper addresses these questions … Section 4 provides our response, namely that axiomatic derivations are powerful rhetorical devices …

I. Gilboa​, A. Postlewaite​, L. Samuelson, ​& D. Schmeidler

‘Powerful rhetorical devices’? What an impressive achievement indeed …

Some of us have for years been urging economists to pay attention to the ontological foundations of their assumptions and models. Sad to say, economists have not paid much attention — and so modern economics has become increasingly irrelevant to the understanding of the real world.  Read more…

The core problem with ‘New Keynesian’ macroeconomics

July 12, 2018 11 comments

from Lars Syll

Whereas the Great Depression of the 1930s produced Keynesian economics, and the stagflation of the 1970s produced Milton Friedman’s monetarism, the Great Recession has produced no similar intellectual shift.

This is deeply depressing to young students of economics, who hoped for a suitably challenging response from the profession. Why has there been none?

risk-uncertainty-03-e1508523129420-1024x550Krugman’s answer is typically ingenious: the old macroeconomics was, as the saying goes, “good enough for government work”  … Krugman is a New Keynesian, and his essay was intended to show that the Great Recession vindicated standard New Keynesian models. But there are serious problems with Krugman’s narrative …

The New Keynesian models did not offer a sufficient basis for maintaining Keynesian policies once the economic emergency had been overcome, they were quickly abandoned …

The problem for New Keynesian macroeconomists is that they fail to acknowledge radical uncertainty in their models, leaving them without any theory of what to do in good times in order to avoid the bad times. Their focus on nominal wage and price rigidities implies that if these factors were absent, equilibrium would readily be achieved …

Without acknowledgement of uncertainty, saltwater economics is bound to collapse into its freshwater counterpart. New Keynesian “tweaking” will create limited political space for intervention, but not nearly enough to do a proper job.

Robert Skidelsky

Skidelsky’s article shows why we all ought to be sceptic of the pretences and aspirations of ‘New Keynesian’ macroeconomics. So far it has been impossible to see that it has yielded very much in terms of realist and relevant economic knowledge. And — as if that wasn’t enough — there’s nothing new or Keynesian about it!  Read more…

The randomistas revolution

July 10, 2018 6 comments

from Lars Syll

RandomistasIn his new history of experimental social science — Randomistas: How radical researchers are changing our world — Andrew Leigh gives an introduction to the RCT (randomized controlled trial) method for conducting experiments in medicine, psychology, development economics, and policy evaluation. Although it mentions there are critiques that can be waged against it, the author does not let that shadow his overwhelmingly enthusiastic view on RCT.

Among mainstream economists, this uncritical attitude towards RCTs has become standard. Nowadays many mainstream economists maintain that ‘imaginative empirical methods’ — such as natural experiments, field experiments, lab experiments, RCTs — can help us to answer questions concerning the external validity of economic models. In their view, they are more or less tests of ‘an underlying economic model’ and enable economists to make the right selection from the ever-expanding ‘collection of potentially applicable models.’

When looked at carefully, however, there are in fact few real reasons to share this optimism on the alleged ’empirical turn’ in economics.  Read more…

Econometrics cannot establish the truth value of a fact. Never has. Never will.

July 9, 2018 2 comments

from Lars Syll

assumptionsThere seems to be a pervasive human aversion to uncertainty, and one way to reduce feelings of uncertainty is to invest faith in deduction as a sufficient guide to truth. Unfortunately, such faith is as logically unjustified as any religious creed, since a deduction produces certainty about the real world only when its assumptions about the real world are certain …

Assumption uncertainty reduces the status of deductions and statistical computations to exercises in hypothetical reasoning – they provide best-case scenarios of what we could infer from specific data (which are assumed to have only specific, known problems). Even more unfortunate, however, is that this exercise is deceptive to the extent it ignores or misrepresents available information, and makes hidden assumptions that are unsupported by data …

Econometrics supplies dramatic cautionary examples in which complexmodelling​g has failed miserably in important applications …

Sander Greenland

Yes, indeed, econometrics fails miserably over and over again. One reason why it does, is that the error term in the regression models used is thought of as representing the effect of the variables that were omitted from the models. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is, as a rule, nothing to ensure identifiability:  Read more…

So much for ‘statistical objectivity’

July 6, 2018 2 comments

from Lars Syll

Last year, we recruited 29 teams of researchers and asked them to answer the same research question with the same data set. Teams approached the data with a wide array of analytical techniques, and obtained highly varied results …

All teams were given the same large data set collected by a sports-statistics firm across four major football leagues. It included referee calls, counts of how often referees encountered each player, and player demographics including team position, height and weight. It also included a rating of players’ skin colour …

unchallengable-statisticsOf the 29 teams, 20 found a statistically significant correlation between skin colour and red cards … Findings varied enormously, from a slight (and non-significant) tendency for referees to give more red cards to light-skinned players to a strong trend of giving more red cards to dark-skinned players …

Had any one of these 29 analyses come out as a single peer-reviewed publication, the conclusion could have ranged from no race bias in referee decisions to a huge bias.

Raphael Silberzahn & Eric Uhlmann

Read more…

The mess at the heart of the EU

July 4, 2018 21 comments

from Lars Syll

austerity22

The EU establishment has been held to account for the euro mess, for austerity policies that turned recession into depression, for the galloping inequality, and for the millions and millions of unemployed.

The EU austerity policies bread understandable and righteous anger — but also ugly far-right xenophobic political movements taking advantage of the frustration that austerity policies inevitably produce. Ultimately this underlines the threats to society that austerity policies and mass unemployment are.

The neoliberal austerity policies pursued in the EU is deeply disturbing. When an economy is already hanging on the ropes, you can’t just cut government spendings. Cutting government expenditures reduces the aggregate demand. Lower aggregate demand means lower tax revenues. Lower tax revenues mean increased deficits — and calls for even more austerity. And so on, and so on.

Without a conscious effort to counteract the inevitable forces driving our societies towards an extreme income and wealth inequality, our societies crackle. It is crucial to have strong redistributive policies if we want to have stable economies and societies. Redistributive taxes and active fiscal policies are necessary ingredients for building a good society.  Read more…

Probability and rationality — trickier than you might think

July 3, 2018 11 comments

from Lars Syll

The Coin-tossing Problem

My friend Bengt says that on the first day he got the following sequence of Heads and Tails when tossing a coin:
H H H H H H H H H H

And on the second day he says that he got the following sequence:
H T T H H T T H T H

184bic9u2w483jpgWhich day-report makes you suspicious?

Most people I ask this question says the first day-report looks suspicious.

But actually,​ both days are equally probable! Every time you toss a (fair) coin there is the same probability (50 %) of getting H or T. Both days Ben makes equally many tosses and every sequence is equally probable!

The Linda Problem

Linda is 40 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which of the following two alternatives is more probable?  Read more…

Paul Krugman — a math-is-the-message-modeler

July 1, 2018 55 comments

from Lars Syll

In a post on his blog, Paul Krugman argues that ‘Keynesian’ macroeconomics more than anything else “made economics the model-oriented field it has become.” In Krugman’s eyes, Keynes was a “pretty klutzy modeler,” and it was only thanks to Samuelson’s famous 45-degree diagram and Hicks’ IS-LM that things got into place. Although admitting that economists have a tendency to use ”excessive math” and “equate hard math with quality” he still vehemently defends — and always has — the mathematization of economics:

I’ve seen quite a lot of what economics without math and models looks like — and it’s not good.

Sure, ‘New Keynesian’ economists like Mankiw and Krugman — and their forerunners, ‘Keynesian’ economists like Paul Samuelson and (young) John Hicks — certainly have contributed to making economics more mathematical and “model-oriented.”

65547036But if these math-is-the-message-modelers aren’t able to show that the mechanisms or causes that they isolate and handle in their mathematically formalized macromodels also are applicable to the real world, these mathematical models are of limited value to our understandings of real-world​ economies.

When it comes to modeling philosophy, Krugman defends his position in the following words (my italics): Read more…

Krugman’s formalization schizophrenia

June 30, 2018 4 comments

from Lars Syll

In an article published last week, Nicholas Gruen criticized the modern vogue of formalization in economics and took Paul Krugman as an example:

He’s saying first that economists can’t see what isn’t in their models – whereas Hicks and pretty much every economist until the late twentieth century would have understood the need for careful and ongoing reconciliation of formal modelling and other sources of knowledge. More shockingly he’s saying that those who smell a rat at the dysfunctionality of all this should just get over themselves. To quote Krugman:

“You may not like this tendency; certainly economists tend to be too quick to dismiss what has not been formalized (although I believe that the focus on models is basically right).”

It’s ironic given how compellingly Krugman has documented the regression of macroeconomics in the same period that saw his own rise via new trade theory. I think both retrogressions were driven by formalisation at all costs, though, in the case of new classical macro, this mindset gave additional licence to the motivated reasoning of the libertarian right. In each case, economics regresses into scholastic abstractions, and obviously important parts of the story slide pristine invisibility to the elect.

Responding to the article, Paul Krugman yesterday rode out to defend formalism in economics:  Read more…

The main reason why almost all econometric models are wrong

June 29, 2018 5 comments

from Lars Syll

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect.  And when the model is wrong — well, then it’s wrong.

The assumption of additivity and linearity means that the outcome variable is, in reality, linearly related to any predictors … and that if you have several predictors then their combined effect is best described by adding their effects together …

catdogThis assumption is the most important because if it is not true then even if all other assumptions are met, your model is invalid because you have described it incorrectly. It’s a bit like calling your pet cat a dog: you can try to get it to go in a kennel, or to fetch sticks, or to sit when you tell it to, but don’t be surprised when its behaviour isn’t what you expect because even though you’ve called it a dog, it is in fact a cat. Similarly, if you have described your statistical model inaccurately it won’t behave itself and there’s no point in interpreting its parameter estimates or worrying about significance tests of confidence intervals: the model is wrong.

Andy Field

Let me take the opportunity to elaborate a little on why I find these assumptions of such paramount importance and ought to be much more argued for — on both epistemological and ontological grounds — if at all being used.  Read more…

Marginal productivity theory

June 28, 2018 4 comments

from Lars Syll

The correlation between high executive pay and good performance is “negligible”, a new academic study has found, providing reformers with fresh evidence that a shake-up of Britain’s corporate remuneration systems is overdue.

jpgimageAlthough big company bosses enjoyed pay rises of more than 80 per cent in a decade, performance as measured by economic returns on invested capital was less than 1 per cent over the period, the paper by Lancaster University Management School says.

“Our findings suggest a material disconnect between pay and fundamental value generation for, and returns to, capital providers,” the authors of the report said.

In a study of more than a decade of data on the pay and performance of Britain’s 350 biggest listed companies, Weijia Li and Steven Young found that remuneration had increased 82 per cent in real terms over the 11 years to 2014 … The research found that the median economic return on invested capital, a preferable measure, was less than 1 per cent over the same period.

Patrick Jenkins/Financial Times

Mainstream economics textbooks usually refer to the interrelationship between technological development and education as the main causal force behind increased inequality. If the educational system (supply) develops at the same pace as technology (demand), there should be no increase, ceteris paribus, in the ratio between high-income (highly educated) groups and low-income (low education) groups. In the race between technology and education, the proliferation of skilled-biased technological change has, however, allegedly increased the premium for the highly educated group.  Read more…

Why the euro cannot be saved

June 27, 2018 18 comments

from Lars Syll

The euro may be approaching another crisis. Italy, the eurozone’s third largest economy, has chosen what can at best be described as a Euroskeptic government. This should surprise no one. The backlash in Italy is another predictable (and predicted) episode in the long saga of a poorly designed currency arrangement, in which the dominant power, Germany, impedes the necessary reforms and insists on policies that exacerbate the inherent problems, using rhetoric seemingly intended to inflame passions.

euroItaly has been performing poorly since the euro’s launch. Its real (inflation-adjusted) GDP in 2016 was the same as it was in 2001. But the eurozone as a whole has not been doing well, either … If one country does poorly, blame the country; if many countries are doing poorly, blame the system … The euro was a system almost designed to fail. It took away governments’ main adjustment mechanisms (interest and exchange rates); and, rather than creating new institutions to help countries cope with the diverse situations in which they find themselves, it imposed new strictures – often based on discredited economic and political theories – on deficits, debt, and even structural policies.

The euro was supposed to bring shared prosperity, which would enhance solidarity and advance the goal of European integration. In fact, it has done just the opposite, slowing growth and sowing discord …

The central problem in a currency area is how to correct exchange-rate misalignments like the one now affecting Italy. Germany’s answer is to put the burden on the weak countries already suffering from high unemployment and low growth rates. We know where this leads: more pain, more suffering, more unemployment, and even slower growth …

Across the eurozone, political leaders are moving into a state of paralysis: citizens want to remain in the EU, but also want an end to austerity and the return of prosperity. They are told they can’t have both. Ever hopeful of a change of heart in northern Europe, troubled governments stay the course, and the suffering of their people increases …

Germany and other countries in northern Europe can save the euro by showing more humanity and more flexibility. But, having watched the first acts of this play so many times, I am not counting on them to change the plot.

Joseph Stiglitz

The euro has taken away the possibility for national governments to manage their economies in a meaningful way — and in Italy, just as in Greece a couple of years ago, the people have had to pay the true costs of its concomitant misguided austerity policies.  Read more…

Noah Smith’s unicorn defence​ of economics

June 26, 2018 10 comments

from Lars Syll

Unlike the old neoclassical theories, game theory concerns strategic interaction between different people. It can encompass things like wage bargaining, fraud and lots of other things that neoclassical equilibrium glosses over or leaves out.  And in game theory, free markets full of rational actors can easily, even regularly, lead to inefficient outcomes that require government intervention.

Noah Smith/Bloomberg

unicorn“Free markets full of rational actors.” Sounds great does it not? The problem? In the real world,​ there is no such thing! Defending mainstream economics against its critics with game theoretical unicorns actually only confirms how justified the critic is.

Half a century ago there was​​ widespread hopes game theory would provide a unified theory of social science. Today it has become obvious those hopes did not materialize. This ought to come as no surprise. Reductionist and atomistic models of social interaction — such as those mainstream economics and game theory are founded on — will never deliver sustainable building blocks for a realist and relevant social science. That is also the reason why game theory never will be anything but a footnote in the history of social science.  Read more…

How to be a great economist

June 24, 2018 35 comments

from Lars Syll

The master-economist must possess a rare combination of gifts …​ He must be mathematician, historian, statesman, philosopher—in some degree. He must understand symbols and speak in words. He must contemplate the particular, in terms of the general, and touch abstract and concrete in the same flight of thought. He must study the present in the light of the past for the purposes of the future. No part of man’s nature or his institutions must be entirely outside his regard. He must be purposeful and disinterested in a simultaneous mood, as aloof and incorruptible as an artist, yet sometimes as near to earth as a politician.

John Maynard Keynes

Economics students today are complaining more and more about the way economics is taught. The lack of fundamental diversity — not just path-dependent elaborations of the mainstream canon — and narrowing of the curriculum, dissatisfy econ students all over the world. The frustrating lack of real-world relevance has led many of them to demand the discipline to start developing a more open and pluralistic theoretical and methodological attitude.

There are many things about the way economics is taught today that worry yours truly. Today’s students are force-fed with mainstream neoclassical theories and models. That lack of pluralism is cause for serious concern.

Read more…

Statistics and econometrics are not very helpful for understanding economies

June 23, 2018 27 comments

from Lars Syll

leamer1 zoomedA statistician may have done the programming, but when you press a button on a computer keyboard and ask the computer to find some good patterns, better get clear a sad fact: computers do not think. They do exactly what the programmer told them to do and nothing more. They look for the patterns that we tell them to look for, those and nothing more. When we turn to the computer for advice, we are only talking to ourselves …

Mathematical analysis works great to decide which horse wins, if we are completely confident which horses are in the race, but it breaks down when we are not sure. In experimental settings, the set of alternative models can often be well agreed on, but with nonexperimental economics data, the set of models is subject to enormous disagreements. You disagree with your model made yesterday, and I disagree with your model today. Mathematics does not help much resolve our internal intellectual disagreements.

Ed Leamer

Indeed. As social researchers, we should never equate science with mathematics and statistical calculation. All science entail human judgement, and using mathematical and statistical models don’t relieve us of that necessity. They are no substitutes for thinking and doing real science. Or as a great German philosopher once famously wrote:  Read more…

Econometric inconsistencies

June 22, 2018 6 comments

from Lars Syll

In plain terms, it is evident that if what is really the same factor is appearing in several places under various disguises, a free choice of regression coefficients can lead to strange results. It becomes like those puzzles for children where you write down your age, multiply, add this and that, subtract something else, and eventually end up with the number of the Beast in Revelation.

deb6e811f2b49ceda8cc2a2981e309f39e3629d8ae801a7088bf80467303077bProf. Tinbergen explains that, generally speaking, he assumes that the correlations under investigation are linear … One would have liked to be told emphatically what is involved in the assumption of linearity. It means that the quantitative effect of any causal factor on the phenomenon under investigation is directly proportional to the factor’s own magnitude … But it is a very drastic and usually improbable postulate to suppose that all economic forces are of this character, producing independent changes in the phenomenon under investigation which are directly proportional to the changes in themselves ; indeed, it is ridiculous. Yet this is what Prof. Tinbergen is throughout assuming …

J M Keynes

Keynes’ comprehensive critique of econometrics and the assumptions it is built around — completeness, measurability, independence, homogeneity, and linearity — is still valid today.  Read more…

The microfoundations crusade

June 21, 2018 6 comments

from Lars Syll

I think the two most important microfoundation led innovations in macro have been intertemporal consumption and rational expectations. I have already talked about the former in an earlier post … [s]o let me focus on rational expectations …  [T]he adoption of rational expectations was not the result of some previous empirical failure. Instead it represented, as Lucas said, a consistency axiom …

UnknownI think macroeconomics today is much better than it was 40 years ago as a result of the microfoundations approach. I also argued in my previous post that a microfoundations purist position – that this is the only valid way to do macro – is a mistake. The interesting questions are in between. Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity? Does sticking with simple, representative agent macro impart some kind of bias? Does a microfoundations approach discourage investigation of the more ‘difficult’ but more important issues? Might both these questions suggest a link between too simple a micro based view and a failure to understand what was going on before the financial crash? Are alternatives to microfoundations modelling methodologically coherent? Is empirical evidence ever going to be strong and clear enough to trump internal consistency? These are difficult and often quite subtle questions that any simplistic for and against microfoundations debate will just obscure.

Simon Wren-Lewis

On this argumentation I would like to add the following comments:  Read more…