Archive

Author Archive

On the importance of validating assumptions in statistics and econometrics

August 17, 2014 Leave a comment

from Lars Syll

In Andrew Gelman’s and Jennifer Hill’s Data Analysis Using Regression and Multilevel/Hierarchical Models, the authors list the assumptions of the linear regression model. On top of the list is validity and additivity/linearity, followed by different assumptions pertaining to error charateristics.

Yours truly can’t but concur, especially on the “decreasing order of importance” of the assumptions. But then, of course, one really has to wonder why econometrics textbooks — almost invariably — turn this order of importance upside-down and don’t have more thorough discussions on the overriding importance of Gelman/Hill’s two first points …

Since econometrics doesn’t content itself with only making “optimal predictions,” but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — and most important of these are validity and  additivity.

Let me take the opportunity to cite one of my favourite introductory statistics textbooks on one further reason these assumptions are made — and why they ought to be much more argued for on both epistemological and ontological grounds when used (emphasis added):  Read more…

General equilibrium theory — a gross misallocation of intellectual resources and time

August 11, 2014 3 comments

from Lars Syll

Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria.

We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. After reading Franklin M. Fisher‘s masterly paper The stability of general equilibrium: results and problems one however has to ask oneself — what good does that do? Read more…

Categories: New vs. Old Paradigm

Bayesian probability theory banned by English court

August 8, 2014 6 comments

from Lars Syll

In a recent judgement the English Court of Appeal has denied that probability can be used as an expression of uncertainty for events that have either happened or not.

THOMAS_BAYESThe case was a civil dispute about the cause of a fire, and concerned an appeal against a decision in the High Court by Judge Edwards-Stuart. Edwards-Stuart had essentially concluded that the fire had been started by a discarded cigarette, even though this seemed an unlikely event in itself, because the other two explanations were even more implausible. The Court of Appeal rejected this approach although still supported the overall judgement and disallowed the appeal …

But it’s the quotations from the judgement that are so interesting:

Sometimes the ‘balance of probability’ standard is expressed mathematically as ’50 + % probability’, but this can carry with it a danger of pseudo-mathematics, as the argument in this case demonstrated. When judging whether a case for believing that an event was caused in a particular way is stronger that the case for not so believing, the process is not scientific (although it may obviously include evaluation of scientific evidence) and to express the probability of some event having happened in percentage terms is illusory.

The idea that you can assign probabilities to events that have already occurred, but where we are ignorant of the result, forms the basis for the Bayesian view of probability. Put very broadly, the ‘classical’ view of probability is in terms of genuine unpredictability about future events, popularly known as ‘chance’ or ‘aleatory uncertainty’. The Bayesian interpretation allows probability also to be used to express our uncertainty due to our ignorance, known as ‘epistemic uncertainty’ …

The judges went on to say:

The chances of something happening in the future may be expressed in terms of percentage. Epidemiological evidence may enable doctors to say that on average smokers increase their risk of lung cancer by X%. But you cannot properly say that there is a 25 per cent chance that something has happened … Either it has or it has not

Anyway, I teach the Bayesian approach to post-graduate students attending my ‘Applied Bayesian Statistics’ course at Cambridge, and so I must now tell them that the entire philosophy behind their course has been declared illegal in the Court of Appeal. I hope they don’t mind.

David Spiegelhalter

Read more…

Categories: New vs. Old Paradigm

Foot-in-mouth disease — Ayn Rand and Alan Greenspan

August 6, 2014 4 comments

from Lars Syll

Now, I don’t care to discuss the alleged complaints American Indians have against this country. I believe, with good reason, the most unsympathetic Hollywood portrayal of Indians and what they did to the white man. They had no right to a country merely because they were born here and then acted like savages. The white man did not conquer this country …

Since the Indians did not have the concept of property or property rights—they didn’t have a settled society, they had predominantly nomadic tribal “cultures”—they didn’t have rights to the land, and there was no reason for anyone to grant them rights that they had not conceived of and were not using …

What were they fighting for, in opposing the white man on this continent? For their wish to continue a primitive existence; for their “right” to keep part of the earth untouched—to keep everybody out so they could live like animals or cavemen. Any European who brought with him an element of civilization had the right to take over this continent, and it’s great that some of them did. The racist Indians today—those who condemn America—do not respect individual rights.

Ayn Rand,  Address To The Graduating Class Of The United States Military Academy at West Point, 1974

Read more…

Why doesn’t Krugman listen to Krugman?

August 1, 2014 7 comments

from Lars Syll

Paul Krugman wonders why no one listens to academic economists …

Listening_TitleOne answer is that economists don’t listen to themselves. More precisely, liberal economists like Krugman who want the state to take a more active role in managing the economy, continue to teach an economic theory that has no place for activist policy.

Let me give a concrete example.

One of Krugman’s bugaboos is the persistence of claims that expansionary monetary policy must lead to higher inflation. Even after 5-plus years of ultra-loose policy with no rising inflation in sight, we keep hearing that since so “much money has been created…, there should already be considerable inflation” … As an empirical matter, of course, Krugman is right. But where could someone have gotten this idea that an increase in the money supply must always lead to higher inflation? Perhaps from an undergraduate economics class? Very possibly — if that class used Krugman’s textbook.

Here’s what Krugman’s International Economics says about money and inflation: Read more…

New Keynesianism as a Club

July 28, 2014 4 comments

from Thomas Palley

Club, noun. 1. An association or organization dedicated to a particular interest or activity. 2. A heavy stick with a thick end, especially one used as a weapon.

Paul Krugman’s economic analysis is always stimulating and insightful, but there is one issue on which I think he persistently falls short. That issue is his account of New Keynesianism’s theoretical originality and intellectual impact. This is illustrated in his recent reply to a note of mine on the theory of the Phillips curve in which he writes: “I do believe that Palley is on the right track here, because it’s pretty much the same track a number of us have been following for the past few years.”

While I very much welcome his approval, his comment also strikes me as a little misleading. The model of nominal wage rigidity and the Phillips curve that I described comes from my 1990 dissertation, was published in March 1994, and has been followed by substantial further published research. That research also introduces ideas which are not part of the New Keynesian model and are needed to explain the Phillips curve in a higher inflation environment.

Similar precedence issues hold for scholarship on debt-driven business cycles, financial instability, the problem of debt-deflation in recessions and depressions, and the endogenous credit-driven nature of the money supply. These are all topics my colleagues and I, working in the Post- and old Keynesian traditions, have been writing about for years – No, decades! Read more…

Categories: economics profession

Is a more inclusive and sustainable development possible in Brazil? – a WEA online conference:

July 25, 2014 Leave a comment

This conference on sustainable development is now open and you are invited to leave comments on the papers on the conference site.
click here to leave comments

For an introduction to the background to the conference and the themes of the call for papers click here ›

THR PAPERS  Read more…

Read my lips — statistical significance is NOT a substitute for doing real science!

July 25, 2014 8 comments

from Lars Syll

Noah Smith has a post up today telling us that his Bayesian Superman wasn’t intended to be a knock on Bayesianism and that he thinks Frequentism is a bit underrated these days:

Frequentist hypothesis testing has come under sustained and vigorous attack in recent years … But there are a couple of good things about Frequentist hypothesis testing that I haven’t seen many people discuss. Both of these have to do not with the formal method itself, but with social conventions associated with the practice …

Why do I like these social conventions? Two reasons. First, I think they cut down a lot on scientific noise.i_do_not_think_it_[significant]_means_what_you_think_it_means “Statistical significance” is sort of a first-pass filter that tells you which results are interesting and which ones aren’t. Without that automated filter, the entire job of distinguishing interesting results from uninteresting ones falls to the reviewers of a paper, who have to read through the paper much more carefully than if they can just scan for those little asterisks of “significance”.

Hmm …

A non-trivial part of teaching statistics is made up of teaching students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p-values – really are, still most students misinterpret them. And a lot of researchers obviously also fall pray to the same mistakes: Read more…

Bayesianism — preposterous mumbo jumbo

July 23, 2014 6 comments

from Lars Syll

Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules (preferably the ones axiomatized by Ramsey (1931), de Finetti (1937) or Savage (1954)) – that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately – via some “Dutch book” or “money pump” argument – susceptible to being ruined by some clever “bookie”.

bayes_dog_tshirtBayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – do rational agents really have to be Bayesian? As I have been arguing elsewhere (e. g. here and here) there is no strong warrant for believing so.

In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.  Read more…

Cheap talk at the Fed

July 22, 2014 2 comments

from Dean Baker

Federal Reserve Board Chair Janet Yellen made waves in her Congressional testimony last week when she argued that social media and biotech stocks were over-valued. She also said that the price of junk bonds was out of line with historic experience. By making these assertions in a highly visible public forum, Yellen was using the power of the Fed’s megaphone to stem the growth of incipient bubbles. This is an approach that some of us have advocated for close to twenty years.

Before examining the merits of this approach, it is worth noting the remarkable transformation in the Fed’s view on its role in containing bubbles. Just a decade ago, then Fed Chair Alan Greenspan told an adoring audience at the American Economic Association that the best thing the Fed could do with bubbles was to let them run their course and then pick up the pieces after they burst. He argued that the Fed’s approach to the stock bubble vindicated this route. Apparently it did not bother him, or most of the people in the audience, that the economy was at the time experiencing its longest period without net job growth since the Great Depression.

The Fed’s view on bubbles has evolved enormously. Most top Fed officials now recognize the need to take steps to prevent financial bubbles from growing to the point that their collapse would jeopardize the health of the economy. However there are two very different routes proposed for containing bubbles. Read more…

The Sonnenschein-Mantel-Debreu results after forty years

July 21, 2014 4 comments

from Lars Syll

Along with the Arrow-Debreu existence theorem and some results on regular economies, SMD theory fills in many of the gaps we might have in our understanding of general equilibrium theory …

It is also a deeply negative result. SMD theory means that assumptions guaranteeing good behavior at the microeconomic level do not carry over to the aggregate level or to qualitative features of the equilibrium. It has been difficult to make progress on the elaborations of general equilibrium theory that were put forth in Arrow and Hahn 1971 …

24958274Given how sweeping the changes wrought by SMD theory seem to be, it is understand-able that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory. These views are widely shared. Bliss (1993, 227) wrote that the “near emptiness of general equilibrium theory is a theorem of the theory.” Mas-Colell, Michael Whinston, and Jerry Green (1995) titled a section of their graduate microeconomics textbook “Anything Goes: The Sonnenschein-Mantel-Debreu Theorem.” There was a realization of a similar gap in the foundations of empirical economics. General equilibrium theory “poses some arduous challenges” as a “paradigm for organizing and synthesizing economic data” so that “a widely accepted empirical counterpart to general equilibrium theory remains to be developed” (Hansen and Heckman 1996). This seems to be the now-accepted view thirty years after the advent of SMD theory …

S. Abu Turab Rizvi

And so what? Why should we care about Sonnenschein-Mantel-Debreu?  Read more…

Categories: New vs. Old Paradigm

The Phillips Curve: Missing the obvious and looking in all the wrong places

July 18, 2014 Leave a comment

from Thomas Palley

There is an old story about a policeman who sees a drunk looking for something under a streetlight and asks what he is looking for. The drunk replies he has lost his car keys and the policeman joins in the search. A few minutes later the policeman asks if he is sure he lost them here and the drunk replies “No, I lost them in the park.” The policeman then asks “So why are you looking here?” to which the drunk replies “Because this is where the light is.”That story has much relevance for the economics profession’s approach to the Phillips curve.

Recently, there has been another flare-up in discussion of the Phillips curve involving Paul Krugman [Here], Chris House [Here], and Simon Wren-Lewis [Here]. It is précised by Mark Thoma [Here].

The question triggering the discussion is can Phillips curve (PC) theory account for inflation and the non-emergence of sustained deflation in the Great Recession? Four approaches are considered: (1) the original PC without inflation expectations; (2) the adaptive inflation expectations augmented PC; (3) the rational inflation expectations new classical vertical PC; and (4) the new Keynesian “sluggish price adjustment” PC that embeds a mix of lagged inflation and forward looking rational inflation expectations. The conclusion seems to be the original PC does best with regard to recent inflation experience but, of course, it fails with regard to past experience.

There is another obvious explanation that has been over-looked by mainstream economists for nearly forty years because they have preferred to keep looking under the “lamppost” of their conventional constructions. That alternative explanation rests on a combination of downward nominal wage rigidity plus incomplete incorporation of inflation expectations in a multi-sector economy. Read more…

Categories: economics profession

Krugman and Mankiw on the loanable funds theory — so wrong, so wrong

July 17, 2014 6 comments

from Lars Syll

Last year Dirk Ehnts had an interesting post up where he took Paul Krugman to task for still being married to the loanable funds theory.

Unfortunately this is not an exception among “New Keynesian” economists.

Neglecting anything resembling a real-world finance system, Greg Mankiw — in the 8th edition of his intermediate textbook Macroeconomics — has appended a new chapter to the other nineteen chapters where finance more or less is equated to the neoclassical thought-construction of a “market for loanable funds.”

On the subject of financial crises he admits that

perhaps we should view speculative excess and its ramifications as an inherent feature of market economies … but preventing them entirely may be too much to ask given our current knowledge.

This is of course self-evident for all of us who understand that both ontologically and epistemologically founded uncertainty makes any such hopes totally unfounded. But it’s rather odd to read this in a book that bases its models on assumptions of rational expectations, representative actors and dynamically stochastic general equilibrium – assumptions that convey the view that markets – give or take a few rigidities and menu costs – are efficient! For being one of many neoclassical economists so proud of their (unreal, yes, but) consistent models, Mankiw here certainly is flagrantly inconsistent! Read more…

Categories: economics profession

‘Rational expectations’ — nonsense on stilts

July 15, 2014 1 comment

from Lars Syll

Assumptions in scientific theories/models are often based on (mathematical) tractability (and so necessarily simplifying) and used for more or less self-evidently necessary theoretical consistency reasons. But one should also remember that assumptions are selected for a specific purpose, and so the arguments (in economics shamelessly often totally non-existent) put forward for having selected a specific set of assumptions, have to be judged against that background to check if they are warranted.

This, however, only shrinks the assumptions set minimally – it is still necessary to decide on which assumptions are innocuous and which are harmful, and what constitutes interesting/important assumptions from an ontological & epistemological point of view (explanation, understanding, prediction). Especially so if you intend to refer your theories/models to a specific target system — preferably the real world. To do this one should start by applying a Real World Filter in the form of a Smell Test: Is the theory/model reasonable given what we know about the real world? If not, why should we care about it? If not – we shouldn’t apply it (remember time is limited and economics is a science on scarcity & optimization …)

I came to think of the importance of applying the Smell Test when re-reading Mark Thoma’s article — in The Fiscal Times — on “Do people have rational expectations?”: Read more…

Polanyi’s methodology in the Great Transformation

July 14, 2014 7 comments

from Asad Zaman

Polanyi’s book is widely recognized as among the most deeply original and seminal analyses of the origins and effects of capitalism. In a previous post,  I provided a brief summary of the main arguments of Polanyi.  Polanyi does not explicitly discuss methodology, but his analysis is based on a methodology radically different from any currently in use in social sciences. This methodology could provide the basis for an entirely new approach to the subject matter. In my paper entitled The Methodology of Polanyi’s Great Transformation, I have articulated central elements of this methodology by showing how Polanyi uses them in his book. I provide a brief summary of the main ideas of the paper here.

Firstly note the Polanyi operates at a meta-theoretical level. The work analyzes emergence of theories as attempts to understand historical experience. This immediately leads to a historical context sensitive analysis, as opposed to current a-historical methods dominant in economics. In what is an extremely interesting twist, Polanyi argues that theories formulated by contemporaries to understand their experience are often wrong. Nonetheless, these theories are used to understand and shape responses to historical circumstances. This mechanism provides substantial room for human agency in influencing history. The key elements of Polanyi’s methodology, extracted from how he has utilized them in his book, are listed as follows: Read more…

Categories: methodology

Re-thinking the Definition of “Public Goods”

July 9, 2014 9 comments

from June Sekera

Introduction

A year ago last May, the Real World Economics Review blog published my post, “Why Aren’t We Talking About Public Goods?” In that article I argued that we need to revive and reframe the concept of public goods. A concept of public goods is immensely important because: 

  • The absence of a widely-held, constructive idea of public goods in public discourse denies citizens the ability to have an informed conversation, or to make informed decisions, about things that matter mightily to the quality of their lives and their communities.
  • Its absence robs public policy makers, leaders and managers of the concept that is most central to the reason for their being. 
  • The current economics definition of public goods feeds and supports the marketization and privatization of government, and the consequent undermining of governments’ ability to operate.         

Since last May I have met with economists and other social scientists across the US and in the UK and have been in discussion with people responding to my post from several other countries. I have also been conducting further research.

In this post I summarize the results of my discussions and findings to date and offer for consideration some criteria for a possible “instrumental” definition of public goods.  Ultimately, an instrumental definition of public goods must be accompanied by a concordant theory of non-market production in the public economy. Both are needed to ground an improved theory and practice of governance. 

1. The Existing Definition and Its Inadequacies Read more…

Categories: Uncategorized

What to do to make economics a relevant and realist science

July 8, 2014 38 comments

from Lars Syll

The other day yours truly wrote re Krugman‘s dangerous neglect of methodological reflection:

The financial crisis of 2007-08 and its aftermath definitely shows that something has gone terribly wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable … Modern mainstream macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ … Mainstream macroeconomists … want to be able to use their hammer. They decide to pretend that the world looks like a nail and that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption–and the ensuing results are financial crises and economic havoc.

Now Brad DeLong earlier today commented on my critique:

OK …

Suppose we decide that we are no longer going to:

Pretend that agents — or economists — know the data-generating process…

Recognize that people are not terribly committed to Bayesianism -– that they do not model probabilities as if they have well-defined priors and all there is is risk…

What do we then do –- what kind of economic arguments do we make–once we have made those decisions?

“What do we then do?” The despair heard in the question reminds me of the first time I met Phil Mirowski. It was twenty years ago, and he had been invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?” Read more…

Categories: New vs. Old Paradigm

Inspecting the unlikely success of “Capital in the 21st century”

July 7, 2014 5 comments

from John Weeks

Against all expectations an economics book became a best seller this year. I illustrate this unlikely occurrence with a true story.  One day in London I hailed a taxi near the Houses of Parliament (the workers of the underground system were on strike).  I mentioned to the driver that I taught economics at the University of London before retiring several years ago.  The driver asked me, have you read this book by a Frenchman named Piketty?

A London taxi driver discussing an economics book 578 pages long (text only) with countless graphics and even a bit of algebra qualifies the book as a “phenomenon” by the dictionary definition, “a fact or situation that is observed to exist or happen, especially one whose cause or explanation is in question”.  Very much in question the cause is.  I am in the process of writing a review of these 578 pages (plus the occasional excursion into a footnote).  At this point I limit myself to speculating over why it has swept all before it, especially since it is certain to be a book that many people buy and almost no one reads.

We find many reviews of Capitalism in the 21st Century (which I shorten to C21C), most from progressives, soft to hard left.  The inequality deniers have yet to launch a frontal assault, though a recent blog entry for the Financial Times by Chris Giles is a shot from that direction (see Piketty’s reply).  Prominent UK journalist Paul Mason succinctly dismisses the attempted hatchet job (here). Read more…

Keynes on the use of mathematics in economics

July 7, 2014 9 comments

from Lars Syll

But I am unfamiliar with the methods involved and it may be that my impression that nothing emerges at the end which has not been introduced expressly or tacitly at the beginning is quite wrong … It seems to me essential in an article of this sort to put in the fullest and most explicit manner at the beginning the assumptions which are made and the methods by which the price indexes are derived; and then to state at the end what substantially novel conclusions has been arrived at …

Quotation-Kenneth-Boulding-mathematics-economics-Meetville-Quotes-152829

I cannot persuade myself that this sort of treatment of economic theory has anything significant to contribute. I suspect it of being nothing better than a contraption proceeding from premises which are not stated with precision to conclusions which have no clear application … [This creates] a mass of symbolism which covers up all kinds of unstated special assumptions.

Letter from Keynes to Frisch 28 November 1935

 

 

 

Investors — people knowing almost nothing whatever about what they are doing

July 5, 2014 3 comments

from Lars Syll

How far the motives which I have been attributing to the market are strictly rational, I leave it to others to judge. They are best regarded, I think, as an example of how sensitive – over-sensitive if you like – to the near future, about which we may think that we know a little, even the best-informed must be, because, in truth, we know almost nothing about the more remote future …

6a00e551f080038834019101e7a534970cThe ignorance of even the best-informed investor about the more remote future is much greater then his knowledge … But if this is true of the best-informed, the vast majority of those who are concerned with the buying and selling of securities know almost nothing whatever about what they are doing … This is one of the odd characteristics of the Capitalist System under which we live …

It may often profit the wisest to anticipate mob psychology rather than the real trend of events, and to ape unreason proleptically … (The object of speculators) is to re-sell to the mob after a few weeks or at most a few months. It is natural, therefore, that they should be influenced by the cost of borrowing, and still more by their expectations on the basis of past experience of the trend of mob psychology. Thus, so long as the crowd can be relied on to act in a certain way, even if it be misguided, it will be to the advantage of the better informed professional to act in the same way – a short period ahead.

 

 

Categories: financial markets, Keynes
Follow

Get every new post delivered to your Inbox.

Join 10,040 other followers