Archive

Author Archive

New Keynesianism as a Club

July 28, 2014 3 comments

from Thomas Palley

Club, noun. 1. An association or organization dedicated to a particular interest or activity. 2. A heavy stick with a thick end, especially one used as a weapon.

Paul Krugman’s economic analysis is always stimulating and insightful, but there is one issue on which I think he persistently falls short. That issue is his account of New Keynesianism’s theoretical originality and intellectual impact. This is illustrated in his recent reply to a note of mine on the theory of the Phillips curve in which he writes: “I do believe that Palley is on the right track here, because it’s pretty much the same track a number of us have been following for the past few years.”

While I very much welcome his approval, his comment also strikes me as a little misleading. The model of nominal wage rigidity and the Phillips curve that I described comes from my 1990 dissertation, was published in March 1994, and has been followed by substantial further published research. That research also introduces ideas which are not part of the New Keynesian model and are needed to explain the Phillips curve in a higher inflation environment.

Similar precedence issues hold for scholarship on debt-driven business cycles, financial instability, the problem of debt-deflation in recessions and depressions, and the endogenous credit-driven nature of the money supply. These are all topics my colleagues and I, working in the Post- and old Keynesian traditions, have been writing about for years – No, decades! Read more…

Categories: economics profession

Is a more inclusive and sustainable development possible in Brazil? – a WEA online conference:

July 25, 2014 Leave a comment

This conference on sustainable development is now open and you are invited to leave comments on the papers on the conference site.
click here to leave comments

For an introduction to the background to the conference and the themes of the call for papers click here ›

THR PAPERS  Read more…

Read my lips — statistical significance is NOT a substitute for doing real science!

July 25, 2014 5 comments

from Lars Syll

Noah Smith has a post up today telling us that his Bayesian Superman wasn’t intended to be a knock on Bayesianism and that he thinks Frequentism is a bit underrated these days:

Frequentist hypothesis testing has come under sustained and vigorous attack in recent years … But there are a couple of good things about Frequentist hypothesis testing that I haven’t seen many people discuss. Both of these have to do not with the formal method itself, but with social conventions associated with the practice …

Why do I like these social conventions? Two reasons. First, I think they cut down a lot on scientific noise.i_do_not_think_it_[significant]_means_what_you_think_it_means “Statistical significance” is sort of a first-pass filter that tells you which results are interesting and which ones aren’t. Without that automated filter, the entire job of distinguishing interesting results from uninteresting ones falls to the reviewers of a paper, who have to read through the paper much more carefully than if they can just scan for those little asterisks of “significance”.

Hmm …

A non-trivial part of teaching statistics is made up of teaching students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p-values – really are, still most students misinterpret them. And a lot of researchers obviously also fall pray to the same mistakes: Read more…

Bayesianism — preposterous mumbo jumbo

July 23, 2014 6 comments

from Lars Syll

Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules (preferably the ones axiomatized by Ramsey (1931), de Finetti (1937) or Savage (1954)) – that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately – via some “Dutch book” or “money pump” argument – susceptible to being ruined by some clever “bookie”.

bayes_dog_tshirtBayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – do rational agents really have to be Bayesian? As I have been arguing elsewhere (e. g. here and here) there is no strong warrant for believing so.

In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.  Read more…

Cheap talk at the Fed

July 22, 2014 2 comments

from Dean Baker

Federal Reserve Board Chair Janet Yellen made waves in her Congressional testimony last week when she argued that social media and biotech stocks were over-valued. She also said that the price of junk bonds was out of line with historic experience. By making these assertions in a highly visible public forum, Yellen was using the power of the Fed’s megaphone to stem the growth of incipient bubbles. This is an approach that some of us have advocated for close to twenty years.

Before examining the merits of this approach, it is worth noting the remarkable transformation in the Fed’s view on its role in containing bubbles. Just a decade ago, then Fed Chair Alan Greenspan told an adoring audience at the American Economic Association that the best thing the Fed could do with bubbles was to let them run their course and then pick up the pieces after they burst. He argued that the Fed’s approach to the stock bubble vindicated this route. Apparently it did not bother him, or most of the people in the audience, that the economy was at the time experiencing its longest period without net job growth since the Great Depression.

The Fed’s view on bubbles has evolved enormously. Most top Fed officials now recognize the need to take steps to prevent financial bubbles from growing to the point that their collapse would jeopardize the health of the economy. However there are two very different routes proposed for containing bubbles. Read more…

The Sonnenschein-Mantel-Debreu results after forty years

July 21, 2014 4 comments

from Lars Syll

Along with the Arrow-Debreu existence theorem and some results on regular economies, SMD theory fills in many of the gaps we might have in our understanding of general equilibrium theory …

It is also a deeply negative result. SMD theory means that assumptions guaranteeing good behavior at the microeconomic level do not carry over to the aggregate level or to qualitative features of the equilibrium. It has been difficult to make progress on the elaborations of general equilibrium theory that were put forth in Arrow and Hahn 1971 …

24958274Given how sweeping the changes wrought by SMD theory seem to be, it is understand-able that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory. These views are widely shared. Bliss (1993, 227) wrote that the “near emptiness of general equilibrium theory is a theorem of the theory.” Mas-Colell, Michael Whinston, and Jerry Green (1995) titled a section of their graduate microeconomics textbook “Anything Goes: The Sonnenschein-Mantel-Debreu Theorem.” There was a realization of a similar gap in the foundations of empirical economics. General equilibrium theory “poses some arduous challenges” as a “paradigm for organizing and synthesizing economic data” so that “a widely accepted empirical counterpart to general equilibrium theory remains to be developed” (Hansen and Heckman 1996). This seems to be the now-accepted view thirty years after the advent of SMD theory …

S. Abu Turab Rizvi

And so what? Why should we care about Sonnenschein-Mantel-Debreu?  Read more…

The Phillips Curve: Missing the obvious and looking in all the wrong places

July 18, 2014 Leave a comment

from Thomas Palley

There is an old story about a policeman who sees a drunk looking for something under a streetlight and asks what he is looking for. The drunk replies he has lost his car keys and the policeman joins in the search. A few minutes later the policeman asks if he is sure he lost them here and the drunk replies “No, I lost them in the park.” The policeman then asks “So why are you looking here?” to which the drunk replies “Because this is where the light is.”That story has much relevance for the economics profession’s approach to the Phillips curve.

Recently, there has been another flare-up in discussion of the Phillips curve involving Paul Krugman [Here], Chris House [Here], and Simon Wren-Lewis [Here]. It is précised by Mark Thoma [Here].

The question triggering the discussion is can Phillips curve (PC) theory account for inflation and the non-emergence of sustained deflation in the Great Recession? Four approaches are considered: (1) the original PC without inflation expectations; (2) the adaptive inflation expectations augmented PC; (3) the rational inflation expectations new classical vertical PC; and (4) the new Keynesian “sluggish price adjustment” PC that embeds a mix of lagged inflation and forward looking rational inflation expectations. The conclusion seems to be the original PC does best with regard to recent inflation experience but, of course, it fails with regard to past experience.

There is another obvious explanation that has been over-looked by mainstream economists for nearly forty years because they have preferred to keep looking under the “lamppost” of their conventional constructions. That alternative explanation rests on a combination of downward nominal wage rigidity plus incomplete incorporation of inflation expectations in a multi-sector economy. Read more…

Categories: economics profession

Krugman and Mankiw on the loanable funds theory — so wrong, so wrong

July 17, 2014 6 comments

from Lars Syll

Last year Dirk Ehnts had an interesting post up where he took Paul Krugman to task for still being married to the loanable funds theory.

Unfortunately this is not an exception among “New Keynesian” economists.

Neglecting anything resembling a real-world finance system, Greg Mankiw — in the 8th edition of his intermediate textbook Macroeconomics — has appended a new chapter to the other nineteen chapters where finance more or less is equated to the neoclassical thought-construction of a “market for loanable funds.”

On the subject of financial crises he admits that

perhaps we should view speculative excess and its ramifications as an inherent feature of market economies … but preventing them entirely may be too much to ask given our current knowledge.

This is of course self-evident for all of us who understand that both ontologically and epistemologically founded uncertainty makes any such hopes totally unfounded. But it’s rather odd to read this in a book that bases its models on assumptions of rational expectations, representative actors and dynamically stochastic general equilibrium – assumptions that convey the view that markets – give or take a few rigidities and menu costs – are efficient! For being one of many neoclassical economists so proud of their (unreal, yes, but) consistent models, Mankiw here certainly is flagrantly inconsistent! Read more…

Categories: economics profession

‘Rational expectations’ — nonsense on stilts

July 15, 2014 1 comment

from Lars Syll

Assumptions in scientific theories/models are often based on (mathematical) tractability (and so necessarily simplifying) and used for more or less self-evidently necessary theoretical consistency reasons. But one should also remember that assumptions are selected for a specific purpose, and so the arguments (in economics shamelessly often totally non-existent) put forward for having selected a specific set of assumptions, have to be judged against that background to check if they are warranted.

This, however, only shrinks the assumptions set minimally – it is still necessary to decide on which assumptions are innocuous and which are harmful, and what constitutes interesting/important assumptions from an ontological & epistemological point of view (explanation, understanding, prediction). Especially so if you intend to refer your theories/models to a specific target system — preferably the real world. To do this one should start by applying a Real World Filter in the form of a Smell Test: Is the theory/model reasonable given what we know about the real world? If not, why should we care about it? If not – we shouldn’t apply it (remember time is limited and economics is a science on scarcity & optimization …)

I came to think of the importance of applying the Smell Test when re-reading Mark Thoma’s article — in The Fiscal Times — on “Do people have rational expectations?”: Read more…

Polanyi’s methodology in the Great Transformation

July 14, 2014 7 comments

from Asad Zaman

Polanyi’s book is widely recognized as among the most deeply original and seminal analyses of the origins and effects of capitalism. In a previous post,  I provided a brief summary of the main arguments of Polanyi.  Polanyi does not explicitly discuss methodology, but his analysis is based on a methodology radically different from any currently in use in social sciences. This methodology could provide the basis for an entirely new approach to the subject matter. In my paper entitled The Methodology of Polanyi’s Great Transformation, I have articulated central elements of this methodology by showing how Polanyi uses them in his book. I provide a brief summary of the main ideas of the paper here.

Firstly note the Polanyi operates at a meta-theoretical level. The work analyzes emergence of theories as attempts to understand historical experience. This immediately leads to a historical context sensitive analysis, as opposed to current a-historical methods dominant in economics. In what is an extremely interesting twist, Polanyi argues that theories formulated by contemporaries to understand their experience are often wrong. Nonetheless, these theories are used to understand and shape responses to historical circumstances. This mechanism provides substantial room for human agency in influencing history. The key elements of Polanyi’s methodology, extracted from how he has utilized them in his book, are listed as follows: Read more…

Categories: methodology

Re-thinking the Definition of “Public Goods”

July 9, 2014 9 comments

from June Sekera

Introduction

A year ago last May, the Real World Economics Review blog published my post, “Why Aren’t We Talking About Public Goods?” In that article I argued that we need to revive and reframe the concept of public goods. A concept of public goods is immensely important because: 

  • The absence of a widely-held, constructive idea of public goods in public discourse denies citizens the ability to have an informed conversation, or to make informed decisions, about things that matter mightily to the quality of their lives and their communities.
  • Its absence robs public policy makers, leaders and managers of the concept that is most central to the reason for their being. 
  • The current economics definition of public goods feeds and supports the marketization and privatization of government, and the consequent undermining of governments’ ability to operate.         

Since last May I have met with economists and other social scientists across the US and in the UK and have been in discussion with people responding to my post from several other countries. I have also been conducting further research.

In this post I summarize the results of my discussions and findings to date and offer for consideration some criteria for a possible “instrumental” definition of public goods.  Ultimately, an instrumental definition of public goods must be accompanied by a concordant theory of non-market production in the public economy. Both are needed to ground an improved theory and practice of governance. 

1. The Existing Definition and Its Inadequacies Read more…

Categories: Uncategorized

What to do to make economics a relevant and realist science

July 8, 2014 38 comments

from Lars Syll

The other day yours truly wrote re Krugman‘s dangerous neglect of methodological reflection:

The financial crisis of 2007-08 and its aftermath definitely shows that something has gone terribly wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable … Modern mainstream macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ … Mainstream macroeconomists … want to be able to use their hammer. They decide to pretend that the world looks like a nail and that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption–and the ensuing results are financial crises and economic havoc.

Now Brad DeLong earlier today commented on my critique:

OK …

Suppose we decide that we are no longer going to:

Pretend that agents — or economists — know the data-generating process…

Recognize that people are not terribly committed to Bayesianism -– that they do not model probabilities as if they have well-defined priors and all there is is risk…

What do we then do –- what kind of economic arguments do we make–once we have made those decisions?

“What do we then do?” The despair heard in the question reminds me of the first time I met Phil Mirowski. It was twenty years ago, and he had been invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?” Read more…

Inspecting the unlikely success of “Capital in the 21st century”

July 7, 2014 5 comments

from John Weeks

Against all expectations an economics book became a best seller this year. I illustrate this unlikely occurrence with a true story.  One day in London I hailed a taxi near the Houses of Parliament (the workers of the underground system were on strike).  I mentioned to the driver that I taught economics at the University of London before retiring several years ago.  The driver asked me, have you read this book by a Frenchman named Piketty?

A London taxi driver discussing an economics book 578 pages long (text only) with countless graphics and even a bit of algebra qualifies the book as a “phenomenon” by the dictionary definition, “a fact or situation that is observed to exist or happen, especially one whose cause or explanation is in question”.  Very much in question the cause is.  I am in the process of writing a review of these 578 pages (plus the occasional excursion into a footnote).  At this point I limit myself to speculating over why it has swept all before it, especially since it is certain to be a book that many people buy and almost no one reads.

We find many reviews of Capitalism in the 21st Century (which I shorten to C21C), most from progressives, soft to hard left.  The inequality deniers have yet to launch a frontal assault, though a recent blog entry for the Financial Times by Chris Giles is a shot from that direction (see Piketty’s reply).  Prominent UK journalist Paul Mason succinctly dismisses the attempted hatchet job (here). Read more…

Keynes on the use of mathematics in economics

July 7, 2014 9 comments

from Lars Syll

But I am unfamiliar with the methods involved and it may be that my impression that nothing emerges at the end which has not been introduced expressly or tacitly at the beginning is quite wrong … It seems to me essential in an article of this sort to put in the fullest and most explicit manner at the beginning the assumptions which are made and the methods by which the price indexes are derived; and then to state at the end what substantially novel conclusions has been arrived at …

Quotation-Kenneth-Boulding-mathematics-economics-Meetville-Quotes-152829

I cannot persuade myself that this sort of treatment of economic theory has anything significant to contribute. I suspect it of being nothing better than a contraption proceeding from premises which are not stated with precision to conclusions which have no clear application … [This creates] a mass of symbolism which covers up all kinds of unstated special assumptions.

Letter from Keynes to Frisch 28 November 1935

 

 

 

Investors — people knowing almost nothing whatever about what they are doing

July 5, 2014 3 comments

from Lars Syll

How far the motives which I have been attributing to the market are strictly rational, I leave it to others to judge. They are best regarded, I think, as an example of how sensitive – over-sensitive if you like – to the near future, about which we may think that we know a little, even the best-informed must be, because, in truth, we know almost nothing about the more remote future …

6a00e551f080038834019101e7a534970cThe ignorance of even the best-informed investor about the more remote future is much greater then his knowledge … But if this is true of the best-informed, the vast majority of those who are concerned with the buying and selling of securities know almost nothing whatever about what they are doing … This is one of the odd characteristics of the Capitalist System under which we live …

It may often profit the wisest to anticipate mob psychology rather than the real trend of events, and to ape unreason proleptically … (The object of speculators) is to re-sell to the mob after a few weeks or at most a few months. It is natural, therefore, that they should be influenced by the cost of borrowing, and still more by their expectations on the basis of past experience of the trend of mob psychology. Thus, so long as the crowd can be relied on to act in a certain way, even if it be misguided, it will be to the advantage of the better informed professional to act in the same way – a short period ahead.

 

 

Categories: financial markets, Keynes

Uncertainty & reflexivity — implications for economics

July 3, 2014 2 comments

from Lars Syll

Almost a hundred years after John Maynard Keynes wrote his seminal A Treatise on Probability (1921), it is still very difficult to find mainstream economists that seriously try to incorporate his far-reaching and incisive analysis of induction and evidential weight into their theories and models.

The standard view in economics – and the axiomatic probability theory underlying it – is to a large extent based on the rather simplistic idea that “more is better.” But as Keynes argues – “more of the same” is not what is important when making inductive inferences. It’s rather a question of “more but different.”

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w “irrelevant.” Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (“weight of argument”). Running 10 replicative experiments do not make you as “sure” of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same. Read more…

Paul Krugman — a case of dangerous neglect of methodological reflection

June 30, 2014 4 comments

from Lars Syll

rosenbergAlex Rosenberg — chair of the philosophy department at Duke University, renowned economic methodologist and author of Economics — Mathematical Politics or Science of Diminshing Returns? – had an interesting article on What’s Wrong with Paul Krugman’s Philosophy of Economics in 3:AM Magazine the other day. Writes Rosenberg: Read more…

‘Just desert’ and neoclassical income distribution theory

June 29, 2014 5 comments

from Lars Syll

Walked-out Harvard economist Greg Mankiw has more than once tried to defend the 0.1 % by invoking Adam Smith’s invisible hand:

[B]y delivering extraordinary performances in hit films, top stars may do more than entertain millions of moviegoers and make themselves rich in the process. They may also contribute many millions in federal taxes, and other millions in state taxes. And those millions help fund schools, police departments and national defense for the rest of us …

[T]he richest 1 percent aren’t motivated by an altruistic desire to advance the public good. But, in most cases, that is precisely their effect.

negotiation1When reading Mankiw’s articles on the “just desert” of the 0.1 % one gets a strong feeling that Mankiw is really trying to argue that a market economy is some kind of moral free zone where, if left undisturbed, people get what they “deserve.”

Where does this view come from? Most neoclassical economists actually have a more or less Panglossian view on unfettered markets, but maybe Mankiw has also read neoliberal philosophers like Robert Nozick or David Gauthier. The latter writes in his Morals by Agreement:

The rich man may feast on caviar and champagne, while the poor woman starves at his gate. And she may not even take the crumbs from his table, if that would deprive him of his pleasure in feeding them to his birds.

Now, compare that unashamed neoliberal apologetics with what three truly great economists and liberals — John Maynard Keynes, Amartya Sen and Robert Solow — have to say on the issue: Read more…

11th out of 11

June 27, 2014 1 comment

from David Ruccio

Davis_Mirror_2014_ES1_for_web

The U.S. healthcare system ranks dead last out of 11 countries studied by the Commonwealth Fund [ht: ja]. Read more…

Categories: Decline of the USA, health

If you only have time to read one statistics book — this is the one!

June 25, 2014 5 comments

from Lars Syll

Mathematical statistician David A. Freedman‘s Statistical Models and Causal Inference (Cambridge University Press, 2010) is a marvellous book. It ought to be mandatory reading for every serious social scientist – including economists and econometricians – who doesn’t want to succumb to ad hoc assumptions and unsupported statistical conclusions! Read more…

Follow

Get every new post delivered to your Inbox.

Join 9,936 other followers