from Thomas Palley
Club, noun. 1. An association or organization dedicated to a particular interest or activity. 2. A heavy stick with a thick end, especially one used as a weapon.
Paul Krugman’s economic analysis is always stimulating and insightful, but there is one issue on which I think he persistently falls short. That issue is his account of New Keynesianism’s theoretical originality and intellectual impact. This is illustrated in his recent reply to a note of mine on the theory of the Phillips curve in which he writes: “I do believe that Palley is on the right track here, because it’s pretty much the same track a number of us have been following for the past few years.”
While I very much welcome his approval, his comment also strikes me as a little misleading. The model of nominal wage rigidity and the Phillips curve that I described comes from my 1990 dissertation, was published in March 1994, and has been followed by substantial further published research. That research also introduces ideas which are not part of the New Keynesian model and are needed to explain the Phillips curve in a higher inflation environment.
Similar precedence issues hold for scholarship on debt-driven business cycles, financial instability, the problem of debt-deflation in recessions and depressions, and the endogenous credit-driven nature of the money supply. These are all topics my colleagues and I, working in the Post- and old Keynesian traditions, have been writing about for years – No, decades! Read more…
In economics, there is an unfortunate rift between academics and the economists who actually measure the economy. Which means that academic economists give little attention to the extremely important question how economic concepts relate to actual measurements – one reason why so much of their work is naïve (the ‘Ricardian’ household, which cuts consumption when government spending increases and the like). Fortunately, economic historians, who often have to do the measurements themselves, often bridge part of the gap. Robert Gallman has some highly relevant remarks about different ways to measure (nineteenth century USA) capital – and how these relate to the future, the past, uncertainty, savings, consumption foregone and replacement costs. This still leaves out important parts of the concept of capital like liquidity, ownership and the ‘overlapping generations’ problem – which however does not make these remarks less valuable. Read more…
This conference on sustainable development is now open and you are invited to leave comments on the papers on the conference site.
click here to leave comments
For an introduction to the background to the conference and the themes of the call for papers click here ›
THR PAPERS Read more…
from Lars Syll
Frequentist hypothesis testing has come under sustained and vigorous attack in recent years … But there are a couple of good things about Frequentist hypothesis testing that I haven’t seen many people discuss. Both of these have to do not with the formal method itself, but with social conventions associated with the practice …
Why do I like these social conventions? Two reasons. First, I think they cut down a lot on scientific noise. “Statistical significance” is sort of a first-pass filter that tells you which results are interesting and which ones aren’t. Without that automated filter, the entire job of distinguishing interesting results from uninteresting ones falls to the reviewers of a paper, who have to read through the paper much more carefully than if they can just scan for those little asterisks of “significance”.
A non-trivial part of teaching statistics is made up of teaching students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p-values – really are, still most students misinterpret them. And a lot of researchers obviously also fall pray to the same mistakes: Read more…
from Mark Weisbrot
Most people think that the Venezuelan economy is a basket case on the verge of collapse, and that has been a widespread belief for most of the last decade. But the South American nation has only run into serious trouble in the past two years. Starting in 2004, after the government wrested control from its political opposition over the all-important oil industry, the economy performed quite well through 2012. It grew at an annual rate of 4.8%, and the poverty rate fell by half.
During the past two years, however, a number of problems worsened. Inflation has hit annual rates of more than 60%, and the country has faced an increasing shortage of essential consumer goods like milk and toilet paper. The black market price of the dollar also soared.
Earlier this month, the government announced that it would try to resolve these imbalances by creating a single, unified exchange rate system. Venezuela currently has four different exchange rates. Most dollars are bought from the government at the official rate of 6.3 bolivares fuertes per U.S. Dollar. These are intended for essential goods, such as food and medicine. Some other importers can buy dollars from the government at a rate of about 11 bolivares per dollar on a limited exchange called SICAD 1. There is also SICAD 2, which was introduced in March and involves private sellers, at a rate of 50 Bf. per dollar. Finally, there is the black market, which is unregulated, where dollars currently sell for about 79 Bf. per dollar. Read more…
from Lars Syll
Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules (preferably the ones axiomatized by Ramsey (1931), de Finetti (1937) or Savage (1954)) – that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately – via some “Dutch book” or “money pump” argument – susceptible to being ruined by some clever “bookie”.
Bayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – do rational agents really have to be Bayesian? As I have been arguing elsewhere (e. g. here and here) there is no strong warrant for believing so.
In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure. Read more…
from Dean Baker
Federal Reserve Board Chair Janet Yellen made waves in her Congressional testimony last week when she argued that social media and biotech stocks were over-valued. She also said that the price of junk bonds was out of line with historic experience. By making these assertions in a highly visible public forum, Yellen was using the power of the Fed’s megaphone to stem the growth of incipient bubbles. This is an approach that some of us have advocated for close to twenty years.
Before examining the merits of this approach, it is worth noting the remarkable transformation in the Fed’s view on its role in containing bubbles. Just a decade ago, then Fed Chair Alan Greenspan told an adoring audience at the American Economic Association that the best thing the Fed could do with bubbles was to let them run their course and then pick up the pieces after they burst. He argued that the Fed’s approach to the stock bubble vindicated this route. Apparently it did not bother him, or most of the people in the audience, that the economy was at the time experiencing its longest period without net job growth since the Great Depression.
The Fed’s view on bubbles has evolved enormously. Most top Fed officials now recognize the need to take steps to prevent financial bubbles from growing to the point that their collapse would jeopardize the health of the economy. However there are two very different routes proposed for containing bubbles. Read more…
from Lars Syll
Along with the Arrow-Debreu existence theorem and some results on regular economies, SMD theory fills in many of the gaps we might have in our understanding of general equilibrium theory …
It is also a deeply negative result. SMD theory means that assumptions guaranteeing good behavior at the microeconomic level do not carry over to the aggregate level or to qualitative features of the equilibrium. It has been difficult to make progress on the elaborations of general equilibrium theory that were put forth in Arrow and Hahn 1971 …
Given how sweeping the changes wrought by SMD theory seem to be, it is understand-able that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory. These views are widely shared. Bliss (1993, 227) wrote that the “near emptiness of general equilibrium theory is a theorem of the theory.” Mas-Colell, Michael Whinston, and Jerry Green (1995) titled a section of their graduate microeconomics textbook “Anything Goes: The Sonnenschein-Mantel-Debreu Theorem.” There was a realization of a similar gap in the foundations of empirical economics. General equilibrium theory “poses some arduous challenges” as a “paradigm for organizing and synthesizing economic data” so that “a widely accepted empirical counterpart to general equilibrium theory remains to be developed” (Hansen and Heckman 1996). This seems to be the now-accepted view thirty years after the advent of SMD theory …
And so what? Why should we care about Sonnenschein-Mantel-Debreu? Read more…
from Dean Baker
Floyd Norris has an interesting piece discussing Citigroup’s $7 billion settlement for misrepresenting the quality of the mortgages in the mortgage backed securities it marketed in the housing bubble. Norris notes that the bank had consultants who warned that many of the mortgages did not meet its standards and therefore should not have been included the securities.
Towards the end of the piece Norris comments:
“And it may well be true that actions like Citigroup’s were necessary for any bank that wanted to stay in what then appeared to be a highly profitable business. Imagine for a minute what would have happened in 2006 if Citigroup had listened to its consultants and canceled the offerings. To the mortgage companies making the loans, that might have simply marked Citigroup as uncooperative. The business would have gone to less scrupulous competitors.”
This raises the question of what purpose is served by this sort of settlement. Undoubtedly Norris’ statement is true. However, the market dynamic might be different if this settlement were different. Read more…
The ‘Statistisches Bundesamt’ has a kind of app which helps companies to use data on historical inflation to insert ‘real price’ clauses in contracts – which can lead to a ‘price-price’ spiral. Interesting, as many economic models suppose that historical inflation is caused by expectations about future inflation (really!).
The German Handelsblatt has a very interesting article (sorry, no link) about business credit in the north (+1,4%) and the south (-5,9%) of the Euro Area. Which shows the difficulties of monetary policy in the Euro Area: one size fits none.
Eurostat published, coincidentally on July 17, data on EU energy imports from Russia. 39% of EU imports of natural gas and 34% of oil imports come from Russia. Also (including re-exports):
For seven Member States (Bulgaria, Czech Republic, Finland, Hungary, Lithuania, Poland, Slovakia) more than 75 % of their petroleum oils imports came from Russia. Twelve countries (Austria, Bulgaria, Czech Republic, Estonia, Finland, Hungary, Latvia, Lithuania, Poland, Romania, Slovakia, Slovenia), imported more than 75 % of total national imports of natural gas from Russia.
Price increases caused a surge in imports from Russia from 80 million euro in 2005 to 140 billion in 2013.
from Mark Weisbrot
Back in 1998, when middle-income Asian countries were hard hit by big capital outflows, there was an effort – joined by China, Japan, Taiwan and other countries—to put together an Asian Monetary Fund to offer balance of payments support. Washington vetoed the idea, insisting that all assistance had to go through the IMF. The result was a mess, including an unnecessarily deep regional recession, as the IMF failed to act as a lender of last resort, and then attached all kinds of harmful and unnecessary conditions to its lending.
But the world has changed a lot in the past 15 years. Last week the BRICS countries (Brazil, Russia, China, India, and South Africa) decided to form the Contingent Reserve Arrangement (CRA) and the New Development Bank (NDB), and the United States will not have a veto this time. These new institutions have the potential to become a game changer for the world economy.
The western media coverage of these events has been mostly dismissive, but that primarily reflects the concerns of Washington and its allies. They have had unchallenged sway over the decision-making institutions of global financial governance for 70 years, and the last thing they want to see is competition. But competition is exactly what the world needs here. Read more…
from Thomas Palley
There is an old story about a policeman who sees a drunk looking for something under a streetlight and asks what he is looking for. The drunk replies he has lost his car keys and the policeman joins in the search. A few minutes later the policeman asks if he is sure he lost them here and the drunk replies “No, I lost them in the park.” The policeman then asks “So why are you looking here?” to which the drunk replies “Because this is where the light is.”That story has much relevance for the economics profession’s approach to the Phillips curve.
The question triggering the discussion is can Phillips curve (PC) theory account for inflation and the non-emergence of sustained deflation in the Great Recession? Four approaches are considered: (1) the original PC without inflation expectations; (2) the adaptive inflation expectations augmented PC; (3) the rational inflation expectations new classical vertical PC; and (4) the new Keynesian “sluggish price adjustment” PC that embeds a mix of lagged inflation and forward looking rational inflation expectations. The conclusion seems to be the original PC does best with regard to recent inflation experience but, of course, it fails with regard to past experience.
There is another obvious explanation that has been over-looked by mainstream economists for nearly forty years because they have preferred to keep looking under the “lamppost” of their conventional constructions. That alternative explanation rests on a combination of downward nominal wage rigidity plus incomplete incorporation of inflation expectations in a multi-sector economy. Read more…
from Lars Syll
Last year Dirk Ehnts had an interesting post up where he took Paul Krugman to task for still being married to the loanable funds theory.
Unfortunately this is not an exception among “New Keynesian” economists.
Neglecting anything resembling a real-world finance system, Greg Mankiw — in the 8th edition of his intermediate textbook Macroeconomics — has appended a new chapter to the other nineteen chapters where finance more or less is equated to the neoclassical thought-construction of a “market for loanable funds.”
On the subject of financial crises he admits that
perhaps we should view speculative excess and its ramifications as an inherent feature of market economies … but preventing them entirely may be too much to ask given our current knowledge.
This is of course self-evident for all of us who understand that both ontologically and epistemologically founded uncertainty makes any such hopes totally unfounded. But it’s rather odd to read this in a book that bases its models on assumptions of rational expectations, representative actors and dynamically stochastic general equilibrium – assumptions that convey the view that markets – give or take a few rigidities and menu costs – are efficient! For being one of many neoclassical economists so proud of their (unreal, yes, but) consistent models, Mankiw here certainly is flagrantly inconsistent! Read more…
from Peter Radford
There is no point is bashing away at old economics or old economists. They are what they are. And it isn’t as if there is a compelling alternative to orthodoxy, if there were we wouldn’t be in this never ending and unproductive cycle of throwing stones at the establishment.
I think we all ought take comfort in the fact that a few decades ago things were so much different. The generation that trashed economics was on the rise and on the outside once. There are great reputations to be made fixing and updating the entire enterprise. In a business where incentives are so lauded, I imagine the incentive of fame should bring a savior soon enough.
Meanwhile it was sobering to read:
“The modern industrial system is no longer essentially a market system. It is planned in part by large firms and in part by the modern state. It must be planned, because modern technology and organization can flourish only in a stable environment, a condition the market cannot satisfy.” – J.K. Galbraith, “The New Industrial State”
Looking back at the state-of-the-art analysis concerning business organization in the first post-war decades we find a picture so discordant with modern business theory that it is hard to connect the two. There was a distinct feeling back then that the complexity of a modern economy would overwhelm the ability of the simple structures of a market and that long and complicated production processes therefore needed to be set within a controlled environment. That environment being a bureaucratic and centrally planned “meso-economy” called a business firm. Read more…
Recently, Eurostat published data on European house prices. Prices are, on average, falling with 0,3%, Year on year. Which is a good thing, especially for the young ones. More good news: deflated Euro area and European Union house prices indexes are however about 12% lower than at the height of the bubble, in 2007. I’m of the opinion that young people establishing a family should not be burdened with paying high tributes to the middle aged and elderly or to banks reaping unearned rent incomes! Houses have to be and can be affordable (and available) and lower prices in combination with lower interest and, in a number of countries, a limited increase in nominal wage rates did lead to increased affordability (the idea that low interest rates necessarily have to lead to high house prices is bad economics!). Good.
However – signs of exuberance are returning. These increases are, as such, maybe not yet alarming. But whenever they are associated with rising private debt – we’re in trouble.
And part of these increases are alarming. Prices in London (which is about ten times as large as Estonia) ewcwntly increased with 20% YoY, a record. In Dublin (where prices declined during the crisis, but not to anything like a low level), prices are increasing at a double-digit rate, too. The 2013 the German rate of change seems to be about 10% (Eurostat does not have recent German data), though the most recent data seem to indicate that this rise has abated. Read more…
from Lars Syll
Assumptions in scientific theories/models are often based on (mathematical) tractability (and so necessarily simplifying) and used for more or less self-evidently necessary theoretical consistency reasons. But one should also remember that assumptions are selected for a specific purpose, and so the arguments (in economics shamelessly often totally non-existent) put forward for having selected a specific set of assumptions, have to be judged against that background to check if they are warranted.
This, however, only shrinks the assumptions set minimally – it is still necessary to decide on which assumptions are innocuous and which are harmful, and what constitutes interesting/important assumptions from an ontological & epistemological point of view (explanation, understanding, prediction). Especially so if you intend to refer your theories/models to a specific target system — preferably the real world. To do this one should start by applying a Real World Filter in the form of a Smell Test: Is the theory/model reasonable given what we know about the real world? If not, why should we care about it? If not – we shouldn’t apply it (remember time is limited and economics is a science on scarcity & optimization …)
from Asad Zaman
Polanyi’s book is widely recognized as among the most deeply original and seminal analyses of the origins and effects of capitalism. In a previous post, I provided a brief summary of the main arguments of Polanyi. Polanyi does not explicitly discuss methodology, but his analysis is based on a methodology radically different from any currently in use in social sciences. This methodology could provide the basis for an entirely new approach to the subject matter. In my paper entitled The Methodology of Polanyi’s Great Transformation, I have articulated central elements of this methodology by showing how Polanyi uses them in his book. I provide a brief summary of the main ideas of the paper here.
Firstly note the Polanyi operates at a meta-theoretical level. The work analyzes emergence of theories as attempts to understand historical experience. This immediately leads to a historical context sensitive analysis, as opposed to current a-historical methods dominant in economics. In what is an extremely interesting twist, Polanyi argues that theories formulated by contemporaries to understand their experience are often wrong. Nonetheless, these theories are used to understand and shape responses to historical circumstances. This mechanism provides substantial room for human agency in influencing history. The key elements of Polanyi’s methodology, extracted from how he has utilized them in his book, are listed as follows: Read more…
Dear mrs. Sekera,
thank you for your very clear and insightful blogpost on this blog about economists and their distorted concept of public goods. But I’m afraid you’re way to positive about at least mainstream macro models and ‘public goods’. Very often – these models do not have a concept of public goods.
Too often, neoclassical ‘macro’ models do not have any logical space for public goods or ‘government consumption’ (i.e. consumption by households produced or financed by the government, like education) at all. Which is daft. My country – the Netherlands - would not even exist without coastal defences. And believe me – we learned about the necessity of well maintained large-scale public coastal defences the
New Orleans hard way (Knibbe, forthcoming). But in the default DSGE (Dynamic Stochastic General Equilibrium) models, public expenditure on coastal defences is, by definition, ‘wasteful’ – as it’s government expenditure. You are attacking Samuelson – but Samuelson in fact crafted his ideas – which as you state do have basic flaws – to combat exactly this kind of thinking! But he failed. This kind of thinking is alive and kicking. And kicking hard – as the over five million unemployed in Spain can testify. Even incorporating the ideas of Samuelson into these models would be a huge step forward – these at least consider the existence of public goods and services.
The Oosterscheldekering coastal defence. Wasteful?
There are endeavours to change this. But the very fact that these articles have to state, in an explicit way, their difference with ‘standard’ neoclassical ‘macro’ by introducing the notion that public expenditure can serve a purpose shows the ‘state of the mainstream art': Read more…
from June Sekera
A year ago last May, the Real World Economics Review blog published my post, “Why Aren’t We Talking About Public Goods?” In that article I argued that we need to revive and reframe the concept of public goods. A concept of public goods is immensely important because:
- The absence of a widely-held, constructive idea of public goods in public discourse denies citizens the ability to have an informed conversation, or to make informed decisions, about things that matter mightily to the quality of their lives and their communities.
- Its absence robs public policy makers, leaders and managers of the concept that is most central to the reason for their being.
- The current economics definition of public goods feeds and supports the marketization and privatization of government, and the consequent undermining of governments’ ability to operate.
Since last May I have met with economists and other social scientists across the US and in the UK and have been in discussion with people responding to my post from several other countries. I have also been conducting further research.
In this post I summarize the results of my discussions and findings to date and offer for consideration some criteria for a possible “instrumental” definition of public goods. Ultimately, an instrumental definition of public goods must be accompanied by a concordant theory of non-market production in the public economy. Both are needed to ground an improved theory and practice of governance.
1. The Existing Definition and Its Inadequacies Read more…
from Lars Syll
The other day yours truly wrote re Krugman‘s dangerous neglect of methodological reflection:
The financial crisis of 2007-08 and its aftermath definitely shows that something has gone terribly wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable … Modern mainstream macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ … Mainstream macroeconomists … want to be able to use their hammer. They decide to pretend that the world looks like a nail and that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption–and the ensuing results are financial crises and economic havoc.
Now Brad DeLong earlier today commented on my critique:
Suppose we decide that we are no longer going to:
Pretend that agents — or economists — know the data-generating process…
Recognize that people are not terribly committed to Bayesianism -– that they do not model probabilities as if they have well-defined priors and all there is is risk…
What do we then do –- what kind of economic arguments do we make–once we have made those decisions?
“What do we then do?” The despair heard in the question reminds me of the first time I met Phil Mirowski. It was twenty years ago, and he had been invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?” Read more…