Where does inflation hide?

February 22, 2018 19 comments

from Herman Daly

The talking heads on the media explain the recent fall in the stock market as follows:

A fall in unemployment leads to a tight labor market and the prospect of wage increases; wage increase leads to threat of inflation; which leads the Fed to likely raise interest rates; which would lead to less borrowing, and to less investment in stocks, and consequently to an expected fall in stock prices. Therefore investors (speculators) rush to sell before the expected fall in stock prices happens, bringing about the very fall expected. So the implicit conclusion is that rising wages of the bottom 90% are bad for “the economy”, while an increase in the unearned incomes (lightly taxed capital gains) of the top 10% is good for “the economy”. The financial news readers of the corporate media avoid making that grotesque conclusion explicit, but it is implicit in their explanation.

A wage increase, in addition to cutting into profits, is considered inflationary, and that leads the Fed to raise interest rates and choke off the new money feeding the stock market boom and related growth euphoria. But higher interest rates serve other functions, most notably to keep capital from being wasted on uneconomic projects that are financially lucrative only at zero or negative interest rates. Furthermore, positive interest rates reward savers, provide for retirement and emergencies, and even reduce the inflationary effect of consumer spending.  Read more…

What, us worry?

February 21, 2018 8 comments

David Ruccio


Ed Wolff is right:

For the vast majority of Americans, fluctuations in the stock market have relatively little effect on their wealth, or well-being, for that matter.

Read more…

The debate continues in the same absurd, polarized and simplified form.  

February 20, 2018 5 comments

from Neva Goodwin 

One of the outstanding features of the time in which we live is the terrifying prospect of global climate change, regarding which it has been said that contemporary humankind is suffering from “Pre-Traumatic Stress Disorder”. Whether we squarely face what this will likely mean for the coming years, or whether we simply can’t bear to look at the facts, it is getting ever harder to avoid the gut-knowledge that the world is rapidly becoming markedly less beautiful, rich and generous to its human inhabitants. Tens of thousands of species disappear forever every year. Large coastal land areas will be submerged; diseases will multiply and spread; food from the oceans and the climate-stressed fields will be scarce; fresh water will be expensive or unobtainable for ever more millions of people; environmental refugees will swell the ranks of unwelcome migrants; and armed conflicts will reach many people who had assumed they were safe.

Armed fortress living will be increasingly common among the rich, and will doubtless create some areas of relative security, but the people inside will be their own prisoners. They will find it difficult to visit the beautiful natural areas in the United States, or the cultural jewels of other continents. Many of these cultural jewels are already being sacked in the raging conflicts of the Middle East and elsewhere; many of the world’s natural beauties are already eroding under pressure from climate change – as well as from actors in the market economy. The rich are not immune to pre-traumatic stress, as this century heads for various forms of catastrophe; their awareness and response will be important for any hope we may have for a constructive response to the threats we face. An indicator of awareness is a comment by the investor, Seth Klarman, warning that the Trump administration could lead to a major stock market correction and “global angst” among the investor class. But some of that angst is already translating into escapist survivalism among those who can afford to buy land in New Zealand, or build bunkers out of former missile sites in the U.S.. The work of Dr Richard Rockefeller, to whom this piece is dedicated, is an example of a more responsible kind of reaction among the one percent.  Read more…

The future — something we know very little about

February 20, 2018 18 comments

from Lars Syll

All these pretty, polite techniques, made for a well-panelled Board Room and a nicely regulated market, are liable to collapse. At all times the vague panic fears and equally vague and unreasoned hopes are not really lulled, and lie but a little way below the surface.

check-your-assumptionsPerhaps the reader feels that this general, philosophical disquisition on the behavior of mankind is somewhat remote from the economic theory under discussion. But I think not. Tho this is how we behave in the marketplace, the theory we devise in the study of how we behave in the market place should not itself submit to market-place idols. I accuse the classical economic theory of being itself one of these pretty, polite techniques which tries to deal with the present by abstracting from the fact that we know very little about the future.

I dare say that a classical economist would readily admit this. But, even so, I think he has overlooked the precise nature of the difference which his abstraction makes between theory and practice, and the character of the fallacies into which he is likely to be led.

John Maynard Keynes

Who’s Afraid of John Maynard Keynes?

February 19, 2018 7 comments

from James Galbraith and the Journal of Behavioral and Experimental Economics

Paul Davidson, Who’s Afraid of John Maynard Keynes? Challenging Economic Governance in an Age of Growing Inequality, 2017, Palgrave-MacMillan

Paul Davidson, in his ninth decade, has produced a crisp and clear exegesis of essential Keynesian ideas and the critical failures of so-called mainstream economic thought. The most critical flaw lies in the treatment of time. Rooted in ancient ideas of equilibrium, harmony and social balance, mainstream economics treats the future as an extrapolation of the past, predictable except for random errors, which are called “risk.” This as Davidson insists is incurably incorrect; there is uncertainty and at any time financial markets are prone to collapse in a failed flight to safety, which drains liquidity and deprives both financial and physical assets of their market value.

From this it follows that in the social sphere any model that projects the future from the past will fail from time to time. The models work so long as things do not change! As for change, for turning points, they nevertheless occur. And that those who believe most in the model will prepare the least and be hurt the worst. And yet, for the economy to function, “belief” in the model – at the least, conditional belief sufficient to motivate consumption and investment – appears essential. Without it, the private economy cannot prosper. Living in a house of cards is better than having no house at all.  Read more…

Economics education — teaching cohorts after cohorts of students useless theories

February 18, 2018 6 comments

from Lars Syll

Nowadays there is almost no place whatsoever in economics education for courses in the history of economic thought and economic methodology.

This is deeply worrying.

A science that doesn’t self-reflect and asks important methodological and science-theoretical questions about the own activity, is a science in dire straits.

How did we end up in this sad state?

Philip Mirowski gives the following answer:

philAfter a brief flirtation in the 1960s and 1970s, the grandees of the economics profession took it upon themselves to express openly their disdain and revulsion for the types of self-reflection practiced by ‘methodologists’ and historians of economics, and to go out of their way to prevent those so inclined from occupying any tenured foothold in reputable economics departments. It was perhaps no coincidence that history and philosophy were the areas where one found the greatest concentrations of skeptics concerning the shape and substance of the post-war American economic orthodoxy. High-ranking economics journals, such as the American Economic Review, the Quarterly Journal of Economics and the Journal of Political Economy, declared that they would cease publication of any articles whatsoever in the area, after a prior history of acceptance.

Once this policy was put in place, and then algorithmic journal rankings were used to deny hiring and promotion at the commanding heights of economics to those with methodological leanings. Consequently, the grey-beards summarily expelled both philosophy and history from the graduate economics curriculum; and then, they chased it out of the undergraduate curriculum as well. This latter exile was the bitterest, if only because many undergraduates often want to ask why the profession believes what it does, and hear others debate the answers, since their own allegiances are still in the process of being formed. The rationale tendered to repress this demand was that the students needed still more mathematics preparation, more statistics and more tutelage in ‘theory’, which meant in practice a boot camp regimen consisting of endless working of problem sets, problem sets and more problem sets, until the poor tyros were so dizzy they did not have the spunk left to interrogate the masses of journal articles they had struggled to absorb.

Read more…

The third axiom of neoclassical economics: methodological equilibration

February 18, 2018 4 comments

from Christian Arnsperger and Yanis Varoufakis

The third feature of neoclassical economics is, on our account, the axiomatic imposition of equilibrium. The point here is that, even after methodological individualism turned into methodological instrumentalism, prediction at the macro (or social) level was seldom forthcoming. Determinacy required something more: it required that agents’ instrumental behaviour is coordinated in a manner that aggregate behaviour becomes sufficiently regular to give rise to solid predictions. Thus, neoclassical theoretical exercises begin by postulating the agents’ utility functions, specifying their constraints, and stating their ‘information’ or ‘belief’. Then, and here is the crux, they pose the standard question: “What behaviour should we expect in equilibrium?” The question of whether an equilibrium is likely, let alone probable, or how it might materialise, is treated as an optional extra; one that is never central to the neoclassical project.

The reason for the axiomatic imposition of equilibrium is simple: it could not be otherwise! By this we mean that neoclassicism cannot demonstrate that equilibrium would emerge as a natural consequence of agents’ instrumentally rational choices. Thus, the second best methodological alternative for the neoclassical theorist is to presume that behaviour hovers around some analytically-discovered equilibrium and then ask questions on the likelihood that, once at that equilibrium, the ‘system’ has a propensity to stick around or drift away (what is known as ‘stability analysis’).  Read more…

The problem of extrapolation

February 17, 2018 43 comments

from Lars Syll

steelThere are two basic challenges that confront any account of extrapolation that seeks to resolve the shortcomings of simple induction. One challenge, which I call extrapolator’s circle, arises from the fact that extrapolation is worthwhile only when there are important limitations on what one can learn about the target by studying it directly. The challenge, then, is to explain how the suitability of the model as a basis for extrapolation can be established given only limited, partial information about the target … The second challenge is a direct consequence of the heterogeneity of populations studied in biology and social sciences. Because of this heterogeneity, it is inevitable there will be causally relevant differences between the model and the target population.

In economics — as a rule — we can’t experiment on the real-world target directly.  To experiment, economists therefore standardly construct ‘surrogate’ models and perform ‘experiments’ on them. To be of interest to us, these surrogate models have to be shown to be relevantly ‘similar’ to the real-world target, so that knowledge from the model can be exported to the real-world target. The fundamental problem highlighted by Steel is that this ‘bridging’ is deeply problematic​ — to show that what is true of the model is also true of the real-world target, we have to know what is true of the target, but to know what is true of the target we have to know that we have a good model  …   Read more…

Polanyi’s six points

February 17, 2018 3 comments

from Asad Zaman

The analysis of Polanyi’s Great Transformation can be summarized in the six points listed below.

1: All societies face the economic task of producing and providing for all members of society. Modern market societies are unique in assigning this responsibility to the marketplace, thereby creating entitlements to production for those with wealth, and depriving the poor of entitlement to food. All traditional societies have used non-market mechanisms based on cooperation and social responsibility to provide for members who cannot take care of their own needs. It is only in a market society that education, health, housing, and social welfare services are only available to those who can pay for it.

2: Market mechanisms for providing goods to members conflict with other social mechanisms and are harmful to society. They emerged to central prominence in Europe after a protracted battle, which was won by markets over society due to certain historical circumstances peculiar to Europe. The rise of markets caused tremendous damage to society, which continues to this day. The replacement of key mechanisms which govern social relations, with those compatible with market mechanisms, was traumatic to human values. Land, labour and money are crucial to the efficient functioning of a market economy. Market societies convert these into commodities causing tremendous damage. This involves (A) changing a nurturing and symbiotic relationship with Mother Earth into a commercial one of exploiting nature, (B) Changing relationships based on trust, intimacy and lifetime commitments into short term impersonal commercial transactions, and (C) Turning human lives into saleable commodities in order to create a labor market.  Read more…

The second axiom of neoclassical economics: methodological instrumentalism

February 17, 2018 2 comments

from Christian Arnsperger and Yanis Varoufakis

We label the second feature of neoclassical economics methodological instrumentalism: all behaviour is preference-driven or, more precisely, it is to be understood as a means for maximising preference-satisfaction.[1] Preference is given, current, fully determining, and strictly separate from both belief (which simply helps the agent predict uncertain future outcomes) and from the means employed. Everything we do and say is instrumental to preference-satisfaction so much so that there is no longer any philosophical room for questioning whether the agent will act on her preferences. In effect, neoclassical theory is a narrow version of consequentialism in which the only consequence that matters is the extent to which an homogeneous index of preference-satisfaction is maximised.[2]

Methodological instrumentalism’s roots are traceable in David Hume’s Treatise of Human Nature (1739/40) in which the Scottish philosopher famously divided the human decision making process in three distinct modules: Passions, Belief and Reason. Passions provide the destination, Reason slavishly steers a course that attempts to get us there, drawing upon a given set of Beliefs regarding the external constraints and the likely consequences of alternative actions. It is not difficult to see the lineage with standard microeconomics: the person is defined as a bundle of preferences, her beliefs reduce to a set of subjective probability density functions, which help convert her preferences into expected utilities, and, lastly, her Reason is the cold-hearted optimiser whose authority does not extend beyond maximising these uilities. However, it is a mistake to think that Hume would have approved. For his Passions are too unruly to fit neatly in some ordinal or expected utility function. It took the combined efforts of Jeremy Bentham and the late 19th Century neoclassicists to tame the Passions sufficiently before they could initially be reduced to a unidimensional index of pleasure before turning into smooth, double differentiable utility functions.  Read more…

Modern macro-economists: money is not ‘neutral’. Bordo, Meissner, Sufi and Mian do a good job.

February 16, 2018 1 comment

Hardcore neoclassical economist John Taylor has edited a new handbook of macro-economics. The good news: the sands are shifting. After 2008, more attention has been paid to the obvious fact that we’re living in a monetary world. Guess what: it  turns out that money is non-neutral after all. Two examples (summaries below):

(A) Bordo and Meissner claim that whenever a country has a large banking sector it has a choice, during a financial crisis. It can bail out the banks or it can try to mitigate the crisis and prevent unemployment to increase to extreme levels.

And (B): Mian and Sufi’s work implicates that the ‘representative consumer’ is bogus: differences between renters and house owners in combination with data on indebtedness and house price booms and busts explain a lot of the severity of the 2008 crisis.

Bordo and Meissner:

(A) Interconnections between banking crises and fiscal crises have a long history. We document the long-run evolution from classic banking panics toward modern banking crises where financial guarantees are associated with crisis resolution. Recent crises feature a feedback loop between bank guarantees and bank holdings of local sovereign debt thereby linking financial to fiscal crises. Earlier examples include the crises in Chile (early 1980s), Japan (1990), Sweden and Finland (1991), and the Asian crisis (1997). Read more…

The first axiom of neoclassical economics: methodological individualism

February 16, 2018 1 comment

from Christian Arnsperger and Yanis Varoufakis

Unsophisticated critics often identify economic neoclassicism with models in which all agents are perfectly informed. Or fully instrumentally rational. Or excruciatingly selfish. Defining neoclassicism in this manner would perhaps be apt in the 1950s but, nowadays, it leaves almost all of modern neoclassical theory out of the definition, therefore strengthening the mainstream’s rejoinders. Indeed, the last thirty years of neoclassical economics have been marked by an explosion of models in which economic actors are imperfectly informed, some times other-regarding, frequently irrational (or boundedly rational, as the current jargon would have it) etc. In short, Homo Economicus has evolved to resemble us more.

None of these brilliant theoretical advances have, however, dislodged the neoclassical vessel from its methodological anchorage. Neoclassical theory retains its roots firmly within liberal individualist social science. The method is still unbendingly of the analytic-synthetic type: the socio-economic phenomenon under scrutiny is to be analysed by focusing on the individuals whose actions brought it about; understanding fully their ‘workings’ at the individual level; and, finally, synthesising the knowledge derived at the individual level in order to understand the complex social phenomenon at hand. In short, neoclassical theory follows the watchmaker’s method who, faced with a strange watch, studies its function by focusing on understanding, initially, the function of each of its cogs and wheels. To the neoclassical economist, the latter are the individual agents who are to be studied, like the watchmaker’ cogs and  wheels, independently of the social whole their actions help bring about.  Read more…

Bitcoin, efficient markets, and efficient financial sectors

February 16, 2018 5 comments

from Dean Baker

John Quiggin had a good piece in the NYT, pointing out how the sky-high valuations of Bitcoin undermine the efficient market hypothesis that plays a central role in much economic theory. In the strong form, we can count on markets to direct capital to its best possible uses. This means that government interventions of various types will lead to a less efficient allocation of capital and therefore slower economic growth.

Quiggin points out that this view is hard to reconcile with the dot-com bubble of the late 1990s and the housing bubble of the last decade. Massive amounts of capital were clearly directed towards poor uses in the form of companies that would never make a profit in the 1990s and houses that never should have been built in the last decade.

But Bitcoin takes this a step further. Bitcoin has no use. It makes no sense as currency and it is almost impossible to envision a scenario in which it would in the future. It has no aesthetic value, like a great painting or even a colorful stock certificate. It is literally nothing and worth nothing. Nonetheless, at its peak, the capitalization of Bitcoin was more than $300 billion. This suggests some heavy-duty inefficiency in the market.  Read more…

The arrow of time in a non-ergodic world

February 16, 2018 8 comments

from Lars Syll

an end of certaintyFor the vast majority of scientists, thermodynamics had to be limited strictly to equilibrium. That was the opinion of J. Willard Gibbs, as well as of Gilbert N. Lewis. For them, irreversibility associated with unidirectional time was anathema …

I myself experienced this type of hostility in 1946 … After I had presented my own lecture on irreversible thermodynamics, the greatest expert in the field of thermodynamics made the following comment: ‘I am astonished that this young man is so interested in nonequilibrium physics. Irreversible processes are transient. Why not wait and study equilibrium as everyone else does?’ I was so amazed at this response that I did not have the presence of mind to answer: ‘But we are all transient. Is it not natural to be interested in our common human condition?’

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages — and hence in any relevant sense timeless — is not a sensible way for dealing with the kind of genuine uncertainty that permeates real-world economies.

Ergodicity and the all-important difference between time averages and ensemble averages are difficult concepts — so let me try to explain the meaning of these concepts by means of a couple of simple examples.

Read more…

Jeff Bezos’ quest to find America’s stupidest mayor

February 14, 2018 2 comments

from Dean Baker

With the Super Bowl now behind us, America eagerly awaits the next big event: the announcement of the winner in Jeff Bezos’ contest to determine which combination of state and local governments is prepared to give him the most money to be home to Amazon’s new headquarters.

Narrowed from a field of more than 200 applications, 20 finalists now wait with baited breath for the news, expected sometime later this year. But while the politicians who join Bezos for the photo op are going to be treated as big winners, it is likely that the taxpayers they represent will be big losers, dishing out more to Amazon than they will ever get back in benefits.

Bezos’ “HQ2” contest is simply an extension of a game that corporations have been playing with state and local governments for the last four decades. Rather than making location decisions based on standard economic factors, like the availability of a skilled labor force, quality infrastructure, land prices and tax rates, they have persuaded governments to bid against each other with company-specific benefit packages ― usually a basket of tax concessions and sometimes even including commitments to build company-specific infrastructure like port facilities or roads.

However, most research indicates that the cost to state and local governments for these subsidies typically outweighs the benefits in terms of employment and tax revenue, including in the cases of Amazon’s growing network of fulfillment centersRead more…

We’re #2! – Financial Secrecy Index 2018

February 14, 2018 1 comment

from David Ruccio


According to the Tax Justice Network, the United States ranks second in the 2018 Financial Secrecy Index. This is based on a secrecy score of 59.8, which is practically unchanged from 2015. The only country ahead of the United States is Switzerland, with a secrecy score of 76. The rise of the United States continues a long-term trend, as the country was one of the few to increase their secrecy score in the 2015 index.   Read more…

The limits of probabilistic reasoning

February 13, 2018 9 comments

from Lars Syll

Probabilistic reasoning in science — especially Bayesianism — reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but, even granted this questionable reductionism, it’s not self-evident that rational agents really have to be probabilistically consistent. There is no strong warrant for believing so. Rather, there is strong evidence for us encountering huge problems if we let probabilistic reasoning become the dominant method for doing research in social sciences on problems that involve risk and uncertainty.

probIn many of the situations that are relevant to economics, one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind and that in those situations it is not possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10%. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1 if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to become unemployed and 90% to become employed.  Read more…

One way to protect democracy is to stop pushing policies that redistribute income upward.

February 13, 2018 15 comments

from Dean Baker

That one is apparently not on the agenda, at least according to Amanda Taub’s NYT “The Interpreter” piece. The piece notes the declining support for center right and center left parties in most western democracies. While it notes that people feel unrepresented by these parties, it never states the obvious, these parties have consistently supported monetary, fiscal, trade, and intellectual property policies that redistribute an ever-larger share of income to people like Bill Gates and Robert Rubin.

It should not be surprising that most of the public is not enthralled with this outcome and the parties that promote it. And yes, there are alternatives, as I point out in my (free) book, Rigged: How Globalization and the Rules of the Modern Economy Were Structured to Make the Rich Richer

The semantics of mathematical equilibrium theory

February 12, 2018 41 comments

from Michael Hudson

            If mathematics is deemed to be the new language of economics, it is a language with a thought structure whose semantics, syntax and vocabulary shape its user’s perceptions. There are many ways in which to think, and many forms in which mathematical ideas may be expressed. Equilibrium theory, for example, may specify the conditions in which an economy’s public and private-sector debts may be paid. But what happens when not all these debts can be paid? Formulating economic problems in the language of linear programming has the advantage of enabling one to reason in terms of linear inequality, e.g., to think of the economy’s debt overhead as being greater than, equal to, or less than its capacity to pay.

An array of mathematical modes of expression thus is available to the economist. Equilibrium-based entropy theory views the economy as a thermodynamic system characterized by what systems analysts call negative feedback. Chaos theories are able to cope with the phenomena of increasing returns and compound interest, which are best analyzed in terms of positive feedback and intersecting trends. Points of intersection imply that something has to give and the solution must come politically from outside the economic system as such.

What determines which kind of mathematical language will be used? At first glance it may seem that if much of today’s mathematical economics has become irrelevant, it is because of a fairly innocent reason: it has become a kind of art for art’s sake, prone to self-indulgent game theory. But almost every economic game serves to support an economic policy.  Read more…

New Classical macroeconomists — people having their heads fuddled with nonsense

February 11, 2018 13 comments

from Lars Syll

McNees documented the radical break between the 1960s and 1970s. The question is: what are the possible responses that economists and economics can make to those events?

robert_solow4One possible response is that of Professors Lucas and Sargent. They describe what happened in the 1970s in a very strong way with a polemical vocabulary reminiscent of Spiro Agnew. Let me quote some phrases that I culled from the paper: “wildly incorrect,” “fundamentally flawed,” “wreckage,” “failure,” “fatal,” “of no value,” “dire implications,” “failure on a grand scale,” spectacular recent failure,” “no hope” … I think that Professors Lucas and Sargent really seem to be serious in what they say, and in turn they have a proposal for constructive research that I find hard to talk about sympathetically. They call it equilibrium business cycle theory, and they say very firmly that it is based on two terribly important postulates — optimizing behavior and perpetual market clearing. When you read closely, they seem to regard the postulate of optimizing behavior as self-evident and the postulate of market-clearing behavior as essentially meaningless. I think they are too optimistic, since the one that they think is self-evident I regard as meaningless and the one that they think is meaningless, I regard as false. The assumption that everyone optimizes implies only weak and uninteresting consistency conditions on their behavior. Anything useful has to come from knowing what they optimize, and what constraints they perceive. Lucas and Sargent’s casual assumptions have no special claim to attention …   Read more…