Archive

Archive for the ‘Uncategorized’ Category

What about the white working-class?

September 27, 2016 Leave a comment

from David Ruccio

We can thank Donald Trump for one thing: he’s put the white working-class on the political map.*

In recent months, we’ve seen a veritable flood of articles, polls, and surveys about the characteristics, conditions, and concerns of white working-class voters—all with the premise that the white working-class is fundamentally different from the rest of non-working-class, non-white Americans.

But why are the members of the white working-class attracting so much attention? My sense is, they both represent a threat—because many plan to vote for Trump and, more generally, reject much elite opinion (including, but not limited, to Trump)—and, at the same time, are assumed to be a dying breed—as the U.S. working-class becomes more female, more racially and ethnically diverse, and increasingly employed in non-manufacturing jobs. So, the argument goes, the white working-class, supposedly radically different from the rest of Americans, is motivated by fear and resentment occasioned by a loss of identity and standing.**

Hence the curiosity—best exemplified by a new CNN/Kaiser Family Foundation [ht: ja] poll, about what white working-class Americans think. The results of the poll are interesting, if only because on many issues (aside from support for or opposition to Trump and immigration) the white working-class holds views that are not all that different from other whites, blacks, and Hispanics.  Read more…

Inequality: the very long run

September 26, 2016 Leave a comment

natureIn Nature Branko Milanovic published an interesting article about (very) long run cycles in inequality, using among other metrics the wage-rent quotiënt as an indicator of inequality (wage income relative to income of landowners). See the first graph. I can actually add a little to this. I’ve extended the Dutch (Frisian) wage/rent series published earlier on this blog backward to 1697 and forward to 1862 (below). Read more…

Liberal trickledown economics

September 26, 2016 5 comments

from David Ruccio

Has the policy consensus on economics fundamentally changed in recent years?

To read Mike Konczal it has. I can’t say I’m convinced. While some of the details may have changed, I still think we’re talking about different—liberal and conservative—versions of the same old trickledown economics.

But first Konczal’s argument. He begins with a pretty good summary of the policy consensus before the crash of 2007-08:

Before the crash, complacent Democrats, whatever their disagreements with their Republican peers, tended to agree with them that the economy was largely self-correcting. The Federal Reserve possessed the tools to nudge the economy to full employment, they thought. What’s more, government programs, while sometimes a necessary evil, were likely to be an inefficient drag compared with the private market. Inequality was something to worry about, sure, but hardly a crisis, and policies were correspondingly timid and market-focused.

And it’s true: the debate about the conditions and consequences of the crash—after Occupy Wall Street, in the midst of the Second Great Depression—challenged that consensus, by focusing much more attention on inequality and disrupting the idea that the growing gap between rich and poor is somehow natural and necessary and by calling into question the idea that capitalist markets are self-stabilizing and full employment can be guaranteed by relying on markets.  Read more…

Stiglitz and the demise of marginal productivity theory

September 25, 2016 5 comments

from Lars Syll

Today the trend to greater equality of incomes which characterised the postwar period has been reversed. Inequality is now rising rapidly. Contrary to the rising-tide hypothesis, the rising tide has only lifted the large yachts, and many of the smaller boats have been left dashed on the rocks. This is partly because the extraordinary growth in top incomes has coincided with an economic slowdown.

economic-mythThe trickle-down notion— along with its theoretical justification, marginal productivity theory— needs urgent rethinking. That theory attempts both to explain inequality— why it occurs— and to justify it— why it would be beneficial for the economy as a whole. This essay looks critically at both claims. It argues in favour of alternative explanations of inequality, with particular reference to the theory of rent-seeking and to the influence of institutional and political factors, which have shaped labour markets and patterns of remuneration. And it shows that, far from being either necessary or good for economic growth, excessive inequality tends to lead to weaker economic performance. In light of this, it argues for a range of policies that would increase both equity and economic well-being.

Joseph Stiglitz

Mainstream economics textbooks usually refer to the interrelationship between technological development and education as the main causal force behind increased inequality. If the educational system (supply) develops at the same pace as technology (demand), there should be no increase, ceteris paribus, in the ratio between high-income (highly educated) groups and low-income (low education) groups. In the race between technology and education, the proliferation of skilled-biased technological change has, however, allegedly increased the premium for the highly educated group.  Read more…

How unemployment has been considered by mainstream macroeconomic models?

September 24, 2016 1 comment

from Maria Alejandra Madi

From the 1950s onwards, the macroeconomic models of the neoclassical synthesis, based a system of simultaneous equations, focused on the interaction between the market for goods and services and the money market in the context of a general equilibrium analysis. According to John Hicks (1904-1989),  in the general case, the capitalist economy is at full employment level of output.  The underlying employment theory is based on the demand and supply of labour in a competitive market. In fact, this neoclassical approach supposes that price adjustment market mechanisms could guarantee full employment.  In same specific cases, however, the general equilibrium implied by the IS-LM model could not necessarily correspond to a full employment level of output. This situation, called unemployment equilibrium, would be the result of market imperfections, such as rigid money wages, interest-inelastic investment demand, income-inelastic money demand, among others.

In the 1960s, mainstream macroeconomic models expanded the analysis of the negative correlation  between  inflation and unemployment. This correlation was  based on the conclusions drawn from an empirical study  -the Philips curve- about the negative relationship between the evolution of the rate of employment and the rate of variation of nominal wages in England at the turn of the 20th century.  The attempt to incorporate the Phillips curve (trade-off between inflation and unemployment) in the analysis of the labour market dynamics turned out to  put emphasis on the role of nominal wages in determining prices, and ultimately, on the demands of workers that put pressure on inflation.  read more 

Hold the champagne

September 23, 2016 1 comment

from David Ruccio

us-median

Last week, to judge by the commentary on the latest Census Bureau report, Income and Poverty in the United States: 2015 (pdf), you’d think the fountain of broadly shared economic prosperity had just been discovered.

Binyamin Appelbaum is a good example:   Read more…

Wren-Lewis trivializing Romer’s critique

September 23, 2016 3 comments

from Lars Syll

As yours truly wrote last week, there has been much discussion going on in the economics academia on Paul Romer’s recent critique of ‘modern’ macroeconomics.

Now Oxford professor Simon Wren-Lewis has a blog post up arguing that Romer’s critique is

ostrich-headunfair and wide of the mark in places … Paul’s discussion of real effects from monetary policy, and the insistence on productivity shocks as business cycle drivers, is pretty dated … Yet it took a long time for RBC models to be replaced by New Keynesian models, and you will still see RBC models around. Elements of the New Classical counter revolution of the 1980s still persist in some places … The impression Paul Romer’s article gives, might just have been true in a few years in the 1980s before New Keynesian theory arrived. Since the 1990s New Keynesian theory is now the orthodoxy, and is used by central banks around the world.

Now this rather unsuccessful attempt to disarm the real force of Romer’s critique should come as no surprise for anyone who has been following Wren-Lewis’ writings over the years.

In a recent paper — Unravelling the New Classical Counter Revolution — Wren-Lewis writes approvingly about all the ‘impressive’ theoretical insights New Classical economics has brought to macroeconomics:  Read more…

Phlogiston, the identification problem, and the state of macroeconomics

September 22, 2016 5 comments

from David Ruccio

The other day, I argued (as I have many times over the years) that contemporary mainstream macroeconomics is in a sorry state.

Mainstream macroeconomists didn’t predict the crash. They didn’t even include the possibility of such a crash within their theory or models. And they certainly didn’t know what to do once the crash occurred.

I’m certainly not the only one who is critical of the basic theory and models of contemporary mainstream macroeconomics. And, at least recently (and, one might say, finally), many of the other critics are themselves mainstream economists—such as MIT emeritus professor and former IMF chief economist Olivier Blanchard (pdf), who has noted that the models that are central to mainstream economic research—so-called dynamic stochastic general equilibrium models—are “seriously flawed.”

Now, one of the most mainstream of the mainstream, Paul Romer (pdf), soon to be chief economist at the World Bank, has taken aim at mainstream macroeconomics.* You can get a taste of the severity of his criticisms from the abstract:  Read more…

Why critique in economics is so important

September 21, 2016 14 comments

Lars Syll

Some of the economists who agree about the state of macro in private conversations will not say so in public. This is consistent with the explanation based on different prices. Yet some of them also discourage me from disagreeing openly, which calls for some other explanation.

un7gnnaThey may feel that they will pay a price too if they have to witness the unpleasant reaction that criticism of a revered leader provokes. There is no question that the emotions are intense. After I criticized a paper by Lucas, I had a chance encounter with someone who was so angry that at first he could not speak. Eventually, he told me, “You are killing Bob.”

But my sense is that the problem goes even deeper that avoidance. Several economists I know seem to have assimilated a norm that the post-real macroeconomists actively promote – that it is an extremely serious violation of some honor code for anyone to criticize openly a revered authority figure – and that neither facts that are false, nor predictions that are wrong, nor models that make no sense matter enough to worry about …

Science, and all the other research fields spawned by the enlightenment, survive by “turning the dial to zero” on these innate moral senses. Members cultivate the conviction that nothing is sacred and that authority should always be challenged … By rejecting any reliance on central authority, the members of a research field can coordinate their independent efforts only by maintaining an unwavering commitment to the pursuit of truth, established imperfectly, via the rough consensus that emerges from many independent assessments of publicly disclosed facts and logic; assessments that are made by people who honor clearly stated disagreement, who accept their own fallibility, and relish the chance to subvert any claim of authority, not to mention any claim of infallibility.

Paul Romer

This is part of why yours truly appreciate Romer’s article, and even find it ‘brave.’ Everyone knows what he says is true, but few have the courage to openly speak and write about it. The ‘honour code’ in academia certainly needs revision.  Read more…

Mind the gaps: compensation and productivity (3 graphs)

September 21, 2016 Leave a comment

from David Ruccio

es1616fig1_20160810091213

According to the norms of both neoclassical economic theory and capitalism itself, workers’ wages should increase at roughly the same rate as their productivity.* Clearly, in recent years they have not.  Read more…

The anniversary of Lehman and men who don’t work

September 21, 2016 7 comments

from Dean Baker

Last week marked the eighth anniversary of the collapse of Lehman Brothers, the huge Wall Street investment bank. This bankruptcy sent financial markets into a panic with the remaining investment banks, like Goldman Sachs and Morgan Stanley, set to soon topple. The largest commercial banks, like Citigroup and Bank of America, were not far behind on the death watch.

The cascade of collapses was halted when the Fed and Treasury went into full-scale bailout mode. They lent trillions of dollars to failing banks at below market interest rates. They also promised the markets that there would be “no more Lehmans” to use former Treasury Secretary Timothy Geithner’s term.

This promise was incredibly valuable in a time of crisis. It meant that investors could lend freely to Goldman and Citigroup without fear that their loans would not be repaid — they had the Treasury and the Fed standing behind them.

The public has every right to be furious about this set of events eight years ago, as well what has happened subsequently. First, everything about the crisis caught the country’s leading economists by surprise. Somehow, the country’s leading economists both could not see an $8 trillion housing bubble, nor could they understand how its collapse would seriously damage the economy. This bubble was clearly driving the economy prior to the crash, it is difficult to envision what these economists thought would replace the demand lost when the bubble burst.  Read more…

Are young men only watching porn nowadays? Not in Iceland (were they have jobs)

September 20, 2016 Leave a comment

3.pngIn the USA there is an amusing discussion going on about the decline of the participation rate of (young) men. some people state that this might be caused by digital amusement. Dean Baker rightly points out that we should not restrict this discussion to American not yet dad’s. I want to make the case that we should not even restrict this discussion to the USA. Below three graphs (source: Eurostat) which show that:

A) The average participation rate in Europe increased, even after 2008 (the participation rate is not the employment rate, the difference is unemployment; the second graph shows the non-participation rate which declines, which means that the participation rate increases).

B) But there is a problem with the participation rate of young people (the third graph, more remarkable is however the increase in the 54+ participation rate…)

C) Differences between countries with regard to the level of young adult participation overwhelm this decline (the first graph) which means that digital amusement can not be held responsible for low participation rates. Remarkably Iceland (Is in graph 3), which has very low overall unemployment, does best. Or is this in fact not remarkable at all…? Why did for instance, after 1939, all the hobo’s in the USA suddenly disappear? Where did they come from in the first place!  Read more…

Chicago drivel — a sure way to get a ‘Nobel prize’ in economics

September 20, 2016 20 comments

from Lars Syll

In 2007 Thomas Sargent gave a graduation speech at University of California at Berkeley, giving the grads “a short list of valuable lessons that our beautiful subject teaches”:

1. Many things that are desirable are not feasible.
2. Individuals and communities face trade-offs.
3. Other people have more information about their abilities, their efforts, and their preferences than you do.
4. Everyone responds to incentives, including people you want to help. That is why social safety nets don’t always end up working as intended.
5. There are trade offs between equality and efficiency.
6. In an equilibrium of a game or an economy, people are satisfied with their choices. That is why it is difficult for well meaning outsiders to change things for better or worse.
7. In the future, you too will respond to incentives. That is why there are some promises that you’d like to make but can’t. No one will believe those promises because they know that later it will not be in your interest to deliver. The lesson here is this: before you make a promise, think about whether you will want to keep it if and when your circumstances change. This is how you earn a reputation.
8. Governments and voters respond to incentives too. That is why governments sometimes default on loans and other promises that they have made.
9. It is feasible for one generation to shift costs to subsequent ones. That is what national government debts and the U.S. social security system do (but not the social security system of Singapore).
10. When a government spends, its citizens eventually pay, either today or tomorrow, either through explicit taxes or implicit ones like inflation.
11. Most people want other people to pay for public goods and government transfers (especially transfers to themselves).
12. Because market prices aggregate traders’ information, it is difficult to forecast stock prices and interest rates and exchange rates.

Reading through this list of “valuable lessons” things suddenly fall in place.

This kind of self-righteous neoliberal drivel has again and again been praised and prized. And not only by econ bloggers and right-wing think-tanks.

Out of the seventy six laureates that have been awarded ‘The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel,’ twenty eight have been affiliated to The University of Chicago. The world is really a small place when it comes to economics …

Rising tides and marginal productivity theory

September 19, 2016 4 comments

from David Ruccio

A constant refrain among mainstream economists and pundits since the crash of 2007-08 has been that, while the state of mainstream macroeconomics is poor, all is well within microeconomics.

The problems within macroeconomics are, of course, well known: Mainstream macroeconomists didn’t predict the crash. They didn’t even include the possibility of such a crash within their theory or models. And they certainly didn’t know what to do once the crash occurred.

What about microeconomics, the area of mainstream economics that was supposedly untouched by all the failures in the other half of the official discipline? Well, as it turns out, there are major problems there, too—especially given the obscene levels of inequality that both preceded and have resumed since the crash erupted, not to mention the slow economic growth that rising inequality was supposed to solve.

In particular, as I have written many times over the years, the idea that a rising tide lifts all boats—along with its theoretical justification, marginal productivity theory—needs to be questioned and ultimately abandoned.

But you don’t have to take my word for it. Just read the latest essay by Nobel Prize-winning economist Joseph Stiglitz.

Stiglitz first explains that neoclassical economists developed marginal productivity theory as a direct response to Marxist claims that the returns to capital are based on the exploitation of workers.  Read more…

Economists’ infatuation with immense assumptions

September 18, 2016 Leave a comment

from Lars Syll

Peter Dorman is one of those rare economists that it is always a pleasure to read. Here his critical eye is focussed on economists’ infatuation with homogeneity and averages:

You may feel a gnawing discomfort with the way economists use statistical techniques. Ostensibly they focus on the difference between people, countries or whatever the units of observation happen to be, but they nevertheless seem to treat the population of cases as interchangeable—as homogenous on some fundamental level. As if people were replicants.

You are right, and this brief talk is about why and how you’re right, and what this implies for the questions people bring to statistical analysis and the methods they use.

Our point of departure will be a simple multiple regression model of the form

y = β0 + β1 x1 + β2 x2 + …. + ε

where y is an outcome variable, x1 is an explanatory variable of interest, the other x’s are control variables, the β’s are coefficients on these variables (or a constant term, in the case of β0), and ε is a vector of residuals. We could apply the same analysis to more complex functional forms, and we would see the same things, so let’s stay simple.

What question does this model answer? It tells us the average effect that variations in x1 have on the outcome y, controlling for the effects of other explanatory variables. Repeat: it’s the average effect of x1 on y.

This model is applied to a sample of observations. What is assumed to be the same for these observations? (1) The outcome variable y is meaningful for all of them. (2) The list of potential explanatory factors, the x’s, is the same for all. (3) The effects these factors have on the outcome, the β’s, are the same for all. (4) The proper functional form that best explains the outcome is the same for all. In these four respects all units of observation are regarded as essentially the same.

Now what is permitted to differ across these observations? Simply the values of the x’s and therefore the values of y and ε. That’s it.

Thus measures of the difference between individual people or other objects of study are purchased at the cost of immense assumptions of sameness. It is these assumptions that both reflect and justify the search for average effects …

In the end, statistical analysis is about imposing a common structure on observations in order to understand differentiation. Any structure requires assuming some kinds of sameness, but some approaches make much more sweeping assumptions than others. An unfortunate symbiosis has arisen in economics between statistical methods that excessively rule out diversity and statistical questions that center on average (non-diverse) effects. This is damaging in many contexts, including hypothesis testing, program evaluation, forecasting—you name it …

The first step toward recovery is admitting you have a problem. Every statistical analyst should come clean about what assumptions of homogeneity are being made, in light of their plausibility and the opportunities that exist for relaxing them.

Read more…

Insider critiques of neoclassical macro models

September 17, 2016 17 comments

Paul Romer has just published a devastating critique of DSGE (or, in his parlance, ‘Post Real’) macro models. He’s not the first important insider to write an article like this. Look here for Paul Krugman, ‘How did economists get it so wrong‘. Look here for Willem Buiter, ‘The unfortunate uselessness of most ‘state of the art’ academic monetary economics’. Look here for Charles Goodhart, ‘Whatever became of the monetary aggregates‘. And look here for the insider of insiders, Olivier Blanchard, ‘Do DSGE models have a future‘ (His analysis: NO! His conclusion: yes).  Especially Krugman, Buiter en Goodhart are extremely eloquent and their pieces are a joy to read.

Two questions: is there a common denominator to these fierce critiques? And does ‘your humble narrator’ have something to add?

The answer to the first question: yes, there is. All authors mention a contempt for reality. All authors mention obfuscating math. And the impossibility to ask questions about monetary instability when even ‘monetary’ models do not model money (and debt). My summary: the models in question disable economists to analyse economic reality instead of enabling them to do this.

Do I have something to add? Yes. Of the authors above, Romer is clearest (not the same as: clear) about the fact that a scientific paradigm does not only need theory but also needs a matching system of measurement, though Goodhart also clearly mentions (in 2007!) that not paying attention to the monetary aggregates (money but, in his view, also credit and debt) was quite a mistake. For quite some time, economic measurement and theory developed more or less in tandem. Veblen’s best students became the heads of organizations, the National Bureau of Economic Research and the Bureau of Labor Statistics and developed the Flow of Funds statistics. Keynes himself established the British Office of National Statistics (yes, the present day ONS), as he needed national accounts data which were not available. In the Netherlands, Tinbergen established the Centraal Planbureau, a ‘fiscal watchdog’ with a strong emphasis on empirics. Look here and here for more information about this. Instead of at least trying to match these efforts and directly measure the variables they model, like ‘the natural rate of interest’,  ‘natural unemployment’ and ‘utility’ they chose to assume that these variables are emergent properties not of the economy – but of the models (read Romer). My point: earlier generations of economists did a much better job and did estimate variables consistent with their ideas and models. Which made their efforts scientific. Let’s stand upon their shoulders!

US income inequality began to worsen after 1970

September 16, 2016 4 comments

Inheritance taxes, equity, and the gift

September 16, 2016 Leave a comment

from David Ruccio

You’d think a Harvard economics professor would be able to do better than invoke horizontal equity as the sole argument for reducing the U.S. inheritance tax.

But not Gregory Mankiw, who uses the silly parable of the Frugals and the Profligates to make his case for a low tax rate on the estates of the wealthiest 0.2 percent of Americans who actually owe any estate tax.*

I’ll leave it to readers to judge whether or not it’s worth spending the time to compose a column on a tax that affects such a tiny percentage of rich—very rich—American households. And then to argue not for raising the tax, but for lowering it.

Me, I want to raise a few, more general issues about how mainstream economists like Mankiw think about inheritance taxes.

First, Mankiw presents one principle—horizontal equity, the “equal treatment of equals”—and never even mentions the other major tax principle—vertical equity, the “unequal treatment of unequals,” the idea that people with higher incomes should pay more taxes. Certainly, on the vertical criterion, those who receive large inheritances (for doing nothing more than being born into and raised within the right family) should pay taxes at a much higher rate than those who do not.  Read more…

Has economics — really — become more empirical?

September 15, 2016 1 comment

from Lars Syll

alchemyIn Economics Rules (OUP 2015), Dani Rodrik maintains that ‘imaginative empirical methods’ — such as game theoretical applications, natural experiments, field experiments, lab experiments, RCTs — can help us to answer questions conerning the external validity of economic models. In Rodrik’s view they are more or less tests of ‘an underlying economic model’ and enable economists to make the right selection from the ever expanding ‘collection of potentially applicable models.’ Writes Rodrik:

Another way we can observe the transformation of the discipline is by looking at the new areas of research that have flourished in recent decades. Three of these are particularly noteworthy: behavioral economics, randomized controlled trials (RCTs), and institutions … They suggest that the view of economics as an insular, inbred discipline closed to the outside influences is more caricature than reality.

I beg to differ. When looked at carefully, there  are in fact few real reasons to share  Rodrik’s optimism on this ’empirical turn’ in economics.

Field studies and experiments face the same basic problem as theoretical models — they are built on rather artificial conditions and have difficulties with the ‘trade-off’ between internal and external validity. The more artificial conditions, the more internal validity, but also less external validity. The more we rig experiments/field studies/models to avoid the ‘confounding factors’, the less the conditions are reminicent of the real ‘target system.’ You could of course discuss the field vs. experiments vs. theoretical models in terms of realism — but the nodal issue is not about that, but basically about how economists using different isolation strategies in different ‘nomological machines’ attempt to learn about causal relationships. I have strong doubts on the generalizability of all three research strategies, because the probability is high that causal mechanisms are different in different contexts and that lack of homogeneity/stability/invariance doesn’t give us warranted export licenses to the ‘real’ societies or economies.

Read more…

Men who don’t work: when did economists stop being wrong about the economy?

September 15, 2016 9 comments

from Cherrie Bucknor and Dean Baker

The 4.9 percent unemployment rate is getting close to most economists’ estimates of full employment. In fact, it is below many estimates from recent years and some current ones. Many policy types, including some at the Federal Reserve Board, take this as evidence that it’s necessary to raise interest rates in order to keep the unemployment rate from falling too low and triggering a round of spiraling inflation.

The argument on the other side is first and foremost there is zero evidence that inflation is about to start spiraling upward. The Fed’s key measure, the core personal consumption expenditure deflator, remains well below the Fed’s target and shows no evidence of acceleration. The same is true of most wage growth measures.

But there is also good reason for skepticism on the current unemployment rate as a useful measure of labor market tightness. Other measures of labor market tightness, such as the percentage of workers employed part-time for economic reasons and the share of unemploymentdue to voluntary quits, remain close to recession levels.

Most importantly, there has been a sharp drop in labor force participation rates. As a result, in spite of the relatively low unemployment rate, the employment rate is still close to 3.0 percentage points below its pre-recession level. This story holds up even if we restrict ourselves to looking at prime-age workers (between the ages of 25–54), with an EPOP that is close to 2.0 percentage points below pre-recession levels and almost 4.0 percentage points below 2000 peaks.[1]  Read more…