Archive

Archive for the ‘The Economy’ Category

World employers report

April 24, 2018 1 comment

from David Ruccio

The history of capitalism is actually a combination of two histories: it’s a history of employers attempting to hire workers and develop new technologies to make profits and expand the reach of capitalism; it’s also a history of workers banding together to improve wages and working conditions and imagine ways of moving beyond capitalism.

The World Bank’s World Development Report, currently in draft form, comes down firmly on the side of employers and their historical role.

The theme of the 2019 report is the “changing nature of work.” As envisioned by the reports authors,

Work is constantly being reshaped by economic progress. Society evolves as technology advances, new ways of production are adopted, markets integrate. While this process is continuous, certain technological changes have the potential for greater impact, and provoke more attention than others. The changes reshaping work today are fundamental and long-term, driven by technological progress, globalization, shifting demographics, urbanization and climate change.

Beneath the typically lofty but vague rhetoric, the two trends that haunt the report are the increasing gap between the top 1 percent and everyone else and the jobs that will be eliminated with the use of automation and other labor-saving technologies—leading to “rising concerns with unemployment, inequality and unfairness that are accompanying these changes.”  Read more…

Global income distribution 1800, 1975 and 2010

April 22, 2018 3 comments

The tractability hoax in modern economics

April 22, 2018 7 comments

from Lars Syll

While the paternity of the theoretical apparatus underlying the new neoclassical synthesis in macro is contested, there is wide agreement that the methodological framework was largely architected by Robert Lucas … Bringing a representative agent meant foregoing the possibility to tackle inequality, redistribution and justice concerns. Was it deliberate? How much does this choice owe to tractability? What macroeconomists were chasing, in these years, was a renewed explanation of the business cycle. They were trying to write microfounded and dynamic models …

tractable-2Rational expectations imposed cross-equation restrictions, yet estimating these new models substantially raised the computing burden. Assuming a representative agent mitigated computational demands, and allowed macroeconomists to get away with general equilibrium aggregate issues: it made new-classical models analytically and computationally tractable …

Was tractability the main reason why Lucas embraced the representative agent (and market clearing)? Or could he have improved tractability through alternative hypotheses, leading to opposed policy conclusions? … Some macroeconomists may have endorsed the new class of Lucas-critique-proof models because they liked its policy conclusions. Other may have retained some hypotheses, then some simplifications, “because it makes the model tractable.” And while the limits of simplifying assumptions are often emphasized by those who propose them, as they spread, caveats are forgotten. Tractability restricts the range of accepted models and prevent economists from discussing some social issues, and with time, from even “seeing” them. Tractability ‘filters’ economists’ reality … The aggregate effect of “looking for tractable models” is unknown, and yet it is crucial to understand the current state of economics.

Beatrice Cherrier

Read more…

Harvard teaches us that hedge fund managers get rich even when they mess up

April 21, 2018 3 comments

from Dean Baker

While we all know that it is important for people to get a good education if they want to do well in today’s economy, it remains the case that who you know matters much more than what you know. Harvard has taught us this lesson well with the management of its endowment in recent years.

Businessweek reported that the returns on Harvard’s endowment over the last decade averaged just 4.4 percent annually. This performance trailed both stock index returns and the returns received by other major university endowments. This means that Harvard would have had considerably more money to pay its faculty and staff if it simply bought a Vanguard index fund.

If this were just bad luck, one could be sympathetic, but according to Businessweek, the school paid $242 million to the people who managed its money over the period from 2010 to 2014, an average of $48.4 million annually. While Harvard’s endowment fared poorly, these money managers did very well, with the top-paid managers undoubtedly pocketing paychecks well in excess of $1 million a year (approximately 8,000 food stamp months). In other words, Harvard’s money managers were paid huge sums to lose the school money. Nice work if you can get it.

It is difficult to understand how Harvard, or any university, could pay so much money to lose the school money. Harvard’s money managers surely have good credentials, and probably even good track records with their past performance. How could the university write contracts that allow these people to get huge paychecks that end up costing Harvard money due to their poor investment decisions?  Read more…

Family wealth redistribution in US from 1989 to 2013

April 21, 2018 2 comments

Source: https://ftalphaville.ft.com/2018/01/04/2197227/eight-charts-on-inequality-in-the-us/

Sometimes we do not know because we cannot know

April 21, 2018 36 comments

from Lars Syll

Some time ago, Bank of England’s Andrew G Haldane and Benjamin Nelson presented a paper with the title Tails of the unexpected. The main message of the paper was that we should not let us be fooled by randomness:

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

blNormality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, it merits a couple of comments.

Read more…

Changing global income distribution

April 20, 2018 2 comments

DSGE models — overconfident macroeconomic story-telling

April 19, 2018 5 comments

from Lars Syll

A recent paper by Christiano, Eichenbaum and Trabandt (C.E.T.) on Dynamic Stochastic General Equilibrium Models (DSGEs) has generated quite a reaction in the blogosphere …

The-DSGE-Model-Quarrel-Again-e1512989462377Bradford Delongpoints out that new Keynesian models were constructed to show that old Keynesian and old Monetarist policy conclusions were relatively robust, and not blown out of the water by rational expectations … The DSGE framework was then constructed so that new Keynesians could talk to RBCites. None of this has, so far, materially advanced the project of understanding the macroeconomic policy-relevant emergent properties of really existing industrial and post-industrial economies …  Read more…

Inter-generational wealth redistribution in the USA 1989 to 2016

April 19, 2018 Leave a comment

Top 10% national income share across the world 1980 to 2016

April 18, 2018 Leave a comment

China’s “Currency Devaluation Game”

April 17, 2018 2 comments

from Dean Baker

Donald Trump was apparently angry about the value of the Russian ruble and the Chinese yuan against the dollar. He complained in a tweet that both are playing the “Currency Devaluation game” in a tweet yesterday.

Neil Irwin rightly points out that the complaint against Russia is bizarre, both because we don’t have much trade with Russia, but also because the most obvious reason its currency is falling is sanctions pushed by the United States and other western countries. The story with China is a bit more complicated.

China’s currency has actually been rising against the dollar over the last year, with the yuan going from 14.5 cents to 15.9 cents. So the claim that China is devaluing its currency is pretty obviously wrong.

There is however an issue of whether China is still deliberately depressing its currency against the dollar. As Irwin notes, China is no longer buying large amounts of dollars and other reserves, as it did in the last decade. This buying raised the value of the dollar and kept down the value of the yuan.

However China still holds a massive stock of foreign reserves, with its central bank holding more than $3 trillion in reserves and its sovereign wealth fund holding another $1.5 trillion in foreign assets. These huge stocks of assets have the effect of holding down the value of the yuan in the same way that the Fed’s holdings of assets keeps down interest rates.  Read more…

Shortcomings of regression analysis

April 17, 2018 8 comments

from Lars Syll

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis. In his Intelligence and How to Get It (Norton 2011) he writes:

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

Read more…

Debt and taxes

April 16, 2018 2 comments

from David Ruccio

As federal deficits and debt grow, they end up receiving,
not paying for, a larger and larger share of federal expenditures.

Tax cuts and spending increases enacted by Republicans over the past four months will lead to wider than previously expected budget deficits, according to the Congressional Budget Office. The federal budget deficit would total $804 billion this year, 43 percent higher than it had projected last summer, and exceed $1 trillion a year starting in 2020.

Larger deficits will, of course, add to the national debt: debt held by the public will hit $28.7 trillion at the end of fiscal 2028, or 96.2 percent of gross domestic product, up from 78 percent of GDP in 2018.

Those estimates assume current law will remain in effect, meaning Congress would allow some tax cuts to expire and spending caps to take effect again in the coming years. If Congress extends the tax cuts, as many Republicans want to do, the CBO predicted higher deficits and publicly held debt of about 105 percent of GDP by the end of 2028—a level exceeded only once in U.S. history, in the immediate aftermath of World War II.

So, what do these escalating deficit and debt numbers mean?

Clearly, in the first instance, the Republican deficit hawks have gone the way of moderate Republicans and all other extinct species of politicians and other mammals. They existed for decades, always in an attempt to cut entitlement programs and other public expenditures for poor and working-class Americans. But once it was possible to pass massive tax cuts for corporations and wealthy individuals and boost military spending, the deficit hawks on the Republican side of the aisle simply disappeared into the walls of Congress.*  Read more…

A typology of uncertainties

April 16, 2018 18 comments

from J.-C. Spender

There is some heavy stuff in this section – but we cannot get beyond today’s literature on managing as rational decision-making and connect with managers’ practice without engaging uncertainty. All attempts to define uncertainty must fail – by definition, for to define is to take as certain, axiomatic. Those who see uncertainty in terms of probability stand on the certainty of population statistics. Knight saw such modified certainty as ‘risk’. Yes, risk management is important, just as is distinguishing knowing definitively from knowing statistically. But the difference here is methodological and neither mode grasps Knightian uncertainty. Probability is logical/nomothetic, computable. In contrast Knight’s notion was implicitly idiographic, the sense of an absence of certainty arising from an ideographic experience of not-knowing. Something failed, what was expected did not occur – why not? Was the causal sequence (nomothetic) adopted wrong, or did the fault lie with the situation’s ideographic characterization – its initial conditions etc.? Such questions must still be expressed in language, thus standing on what is known. Like us all, Knight struggled with Aristotle’s nomothetic/idiographic distinction, the failure to relate knowing and experiencing, the inevitable separation between the totality and immediacy of living versus explaining it with abstract concepts.  Read more…

Dysfunctionalism in US economic departments and business schools

April 15, 2018 5 comments

from John Locke

The problem, however, is not the failure of economic departments and business schools to create a prescriptive science, but the refusal of nomothetic neoclassical economists and mathematical modelers in them to admit the failure, and their actions after they gained a monopoly of the sinews of institutional power, that produced dysfunctionality in Anglo American higher education. That dysfunctionalism is expressed in their constant battle with people in academia who realize the prescriptive failure of the nomothetic science project, with which readers of the Real-World Economics Review blog are painfully familiar, and a dysfunctionalism that results in education from their narrow minded refusal (b) to accept the importance of the ideographic tradition in economic and business studies during the current crisis in U.S. management capitalism.  Read more…

The relationship of the real investment led economy perspective to existing views

April 14, 2018 1 comment

from Michael Joffe

This perspective contrasts with standard neoclassical theory in several respects. That theory puts forward models relating to the decision making of (potential) workers, and of firms – respectively the supply of and the demand for labor. Workers choose whether or not to accept employment, based on a comparison of the offered wage with their reservation wage. Firms’ decision making is seen as a comparison between employing one more or one fewer worker with the difference this would make to production – respectively marginal cost and marginal benefit – given that the firm already exists, and has an established production system with premises, equipment, etc. Neoclassical theory implies that the forces of demand and supply rapidly bring about an equilibrium in which there is neither excess demand for labor, nor excess supply.  Read more…

Game theory, Larry Samuelson and one of the most widespread myths in economics

April 14, 2018 1 comment

from Bernard Guerrien

 One of the most widespread myths in economics, but also in sociology and political science, is that game theory provides “tools” that can help solve concrete problems in these branches – especially in economics. Introductory and advanced textbooks thus often speak of the “applications” of game theory that are being made, giving the impression that they are revolutionizing the social sciences. But, looking more closely, we see that the few examples given concern mostly the usual “stories” (prisoners’ dilemma, “chiken”, battle of sexes, entry deterrence, store chain paradox, centipede game, etc.) of “old” game theory. Take the four volume set Handbook of Game Theory with Economic Applications – a Handbook that provides an extensive account of what has been done in the field of game theory from its beginning, especially in economics, but not exclusively. Despite its title, there is not the slightest trace of a concrete example of an application, nor do we find any numerical data in its thousands of pages. This is not surprising. Mathematical reasoning requires clear and explicit enunciation of the assumptions used in its demonstrations. In particular, the assumptions concerning the information available to each player – his payoffs and those of the other players for each outcome of the game, the rules of the game, etc. – are so restrictive that there is no concrete situation in the world where they could possibly be verified, not even roughly (Guerrien, 2004, p. 2-3). As Ariel Rubinstein, another renowned game theorist puts it:  Read more…

Modeling economic risk

April 14, 2018 14 comments

from Lars Syll

sav2Model builders face a constant temptation to make things more complicated than necessary because this makes them appear more capable than they really are.

Remember that a model is not the truth.

It is a lie to help you get your point across.

And in the case of modeling economic risk, your model is a lie about others, who are probably lying themselves.

And what’s worse than a simple lie?

A complicated lie.

The wisdom of crowds

April 14, 2018 7 comments

from Lars Syll

A classic demonstration of group intelligence is the jelly-beans-in-the-jar experiment, in which invariably the group’s estimate is superior to the vast majority of the individual guesses. When finance professor Jack Treynor ran the experiment in his class with a jar that held 850 beans, the group estimate was 871. Only one of the fifty-six people in the class made a better guess.

There are two lessons to draw from these experiments. First, in most of them the members of the group were not talking to each other or working on a problem together. They were making individual guesses, which were aggregated and then averaged … Second, the group’s guess will not be better than that of every single person in the group each time. WoCIn many (perhaps most) cases, there will be a few people who do better than the group. This is, in some sense, a good thing, since especially in situations where there is an incentive for doing well (like, say, the stock market) it gives people reason to keep participating. But there is no evidence in these studies that certain people consistently outperform the group. In other words, if you run ten different jelly-bean-counting experiments, it’s likely that each time one or two students will outperform the group. But they will not be the same students each time. Over the ten experiments, the group’s performance will almost certainly be the best possible. The simplest way to get reliably good answers is just to ask the group each time.

Read more…

Ryan, deficits and hypocrisy

April 13, 2018 4 comments

from Peter Radford

Paul Ryan is leaving Congress. Before he had finished announcing his upcoming retirement the airwaves were awash with commentary about his legacy.

Count me as one of those who have a particularly strong perspective on this. Paul Ryan was, and presumably still is, a supremely hypocritical human being.

Recall how he sprang into public consciousness. He quickly established himself as a severe right winger, but one with the smarts to back it all up. He promoted himself as a thoughtful conservative. He quoted all the thinkers one has to refer to if one is to be such a person. Ayn Rand was his go-to intellectual foundational source.

He spoke eloquently about the damage that Federal deficits would do. He berated Democrats, and Obama especially, for their wanton reliance on debt to pay for rescuing the economy in the aftermath of the Great Recession. His mantra was that if we would only set taxpayers free, if we would only slash social spending, and if we would only see the sense in balancing our budget then America would enter a new golden age.

Well, a golden age for the wealthy at any rate.

He was, and presumably still is, one of those far right conservatives who honestly believe that cutting away the social safety net from underneath our fellow citizens will somehow lead to their discovery of endless vitality, determination, and virtue sufficient to lift them free of poverty, sickness, or age whichever of which is the cause of their reliance on that safety net.  Read more…