Is inequality within countries getting better or worse?

August 6, 2019 3 comments

from Jason Hickel

In a recent Twitter post, Max Roser of Our World In Data claimed that the narrative about rising inequality within countries is incorrect. Inequality has been falling in as many countries as it has been rising, he said, “which should be really embarrassing for many news stories that suggest the opposite with great certainty.”

Roser’s tweet referred to an interesting blog post by Joe Hasell, with a graph illustrating the change in the Gini index within countries from 1990 to 2015. Countries above the 45-degree line have seen rising inequality, while countries below the line have seen falling inequality. It’s a pretty even split (although the majority of the world’s population live in countries that have seen rising, not falling inequality).

Screen Shot 2018-12-13 at 11.42.40 AM.png

This seems like good news indeed. So which is it? Is inequality getting better or worse?

Read more…

Inequality and unsustainable consumption

August 5, 2019 3 comments

Image result for inequality footprint economics graphs

Source: Extreme Carbon Inequality (Oxfam, December 2015) Read more…

MMT — the key insights

August 5, 2019 5 comments

from Lars Syll

As has become abundantly clear during the last couple of years, it is obvious that most mainstream economists seem to think that Modern Monetary Theory is something new that some wild heterodox economic cranks have come up with. That is actually very telling about the total lack of knowledge of their own discipline’s history these modern mainstream guys like Summers, Rogoff and Krugman have.

New? Cranks? Reading one of the founders of neoclassical economics, Knut Wicksell, and what he writes in 1898 on ‘pure credit systems’ in Interest and Prices (Geldzins und Güterpreise) soon makes the delusion go away:

It is possible to go even further. There is no real need for any money at all if a payment between two customers can be accomplished by simply transferring the appropriate sum of money in the books of the bank 

A pure credit system has not yet … been completely developed in this form. But here and there it is to be found in the somewhat different guise of the banknote system …

We intend therefore​, as a basis for the following discussion, to imagine a state of affairs in which money does not actually circulate at all, neither in the form of coin … nor in the form of notes, but where all domestic payments are effected by means of the Giro system and bookkeeping transfers. A thorough analysis of this purely imaginary case seems to me to beworthwhile​e, for it provides a precise antithesis to the equally imaginary​ case of a pure cash system, in which credit plays no part whatever [the exact equivalent of the often used neoclassical model assumption of ‘cash in advance’ – LPS] …

For the sake of simplicity, let us then assume that the whole monetary system of a country is in the hands of a single credit institution, provided with an adequate number of branches, at which each independent economic individual keeps an account on which he can draw cheques.

What Modern Monetary Theory (MMT) basically does is exactly what Wicksell tried to do more than a hundred years ago. Read more…

MMT macro final (3/3) entanglement

August 5, 2019 Leave a comment

from Asad Zaman

Previous posts (  MMT Macro Final 1/3 , and  MMT Macro Final 2/3 ) have covered questions 1-4 and 5-8. This post covers the last 4 question of the MMT based  Advanced Macro course I taught last semester at PIDE. The central methodological difference at the heart of my course was the principle of EntanglementTheories cannot be understood outside their historical context, and history cannot be understood without understanding theories used by human agents to understand and respond to that history. This is one of the three methodological principles that I have extracted from a study of  Methodology of Polanyi’s Great Transformation . This issue is discussed in the answer to question 11 below. Because of its central importance, I have also tried to explain it in greater detail in a separate 18 min video lecture.   read more

Solow kicking Lucas and Sargent in the pants

August 4, 2019 4 comments

from Lars Syll

robert_solow4Professors Lucas and Sargent … have a proposal for constructive research that I find hard to talk about sympathetically. They call it equilibrium business cycle theory, and they say very firmly that it is based on two terribly important postulates — optimizing behavior and perpetual market clearing. When you read closely, they seem to regard the postulate of optimizing behavior as self-evident and the postulate of market-clearing behavior as essentially meaningless. I think they are too optimistic, since the one that they think is self-evident I regard as meaningless and the one that they think is meaningless, I regard as false. The assumption that everyone optimizes implies only weak and uninteresting consistency conditions on their behavior …  Read more…

Can capitalism feed the world sustainably and fairly?

August 3, 2019 2 comments

from Ken Zimmerman

In 1798, just before the beginning of the industrial revolution in the UK, Robert Malthus published “An Essay on the Principle of Population as it Affects the Future Improvement of Society, with Remarks on the Speculations of Mr. Godwin, M. Condorcet, and Other Writers.” The thesis for the book was simple. The natural human urge to reproduce increases human population geometrically (1, 2, 4, 16, 32, 64, 128, 256, etc.). However, food supply, at most, can only increase arithmetically (1, 2, 3, 4, 5, 6, 7, 8, etc.). Thus, since food is a necessity for human life, population growth in any area or on the planet, if unchecked, would lead to starvation. Malthus argued there are preventative checks and positive checks on the population that slow its growth and keep the population from rising exponentially for too long, but still, poverty and some starvation are inescapable and will continue. Preventative checks alter the birth rate. They include marrying at a later age (moral restraint), abstaining from procreation, birth control, and homosexuality. Malthus considered birth control and homosexuality vices, but recognized they are practiced. Positive checks increase the death rate. These include disease, war, disaster, and finally when other checks don’t reduce the population, famine. The fear of famine or the development of famine was, thought Malthus a major impetus to reduce the birth rate. Potential parents are less likely to have children when they believe their children are likely to starve.

Malthus considered these “laws of nature.” Turns out, it’s not quite so simple. Read more…

Chain-weighting the base-year problem

August 2, 2019 Leave a comment

from Blair Fix, Jonathan Nitzan and Shimshon Bichler   

To reiterate, the base-year problem leads to uncertainty in the calculation of real GDP. But instead of openly reporting this uncertainty, government economists have devised a “fix”. Rather than using a single base year, they “chain” together many adjacent base years. This is a bit like a moving average. They calculate the growth of real GDP between consecutive years, using the first year in each pair as the base, and then “chain” together the resulting growth measures to calculate real GDP levels. This method claims to “fix” or at least lessen the base-year problem. It doesn’t.

The appeal of chain-weighting, according to economists, is that it gets closer to their theoretical ideal. According to this ideal, the weight of each commodity in real GDP is provided by its “true” or “natural” price. When using a single base year, the implicit assumption is that relative prices in that base year are “true” and therefore constitute the “correct” weights (Equation 2). However, if the “correct” weights change over time, and if these changes are mirrored in the movement of relative market prices, we can do better by changing the base year more often (every year) and chain-weight the results.

This argument is superficially convincing, but it falls apart on further inspection. Read more…

Game Theory and Operations Research lacked substantiated applications in social, political and economic fields.

August 2, 2019 11 comments

from Richard Vahrenkamp

Since 1945, the United States had experienced a unique innovation push with the computer, the nuclear weapon, new air combat weapons and the transistor within just a few years. These innovations were accompanied by Game Theory and Operations Research in the academic field. Widely–held is the view that computers supplemented the mathematical concepts of Game Theory and Operations Research and gave these fields a fresh impulse. Together, they established the view of the world as a space of numbers and introduced quantitative methods in economics, political science and in sociology. A series of conferences on these subjects settled this new view. They imparted Cold War science and technology policy with a unique flavour of progress, superiority and modernity.

Whereas the history of quantitative methods has been mainly written as a history of digital computers, the history of Game Theory and Operations Research has had only a small number of contributions. Read more…

Non-normal normality

August 1, 2019 28 comments

from Lars Syll

Asset price distributions are of great practical significance for portfolio managers. Standard finance theory assumes that asset price changes follow a normal distribution—the well-known bell curve. That this assumption is roughly accurate most of the time allows analysts to use very robust probability statistics. For example, for a sample that follows a normal distribution, you can identify the population average and characterize the likelihood of variance from that average.

33331However, much of nature—including the man-made stock market—is not normal. Many natural systems have two defining characteristics: an ever-larger number of smaller pieces and similar-looking pieces across the different size scales. For example, a tree has a large trunk and a number of ever-smaller branches, and the small branches resemble the big branches. These systems are fractal. Unlike a normal distribution, no average value adequately characterizes a fractal system. Fractal systems follow a power law.

Using the statistics of normal distributions to characterize a fractal system like financial markets is potentially very hazardous. Yet theoreticians and practitioners do it daily. The distinction between the two systems boils down to probabilities and payoffs. Fractal systems have few, very large observations that fall outside the normal distribution. The classic example is the crash of 1987. The probability (assuming a normal distribution) of the market’s 20%-plus plunge in one day was so infinitesimally low it was practically zero. And still the losses were a staggering $2 trillion-plus.

Number Facts & Number Fictions

July 31, 2019 Leave a comment

from Asad Zaman

Excerpt from:  Real Statistics (3/4) Statistics as Rhetoric 

{Preliminary material explains that conventional approach statistics separates theory and application — the job of ths statistician is to analyze numbers – without knowing where the come from. The job of the Field Expert is to use objective statistical analysis of numbers to get better understanding of the realities which generate the numbers. In “Real Statistics”, we assert that these two tasks cannot be separated. Theory must always be studied within the context of real world application. Also, real world phenomena cannot be understood without application of theory} read more

The biggest problem in science

July 30, 2019 62 comments

from Lars Syll

There’s a huge debate going on in social science right now. The question is simple, and strikes near the heart of all research: What counts as solid evidence? …

null-hypothesis1Prominent statisticians, psychologists, economists, sociologists, political scientists, biomedical researchers, and others … argue that results should only be deemed “statistically significant” if they pass a higher threshold.

“We propose a change to P< 0.005,” the authors write. “This simple step would immediately improve the reproducibility of scientific research in many fields” …

There’s a critique of the proposal the authors whom I spoke to agree completely with: Changing the definition of statistical significance doesn’t address the real problem. And the real problem is the culture of science.

In 2016, Vox sent out a survey to more than 200 scientists, asking, “If you could change one thing about how science works today, what would it be and why?” One of the clear themes in the responses: The institutions of science need to get better at rewarding failure.

One young scientist told us, “I feel torn between asking questions that I know will lead to statistical significance and asking questions that matter.”

The biggest problem in science isn’t statistical significance. It’s the culture. She felt torn because young scientists need publications to get jobs. Under the status quo, in order to get publications, you need statistically significant results. Statistical significance alone didn’t lead to the replication crisis. The institutions of science incentivized the behaviors that allowed it to fester.

Brian Resnick 

Read more…

Why is Facebook, the world’s largest publisher, immune to publishing laws?

July 30, 2019 5 comments

from Dean Baker

Mark Zuckerberg may not think he needs a new job, but he does. It’s long past time Facebook be classified as a publisher, where it can be held responsible for the content that appears in posts on its system.

The issue here is the special exemption to liability that Facebook and other internet platforms get from Section 230 of the Communications Decency Act of 1996. This law was passed in the early days of the internet and was intended to set up rules for governing communications that paralleled the ones for print and broadcast media. At the time, Congress decided to include Section 230, which protects Facebook and other internet platforms from the same sort of responsibility for content that print or broadcast media face.

To see what is at issue, suppose that a Facebook post becomes widely circulated saying that Donald Trump has stolen $20 million from charity. Imagine in this particular case, it happens not to be true, and Trump can prove this fact.

Because of Section 230, Facebook bears no responsibility for spreading this false accusation. In fact, it is not even obligated to remove the false accusation from its platform, although it would likely choose to do so under the circumstances. If Trump could determine who had initiated the post, he could pursue legal action against them, but Section 230 would protect Facebook from any liability.

By contrast, suppose that a newspaper had printed the same accusation. Read more…

Bringing science into economics must necessarily entail measurements in the scientific units.

July 29, 2019 29 comments

from Ikonoclast

The only real science is hard science; namely physics, chemistry and biology. The rest is not science. This is not to insist on mere scientism nor is it to insist that other subjects are worthless. It is simply to insist on the precision of definition for which those (mistakenly) arguing for precise science and mathematics in economics are in effect calling. Those calling for precise science and mathematics in economics become hoist on their own petard if they use, at any point in their calculations, dollars or “utils” or “snalts” (socially necessary abstract labor time).

If you are calling for scientific and mathematical precision in economics then you must stick to the scientific units laid out in International System of Units (SI). These are base units; Read more…

Arrow-Debreu and the Bourbaki illusion of rigour

July 29, 2019 14 comments

from Lars Syll

By the time that we have arrived at the peak first climbed by Arrow and Debreu, the central question boils down to something rather simple. We can phrase the question in the context of an exchange economy, but producers can be, and are, incorporated in the model. There is a rather arid economic environment referred to as a purely competitive market in which individuals receive signals as to the prices of all goods. All the individuals have preferences over all bundles of goods. They also have endowments or incomes defined by the prices of the goods, and this determines what is feasible for them, and the set of feasible bundles constitutes their budget set.bourbaki Choosing the best commodity bundle within their budget set determines their demand at each price vector. Under what assumptions on the preferences will there be at least one price vector that clears all markets, that is, an equilibrium? Put alternatively, can we find a price vector for which the excess demand for each good is zero? The question as to whether a mechanism exists to drive prices to the equilibrium has become secondary, and Herb Scarf’s famous example (1960) had already dealt that discussion a blow.

The warning bell was sounded by such authors as Donald Saari and Carl Simon (1978), whose work gave an indication, but one that has been somewhat overlooked, as to why the stability problem was basically unsolvable in the context of the general equilibrium model. The most destructive results were, of course, already there, those of Hugo Sonnenschein (1974), Rolf Mantel (1974), and Debreu (1974) himself. But those results show the model’s weakness, not where that weakness comes from. Nevertheless, the damage was done. What is particularly interesting about that episode is that it was scholars of the highest reputation in mathematical economics who brought the edifice down. This was not a revolt of the lower classes of economists complaining about the irrelevance of formalism in economics; this was a palace revolution.

Alan Kirman

Read more…

There is a sound reason for the growth of statistical theory.

July 29, 2019 36 comments

from Gerald Holtham

Econometrics like more casual empiricism can be done well or badly, intelligently or stupidly, dogmatically or with an open mind. But are these gentlemen saying that statistical analysis can never reveal anything in economics that is not obvious to simple observation? Evidently that is untrue. What is revealed is never a “law” and will obviously be contingent in space and time. That follows from the nature of society and economic data. Statistical analysis nevertheless has an indispensable place in social studies whether economics, psychology, medicine or sociology.

Read more…

Which base year?

July 28, 2019 7 comments

from Blair Fix, Jonathan Nitzan and Shimshon Bichler   

But there is a slight conceptual problem. It turns out that the growth of real GDP – ostensibly a single, objective quantity – is highly sensitive to our choice of base year.

To illustrate, consider a hypothetical economy that produces only two commodities: 1,000 lb of tomatoes and two laptops. Next, let’s choose 1990 as our base year and assume that tomatoes in that year cost $2/lb while a laptop costs $2,000. In this case, real GDP, denominated in 1990 dollars, would be $6,000 (=1,000 x $2 + 2 x $2,000). Now, skip to 1991 and imagine that, in that year, the economy grows by producing one additional laptop. This increase means that real GDP in 1991, denominated in 1990 prices, is $8,000 (=1,000 x $2 + 3 x $2,000). Compared to 1990, real GDP grew by 33.3 per cent.

So far so good. Now, instead of using 1990 as our base year, let’s use 1991. Read more…

Economics — an axiomatically based science doomed to fail

July 27, 2019 15 comments

from Lars Syll

A modern economy is a very complicated system. Since we cannot conduct controlled experiments on its smaller parts, or even observe them in isolation, the classical hard-science devices for discriminating between competing hypotheses are closed to us. The main alternative device is the statistical analysis of historical time-series. But then another difficulty arises. The competing hypotheses are themselves complex and subtle. We know before we start that all of them, or at least many of them, are capable of fitting the data in a gross sort of way. Then, in order to make more refined distinctions, we need long time-series observed under stationary conditions.

fellaUnfortunately, however, economics is a social science … Much of what we observe cannot be treated as the realization of a stationary stochastic process without straining credulity. Moreover, all narrowly economic activity is embedded in a web of social institutions, customs, beliefs, and attitudes. Concrete outcomes are indubitably affected by these background factors, some of which change slowly and gradually, others erratically. As soon as time-series get long enough to offer hope of discriminating among complex hypotheses, the likelihood that they remain stationary dwindles away, and the noise level gets correspondingly high. Under these circumstances, a little cleverness and persistence can get you almost any result you want. I think that is why so few econometricians have ever been forced by the facts to abandon a firmly held belief. Indeed, some of Fortune’s favorites have been known to write scores of empirical articles without once feeling obliged to report a result that contradicts their prior prejudices.

Robert Solow

Yours truly has for many years been urging economists to pay attention to the ontological foundations of their assumptions and models. Sad to say, economists have not paid much attention — and so modern economics has become increasingly irrelevant to the understanding of the real world. Read more…

Attempted endless growth

July 26, 2019 5 comments

from Ikonoclast

I guess history shows that moral wrongs can continue almost indefinitely. However, being empirically (provably scientifically wrong to a high degree of probability) is another beast altogether. Trends that can’t continue because they approach limits imposed by fundamental laws of nature, won’t continue. It’s as simple as that. We have to change our ways (patterns of production, consumption and attempted endless growth) or civilization will crash and burn, pretty much literally. The opposition of entrenched dominant capital to necessary changes and transitions has seriously delayed necessary change. The situation is now ultra critical. We have ten years, or maybe only five years (as per the old Bowie song).

Understanding global inequality in the 21st century

July 25, 2019 5 comments

from Jayati Ghosh

Inequality has increased since it caught the attention of the international community. The claims that global inequality has decreased because of the faster rise in per capita incomes in populous countries like China and India must be tempered by several considerations. National policies are crucial in this worsening state of affairs and the international economic architecture and associated patterns of trade and capital flows encourage such policies. More national policy space is required for governments, especially in developing countries, to pursue policies that would move towards more sustainable and equitable development which in turn requires significant changes in the global architecture. None of this can be done without some international coordination, and there is a need to revive a progressive and acceptable form of multilateralism that supports the working people across the world, rather than the interests of large capital. Read more…

Economics — science succumbed to universalist temptations

July 25, 2019 8 comments

from Lars Syll

universalismAll social sciences, to a greater or lesser degree, start with a yearning for a universal language, into which they can fit such particulars as suit their view of things. Their model of knowledge thus aspires to the precision and generality of the natural sciences. Once we understand human behavior in terms of some universal and – crucially – ahistorical principle, we can aspire to control (and of course improve) it.

No social science has succumbed to this temptation more than economics. Its favored universal language is mathematics. Its models of human behavior are built not on close observation, but on hypotheses that, if not quite plucked from the air, are unconsciously plucked from economists’ intellectual and political environments. These then form the premises of logical reasoning of the type, “All sheep are white, therefore the next sheep I meet will be white.” In economics: “All humans are rational utility maximizers. Therefore, in any situation, they will act in such a way as to maximize their utility.” This method gives economics a unique predictive power, especially as the utilities can all be expressed and manipulated quantitatively. It makes economics, in Paul Samuelson’s words, the “queen of the social sciences.” Read more…