from David Ruccio
from Lars Syll
Paul Krugman has often been criticized by people like yours truly and other Minskyites for getting things pretty wrong on the economics of Hyman Minsky.
When Krugman has responded to the critique, by himself rather gratuitously portrayed as about “What Minsky Really Meant” or “What Keynes Really Meant,” the over all conclusion is — “Krugman Doesn’t Care.”
The reason given for this rather debonair attitude seems to be that history of economic thought may be OK, but what really counts is if reading Minsky — or Keynes — give birth to new and interesting insights and ideas. Economics is not religion, and to simply refer to authority is not an accepted way of arguing in science.
Although I have a lot of sympathy for Krugman’s view on authority, there is a somewhat disturbing and unbecoming coquetting in his attitude towards the great forerunners he is discussing — as his rather controversial speech at Cambridge, commemorating the 75th anniversary of Keynes’ General Theory, bears evidence of.
Sometimes — and this goes not only for children — it is easier to see things if you can stand on the shoulders of elders and giants. If Krugman took his time and really studied Keynes and Minsky, I’m sure even he would learn a lot. Read more…
from today’s Guardian
Surging carbon dioxide levels have pushed greenhouse gases to record highs in the atmosphere, the World Meteorological Organisation (WMO) has said.
Concentrations of carbon dioxide, the major cause of global warming, increased at their fastest rate for 30 years in 2013, despite warnings from the world’s scientists of the need to cut emissions to halt temperature rises.
Experts warned that the world was “running out of time” to reverse rising levels of carbon dioxide (CO2) to tackle climate change.
Data show levels of the gas increased more between 2012 and 2013 than during any other year since 1984, possibly due to less uptake of carbon dioxide by ecosystems such as forests, as well as rising CO2 emissions.
The annual greenhouse gas bulletin from the WMO showed that in 2013 concentrations of CO2 in the atmosphere were 142% of what they were before the Industrial Revolution.
Other potent greenhouse gases have also risen significantly, with concentrations of methane now 253% and nitrous oxide 121% of pre-industrial levels.
Between 1990 and 2013 the warming effect on the planet known as “radiative forcing” due to greenhouse gases such as CO2 rose by more than a third (34%).
from David Ruccio
Inflation appears at first sight an extremely obvious, trivial thing. But its analysis brings out that it is a very strange thing…
One one level, inflation is extremely obvious: it’s an increase in the prices of the commodities people buy. Bread, gasoline, housing, and so on. When their prices go up, we are witnessing (and, for many, suffering) inflation. (The opposite, when prices fall, is deflation.)
Why is inflation important? Well, for most of us, our money (or nominal) incomes are eaten away by increases in prices. Therefore, over time, our real incomes are less than our nominal incomes, thus permitting us to purchase less.
Here’s an USA illustration of the difference: Read more…
from Lars Syll
Modern probabilistic econometrics relies on the notion of probability. To at all be amenable to econometric analysis, economic observations allegedly have to be conceived as random events.
But is it really necessary to model the economic system as a system where randomness can only be analyzed and understood when based on an a priori notion of probability?
In probabilistic econometrics, events and observations are as a rule interpreted as random variables as if generated by an underlying probability density function, and a fortiori – since probability density functions are only definable in a probability context – consistent with a probability. As Haavelmo (1944:iii) has it:
For no tool developed in the theory of statistics has any meaning – except , perhaps for descriptive purposes – without being referred to some stochastic scheme.
When attempting to convince us of the necessity of founding empirical economic analysis on probability models, Haavelmo – building largely on the earlier Fisherian paradigm – actually forces econometrics to (implicitly) interpret events as random variables generated by an underlying probability density function.
This is at odds with reality. Randomness obviously is a fact of the real world. Probability, on the other hand, attaches to the world via intellectually constructed models, and a fortiori is only a fact of a probability generating machine or a well constructed experimental arrangement or “chance set-up”. Read more…
Economics textbooks are not only written for students. At two critical points in the history of economic thought textbooks have played significant roles in defining the field, not only for what is taught, but more importantly (in terms of real world outcomes) for the understanding of the economy that is used by politicians, policy makers, and the public, when it votes its approval or disapproval of how the government is affecting the economy.
This started in the 1890s, when Alfred Marshall wrote the first edition of his text, called Principles of Economics. It went through 8 editions, the last being published in 1920. For a large part of the English-speaking world Marshall’s textbook continued to define the field (especially the microeconomics basics) until the middle of the 20th century, when it was replaced by Paul Samuelson’s Economics (first published in 1948). That set the standard for about the next 60 years.
Virtually all economies are currently growing both physically and financially, within a global envelope that is finite, non-growing and materially closed. A prevailing view, such as within the OECD and UNEP, is that the physical growth of throughput can be decoupled from the non-physical (financial) growth of GDP through innovation, which is commonly branded as “green growth” or “sustainable growth”. This view is also reflected, for example, in policy proposals for the next United Nations Climate Change Conference that emphasize decoupling emissions from growth (European Commission 2014). Two forms of decoupling are discussed in the literature: With relative decoupling, the growth of environmental impacts slows down relative to GDP due to efficiency improvements. With absolute decoupling, the environmental impact decreases as GDP grows.
To perpetuate a growing GDP under conditions of absolute biophysical limits will require—it is argued—compensation in terms of absolute decoupling of both the inflows from and the outflows into the environment. Relative decoupling will not suffice; it will merely delay the point in time when one or more limits are reached. Moreover, absolute decoupling will have to be achieved on a global scale, because improvements in one part of the world might be achieved when production and associated ecological impacts are moved offshore.
If we give up the assumption (2) that agents have no power over the states of the world, and consider that there is a time interval between the action taken and the outcome, other options become available for them. An individual (or a firm) can know what the impact on the world will be of other actions, taken after what we call here the “decision” in the strict sense has been adopted. Let’s suppose a lapse of time divided into two periods, t0 – t1 and t1 – t2. In t0 a subject takes the decision a1, assuming the prevailing state of the world along t0 – t2 will be S1 (the one in which c1 is expected). But then, he does not stay idle, but undertakes some additional actions b1…… bn, designed to produce (or help to create) the needed state S1. These actions, additional to (and successive of) the initial decision, are aimed at the transformation of reality in a precise way in order to get de desired result. We may call them validating actions.
A good example of validating action is propaganda, which tries to install at the top of the agents’ preferences a product whose production has already been decided (or has already been finished).
There are clear signs about what needs to be done to diminish the effect that financialization has on income distribution. It is obvious that the solution to the problem of excessive and wildly mal-distributed incomes is not to set up ethics courses for MBA students at Harvard, London, the Chicago Business School, and elsewhere (Locke, 2011b). Solutions require the adoption of new public policies and legal-institutional change. They involve politics and are about grasping power. Nor should political control be sought primarily in underdeveloped and/or developing countries, where financialization wreaks havoc. The West is not driven by some financialization monolith; there are strong advanced economies, as the German example shows, and a political base, even within the business community, that is ready to oppose this juggernaut. To choose is simple: If people want to keep out undesirables from their community why just pass anti-immigrant or vagrancy laws; they need also to stop rich financial interlopers in private equity firms from buying local firms and using bankruptcy statutes to deprive employees of their pension and benefit plans. They also need, like the Germans do, to give employee representatives on supervisory boards a voice in setting the salaries of top management and in firm governance, so that they can resist acquisitions and takeovers. It won’t be easy; witness American workers’ (under intense pressure from Republicans and the business community) recent rejection of the union at Volkswagen’s plant in Tennessee, which spoiled the company’s attempt to introduce a works council in the plant (Volkswagen is fully unionized with works councils included in its governance everywhere in its worldwide operations, except Tennessee).
The German ‘Statistisches Bundesamt’ has published new, revised data on German economic growth. These data clearly show a slightly less rosy picture (in fact: a more gloomy picture) than previous data. Germany unequivocally experienced the feared ‘double dip’ in 2012/2013 – and we can’t even exclude a triple dip (a dip being defined as two subsequent quarters showing a decline of production). This despite record low interest rates! Which of course means that Schauble and Merkel are wrong: contractionary policies are not expansionary, they are, as they are intended to be, contractionary (why is this for many people so difficult to understand…). Even when domestic interest rates are low.
I do not really expect a double dip: extra-Eurozone orders for manufacturing showed an extra-ordinary increase of +10% (mainly investment goods which, considering the size of the German economy, is HUGE). This increase even compensated the recent dismal development of domestic orders. But a little historical perspective is in order: total orders are still way lower than in 2007… Fortunately, the labour market is still doing reasonably well, the number of jobs increases with about 0,8% a year (which is to an extent caused by people working ever fewer hours, on average – if that’s voluntary that’s a good thing). We can’t however be sure that foreign, extra EU orders will keep compensating rapidly declining German retail sales. German domestic demand still seems to be sub par. Domestic demand has to revive and the number of jobs will have to grow with at least 2% a year. Raise those wages (with an additional 1% for a number of years), cut those taxes.
By the way. Last year, German government wealth declined more than a little – because, well, mainly something with banks.
For industrial systems, a low throughput of matter and energy implies a smaller ecological footprint and greater life expectancy and durability of goods and infrastructure; a high throughput implies more depletion of resources that will need to be renewed and more waste that will need to be disposed of (Meadows and Wright 2008). System dynamics and thermodynamics tell us that a tolerable rate of throughput and entropic transformation is ultimately dictated by the natural system, not by economics or engineering.
A possible task for engineering, within limits, would be to maximise the durability of stocks by minimising inflows of low entropy natural resources and by minimising outflows of high entropy waste and emissions. The role that industrial societies have assigned to technology is, however, much more Herculean. We have asked it to simultaneously and boundlessly minimise environmental impacts and maximise economic growth. In 1966, Kenneth Boulding suggested: “We are very far from having made the moral, political, and psychological adjustments which are implied in this transition from the illimitable plane to the closed sphere” (Boulding 1966: 2-3). How far are we now, almost half a century later?
We have indeed come round in a circle. The whole vision of the working of the macrosystem presented, in terms of the AD/AS model, by far too many contemporary textbooks, is essentially pre-Keynesian. Monetary spending may fluctuate, but whether or not such fluctuations affect employment and output is said to depend on reactions affecting real wages. Slow adjustment of money wages to price changes is held to account for cyclical variations in employment and output. With respect to the longer term, it is presumed that real wages return to their proper full-employment level. There are then no obstacles on the side of demand to prevent re-establishment of the ‘natural’ (full employment) level of activity. The pale shadow of Keynesian theory in the ADAS model – the AD curve – has nothing to do with the values of output and employment at equilibrium, only with the price level.
from David Ruccio
The core of my argument is that many sequences of events that are presented as mechanisms (i.e., as sequences of events organized in a stable way and leading to results known beforehand) in theoretical models are actually socially constructed by the presence (often tacit) of regulations and institutions that eliminate otherwise alternative options. My argument is against the alleged naturalness of social sequences modeled within theoretical models. These sequences do not reflect social laws (like physical laws), or mechanisms in the usual sense of the term (used in current mechanismic literature). When they are represented within theoretical models, they are not much more than modeled representations of truncated processes, which are open-ended in reality. Theoretical mechanisms are obtained assuming as “natural” and given (i.e., unchangeable as a matter of principle) institutional features that are actually historically determined and perfectly modifiable.
The value of capitals. Does Airbnb drive house price increases in London, Berlin, Dublin and Amsterdam?
Since 2008, economic statistics have twisted my mind, again and again. One of the weird patterns shown by the data are comparatively very large and fast house price increases in capitals.
Look here for London.
Look here for Berlin
Look here for Dublin
Look here for Amsterdam
Look here for Paris
Is this caused by a large and fast increase in gross rental yields of houses enabled by the site Airbnb, which makes it a lot easier for international tourists to rent houses from individual owners, which leads to a rapid increase of potential gross rental yields of existing houses? According to a recent article in De Volkskrant, a Dutch newspaper, this is at least one of the reasons why house prices are increasing in Amsterdam, not just because individuals are enabled to let their houses but also because new companies are very rapidly using Airbnb to enter this market (especially in the center of cities) which leads to house price increases – increases which are consistent with the fact that tourism is one of the few growth sectors in Europe, at the moment. Look also here, for Venice (Italy).
If this hypothesis is right, these price increases are no sign of a property bubble.
from Lars Syll
I’ve never yet been able to understand why the economics profession was/is so impressed by the Arrow-Debreu results. They establish that in an extremely abstract model of an economy, there exists a unique equilibrium with certain properties. The assumptions required to obtain the result make this economy utterly unlike anything in the real world. In effect, it tells us nothing at all. So why pay any attention to it? The attention, I suspect, must come from some prior fascination with the idea of competitive equilibrium, and a desire to see the world through that lens, a desire that is more powerful than the desire to understand the real world itself. This fascination really does hold a kind of deranging power over economic theorists, so powerful that they lose the ability to think in even minimally logical terms; they fail to distinguish necessary from sufficient conditions, and manage to overlook the issue of the stability of equilibria.
Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria. Read more…
After re-unification, the German unemployment rate has been way too high for way too long. Look here for excellent data from ‘der Spiegel’, a German weekly. Comparing the 1991-2013 unemployment data with the 1950-1960 period of the ‘Wirtschaftswunder’, when the initially very high unemployment rate declined rapidly for 10 years in a stretch, the post re-unification era was a time of massive policy failings. Unemployment rates either increased or were stagnant for 14 years in a stretch. Neither the monetary union between West- and East-Germany nor the subsequent ‘internal devaluation’ policies (needed because the implied exchange rate of the East-German ‘mark was way too high to begin with while subsequent developments led to an inflation shock in 1992-1994 which, alas, was not offset by external devaluation) led to any kind of rapid decline of unemployment. To the contrary. More succesful economic policies would have led to lower unemployment and higher investment rates – is it too much to expect at least a temporary increase in the German investment rate after re-unification? It did not happen.
Smith’s commitment to “equity” for the working class was behind the vehemence of his opposition to mercantilist (“business economics”) arguments for policies that would protect or promote the profits of producers and intermediaries. Smith saw such pro-business arguments—which arguably persist as the core of neoliberalism (Harvey 2007)—whether for direct subsidies or competition-restricting regulations, as an intellectually bankrupt and often morally corrupt rhetorical veil for what were actually “taxes” upon the poor (what we now call “rents”). Such taxes are unjust and outrageous because they violate fair play both in the deceptive rhetoric by which they are advanced and by harming the interests of one group in society (generally, the poor and voiceless) to further the interests of another (unsurprisingly, the rich and politically connected). Smith explicitly moralised the point,
To hurt in any degree the interest of any one order of citizens, for no other purpose but to promote that of some other, is evidently contrary to that justice and equality of treatment which the sovereign owes to all the different orders of his subjects (WN IV.viii.30).