Archive

Author Archive

Oil, gas and coal 2040 (4 graphs)

February 24, 2018 5 comments

Where does inflation hide?

February 22, 2018 18 comments

from Herman Daly

The talking heads on the media explain the recent fall in the stock market as follows:

A fall in unemployment leads to a tight labor market and the prospect of wage increases; wage increase leads to threat of inflation; which leads the Fed to likely raise interest rates; which would lead to less borrowing, and to less investment in stocks, and consequently to an expected fall in stock prices. Therefore investors (speculators) rush to sell before the expected fall in stock prices happens, bringing about the very fall expected. So the implicit conclusion is that rising wages of the bottom 90% are bad for “the economy”, while an increase in the unearned incomes (lightly taxed capital gains) of the top 10% is good for “the economy”. The financial news readers of the corporate media avoid making that grotesque conclusion explicit, but it is implicit in their explanation.

A wage increase, in addition to cutting into profits, is considered inflationary, and that leads the Fed to raise interest rates and choke off the new money feeding the stock market boom and related growth euphoria. But higher interest rates serve other functions, most notably to keep capital from being wasted on uneconomic projects that are financially lucrative only at zero or negative interest rates. Furthermore, positive interest rates reward savers, provide for retirement and emergencies, and even reduce the inflationary effect of consumer spending.  Read more…

The debate continues in the same absurd, polarized and simplified form.  

February 20, 2018 5 comments

from Neva Goodwin 

One of the outstanding features of the time in which we live is the terrifying prospect of global climate change, regarding which it has been said that contemporary humankind is suffering from “Pre-Traumatic Stress Disorder”. Whether we squarely face what this will likely mean for the coming years, or whether we simply can’t bear to look at the facts, it is getting ever harder to avoid the gut-knowledge that the world is rapidly becoming markedly less beautiful, rich and generous to its human inhabitants. Tens of thousands of species disappear forever every year. Large coastal land areas will be submerged; diseases will multiply and spread; food from the oceans and the climate-stressed fields will be scarce; fresh water will be expensive or unobtainable for ever more millions of people; environmental refugees will swell the ranks of unwelcome migrants; and armed conflicts will reach many people who had assumed they were safe.

Armed fortress living will be increasingly common among the rich, and will doubtless create some areas of relative security, but the people inside will be their own prisoners. They will find it difficult to visit the beautiful natural areas in the United States, or the cultural jewels of other continents. Many of these cultural jewels are already being sacked in the raging conflicts of the Middle East and elsewhere; many of the world’s natural beauties are already eroding under pressure from climate change – as well as from actors in the market economy. The rich are not immune to pre-traumatic stress, as this century heads for various forms of catastrophe; their awareness and response will be important for any hope we may have for a constructive response to the threats we face. An indicator of awareness is a comment by the investor, Seth Klarman, warning that the Trump administration could lead to a major stock market correction and “global angst” among the investor class. But some of that angst is already translating into escapist survivalism among those who can afford to buy land in New Zealand, or build bunkers out of former missile sites in the U.S.. The work of Dr Richard Rockefeller, to whom this piece is dedicated, is an example of a more responsible kind of reaction among the one percent.  Read more…

Who’s Afraid of John Maynard Keynes?

February 19, 2018 7 comments

from James Galbraith and the Journal of Behavioral and Experimental Economics

Paul Davidson, Who’s Afraid of John Maynard Keynes? Challenging Economic Governance in an Age of Growing Inequality, 2017, Palgrave-MacMillan

Paul Davidson, in his ninth decade, has produced a crisp and clear exegesis of essential Keynesian ideas and the critical failures of so-called mainstream economic thought. The most critical flaw lies in the treatment of time. Rooted in ancient ideas of equilibrium, harmony and social balance, mainstream economics treats the future as an extrapolation of the past, predictable except for random errors, which are called “risk.” This as Davidson insists is incurably incorrect; there is uncertainty and at any time financial markets are prone to collapse in a failed flight to safety, which drains liquidity and deprives both financial and physical assets of their market value.

From this it follows that in the social sphere any model that projects the future from the past will fail from time to time. The models work so long as things do not change! As for change, for turning points, they nevertheless occur. And that those who believe most in the model will prepare the least and be hurt the worst. And yet, for the economy to function, “belief” in the model – at the least, conditional belief sufficient to motivate consumption and investment – appears essential. Without it, the private economy cannot prosper. Living in a house of cards is better than having no house at all.  Read more…

The third axiom of neoclassical economics: methodological equilibration

February 18, 2018 4 comments

from Christian Arnsperger and Yanis Varoufakis

The third feature of neoclassical economics is, on our account, the axiomatic imposition of equilibrium. The point here is that, even after methodological individualism turned into methodological instrumentalism, prediction at the macro (or social) level was seldom forthcoming. Determinacy required something more: it required that agents’ instrumental behaviour is coordinated in a manner that aggregate behaviour becomes sufficiently regular to give rise to solid predictions. Thus, neoclassical theoretical exercises begin by postulating the agents’ utility functions, specifying their constraints, and stating their ‘information’ or ‘belief’. Then, and here is the crux, they pose the standard question: “What behaviour should we expect in equilibrium?” The question of whether an equilibrium is likely, let alone probable, or how it might materialise, is treated as an optional extra; one that is never central to the neoclassical project.

The reason for the axiomatic imposition of equilibrium is simple: it could not be otherwise! By this we mean that neoclassicism cannot demonstrate that equilibrium would emerge as a natural consequence of agents’ instrumentally rational choices. Thus, the second best methodological alternative for the neoclassical theorist is to presume that behaviour hovers around some analytically-discovered equilibrium and then ask questions on the likelihood that, once at that equilibrium, the ‘system’ has a propensity to stick around or drift away (what is known as ‘stability analysis’).  Read more…

Polanyi’s six points

February 17, 2018 3 comments

from Asad Zaman

The analysis of Polanyi’s Great Transformation can be summarized in the six points listed below.

1: All societies face the economic task of producing and providing for all members of society. Modern market societies are unique in assigning this responsibility to the marketplace, thereby creating entitlements to production for those with wealth, and depriving the poor of entitlement to food. All traditional societies have used non-market mechanisms based on cooperation and social responsibility to provide for members who cannot take care of their own needs. It is only in a market society that education, health, housing, and social welfare services are only available to those who can pay for it.

2: Market mechanisms for providing goods to members conflict with other social mechanisms and are harmful to society. They emerged to central prominence in Europe after a protracted battle, which was won by markets over society due to certain historical circumstances peculiar to Europe. The rise of markets caused tremendous damage to society, which continues to this day. The replacement of key mechanisms which govern social relations, with those compatible with market mechanisms, was traumatic to human values. Land, labour and money are crucial to the efficient functioning of a market economy. Market societies convert these into commodities causing tremendous damage. This involves (A) changing a nurturing and symbiotic relationship with Mother Earth into a commercial one of exploiting nature, (B) Changing relationships based on trust, intimacy and lifetime commitments into short term impersonal commercial transactions, and (C) Turning human lives into saleable commodities in order to create a labor market.  Read more…

The second axiom of neoclassical economics: methodological instrumentalism

February 17, 2018 2 comments

from Christian Arnsperger and Yanis Varoufakis

We label the second feature of neoclassical economics methodological instrumentalism: all behaviour is preference-driven or, more precisely, it is to be understood as a means for maximising preference-satisfaction.[1] Preference is given, current, fully determining, and strictly separate from both belief (which simply helps the agent predict uncertain future outcomes) and from the means employed. Everything we do and say is instrumental to preference-satisfaction so much so that there is no longer any philosophical room for questioning whether the agent will act on her preferences. In effect, neoclassical theory is a narrow version of consequentialism in which the only consequence that matters is the extent to which an homogeneous index of preference-satisfaction is maximised.[2]

Methodological instrumentalism’s roots are traceable in David Hume’s Treatise of Human Nature (1739/40) in which the Scottish philosopher famously divided the human decision making process in three distinct modules: Passions, Belief and Reason. Passions provide the destination, Reason slavishly steers a course that attempts to get us there, drawing upon a given set of Beliefs regarding the external constraints and the likely consequences of alternative actions. It is not difficult to see the lineage with standard microeconomics: the person is defined as a bundle of preferences, her beliefs reduce to a set of subjective probability density functions, which help convert her preferences into expected utilities, and, lastly, her Reason is the cold-hearted optimiser whose authority does not extend beyond maximising these uilities. However, it is a mistake to think that Hume would have approved. For his Passions are too unruly to fit neatly in some ordinal or expected utility function. It took the combined efforts of Jeremy Bentham and the late 19th Century neoclassicists to tame the Passions sufficiently before they could initially be reduced to a unidimensional index of pleasure before turning into smooth, double differentiable utility functions.  Read more…

The first axiom of neoclassical economics: methodological individualism

February 16, 2018 1 comment

from Christian Arnsperger and Yanis Varoufakis

Unsophisticated critics often identify economic neoclassicism with models in which all agents are perfectly informed. Or fully instrumentally rational. Or excruciatingly selfish. Defining neoclassicism in this manner would perhaps be apt in the 1950s but, nowadays, it leaves almost all of modern neoclassical theory out of the definition, therefore strengthening the mainstream’s rejoinders. Indeed, the last thirty years of neoclassical economics have been marked by an explosion of models in which economic actors are imperfectly informed, some times other-regarding, frequently irrational (or boundedly rational, as the current jargon would have it) etc. In short, Homo Economicus has evolved to resemble us more.

None of these brilliant theoretical advances have, however, dislodged the neoclassical vessel from its methodological anchorage. Neoclassical theory retains its roots firmly within liberal individualist social science. The method is still unbendingly of the analytic-synthetic type: the socio-economic phenomenon under scrutiny is to be analysed by focusing on the individuals whose actions brought it about; understanding fully their ‘workings’ at the individual level; and, finally, synthesising the knowledge derived at the individual level in order to understand the complex social phenomenon at hand. In short, neoclassical theory follows the watchmaker’s method who, faced with a strange watch, studies its function by focusing on understanding, initially, the function of each of its cogs and wheels. To the neoclassical economist, the latter are the individual agents who are to be studied, like the watchmaker’ cogs and  wheels, independently of the social whole their actions help bring about.  Read more…

The semantics of mathematical equilibrium theory

February 12, 2018 41 comments

from Michael Hudson

            If mathematics is deemed to be the new language of economics, it is a language with a thought structure whose semantics, syntax and vocabulary shape its user’s perceptions. There are many ways in which to think, and many forms in which mathematical ideas may be expressed. Equilibrium theory, for example, may specify the conditions in which an economy’s public and private-sector debts may be paid. But what happens when not all these debts can be paid? Formulating economic problems in the language of linear programming has the advantage of enabling one to reason in terms of linear inequality, e.g., to think of the economy’s debt overhead as being greater than, equal to, or less than its capacity to pay.

An array of mathematical modes of expression thus is available to the economist. Equilibrium-based entropy theory views the economy as a thermodynamic system characterized by what systems analysts call negative feedback. Chaos theories are able to cope with the phenomena of increasing returns and compound interest, which are best analyzed in terms of positive feedback and intersecting trends. Points of intersection imply that something has to give and the solution must come politically from outside the economic system as such.

What determines which kind of mathematical language will be used? At first glance it may seem that if much of today’s mathematical economics has become irrelevant, it is because of a fairly innocent reason: it has become a kind of art for art’s sake, prone to self-indulgent game theory. But almost every economic game serves to support an economic policy.  Read more…

Let us reconsider the idea of demographic transition.

February 11, 2018 4 comments

from Herman Daly

By definition this is the transition from a human population maintained by high birth rates equal to high death rates, to one maintained by low birth rates equal to low death rates, and consequently from a population with low average lifetimes to one with high average lifetimes. Statistically such transitions have often been observed as standard of living increases. Many studies have attempted to explain this correlation, and much hope has been invested in it as an automatic cure for overpopulation. “Development is the best contraceptive” is a related slogan, partly based in fact, and partly in wishful thinking.

There are a couple of thoughts I’d like to add to the discussion of demographic transition. The first and most obvious one is that populations of artifacts can undergo an analogous transition from high rates of production and depreciation to low ones. The lower rates will maintain a constant population of longer-lived, more durable artifacts. Our economy has a GDP-oriented focus on maximizing production flows (birth rates of artifacts) that keeps us in the pre-transition mode, giving rise to low product lifetimes, planned obsolescence, and high resource throughput, with consequent environmental destruction. The transition from a high maintenance throughput to a low one applies to both human and artifact populations independently. From an environmental perspective, lower throughput per unit of stock (longer human and product lifetimes) is desirable in both cases, at least up to some distant limit.

Read more…

The absence of a theory of public economy in today’s economics

February 10, 2018 13 comments

from June Sekera

More than a century ago, the effective operation of the public economy was a significant, active concern of economists. With the insurgence of market-centrism and rational choice economics, however, government was devalued, its role circumscribed and seen from a perspective of “market failure.” As Backhouse (2005) has shown, the transformation in economic thinking in the latter half of the 20th century led to a “radical shift” in worldview regarding the role of the state. The very idea of a valid, valuable public non-market has almost disappeared from sight.

Read more…

Cashlessness poses a challenge to democracy itself.

February 2, 2018 20 comments

The Indian experience suggests that the obsession with digital transactions
as a marker of social and material progress may be misplaced and even
counterproductive. Indeed, policy attempts to push a rapid transition to
cashlessness may be both infeasible and regressive. Cashlessness relies on
very substantial development of infrastructure, universal access to banking,
and strong and reliable internet connectivity — and while it provides
convenience, it also can lead to greater monitoring and cyber-insecurity.
Even in favourable conditions, it should essentially be a choice rather than
an imposition for most small transactions, and the shift must be based on
people preferring digital payments for their convenience rather than being
driven by the physical absence of cash. There is a deeper issue here: forcing
people to go cashless by reducing the currency in circulation amounts to an
infringement of their civil liberties even as it transfers incomes to financial
intermediaries.
The specific form of financialization exhibited in the enforced push to
e-transactions is therefore an extreme example of a coercive strategy that
purports to provide convenience and formalization, but actually increases
inequality. It assists the generation of profits for financial companies by
adding another layer of costs to systems of payment. In doing so, it not only
makes those involved in such transactions poorer to that extent; it also renders
them more vulnerable to all-encompassing monitoring and surveillance, as
well as data and identity theft. This particular form of surplus extraction by
finance therefore also poses a challenge to democracy itself.

C.P. Chandrasekhar and Jayati Ghosh

Time in economic theory

January 31, 2018 22 comments

from Frank Salter and RWER issue 81

Despite economic cycles being the norm from the beginnings of the industrial revolution, major areas of economic thought present equilibrium as an appropriate basis for analysis. Blaug (1998, p.23) comments “indeed real business cycle theory is, like new classical macroeconomics, a species of the genus of equilibrium explanations of the business cycle (which would yesteryear have been considered an oxymoron).” The formal treatment of time is eschewed.

In their articles, The Production Function and the Theory of Capital, both Robinson (1953) and Solow (1955) express their concerns about the use of time in economic analysis. Robinson points out that time is unidirectional in the real world, and that some mathematical descriptions fail to reflect the fact. Solow (1955, p.102) expresses his concerns, “But the real difficulty of the subject comes… from the intertwining of past, present and future.”

Robinson (1980) continues to voice her critical assessment of the treatment of time in economic analysis. Later, no longer maintaining the concerns of his earlier insight, Solow (1994, p.47) states “Substitution along isoquants is routine stuff.”   Read more…

The crisis in economics education

January 27, 2018 4 comments

from Michel Zouboulaki and RWER issue 82

An outstanding neoclassical microeconomist, Hal Varian, asked the emphatic question of “What use is economic theory?” To answer the question, he started by recognizing the obvious: “Economics is a policy science and, as such, the contribution of economic theory to economics should be measured on how well economic theory contributes to the understanding and conduct of economic policy” (1997, 109). But this acknowledgement should have led Varian in the opposite direction to the one he took. Instead, he claimed that although “it offers a useful insight in explaining an economic phenomenon” (ib., 115), “no theory in Economics is ever exactly true” (sic), since – as Friedman said 44 years ago – it focuses unilaterally onto one dimension of economic phenomena. Varian feels comfortable in admitting that “any method is better than none” (ib., 116), even if it leads to error.[1] What a rigorous theorist should do instead is to promote only theories based on assumptions that sufficiently correspond to the operating frame of the real economy.

A commonly held view is that the Great Depression established Keynesian macroeconomics. However, the specialists know that it also greatly facilitated the process of mathematical formalization. A plausible explanation refers to the demand of the labor market for economists: business and research institutions wanted more technically skilled economists instead of broadly educated ones. The same demand for technical expertise was explicit in organizations such as the IMF, the OECD and, even more, the Rand Corporation. Thus,  Read more…

The Volatility Index and Heisenberg’s uncertainty principle

January 18, 2018 51 comments

from Donald MacKenzie and the London Review of Books

The VIX, or Volatility Index, is Wall Street’s fear gauge. I first started paying attention to it in the late 1990s. Back then, a level of around 20 seemed normal. If the index got to 30, that was an indication of serious market unease; over 40 signalled a crisis. The highest the VIX ever got was during the 1987 stockmarket crash, when it reached 150. In the 2008 global banking crisis, it peaked at just below 90.

The US economy has gradually recovered from the banking crisis, and the newly legislated tax cuts will further boost corporate profitability. These effects, though, are now ‘priced in’: share prices have already risen to reflect them. Tax cuts aside, the political system remains largely paralysed. The Federal Reserve seems likely to continue raising interest rates, which usually isn’t good news for the price of shares, and is beginning the process of weaning markets off the flood of cheap money that has helped inflate share prices. The tax cuts will most likely increase the Federal deficit. Add in a president who is the very opposite of calm (and who is under FBI investigation), and you might expect the VIX to be approaching the sweaty-palmed 30s. It isn’t. As this issue of the LRB went to press, the VIX was 9.8. It has been low for many months, and shows no clear sign of increasing.

Donald Trump would no doubt attribute the low readings to investors’ confidence in his leadership. But I have my doubts. There is an alternative explanation. Heisenberg’s uncertainty principle is often taken to mean that whenever you measure something, you alter it. In the everyday world, you can usually set this aside: I don’t worry about the effect of the speedometer on how fast my car’s wheels turn or on how its engine runs. You can’t ignore it, though, in economic life. As Charles Goodhart argues, if a measurement device is widely used, it stops being a simple economic speedometer. In the financial markets, it becomes part of how traders think, and can then begin to affect how they act.

Read more here in the LRB

Trump trashes the United States while Xi tries to build a “sustainable China”

January 14, 2018 3 comments

from Richard Smith and issue 82 of the RWER

As Trump walked away from the Paris climate accord, Xi announced his intention to “take the driver’s seat in international cooperation to respond to climate change”. Not only that but Xi’s government has also pledged to wipe out the last vestiges of poverty in China by 2030 and turn China into a “moderately prosperous society” where the basic needs of all including jobs, housing, and healthcare, are met. Trump, as we know, has different priorities: tax cuts for the rich.

In short, the contrasts between Donald Trump and Xi Jinping could hardly be more striking. Little wonder, then, that more and more people around the world look to China to take the lead to save us from climate collapse. 

Systemic drivers of destruction

Alas, that is not going to happen. I don’t doubt Xi’s earnest intentions. But for all of that, I’m going to argue here that Xi Jinping cannot lead the fight against global warming because he runs a political-economic system characterised by systemic growth drivers – the need to maximise growth beyond any market rationality, the need to maximise employment, and the need to maximise consumerism – which are, if anything, even more powerful and even more eco-suicidal than those of “normal” capitalism in the West, but which Xi is powerless to alter.

Read more…

Some suggestions for reorienting economics and philosophy of economics

January 9, 2018 30 comments

from Gustavo Marqués

If it is assumed that (a) both agents and theorists are aware they are facing an uncertain context, and (b) they hold epistemic and ontological beliefs consistent with this state of affairs, the proper way to approach economic phenomena should be very different from those that guide current modeling practice. Particularly, instead of mechanisms or economic regularities that keep running independently of agents’ expectations, the decisive role of lobbyists within open-ended and uncertain processes based on expectations should be incorporated into the analysis. The following assumptions could be the philosophical core of a new conceptual framework for economics:

1) There are economic processes based on expectations and characterized by radical uncertainty. Agents involved in such processes act in two different ways (as decision-makers or as lobbyists).

2) Ex-ante knowledge of invariant sequences of events is generally not possible (because there are few if any sequences of this kind); more importantly, such knowledge is unnecessary as support and justification for the implementation of economic policies.

3) The role of theoretical practice is to identify the many feasible “branches” of a “tree of plausible outcomes” as well as the restrictions that each sequence of events faces.

4) It is not known (and it is not possible to know) ex-ante what “branches” of the tree (what sequences of feasible alternative events) will prevail. Science cannot help us with this.

5) Other types of knowledge (common and practical knowledge as well as practical skills) are crucial for shaping those processes. It is a sort of know-how knowledge, closer to management and administration than to scientific economics.

6) Although – as was shown in point (3) – theoretical practice has an important role to play in shaping processes, what is crucial in this endeavor is another practice, which we denote as lobbying (interventional) practice, which is performed by a wide range of economic players (mostly different kind of interest groups who are able to operate on the relevant context and agents’ expectations).  Read more…

WEA Commentaries – new issue

January 5, 2018 Leave a comment

Volume 7, Issue No. 6  Download the issue as a PDF

In this issue
A Philosophical Framework for Rethinking Theoretical Economics and Philosophy of Economics
Gustavo Marqués

Interview on The Fascist Nature of Neoliberalism
Flavia Di Mario

Doomed to Repeat
Peter Radford

The Invisible Hand in Context

Global Rentier Capitalism
David F Ruccio

Perfect Competition and Counterfactuals
Stuart Birks

Keynes was right about Quantitative Easing
Merijn Knibbe

How UBER Money Dominates and Distorts Economic Research on Ride-Hailing Platforms
Norbert Häring

Announcements and WEA contact details

Please support the WEA by paying a membership fee
or making a small donation.

Making merry on Bitcoin

January 4, 2018 11 comments

from C.P. Chandrasekhar

Bitcoin has left the world of finance gasping. Though the total market value of all of that cryptocurrency in circulation is only a fraction of the value of the world’s financial assets, the rapid rise in the value of the currency has made it the most wanted of those assets. On January 1, 2017, the currency was trading at between $972 to $990 a unit. By December 7 that range had risen to $14,063 to $17,363. According to a calculation by Reuters, an investment of $1000 in bitcoins at the beginning of 2013 would be worth around $1.2 million now. Sensing the opportunity this offers by serving as a platform for speculation, the Chicago Board Options Exchange and Chicago Mercantile Exchange launched bitcoin futures on December 10, with the contracts opening at $15,460 a unit and rising more than 20 per cent to $18,700, before shedding some of those initial gains. With these developments, a shadowy currency with a still elusive originator named Satoshi Nakamoto, moved to centre stage in financial markets.

Launched in 2009, the role of bitcoin has always been in question. What its advocates regard as its strength, decentralised management by a community that can ensure integrity through verification of transactions over a ‘public’, peer-to-peer network, many of its critiques see as its weakness, because of the absence of an issuing authority in the form of a central bank and the backing of a state. Moreover, while there are a few establishments that accept bitcoin payments, the currency is still nowhere a ubiquitous means of payment in day-to-day exchange and commodity circulation. At the end of 2013 researchers from the University of San Diego and George Mason University had estimated that 64 percent of the 12 million bitcoins that had been mined till then had not been used in an exchange transaction.   Read more…

Perfect competition and counterfactuals

January 4, 2018 3 comments

from Stuart Birks and new issue of WEA Commentaries 7(6)

From: P.17 of Birks, S. (2015). Rethinking economics: from analogies to the real world. Singapore: Springer.

Market failure is defined in comparison to the ideal of perfect competition. An alternative is needed for comparison, and value judgments must be applied to justify one situation being considered superior to another. This raises two questions:

(i) Is perfect competition the right ‘ideal’?

(ii) If it is, then given that the counterfactual is an important aspect of any policy analysis, should economic analyses compare a real situation with an unattainable ideal such as perfect competition?  Read more…