Macroeconomic aspirations

October 26, 2022 2 comments

from Lars Syll

Some economists seem to be überjoyed by the fact that they are using the same ‘language’ as real business cycles macroeconomists and that they therefore somehow can learn something from them.

James Tobin obviously did not find any need to speak the RBC ‘language’:

They try to explain business cycles solely as problems of information, such as asymmetries and imperfections in the information agents have. Those assumptions are just as arbitrary as the institutional rigidities and inertia they find objectionable in other theories of business fluctuations … I try to point out how incapable the new equilibrium business cycles models are of explaining the most obvious observed facts of cyclical fluctuations … I don’t think that models so far from realistic description should be taken seriously as a guide to policy … I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it.

Arjo Klamer, The New Classical Mcroeconomics: Conversations with the New Classical Economists and their  Opponents,Wheatsheaf Books, 1984

Using the same microfoundational ‘language’ as mainstream macroeconomists don’t take us very far. Far better than having a common ‘language’ is to have a well-founded, realist, and relevant theory: Read more…

Neoclassical induced financial fragility. Central bank pension fund regulation edition.

October 23, 2022 3 comments

Financial wizardry recently caused massive problems for UK pension funds and the Bank of England. The Bank of England forces pension funds to take part in ‘LDI’ contracts which aim to insure possible future liquidity problems. These contracts however lead to real liquidity problems, which forced the Bank of England to intervene to prevent a market melt down. The solution became the problem.

Deputy Governor John Cunliff of the Bank of England stated:

“The Bank was informed by a number of LDI fund managers that, at the prevailing yields, multiple LDI funds were likely to fall into negative net asset value. As a result, it was likely that these funds would have to begin the process of winding up the following morning… In that eventuality, a large quantity of gilts, held as collateral by banks that had lent to these LDI funds, was likely to be sold on the market, driving a potentially self-reinforcing spiral and threatening severe disruption of core funding markets and consequent widespread financial instability.”

Notice the ultra short periods whichm presumably, are specified in the LDI contracts: ‘Cash, Now!’. I haven’t read any of these contracts, if somebody can provide me with one: please! I do not see any reason for such ultra short periods.

This did not just happen in the UK. Related problems in the Netherlands in 2020 forced the ECB to intervene, to prevent a market melt down. This led Anil Kashyap, in a November 2020 speech at the Bank of England about the March 2020 crisis, to issue the next warning (emphasis added): Read more…

Weekend read – Can university education in economics contribute to strengthened democracy and peace?

October 21, 2022 4 comments

from Peter Söderbaum and WEA Commentaries current issue


In all societies there is a tension between democracy and dictatorship. In some countries democracy is well institutionalized and the threat of dictatorship is successfully kept at a distance. In other societies, a system close to dictatorship is quite established and democracy is regarded as a threat. Today, we witness a confrontation between Russia, a nation close to dictatorship and Ukraine which appears to move toward strengthened democracy.

In this essay it is assumed that a strengthened democracy is preferable to dictatorship if one aims at a sustainable and peaceful relationship between single nations and regional assemblies of nations. Does the economics taught in universities play a positive role in strengthening democracy or is the idea rather for economists to be neutral and leave democracy to other disciplines, such as political science?

Two observations concerning the state of economic science

Until about 1870 economics was understood and referred to as, “political economics”. From that time onwards, a group of economists preferred to see themselves as objective and neutral comparable to how they understood the role of scholars in disciplines such as physics and chemistry. So called “neoclassical economics” was born as a theoretical perspective that claims to be more objective and neutral in terms of values than its forerunners. And neoclassical economics developed into a mainstream. “Political economics” became just “economics”. Read more…

The Nobel prize in economics — awarding popular misconceptions

October 19, 2022 1 comment

from Lars Syll

This year’s Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel honours Ben Bernanke, Douglas Diamond and Philip Dybvig. In the view of the Royal Swedish Academy of Sciences, the laureates ‘have significantly improved our understanding of the role of banks in the economy’.

3 Offshore Banking Myths | Caye Bank InternationalBut what is the role of banks in the economy? The academy describes it this way: ‘To understand why a banking crisis can have such enormous consequences for society, we need to know what banks actually do: they receive money from people making deposits and channel it to borrowers.’ According to this view, banks are thus pure intermediaries or dealers of savings between saving households and investing companies. It is a view widespread in economics today but there has long been a completely different theory of the function of banks.

This was formulated, among others, by Joseph Schumpeter. In his Theory of Economic Development Schumpeter wrote: ‘The banker, therefore, is not so much primarily a middleman in the commodity “purchasing power” as a producer of this commodity.’

In this vein, in 2014 the Bank of England affirmed: ‘Money creation in practice differs from some popular misconceptions—banks do not act simply as intermediaries, lending out deposits that savers place with them.’ Three years later, the Deutsche Bundesbank similarly spoke of the ‘popular misconception that banks act simply as intermediaries at the time of lending—ie that banks can only grant loans using funds placed with them previously as deposits by other customers’ …

It is hard to understand how the Swedish academy could decide to honour a theory which—due to its ‘real analysis’—is unsuitable to represent monetary processes in reality. In terms of economic policy, it makes a fundamental difference whether banks are merely intermediaries of savings or insurance companies or whether they are producers of purchasing power. The ‘real analysis’ was an important factor in the inability of the economics profession to anticipate the Great Financial Crisis in time.

After that painful experience, to extol with the Nobel prize for economics the intermediation approach to banking is akin to posthumously offering Ptolemy the prize for physics—because he discovered that the sun revolved around the earth.

Peter Bofinger

Yes indeed — money doesn’t matter in mainstream macroeconomic models. That’s true. According to the ‘classical dichotomy,’ real variables — output and employment — are independent of monetary variables, and so enable mainstream economics to depict the economy as basically a barter system.

But in the real world in which we happen to live, money certainly does matter. Money is not neutral and money matters in both the short run and the long run: Read more…

What is money?

October 17, 2022 2 comments

from Tony Lawson and real-world economics review issue no. 101

What is money?  Two sorts of answer to this question can be found in the modern literature.  One locates money’s nature in the organising structure of human communities, the other in intrinsic properties of particular (money) items (like commodities, debts, precious metals and so on). If accounts of money that draw on social positioning theory are instances of the former, a currently very popular and seemingly increasingly influential version of the latter takes the form of modern money theory (MMT). Notably, however, whilst contributors to social positioning theory have regularly concerned themselves with elaborating explicitly a conception of money’s nature, proponents of MMT rarely address the matter explicitly; mostly the notions entertained must be inferred from monetary assessments and policy proposals and the like.

An exception to the latter, though, is provided by the writings of Randall Wray, a central contributor to MMT; and Wray (2022) has produced a new book on money in which the topic is again addressed head on. As Wray observes it is difficult to have confidence that claims about monetary policy are coherent if we do not first understand the nature of what is being talked about. I agree with Wray on this, albeit defending a different conception drawing on social positioning theory.

Here I take the publication of Wray’s new book as affording an opportunity to contrast (I intend constructively) the two accounts of money in question. Read more…

Neoliberalism as an enabler of the spread of coronavirus

October 14, 2022 2 comments

from Imad A. Moosa and real-world economics review issue no. 101

The spread of the Coronavirus was aided by unpreparedness, the fact that the private sector cannot deal with a pandemic, neoliberal policy makers who could not care less about ordinary people, and years of dismantling public health systems through privatisation. Since the 1980s, belief in the power of the market has led to a status quo where governments take a back seat, allowing the private sector to steer the economy for the benefit of the oligarchy. As a result, governments have been put in a position where they are not always properly prepared and equipped to deal with crises such as Covid-19. Free markets cannot deal with a crisis of this magnitude. The economy is like the human body: a person who cuts himself shaving does not need the intervention of a surgeon, but the intervention of a surgeon is required when a person is involved in a major car accident.

Covid-19 is not a “black swan”, but rather a case of neglected risk, where neglect can be attributed to neoliberal thinking, to the belief that that the market can solve any problem and that it does a better job than the public sector. Read more…

On the validity of econometric inferences

October 11, 2022 3 comments

from Lars Syll

LierThe impossibility of proper specification is true generally in regression analyses across the social sciences, whether we are looking at the factors affecting occupational status, voting behavior, etc. The problem is that as implied by the three conditions for regression analyses to yield accurate, unbiased estimates, you need to investigate a phenomenon that has underlying mathematical regularities – and, moreover, you need to know what they are. Neither seems true. I have no reason to believe that the way in which multiple factors affect earnings, student achievement, and GNP have some underlying mathematical regularity across individuals or countries. More likely, each individual or country has a different function, and one that changes over time. Even if there was some constancy, the processes are so complex that we have no idea of what the function looks like.

Researchers recognize that they do not know the true function and seem to treat, usually implicitly, their results as a good-enough approximation. But there is no basis for the belief that the results of what is run in practice is anything close to the underlying phenomenon, even if there is an underlying phenomenon. This just seems to be wishful thinking. Most regression analysis research doesn’t even pay lip service to theoretical regularities. But you can’t just regress anything you want and expect the results to approximate reality. And even when researchers take somewhat seriously the need to have an underlying theoretical framework – as they have, at least to some extent, in the examples of studies of earnings, educational achievement, and GNP that I have used to illustrate my argument – they are so far from the conditions necessary for proper specification that one can have no confidence in the validity of the results.

Steven J. Klees

Most work in econometrics and regression analysis is done on the assumption that the researcher has a theoretical model that is ‘true.’ Read more…

Weekend read – Four flaws in foundations of statistics

October 9, 2022 2 comments


from Asad Zaman and the WEA Pedagogy Blog

Rejecting Arbitrary Distributional Assumptions

In the early 20th Century, Sir Ronald Fisher initiated an approach to statistics that he characterized as follows:: “… the object of statistical methods is the reduction of data. A quantity of data, which usually by its mere bulk is incapable of entering the mind, is to be replaced by relatively few quantities which shall adequately represent the whole …” As he clearly indicates, we want to reduce the data because our minds cannot comprehend large amounts of data. Therefore, we want to summarize the data in a few numbers which adequately represent the whole data set.

It should be obvious from the start that this is an impossible task. One cannot reduce the information contained in 1000 points of data to two or three numbers. There must be loss of information in this process. Fisher developed a distinctive methodology, which is still at the heart of conventional statistics. The central element of this methodology was an ASSUMPTION – the data is a random sample from a larger population, where the larger population is characterized by a few key parameters. Under these assumptions, the key parameters which characterized the larger population would be sufficient to characterize the data set at hand. Under such assumptions, Fisher showed that there were “sufficient statistics” – a small set of numbers that captured all of the information available in the data. Thus, once in possession of the sufficient statistics, the data analyst could actually throw away the original data, as all relevant information from the data set had been captured in the sufficient statistics. Our goal in this section is to explain how this methodology works, why it was a brilliant contribution of Fisher at his time, and why this methodology is now obsolete, and a handicap to progress in statistics. Read more…

The economist’s oath

October 8, 2022 1 comment

from Lars Syll

— I will remember that I didn’t make the world, and it doesn’t satisfy my equations.

Ten Principles of Economics | nostra economia— Though I will use models boldly to estimate value, I will not be overly impressed by mathematics.

— I will never sacrifice reality for elegance without explaining why I have done so.

— Nor will I give the people who use my model false comfort about its accuracy. Instead, I will make explicit its assumptions and oversights.

— I understand that my work may have enormous effects on society and the economy, many of them beyond my comprehension.

Emanuel Derman & Paul Wilmott

Read more…

Mainstream economics — a form of brain damage

October 7, 2022 1 comment

from Lars Syll

It is difficult to understand why mainstream economists keep on using their unreal and irrelevant models! Sure, you get academic accolades and give the impression of having something deep and ‘scientific’ to say, but that should count for nothing if you’re in the truth business. As long as that kind of modelling output doesn’t come with the accompanying warning text “NB! This is model-based results based on tons of more or less unsubstantiated assumptions,” we should keep on scrutinising and criticising it.

Yours truly appreciates scientists like David Suzuki. With razor-sharp intellects, they immediately go for the essentials. They have no time for bullshit. And neither should we.

We need our Hutton

October 7, 2022 4 comments

from Peter Radford

– the question is how does economics get its much needed revamp?

This caught my eye:

“Debreu noted in his Nobel Prize lecture that the success of the mathematization of economic theory depended “on the fact that the commodity space has the structure of a real vector space”. We have shown that this is incorrect. The “price vector” is not a vector, and GET [General Equilibrium Theory] is therefore false. But we may go further and assert that not only was the proof incorrect, what was set out to be proved was not true in the first place. The real economy cannot be brought into equilibrium by adjusting prices. And indeed, the real economy is never in equilibrium.”

That’s the concluding paragraph in Philip George’s paper in the recently published Real World Economics Review #101.

The emperor, apparently, has no clothes.

But, then, we all knew that, didn’t we?

I wrote earlier this week about the difficulty we have in determining the efficacy of a supposed body of knowledge.  The arbiters of knowledge have a vested interest in maintaining the outward appearance of whatever it is they study.  They act like a priesthood intoning in ancient languages and using secret signs to distinguish themselves from the ordinary folk whom they intend to control or influence.  The problem is that we, those of us on the outside, can only rely on those arbiters for assurance that the efficacy they proclaim for themselves is actually, well, efficacious.  Worse, within a wide discipline such as economics, or applied mathematics as it has now become, the various sub-specialities are so specialized and the knowledge so arcane that anyone not within close proximity to it is unable to offer an opinion as to its validity.

This has become a fundamental and defining issue within economics.  The discipline needs good jolt of reality.  It needs a new direction.  It needs to shake off the errors of its past and begin anew.

Read more…

Statistical models and the assumptions on which they build

October 6, 2022 2 comments

from Lars Syll

Assumption / Validation Flowchart | by Tami Reiss | Product Ponderings |  MediumEvery method of statistical inference depends on a complex web of assumptions about how data were collected and analyzed, and how the analysis results were selected for presentation. The full set of assumptions is embodied in a statistical model that underpins the method … Many problems arise however because this statistical model often incorporates unrealistic or at best unjustified assumptions …

The difficulty of understanding and assessing underlying assumptions is exacerbated by the fact that the statistical model is usually presented in a highly compressed and abstract form—if presented at all. As a result, many assumptions go unremarked and are often unrecognized by users as well as consumers of statistics. Nonetheless, all statistical methods and interpretations are premised on the model assumptions; that is, on an assumption that the model provides a valid representation of the variation we would expect to see across data sets, faithfully reflecting the circumstances surrounding the study and phenomena
occurring within it.

Sander Greenland et al.

If anything, the common abuse of statistical tests underlines how important it is not to equate science with statistical calculation. Read more…

The market did not cause inequality, no matter how much the New York Times insists

October 5, 2022 5 comments

from Dean Baker

It is a complete article of faith in intellectual circles that the market is responsible for the rise in inequality that we have seen in the United States and elsewhere over the last half-century. Intellectual types literally cannot even consider the alternative that inequality was the result of government policies, not the natural workings of the market.

The standard line is that technology and globalization were responsible for the increasing gap in income between people with college, especially advanced, degrees and non-college-educated workers. This belief that market forces drove inequality and not policy is apparently central to the identity of its beneficiaries, who determine what appears in major news outlets.

In this way, the belief in the market causes of inequality can be similar to the belief among Trumpers that the 2020 election was stolen from Trump. They simply do not even want to see the issue debated.

Spencer Bokat-Lindell: The Latest Perp

My current prompt to make my standard complaint is a column by New York Times columnist Spencer Bokat-Lindell which raises the question, “Is liberal democracy dying?” While the causes of growing inequality are not directly the piece’s topic, the issue comes up at several points.

For example, in discussing the rise of authoritarian sentiments among the masses, he tells readers: Read more…

The price of economics

October 4, 2022 4 comments

from Peter Radford

Thank you Mariano Torras.

You said the following in a letter to the Financial Times:

“I would venture that there is a professional motive for perpetuating — through the use of elegant and abstract models — the fantasy that economics is a science.  The prestige, the stature and influence that such a myth permits is undeniable.  Yet, far more perniciously, the ostensible neutrality of “economic science” provides seemingly unshakeable ideological cover against critics who (more realistically) accentuate power, inequality, and politics.”

That about sums it up.

Economists do not study economies.  They study economics.  They study their own models and other stylized facts in ever more detail and abstraction.  They have substituted technical wizardry for contemplation.  They privilege mathematics over other kinds of analysis.  And they scrupulously avoid entanglement with history which might drag them into a conversation about just how they arrived atop Mount Economics far above the plains of reality below.

Professor Torras inspired me to dig out the following:

“More generally, how does a scholarly community determine that a proof is valid, especially when the proof is highly complex and when there are few people in the community with the technical skill to understand the proof? And what might “understanding a proof” entail?”

That’s Roy Weintraub speaking in his excellent book ”How Economics Became a Mathematical Science”.  Go read the book.  It’s highly instructive.

But it isn’t my intention to poke too much at the mathematicians we now call economists.  It isn’t for me to say whether economics has progressed as a consequence of restricting itself to the confines of applied mathematics.  The profession seems comfortable to be so restricted.  It wallows in its arcane nook and appears content to portray itself as a repository of analytical capability rather than of economic insight.  That’s all we need to know.

A broader question is worth delving into though.

How do people outside of the profession know that what economists state as knowledge is actually worth knowing?  How does anyone not within the hallowed halls and not grounded in the accepted or iconic modeling know that the knowledge professed to be possessed by economists is actually worth anything?  Anything at all? Read more…

Liz Truss. Or: how not to pay for the war

October 1, 2022 3 comments

The Dutch September HICP inflation rate was 17,1%. One year ago it was 3,0%. Below, I will argue that this is a sign of kind of war economy, not of a cyclically overheated economy. Ways to mitigate inflation were pioneered by the English economist John Maynard Keynes in his ‘How to pay for the war‘. it’s useful to go back to his ideas.

The first version was published in three parts in The Times of november 1939. It was partly based on his experiences in World War I and partly on the new system of national accounting (extended and improved by Keynes). The ideas weere based upon the idea of a monetary economy where consumer spending and consumer prices, production and producer prices and the use of factors of production and factor prices (wages, profits, interests, rents) are intertwined. During a war, this system could lead to consumer and producer price inflation resulting in war time profits on one side and poverty on the other. He proposed changes to the system which would mitigate producer and consumer price inflation as well as war time profits. Fun fact: it’s about the opposite of the Liz Truss UK budget. The central idea of Keynes: we have to understand inflation not just as an increase of consumer prices but as interconnected changes in consumer, producer and factor prices, financed by income as well as borrowing/(forced) saving. Especially during a war, the connections will change in unwanted ways, policies to mitigate this can be enacted as long as we understand inflation as a system of interconnected expenditure, output and factor prices. Fun fact: Keynes acknowledges ‘Prof von Hayek’ for the idea of a post war levy on capital.

Read more…


September 28, 2022 6 comments

from Lars Syll

Analytical Economics: 9780674281622: Economics Books @ Amazon.comIt may be argued … that the betting quotient and credibility are substitutable in the same sense in which two commodities are: less bread but more meat may leave the consumer as well off as before. If this were, then clearly expectation could be reduced to a unidimensional concept … However, the substitutability of consumers’ goods rests upon the tacit assumption that all commodities contain something — called utility — in a greater or less degree; substitutability is therefore another name for compensation of utility. The crucial question in expectation then is whether credibility and betting quotient have a common essence so that compensation of this common essence would make sense.

Just like Keynes underlined with his concept of ‘weight of argument,’ Georgescu-Roegen, with his similar concept of ‘credibility,’ underscores the impossibility of reducing uncertainty to risk and thereby being able to describe choice under uncertainty with a unidimensional probability concept.

In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical, and New ‘Keynesian’ — variables are treated as if drawn from a known ‘data-generating process’ that unfolds over time and on which we, therefore, have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course, it has to. Who honestly believes that we should have access to this mythical Holy Grail, the data-generating process? Read more…

new issue of real-world economic review

September 22, 2022 Comments off

real-world economic review

issue no. 101   –   
September 2022

download whole issue


Two conceptions of the nature of money 
Tony Lawson          2

How power shapes our thoughts
Asad Zaman         20

SARS-CoV-2: The Neoliberal Virus
Imad A. Moosa          27

The giant blunder at the heart of General Equilibrium Theory
Philip George          38

A life in development economics and political economy: An interview with Jayati Ghosh
Jayati Ghosh and Jamie Morgan          44

John Komlos and the Seven Dwarfs
Junaid B. Jahangir          65

A three-dimensional production possibility frontier with stress
John Komlos           76

Free trade theory and reality:
How economists have ignored their own evidence for 100 years

Jeff Ferry          83

The choice of currency and policies for an independent Scotland:
A debate through the lenses of different economic paradigms

Alberto Paloni          90

End Matter 107

please support this journal

China is the world’s largest economy: Get over it

September 16, 2022 4 comments

from Dean Baker

It is common for politicians and pundit types to speculate on when or whether China’s economy will pass the US economy as the world’s largest. The latest episode to cross my path was a column by David Wallace in the New York Times.

There is little reason for this sort of speculation. China is already the world’s largest economy, its economy is more than 20 percent larger than the US economy, according to the IMF. Furthermore, it is growing considerably more rapidly (assuming they don’t continue their zero COVID-19 policy forever), so it is projected to be more than a third larger than the US economy by the end of the decade.

Here’s the picture.

Source: International Monetary Fund.

Read more…

The danger of teaching the wrong thing all too well

September 12, 2022 Leave a comment

from Lars Syll

It is well known that even experienced scientists routinely misinterpret p-values in all sorts of ways, including confusion of statistical and practical significance, treating non-rejection as acceptance of the null hypothesis, and interpreting the p-value as some sort of replication probability or as the posterior probability that the null hypothesis is true …

servicemanIt is shocking that these errors seem so hard-wired into statisticians’ thinking, and this suggests that our profession really needs to look at how it teaches the interpretation of statistical inferences. The problem does not seem just to be technical misunderstandings; rather, statistical analysis is being asked to do something that it simply can’t do, to bring out a signal from any data, no matter how noisy. We suspect that, to make progress in pedagogy, statisticians will have to give up some of the claims we have implicitly been making about the effectiveness of our methods …

It would be nice if the statistics profession was offering a good solution to the significance testing problem and we just needed to convey it more clearly. But, no, … many statisticians misunderstand the core ideas too. It might be a good idea for other reasons to recommend that students take more statistics classes—but this won’t solve the problems if textbooks point in the wrong direction and instructors don’t understand what they are teaching. To put it another way, it’s not that we’re teaching the right thing poorly; unfortunately, we’ve been teaching the wrong thing all too well.

Andrew Gelman & John Carlin

Teaching both statistics and economics, yours truly can’t but notice that Read more…

Weekend read – The big myth on inequality: it just happened

September 9, 2022 5 comments

from Dean Baker

The standard line in policy circles about the soaring inequality of the last four decades is that it is just an unfortunate outcome of technological change. As a result of technological developments, education is much more highly valued and physical labor has much less value. The drop in relative income for workers without college degrees is unfortunate and provides grounds for lots of hand wringing and bloviating in elite media outlets, but hey, what can you do?

Manufacturing plays a central role in this story since it has historically been the major source of high-paying jobs for workers without college degrees. Manufacturing jobs offered a pay premium of almost 17.0 percent in the 1980s. This had fallen sharply by the start of the last decade and had largely disappeared in more recent years.

This decline in the wage premium has coincided with a plunge in unionization rates in manufacturing. Approximately 20 percent of manufacturing workers were unionized at the start of the 1980s. In 2021 just 7.7 percent of manufacturing workers were in unions, only slightly higher than the average of 6.1 percent in the private sector. Read more…

%d bloggers like this: