In our PhD Economics program at Stanford, we learnt nothing about the history of major economic events of the twentieth century. Instead, we were taught the rather arcane and difficult skill of building models. In order to analyse what would happen in an economy, we learnt that you have to construct an artificial economy, populated by rational robots called homo economicus, who behave according to strict mathematical laws. At no point in our studies were we asked to match what happens in our models with any events in the real world; it was assumed that the two always matched. This process of economic modelling permits us to provide exact mathematical answers to a vast range of questions one might ask about the economy. This is undoubtedly a powerful technique, which has earned economics the name “Queen of the Social Sciences”. Our poor cousins in political science, psychology, sociology, geography, and so on, have to study the more complex real world, and cannot offer anything comparable. Nonetheless, the power of mathematical modelling derives from the extremely unrealistic assumption that real world events and human behaviour can be predicted by mathematical formulae. Thus, the precise predictions of economists are often dramatically contradicted by real world outcomes. As Nobel Laureate Paul Krugman remarked after the global financial crisis took economists by surprise: “the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth.” read more
from Lars Syll
In the standard mainstream economic analysis — take a quick look in e.g. Mankiw’s or Krugman’s textbooks — a demand expansion may very well raise measured productivity in the short run. But in the long run, expansionary demand policy measures cannot lead to sustained higher productivity and output levels.
In some non-standard heterodox analyses, however, labour productivity growth is often described as a function of output growth. The rate of technical progress varies directly with the rate of growth according to the Verdoorn law. Growth and productivity is in this view highly demand-determined, not only in the short run but also in the long run.
Given that the Verdoorn law is operative, expansionary economic policies actually may lead to increases in productivity and growth. Living in a world permeated by genuine Keynes-type uncertainty, we can, of course, not with any greater precision forecast how great those effects would be.
So, the nodal point is — has the Verdoorn Law been validated or not in empirical studies? Read more…
from Asad Zaman
Many leading economists have come to agree with Nobel Laureate Stiglitz that modern economic theory represents the triumph of ideology over science. One of the core victories of ideology is the famous Quantity Theory of Money (QTM). The QTM teaches us that money is veil – it only affects prices, and has no real effect on the economy. One must look through this veil to understand the working of the real economy. Nothing could be further from the truth.
In fact, the QTM itself is a veil which hides the real and important functions of money in an economy. The Great Depression of 1929 opened the eyes of everyone to the crucial role money plays in the real economy. For a brief period afterwards, Keynesian theories emerged to illuminate real role of money, and to counteract errors of orthodox economics. Economists believed in the QTM, that money doesn’t matter, and also that the free market automatically eliminates unemployment. Keynes started his celebrated book “The General Theory of Employment, Interest and Money” by asserting that both of these orthodox ideas were wrong. He explained why free markets cannot remove unemployment, and also how money plays a crucial role in creating full employment. He argued that in response to the Depression, the government should expand the money supply, create programs for employment, undertake expansionary fiscal policy, and run large budget deficits if necessary. read more
from Lars Syll
Thus your standard New Keynesian model will use Calvo pricing and model the current inflation rate as tightly coupled to the present value of expected future output gaps. Is this a requirement anyone really wants to put on the model intended to help us understand the world that actually exists out there? Thus your standard New Keynesian model will calculate the expected path of consumption as the solution to some Euler equation plus an intertemporal budget constraint, with current wealth and the projected real interest rate path as the only factors that matter. This is fine if you want to demonstrate that the model can produce macroeconomic pathologies. But is it a not-stupid thing to do if you want your model to fit reality?
I remember attending the first lecture in Tom Sargent’s evening macroeconomics class back when I was in undergraduate: very smart man from whom I have learned the enormous amount, and well deserving his Nobel Prize. But…
He said … we were going to build a rigorous, micro founded model of the demand for money: We would assume that everyone lived for two periods, worked in the first period when they were young and sold what they produced to the old, held money as they aged, and then when they were old use their money to buy the goods newly produced by the new generation of young. Tom called this “microfoundations” and thought it gave powerful insights into the demand for money that you could not get from money-in-the-utility-function models.
I thought that it was a just-so story, and that whatever insights it purchased for you were probably not things you really wanted to buy. I thought it was dangerous to presume that you understood something because you had “microfoundations” when those microfoundations were wrong. After all, Ptolemaic astronomy had microfoundations: Mercury moved more rapidly than Saturn because the Angel of Mercury left his wings more rapidly than the Angel of Saturn and because Mercury was lighter than Saturn…
Brad DeLong is of course absolutely right here, and one could only wish that other mainstream economists would listen to him … Read more…
from Lars Syll
Listen to the program here.
Mainstream economists like Paul Krugman and Simon Wren-Lewis think that yours truly and other heterodox economists are wrong in blaming mainstream economics for not being real-world relevant and pluralist. To Krugman there is nothing wrong with ‘standard theory’ and ‘economics textbooks.’ If only policy makers and economists stick to ‘standard economic analysis’ everything would be just fine.
I’ll be dipped! If there’s anything the last decade has shown us, it is that economists have gone astray in their tool shed. Krugman’s ‘standard theory’ — mainstream neoclassical economics — has contributed to causing today’s economic crisis rather than to solving it.
One of the major problems of economics, even today, is to establish an empirical discipline that connects our theories and models to the actual world we live in. In that perspective I think it is necessary to replace both the theory and methodology of the predominant neoclassical paradigm. Giving up the neoclassical creed does not mean that we will have complete theoretical chaos.
The essence of neoclassical economic theory is its exclusive use of a deductivist Euclidean methodology. A methodology – which Arnsperger & Varoufakis [2006:12] calls the neoclassical meta-axioms of “methodological individualism, methodological instrumentalism and methodological equilibration” – that is more or less imposed as constituting economics, and, usually, without a smack of argument. Hopefully this book will manage to convey the need for an articulate feasible alternative – an alternative grounded on a relevant and realist open-systems ontology and a non-axiomatic methodology where social atomism and closures are treated as far from ubiquitous.
from Lars Syll
Ultimately, the problem isn’t with worshipping models of the stars, but rather with uncritical worship of the language used to model them, and nowhere is this more prevalent than in economics. The economist Paul Romer at New York University has recently begun calling attention to an issue he dubs ‘mathiness’ – first in the paper ‘Mathiness in the Theory of Economic Growth’ (2015) and then in a series of blog posts. Romer believes that macroeconomics, plagued by mathiness, is failing to progress as a true science should, and compares debates among economists to those between 16th-century advocates of heliocentrism and geocentrism. Mathematics, he acknowledges, can help economists to clarify their thinking and reasoning. But the ubiquity of mathematical theory in economics also has serious downsides: it creates a high barrier to entry for those who want to participate in the professional dialogue, and makes checking someone’s work excessively laborious. Worst of all, it imbues economic theory with unearned empirical authority.
From the times of Galileo and Newton, physicists have learned not to confuse what is happening in the model with what instead is happening in reality. Physical models are compared with observations to prove if they are able to provide precise explanations … Can one argue that the use of mathematics in neoclassical economics serves similar purposes? … Gillies‘s conclusion is that, while in physics mathematics was used to obtain precise explanations and successful predictions, one cannot draw the same conclusion about the use of mathematics in neoclassical economics in the last half century. This analysis reinforces the conclusion about the pseudo-scientific nature of neoclassical economics … given the systematic failure of predictions of neoclassical economics.
Francesco Sylos Labini is a researcher in physics. His book is to be highly recommended reading to anyone with an interest in understanding the pseudo-scientific character of modern mainstream economics. Turning economics into a ‘pseudo-natural-science’ is — as Keynes made very clear in a letter to Roy Harrod already back in 1938 — something that has to be firmly ‘repelled.’
from Lars Syll
It may be argued … that the betting quotient and credibility are substitutable in the same sense in which two commodities are: less bread but more meat may leave the consumer as well off as before. If this were, then clearly expectation could be reduced to a unidimensional concept … However, the substitutability of consumers’ goods rests upon the tacit assumption that all commodities contain something — called utility — in a greater or less degree; substitutability is therefore another name for compensation of utility. The crucial question in expectation then is whether credibility and betting quotient have a common essence so that compensation of this common essence would make sense.
Just like Keynes underlined with his concept of ‘weight of argument,’ Georgescu-Roegen, with his similar concept of ‘credibility,’ underlines the impossibility of reducing uncertainty to risk and thereby being able to describe choice under uncertainty with a unidimensional probability concept. Read more…
from Asad Zaman
In the wake of the Global Financial Crisis (GFC 2007), the Queen of England asked academics at the London School of Economics why no one saw it coming. The US Congress constituted a committee to investigate the failure of economic theory to predict the crisis. Unfortunately, economists remain unable to answer this critical question. Some say that crises are like earthquakes, impossible to forecast. Others take refuge behind technical aspects of complex mathematical models. With monotonous regularity, more than 200 monetary crises have occurred globally, ever since financial liberalization started in the 1980’s. the methodology currently in use in economics systematically blinds economists to the root causes of these crises. Many leading economists have called for radical changes to bring economic theory into closer contact with reality.
Many who had hoped that the GFC would serve as a wake-up call for the profession have been extremely disappointed by subsequent developments. Although there has been a flurry of papers on various aspects of the crisis, there has been no fundamental re-thinking. Theories which assume free markets will create full employment and maximal growth, continue to be taught at universities. The rational expectations theory of Eugene Fama says that the stock market prices always correctly reflect the information available to the market, and there is no possibility of a bubble – a systematic over-valuation of all stock market prices. Under the influence of this theory, Robert Shiller’s demonstration that the stock market prices were over-inflated went unheeded. Similarly, warnings by many Cassandras like Steve Keen, Raghuram Rajan, Dean Baker, Nouriel Roubini, were ridiculed and ignored by senior level policy makers infatuated with free market dogma. read more
from Lars Syll
Roman Frydman is Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that macroeconomic models founded on the rational expectations hypothesis are inadequate as representation of economic agents’ decision making. Read more…
from Gary Flomenhoft and the RWER’s current issue.
We are first looking at items produced for sale on the market, and in particular a competitive market. The conditions for maximizing consumer surplus are approached in some industries. The most obvious one is the microelectronic industry, where Moore’s law has prevailed for many decades since first stated in 1965, doubling computing power at the same price every 18-24 months. Competition between Intel, Samsung, Qualcomm, Micron, etc. is fierce, dropping prices, while improving performance. The Top 10 manufacturers in 2013 from Wikipedia with market share are:
from Frank Stilwell
Even if the economics profession continues to deflect the challenges posed by heterodox economists, substantial progress can be made in relation to cognate social sciences. This is a necessary element in a strategy for progress because mainstream economists working in universities usually resist attempts to reconstitute their discipline on genuinely pluralist principles. Marxist political economy, for example, can usually only get a hearing as an historically discredited view; while “old” institutionalism, if mentioned at all, is merely a precursor to “new institutional economics”, which is more compatible with a neoclassical approach. Heterodox economists may get jobs in economics departments: some do, especially if their “deviance” develops after secure employment has been achieved, but they are often not replaced by people of similar inclination when they retire or move on.
from Lars Syll
The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be much less simple than the bare principle of uniformity. They appear to assume something much more like what mathematicians call the principle of the superposition of small effects, or, as I prefer to call it, in this connection, the atomic character of natural law. The system of the material universe must consist, if this kind of assumption is warranted, of bodies which we may term (without any implication as to their size being conveyed thereby) legal atoms, such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state. We do not have an invariable relation between particular bodies, but nevertheless each has on the others its own separate and invariable effect, which does not change with changing circumstances, although, of course, the total effect may be changed to almost any extent if all the other accompanying causes are different. Each atom can, according to this theory, be treated as a separate cause and does not enter into different organic combinations in each of which it is regulated by different laws …
The scientist wishes, in fact, to assume that the occurrence of a phenomenon which has appeared as part of a more complex phenomenon, may be some reason for expecting it to be associated on another occasion with part of the same complex. Yet if different wholes were subject to laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts. Given, on the other hand, a number of legally atomic units and the laws connecting them, it would be possible to deduce their effects pro tanto without an exhaustive knowledge of all the coexisting circumstances.
Keynes’ incisive critique is of course of interest in general for all sciences, but I think it is also of special interest in economics as a background to much of Keynes’ doubts about inferential statistics and econometrics.
from Asad Zaman
Modern economic theory is founded on the principle that human beings “maximize utility”; that is, they choose the best action from among a collection of choices. This axiom is considered self-evident: why would anyone make an inferior choice, when a better option is available? However, the mathematical formulation of this axiom is far from realistic. After all, it is self-evident that human behavior cannot be described by mathematical laws. Critics have invented the term “homo economicus” to describe behavior governed by economic laws, which differs drastically from normal human behavior. We can describe homo economicus as cold, calculating, and callous. We explain each of these terms separately.
The theory of consumer behavior which is taught in business school differs drastically from the same theory taught in the economics school. The homo economicus of economists is cold – not subject to any emotional influences in his consumption decisions. In complete contrast, a fundamental axiom of consumer theory in the business school is that effective marketing appeals to emotions instead of reason. The proven effectiveness of business school methods in getting consumers to purchase a wide range of completely useless goods shows the superiority of their models of human behavior. read more
from Lars Syll
In 1938 Paul Samuelson offered a replacement for the then accepted theory of utility. The cardinal utility theory had been discarded a couple of years earlier, but according to Samuelson, the ordinalist revision of utility theory was not drastic enough. One ought to analyze the consumer’s behaviour without having recourse to the concept of utility at all, since this did not correspond to directly observable phenomena.
The new theory’s main feature was a consistency postulate which said ‘if an individual selects batch one over batch two, he does not at the same time select two over one.’ From this ‘perfectly clear’ postulate and the assumptions of given demand functions and that all income is spent, Samuelson could derive all the main results of ordinal utility theory (single-valuedness and homogeneity of degree zero of demand functions, and negative semi-definiteness of the substitution matrix).
In 1950 Hendrik Houthakker made an amendment to the theory assuring integrability, and by that the theory had according to Samuelson been ‘brought to a close.’ According to Houthakker, the aim of the revealed preference approach was ‘to formulate equivalent systems of axioms on preferences and on demand functions.’
from Lars Syll
Despite the rise of behavioral economics, many economists still believe that utility maximization is a good explanation of human behavior. Although evidence from experimental economics and elsewhere has rolled back the assumption that human agents are entirely self-interested, and shown that altruism and cooperation are important, a prominent response has been to modify individual preference functions so that they are “other-regarding”. But even with these modified preference functions, individuals are still maximizing their own utility.
Defenders of utility maximization rightly reject critical claims that specific behavioral outcomes undermine this assumption. They do not. But this is a sign of weakness rather than strength. The problem is that utility maximization is unfalsifiable as an explanation of behavior. As I show more fully in my 2013 book entitled From Pleasure Machines to Moral Communities, utility maximization can fit any real-world evidence, including behavior that appears to suggest preference inconsistency.
But note that utility maximization is not a tautology. Tautologies are true by assumption or definition. Utility maximization is not a tautology because it is potentially false. But empirically it is unfalsifiable.
Where does that leave us? Utility maximization can be useful as a heuristic modelling device. But strictly it does not explain any behavior. It does not identify specific causes. It cannot explain any particular behavior because it is consistent with any observable behavior. Its apparent universal power signals weakness, not strength.
Interesting post from one of yours truly’s favourite economists. Read more…
The graph below is from an open-access paper just published in Bulletin of the Atomic Scientists.
from Lars Syll
The desire in the profession to make universalistic claims following certain standard procedures of statistical inference is simply too strong to embrace procedures which explicitly rely on the use of vernacular knowledge for model closure in a contingent manner. More broadly, such a desire has played a vital role in the decisive victory of mathematical formalization over conventionally verbal based economic discourses as the proncipal medium of rhetoric, owing to its internal consistency, reducibility, generality, and apparent objectivity. It does not matter that [as Einstein wrote] ‘as far as the laws of mathematics refer to reality, they are not certain.’ What matters is that these laws are ‘certain’ when ‘they do not refer to reality.’ Most of what is evaluated as core research in the academic domain has little direct bearing on concrete social events in the real world anyway.