from Lars Syll
The increasing use of natural and quasi-natural experiments in economics during the last couple of decades has led some economists to triumphantly declare it as a major step on a recent path toward empirics, where instead of being a deductive philosophy, economics is now increasingly becoming an inductive science.
In their plaidoyer for this view, the work of Joshua Angrist and Jörn-Steffen Pischke is often apostrophized, so lets start with one of their later books and see if there is any real reason to share the optimism on this ’empirical turn’ in economics.
In their new book, Mastering ‘Metrics: The Path from Cause to Effect, Angrist and Pischke write: Read more…
from Lars Syll
Lynn Parramore: It seems obvious that both fundamentals and psychology matter. Why haven’t economists developed an approach to modeling stock-price movements that incorporates both?
Roman Frydman: It took a while to realize that the reason is relatively straightforward. Economists have relied on models that assume away unforeseeable change. As different as they are, rational expectations and behavioral-finance models represent the market with what mathematicians call a probability distribution – a rule that specifies in advance the chances of absolutely everything that will ever happen.
In a world in which nothing unforeseen ever happened, rational individuals could compute precisely whatever they had to know about the future to make profit-maximizing decisions. Presuming that they do not fully rely on such computations and resort to psychology would mean that they forego profit opportunities.
from Lars Syll
By the early 1980s it was already common knowledge among people I hung out with that the only way to get non-crazy macroeconomics published was to wrap sensible assumptions about output and employment in something else, something that involved rational expectations and intertemporal stuff and made the paper respectable. And yes, that was conscious knowledge, which shaped the kinds of papers we wrote.
More or less says it all, doesn’t it?
And for those of you who do not want to play according these sickening hypocritical rules — well, here’s one good alternative.
This book describes the many wrong turns that the social sciences have taken to arrive at their current dismal state. Most of the space will be devoted to my own field of economics, however political science, particularly the theory of collective choice will also be treated at some length. Though I agree with much of the criticism of contemporary economics, particularly macroeconomics that comes from the Left of the political spectrum, this and the following two books differ from the conventional critique in fundamental ways.
The first is that I define each wrong turn by contrasting it with what my view would have been the correct turn in the sense that it would have advanced the field in a positive direction. Thus unlike much contemporary criticism, mine is not purely negative – the negative is always contrasted with a possible positive.
Secondly, I do not concentrate on the past several decades that saw the rise of neoclassical economics along with neoliberal economic policies. Instead, I begin with the rise of classical economics and the works of William Petty and Adam Smith. The rejection of all economic theory that is, or has been in some sense mainstream, is in my view an act of non-constructive nihilism.
The topic of this book is how economics came to its present state. What were the valid ideas discovered along the way and why were they lost? What motivated the wrong turns along the way? What role did ideologies of various kinds play in this process? The serious student will find many ideas to challenge him.
from Lars Syll
Stylized facts are close kin of ceteris paribus laws. They are ‘broad generalizations true in essence, though perhaps not in detail’. They play a major role in economics, constituting explananda that economic models are required to explain. Models of economic growth, for example, are supposed to explain the (stylized) fact that the profit rate is constant. The unvarnished fact of course is that profit rates are not constant. All sorts of non-economic factors — e.g., war, pestilence, drought, political chicanery — interfere. Manifestly, stylized facts are not (what philosophers would call) facts, for the simple reason that they do not actually obtain. It might seem then that economics takes itself to be required to explain why known falsehoods are true. (Voodoo economics, indeed!) This can’t be correct. Rather, economics is committed to the view that the claims it recognizes as stylized facts are in the right neighborhood, and that their being in the right neighborhood is something economic models should account for. The models may show them to be good approximations in all cases, or where deviations from the economically ideal are small, or where economic factors dominate non-economic ones. Or they might afford some other account of their often being nearly right. The models may diverge as to what is actually true, or as to where, to what degree, and why the stylized facts are as good as they are. But to fail to acknowledge the stylized facts would be to lose valuable economic information (for example, the fact that if we control for the effects of such non-economic interference as war, disease, and the president for life absconding with the national treasury, the profit rate is constant.) Stylized facts figure in other social sciences as well. I suspect that under a less alarming description, they occur in the natural sciences too. The standard characterization of the pendulum, for example, strikes me as a stylized fact of physics. The motion of the pendulum which physics is supposed to explain is a motion that no actual pendulum exhibits. What such cases point to is this: The fact that a strictly false description is in the right neighborhood sometimes advances understanding of a domain.
Catherine Elgin thinks we should accept model claims when we consider them to be ‘true enough,’ and Uskali Mäki has argued in a similar vain, maintaining that it could be warranted — based on diverse pragmatic considerations — to accept model claims that are negligibly false.
from Shimshon Bichler and Jonathan Nitzan
from Lars Syll
In econometrics one often gets the feeling that many of its practitioners think of it as a kind of automatic inferential machine: input data and out comes casual knowledge. This is like pulling a rabbit from a hat. Great — but first you have to put the rabbit in the hat. And this is where assumptions come in to the picture.
As social scientists — and economists — we have to confront the all-important question of how to handle uncertainty and randomness. Should we equate randomness with probability? If we do, we have to accept that to speak of randomness we also have to presuppose the existence of nomological probability machines, since probabilities cannot be spoken of – and actually, to be strict, do not at all exist – without specifying such system-contexts.
Accepting a domain of probability theory and a sample space of “infinite populations” — which is legion in modern econometrics — also implies that judgments are made on the basis of observations that are actually never made! Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s not tenable.
from Lars Syll
To achieve explanatory success, a theory should, minimally, satisfy two criteria: it should have determinate implications for behavior, and the implied behavior should be what we actually observe. These are necessary conditions, not sufficient ones. Rational-choice theory often fails on both counts. The theory may be indeterminate, and people may be irrational.
In what was perhaps the first sustained criticism of the theory, Keynes emphasized indeterminacy, notably because of the pervasive presence of uncertainty. His criticism applied especially to cases where agents have to form expectations about the behavior of other agents or about the development of the economy in the long run. In the wake of the current economic crisis, this objection has returned to the forefront. Before the crisis, going back to the 1970s, the main objections to the theory were based on pervasive irrational behavior. Experimental psychology and behavioral economics have uncovered many mechanisms that cause people to deviate from the behavior that rational-choice theory prescribes.
Disregarding some more technical sources of indeterminacy, the most basic one is embarrassingly simple: how can one impute to the social agents the capacity to make the calculations that occupy many pages of mathematical appendixes in the leading journals of economics and political science and that can be acquired only through years of professional training? …
I believe that much work in economics and political science that is inspired by rational-choice theory is devoid of any explanatory, aesthetic or mathematical interest, which means that it has no value at all. I cannot make a quantitative assessment of the proportion of work in leading journals that fall in this category, but I am confident that it represents waste on a staggering scale.
Elster’s article is essential reading for all those who want to understand why mainstream – neoclassical – economists actively have contributed to causing todays’s economic crisis rather than to solving it.
from Lars Syll
A common idea among mainstream — neoclassical — economists is the idea of science advancing through the use of ‘as if’ modeling assumptions and ‘successive approximations’. But is this really a feasible methodology? I think not.
Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world). All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions — like “rational expectations” or “representative actors” — made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.
All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.
from Maria Alejandra Madi and WEA Pedagogy Blog
The WEA On-line Conferences format, designed by Edward Fullbrook and Grazia Ietto-Gillies, makes full use of the digital technologies in the pursuit of the commitments included in the World Economics Association Manifesto: plurality, competence, reality and relevance, diversity, openness, outreach, ethical conduct, and global democracy. The WEA On-line Conferences seek to also engage graduate and undergraduate students considering: (a) the variety of theoretical perspectives; (b) the range of human activities and issues which fall within the broad domain of economics; and (c) the study of the world’s diverse economies.
The current conference is The European crisis. It is being led by distinguished professors Victor Becker, Beniamino Moro and James Galbraith. The purpose of the online Conference is to analyze the current crisis in the countries of the Eurozone. After the 2008 financial meltdown, the American crisis soon infected the European financial system, becoming both a sovereign debt crisis and a banking debacle in many peripheral Euro area countries. The European crisis has shown that crisis can spread quickly among closely integrated economies. The implementation of austerity policies, prompted by the Troika (European Commission, European Central Bank and the IMF) have reinforced a spiral of economic contractions, and provoked a rising political rebellion against austerity, inspired in part (and especially in Spain, but also to a degree in Greece) by the successful exit from crisis of the South American countries in the past decade. The conference would like especially to address the questions of social stabilization, strategies for structural reform and economic growth, and monetary, financial and debt management that may be used to frame a new economic model for Europe.
The Discussion Forum is now open. The interactive format of Conferences provides an on-line forum for visitors and commentators. All participants will be able to send comments on specific papers, or to contribute to a general discussion on the conference theme.The Leaders of the conference moderate these comments prior to posting to ensure no libellous or hateful language.
Within the Discussion Forum, students share thoughts, review ideas of others and explore new perspectives. The Wea leaders encourage students to submit comments to the following papers http://europeancrisis2015.weaconferences.net/papers/. read more
from Lars Syll
Macroeconomic forecasts produced with macroeconomic models tend to be little better than intelligent guesswork. That is not an opinion – it is a fact. It is a fact because for decades many reputable and long standing model based forecasters have looked at their past errors, and that is what they find. It is also a fact because we can use models to generate standard errors for forecasts, as well as the most likely outcome that gets all the attention. Doing so indicates errors of a similar magnitude as those observed from past forecasts. In other words, model based forecasts are predictably bad …
I think it is safe to say that this inability to accurately forecast is unlikely to change anytime soon. Which raises an obvious question: why do people still use often elaborate models to forecast? …
It makes sense for both monetary and fiscal authorities to forecast. So why use the combination of a macroeconomic model and judgement to do so, rather than intelligent guesswork? (Intelligent guesswork here means some atheoretical time series forecasting technique.) The first point is that it is not obviously harmful to do so …
from Lars Syll
Walked-out Harvard economist Greg Mankiw has more than once tried to defend the 1 % by invoking Adam Smith’s invisible hand:
[B]y delivering extraordinary performances in hit films, top stars may do more than entertain millions of moviegoers and make themselves rich in the process. They may also contribute many millions in federal taxes, and other millions in state taxes. And those millions help fund schools, police departments and national defense for the rest of us …
[T]he richest 1 percent aren’t motivated by an altruistic desire to advance the public good. But, in most cases, that is precisely their effect.
When reading Mankiw’s articles on the “just desert” of the 1 % one gets a strong feeling that Mankiw is really trying to argue that a market economy is some kind of moral free zone where, if left undisturbed, people get what they “deserve.” Read more…
from Maria Alejandra Madi and the WEA Pedagogy Blog
More recently, the internet has enabled the transformation of traditional work under Fordism, to knowledge work, characteristic of post-Fordism. In knowledge work, multi-tasking workers are integrated into flat hierarchical structures, compared to the centralized large corporation, e.g., General Motors. As a result, communication channels have been re-defined with greater involvement of lower-level employees in decision-making. Knowledge work includes new employment practices, such as time flexibility, teleworking; alternative payment schemes; along with employee empowerment and autonomy; task rotation and multi-skilling, team work and team autonomy. Potential consequences include fragmentation of work, crowdsourcing and virtualization of work.
Indeed, technological change has significantly transformed the labour market as the result of the diffusion of innovative practices at the micro-level. Crowdsourcing, for example, is the outsourcing of tasks to a large, undefined group of people in an open call. Considering this background, current challenges in working conditions are also related to the emergence of a crowd of freelancers available and able to quickly do the necessary tasks. The cloud based work environment is characterized by five essential characteristics: on-demand service; broad access; resource pooling; rapid elasticity; and measured service (Ipeirotis, 2012). read more
from Lars Syll
Using formal mathematical modeling, mainstream economists sure can guarantee that the conclusion holds given the assumptions. However, the validity we get in abstract model worlds does not warrantly transfer to real world economies. Validity may be good, but it isn’t enough. From a realist perspective both relevance and soundness are sine qua non.
In their search for validity, rigour and precision, mainstream macro modellers of various ilks construct microfounded DSGE models that standardly assume rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative household/ consumer/producer agents with homothetic and identical preferences, etc., etc. At the same time the models standardly ignore complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc., etc.
from Asad Zaman and the WEA Pedagogy Blog
The vision of a government of the people, by the people and for the people is enchanting, and powerfully attractive to the masses yearning to be free. However, the title of Nobel laureate Joseph Stiglitz’s article, Of the 1%, by the 1%, and for the 1% is a far more accurate description of the reality of US democracy. Prophetically, Eisenhower had warned against the threat to democracy posed by the powerful military-industrial complex. Today the power of a tiny minority to control the US, and thence the world, exceeds his worst nightmares. read more
from Lars Syll
Standard new Keynesian macroeconomics essentially abstracts away from most of what is important in macroeconomics. To an even greater extent, this is true of the dynamic stochastic general equilibrium (DSGE) models that are the workhorse of central bank staffs and much practically oriented academic work.
Why? New Keynesian models imply that stabilization policies cannot affect the average level of output over time and that the only effect policy can have is on the amplitude of economic fluctuations, not on the level of output. This assumption is problematic at a number of levels …
The problem has always been that it is difficult to beat something with nothing. This may be changing as topics like hysteresis, secular stagnation, and multiple equilibrium are getting more and more attention …
As macroeconomics was transformed in response to the Depression of the 1930s and the inflation of the 1970s, another 40 years later it should again be transformed in response to stagnation in the industrial world.
Maybe we can call it the Keynesian New Economics.
Mainstream macroeconomics is stuck with crazy models — and ‘New Keynesian’ macroeconomics and DSGE models certainly, as Summers puts it, “essentially abstract away from most of what is important in macroeconomics. ”
Let me just give one example. Read more…
from Lars Syll
The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?
There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.
But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In “modern” macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?
“Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances. Read more…
from Asad Zaman and the Pedagogic Blog
More than a billion people live in extreme poverty, in conditions which would be unimaginable for readers of this column. Economists say that this is due to ‘scarcity’ — there are not enough resources to feed them. The solution lies in economic growth, increased production to enable us to provide for all. This diagnosis deliberately distracts attention from the real problems. One of them is the rapidly rising inequality. In 2010, the richest 388 people owned more than half the wealth of the planet, an astonishingly skewed income distribution. Although there has been substantial growth, benefits of the growth accrue only to those who are already extremely wealthy. According to recent Oxfam reports for 2014, the richest 80 people now have more than $1.3 trillion, which is more than half of the total privately-owned planetary wealth. A tax of only 33 per cent on just these 80 would suffice to feed, clothe, house, educate and provide for the health needs of all of the extremely poor. Coincidentally, global defence budgets are of similar magnitude. We don’t have to become peaceniks; just scaling back our bloodthirstiness by 33 per cent would suffice to remove extreme poverty from the planet. Just avoiding the Iraq war would have saved sufficient money to feed the planet for 30 years. read more
from Asad Zaman
In a story not reported on at all by any Western mainstream media source, Iceland just sentenced another five high level bankers to prison for directly contributing to the collapse of the country’s economy in 2008.
This brings the total to 26 bankers now behind bars in Iceland, with most being CEOs of large financial institutions, rather than low level traders.
Most of those jailed will serve terms of two to five years, according to a report by Iceland Magazine, which notes that three executives at Landsbankinn and two at Kaupþing, along with one prominent investor, have been prosecuted.
Their crimes include market manipulation, embezzlement, and breach of fiduciary duties. Their market manipulation destroyed the country’s economy and to this day Iceland is still having to repay the global loan sharks at the IMF, as well as governments of other countries, which kept the nation operating.
The article explains that the prosecutions have been possible because rather than protect and reward the very institutions responsible for the collapse, and the gangsters that run them, the Icelandic government let them fail, and then created a financial supervisory authority to strictly oversee the banks.
Iceland’s President, Olafur Ragnar Grimmson noted: Read more…