The man who completely missed the housing bubble and was convinced financial disruption would be restricted to the subprime market deserves two seven-figure sinecures?
from Dean Baker
I hate to be picking on Matt O’Brien again, but come on, this is setting the bar pretty goddamn low. He began a piece reporting on a consulting gig that Bernanke will have the bond fund Pimco by telling readers:
“If anyone deserves two seven-figure sinecures, it’s Ben Bernanke.”
I won’t go over the full indictment of Ben Bernanke and will give him credit for a reasonably good job trying to boost the economy post-crash in the wake of the outraged opposition of the right-wing, but let’s get real. The housing bubble and ensuing crash were not natural disasters like Hurricane Katrina. Read more…
from Peter Radford
OK. Let’s have some fun.
Transaction costs were invented by Ronald Coase to help explain why we see business firms littering the economic landscape when orthodox economic theory argues that the marketplace is the superior and unequalled coordinator of economic activity. The Coasian idea, later extended and expanded upon by the likes of Oliver Williamson, is that there are costs of accessing the market which, under some circumstances, render market coordination more expensive than having production contained within the boundaries of what we call a business firm. These costs are what are now called transaction costs.
The problem is that they are also fairly vague. Indeed on of the main counter attacks by leading orthodox economists has always been that transaction costs are hard to pin down and thus ‘formalize’. And, as we all know, things that are not formal are considered to be dicey and not rigorous by orthodox ideologues.
Anyway, that’s for them to argue over, let’s get back to our fun. Read more…
from Lars Syll
Roman Frydman is Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that models founded on the rational expectations hypothesis are inadequate as representation of economic agents’ decision making.
Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin of history. Read more…
from Peter Radford
In his introduction to a collection of his own work, Ronald Coase tells us:
‘Becker points out that: “what most distinguishes economics as a discipline from other disciplines in the social sciences is not its subject matter but its approach”’.
He then goes on:
‘One result of this divorce of the theory from its subject matter has been that the entities whose decisions economists are engaged in analyzing lack any substance. The consumer is not a human being but a consistent set of preferences. The firm, to an economist, as Slater has said, “is effectively defined as a cost curve and a demand curve, and the theory is simply the logic of optimal pricing and input combination”. Exchange takes place without any specification of its institutional setting. We have consumers without humanity, firms without organization, and even exchange without markets.’
All true, too true. Read more…
from Lars Syll
Piketty uses the terms “capital” and “wealth” interchangeably to denote the total monetary value of shares, housing and other assets. “Income” is measured in money terms. We shall reserve the term “capital” for the totality of productive assets evaluated at constant prices. The term “output” is used to denote the totality of net output (value-added) measured at constant prices. Piketty uses the symbol β to denote the ratio of “wealth” to “income” and he denotes the share of wealth-owners in total income by α. In his theoretical analysis this share is equated to the share of profits in total output. Piketty documents how α and β have both risen by a considerable amount in recent decades. He argues that this is not mere correlation, but reflects a causal link. It is the rise in β which is responsible for the rise in α. To reach this conclusion, he first assumes that β is equal to the capital-output ratio K/Y, as conventionally understood. From his empirical finding that β has risen, he concludes that K/Y has also risen by a similar amount. According to the neoclassical theory of factor shares, an increase in K/Y will only lead to an increase in α when the elasticity of substitution between capital and labour σ is greater than unity. Piketty asserts that this is the case. Indeed, based on movements α and β, he estimates that σ is between 1.3 and 1.6 (page 221). Read more…
from Grazia Ietto-Gillies
En route from London to Rome I read The Superiority of Economists by Marion Fourcade, Etienne Ollion and Yann Algan (Journal of Economic Perspectives, 29, 1: 89-114). https://www.aeaweb.org/articles.php?doi=10.1257/jep.29.1.89
Some travels within Italy – to Pisa and Siena – gave me time for musings and reflections about the content of this paper. Move forward a few weeks and back in London I have decided to turn those reflections into clicks and share them . . . read more
from Lars Syll
Robert Lucas is well-known for condemning everything that isn’t microfounded rational expectations macroeconomics as “ad hoc” theorizing.
But instead of rather unsubstantiated recapitulations, it would be refreshing and helpful if the Chicago übereconomist — for a change — endeavoured to clarify just what he means by “ad hoc.”
The standard meaning — OED — of the term is “for this particular purpose.” But in the hands of New Classical–RBC–New Keynesians it seems to be used more to convey the view that modeling with realist and relevant assumptions is somehow equivalent to basing models on “specifics” rather than the “fundamentals” of individual intertemporal optimization and rational expectations.
This is of course pure nonsense, simply because there is no — as yours truly has argued at length e. g. here — macro behaviour that consistently follows from the RBC–New Keynesian microfoundations. The only ones that succumb to ad hoc assumptions here are macroeconomists like Lucas et consortes, who believe that macroeconomic behaviour can be adequately analyzed with a fictitious rational-expectations-optimizing-robot-imitation-representative-agent.
from Lars Syll
The bias toward the superficial and the response to extraneous influences on research are both examples of real harm done in contemporary social science by a roughly Bayesian paradigm of statistical inference as the epitome of empirical argument. For instance the dominant attitude toward the sources of black-white differential in United States unemployment rates (routinely the rates are in a two to one ratio) is “phenomenological.” The employment differences are traced to correlates in education, locale, occupational structure, and family background. The attitude toward further, underlying causes of those correlations is agnostic … Yet on reflection, common sense dictates that racist attitudes and institutional racism must play an important causal role. People do have beliefs that blacks are inferior in intelligence and morality, and they are surely influenced by these beliefs in hiring decisions … Thus, an overemphasis on Bayesian success in statistical inference discourages the elaboration of a type of account of racial disadavantages that almost certainly provides a large part of their explanation.
from Asad Zaman
Modern history is largely driven by the battle of the rich (top 0.01%) against the masses (bottom 90%). Over the past few decades, the rich have been tremendously successful in having it all their way. A previous blog post on “Deception and Democracy” illustrates by examples their successful conversion of democracy into plutocracy in the USA. As pointed out by Polanyi, unregulated markets create disastrous outcomes for the majority. Therefore, in a democratic environment, theories which misrepresent facts and justify massive inequalities are essential pillars of support for the plutocrats. Spreading these theories via media and educational channels helps create an environment where people support policies which go against their common interests. Read more…
from Lars Syll
Paul Krugman has often tried to explain why we should continue to use neoclassical hobby horses like IS-LM and Aggregate Supply-Aggregate Demand models. Here’s one example:
So why do AS-AD? … We do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility. Or to put it differently, you do want somehow to make clear the notion (which even fairly Keynesian guys like me share) that money is neutral in the long run.
Well, this “fairly Keynesian” guy is not impressed. And I doubt that Keynes himself would have been impressed by having his theory being characterized with catchwords like “tendency to return to full employment” and “money is neutral in the long run.” Read more…
from Asad Zaman
My article on the limits of reason was published in Express Tribune recently (Monday April 13, 2015). This essay shows that logic is limited in its ability to arrive at a definite conclusion even in the heartland of mathematics. Pluralism is required to cater for the possibility that both Euclidean and non-Euclidean geometries represent valid ways of looking at the world. The world of human affairs is far more complex. In order to study and understand societies, one must learn to deal with a multiplicity of truths. This argument, which is related to the first, has been made in my article “Tolerance and Multiple Narratives” which was published in Express Tribune earlier (March 29, 2015). These ideas form part of the background for supporting the drive for pluralism in our approaches to economic problems.
Ben Bernanke: The revolving door between Wall Street and U.S. government agencies continues to revolve.
from David Ruccio
Apparently, the door between Wall Street and the U.S. government agencies in charge of regulating Wall Street continues to revolve. Former Federal Reserve chair Ben Bernanke is the latest to walk through the door. Read more…
from Lars Syll
In its standard form, a significance test is not the kind of “severe test” that we are looking for in our search for being able to confirm or disconfirm empirical scientific hypothesis. This is problematic for many reasons, one being that there is a strong tendency to accept the null hypothesis since they can’t be rejected at the standard 5% significance level. In their standard form, significance tests bias against new hypotheses by making it hard to disconfirm the null hypothesis. Read more…
from Lars Syll
Abstraction is the most valuable ladder of any science. In the social sciences, as Marx forcefully argued, it is all the more indispensable since there ‘the force of abstraction’ must compensate for the impossibility of using microscopes or chemical reactions. However, the task of science is not to climb up the easiest ladder and remain there forever distilling and redistilling the same pure stuff. Standard economics, by opposing any suggestions that the economic process may consist of something more than a jigsaw puzzle with all its elements given, has identified itself with dogmatism. And this is a privilegium odiosum that has dwarfed the understanding of the economic process wherever it has been exercised.
Bringing economics back into liberal academic life.April 16, 2015.http://www.ft.com/cms/s/0/99799262-e293-11e4-ba33-00144feab7de.html#ixzz3XTazf200
Sir, The moribund orthodoxy that currently exercises such an inflexible grip on university economics departments will, as Wolfgang Münchau comments, inevitably face a challenge, and this “will come from outside the discipline and will be brutal” (“Macroeconomists need new tools to challenge consensus”, April 13). The orthodoxy has brought this dismal prospect on itself through the brutality with which it has purged those departments of any other school of thought than its own.
Indeed, in its extreme version, the orthodoxy’s doctrine holds quite simply that there are “no schools of thought in economics”, a totalitarian assertion all too true in most economics departments today, so ruthless has been the purge of alternatives. As a result, the different approaches to economic issues of Adam Smith, Bentham, Ricardo, Marshall, Keynes, Friedman and so on are all relegated to the fringe subject of the “history of economic thought”.
from Lars Syll
In their new book, Mastering ‘Metrics: The Path from Cause to Effect, Joshua D. Angrist and Jörn-Steffen Pischke write:
Our first line of attack on the causality problem is a randomized experiment, often called a randomized trial. In a randomized trial, researchers change the causal variables of interest … for a group selected using something like a coin toss. By changing circumstances randomly, we make it highly likely that the variable of interest is unrelated to the many other factors determining the outcomes we want to study. Random assignment isn’t the same as holding everything else fixed, but it has the same effect. Random manipulation makes other things equal hold on average across the groups that did and did not experience manipulation. As we explain … ‘on average’ is usually good enough.
Angrist and Pischke may “dream of the trials we’d like to do” and consider “the notion of an ideal experiment” something that “disciplines our approach to econometric research,” but to maintain that ‘on average’ is “usually good enough” is an allegation that in my view is rather unwarranted, and for many reasons.
First of all it amounts to nothing but hand waving to simpliciter assume, without argumentation, that it is tenable to treat social agents and relations as homogeneous and interchangeable entities. Read more…
from Edward Fullbrook
For me three economists stand out historically as having been the most effective at building resistance to the dominance of scientism in economics. Keynes of course is one, and the other two are Bernard Guerrien and Tony Lawson, Guerrien because he was the intellectual and moral force behind Autisme Economie which, among other things, gave rise to the RWER; and Lawson because his papers, books and seminars have inspired, joined and intellectually fortified thousands.
It is notable that all three of these economists were or were on their way to becoming professional mathematicians before switching to economics. When still in his twenties, Keynes’ mathematical genius was already publicly celebrated, most notably by Whitehead and Russell, and he had already published what was to become for his first discipline a classic work. Guerrien’s first PhD was in mathematics, and Lawson was doing a PhD in mathematics at Cambridge when its economics department lured him over in an attempt to boost its mathematical competence.
The significance for me of Keynes, Guerrien and Lawson being mathematicians first and economists second is that it meant that they were not even for an hour taken in or intimidated by the aggressive scientism of neoclassical economists, and this has enabled them to write analytically about the dominant scientism with a quiet straightforwardness that is beyond the reach of most of us.
An example of this kind of writing that I am talking about is the short essay below that in 2002 Guerrien published in what is now the Real-World Economics Review. Read more…
from Lars Syll
The general equilibrium approach starts with individual decisions. It assumes that trades are voluntary and that there exist mutually advantageous opportunities of exchange. Up to here, everyone can agree. The problem lies in the next step. At this point, let us folllow David Kreps’s (1990) reasoning in his A Course in Microeconomic Theory. Kreps asks the reader to “imagine consumers wandering around a large market square” with different kinds of food in their bags. When two of them meet, “they examine what each has to offer, to see if they can arrange a mutually agreeable trade. To be precise, we might imagine that at every chance meeting of this sort, the two flip a coin and depending on the outcome, one is allowed to propose an exchange, which the other may either accept or reject. The rule is that you can’t eat until you leave the market square, so consumers wait until they are sat- isfied with what they possess” (196). Read more…
from Lars Syll
As no one interested in macroeconomics has failed to notice, Ben Bernanke is having a debate with Larry Summers on what’s behind the slow recovery of growth rates since the financial crisis of 2007.
To Bernanke it’s basically a question of a savings glut.
To Summers it’s basically a question of a secular decline in the level of investment.
To me the debate is actually a non-starter, since they both rely on a loanable funds theory and a Wicksellian notion of a “natural” rate of interest — ideas that have been known to be dead wrong for at least 80 years …