from David Ruccio
from Asad Zaman
Chapter 1 of General Theory is just one paragraph, displayed in full HERE
The discussion below borrows extensively, without explicit point-by-point acknowledgement, from Brian S. Ferguson, “Lectures on John Maynard Keynes’ General Theory of Employment, Interest and Money (1): Chapter One, Background and Historical Setting” University of Guelph Department of Economics and Finance Discussion Paper No. 2013-06:
1. RHETORIC: Keynes wishes to persuade fellow economists. Instead of saying that they are all wrong, blinkered idiots, he says that they are studying a special case, which he wishes to generalize. He also acknowledges that he was misled by the same errors, and creates common ground to enable dialog. He is also making a subliminal appeal to the hugely influential General Theory of Relativity published earlier by Einstein.
2. INVENTION OF MACRO: The revolutionary contribution of Keynes is to study aggregates, instead of micro-level behavior. He is correctly labelled the inventor of macro-economics; prior to him, economists thought that the aggregate behavior would be obtained simply as a sum of the individual behaviors; there is no need to study macroeconomics separately. Parenthetically, it is this same position to which macro-economists retreated in the 70’s and 80’s with the development of DSGE model. Ferguson writes that: read more
from Dean Baker
Harvard professor, textbook author, and occasionally New York Times columnist Greg Mankiw told readers today that Donald Trump’s economic team is wrong to worry about the trade deficit.
“The most important lesson about trade deficits is that they have a flip side. When the United States buys goods and services from other nations, the money Americans send abroad generally comes back in one way or another. One possibility is that foreigners use it to buy things we produce, and we have balanced trade. The other possibility, which is relevant when we have trade deficits, is that foreigners spend on capital assets in the United States, such as stocks, bonds and direct investments in plants, equipment and real estate.” …..
“in reality, trade deficits are not a threat to robust growth and full employment. The United States had a large trade deficit in 2009, when the unemployment rate reached 10 percent, but it had an even larger trade deficit in 2006, when the unemployment rate fell to 4.4 percent.
“Rather than reflecting the failure of American economic policy, the trade deficit may be better viewed as a sign of success. The relative vibrancy and safety of the American economy is why so many investors around the world want to move their assets here.”
There are three points worth making here. Read more…
from Lars Syll
In practice Prof. Tinbergen seems to be entirely indifferent whether or not his basic factors are independent of one another … But my mind goes back to the days when Mr. Yule sprang a mine under the contraptions of optimistic statisticians by his discovery of spurious correlation. In plain terms, it is evident that if what is really the same factor is appearing in several places under various disguises, a free choice of regression coefficients can lead to strange results. It becomes like those puzzles for children where you write down your age, multiply, add this and that, subtract something else, and eventually end up with the number of the Beast in Revelation.
Prof. Tinbergen explains that, generally speaking, he assumes that the correlations under investigation are linear … I have not discovered any example of curvilinear correlation in this book, and he does not tell us what kind of evidence would lead him to introduce it. If, as he suggests above, he were in such cases to use the method of changing his linear coefficients from time to time, it would certainly seem that quite easy manipulation on these lines would make it possible to fit any explanation to any facts. Am I right in thinking that the uniqueness of his results depends on his knowing beforehand that the correlation curve must be a particular kind of function, whether linear or some other kind ?
Apart from this, one would have liked to be told emphatically what is involved in the assumption of linearity. It means that the quantitative effect of any causal factor on the phenomenon under investigation is directly proportional to the factor’s own magnitude … But it is a very drastic and usually improbable postulate to suppose that all economic forces are of this character, producing independent changes in the phenomenon under investigation which are directly proportional to the changes in themselves ; indeed, it is ridiculous. Yet this is what Prof. Tinbergen is throughout assuming …
Keynes’ comprehensive critique of econometrics and the assumptions it is built around — completeness, measurability, indepencence, homogeneity, and linearity — are still valid today. Read more…
Brexit, TrumpKKK, the EU – broadly defined labour markets are key. About this:
A. Noah Smith is changing his opinion
- A job is more than a paycheck. It is a social institution, too
- Debunking labour economics 101 (very clever but also logical and empirical)
B. Frances Coppola is not changing her opinion: ‘Reinventing work for the future'(about basic income)
C. The Daily Mail wants to change your opinion, about this (graph) Source: ONS. The non-far right should not leave it to the Daily Mail to write about such events and has to ask the question: is the UK mopping up the fall out of austerity in the EU, which leads to people (men…) who become angry about/afraid of losing the dignity and status their jobs once provided. See the first Noah Smith link. And his second. And the Coppola link.
from David Ruccio
In the second installment of this series on “class before Trumponomics,” I argued that, in recent decades, while American workers have created enormous wealth, most of the increase in that wealth has been captured by their employers and a tiny group at the top—as workers have been forced to compete with one another for new kinds of jobs, with fewer protections, at lower wages, and with less security than they once expected. And the period of recovery from the Second Great Depression has done nothing to change that fundamental dynamic.
In this post, I want to focus on a more detailed analysis of the other side of the class relationship—capital.
It should come as no surprise that one of the major changes in U.S. capital over the past few decades is the growing importance of financial activities. Since 1980, FIRE (finance, insurance, and real estate) has almost doubled, expanding from roughly 12 percent of the gross output of private industries to over 20 percent. Read more…
from The Guardian “the world’s leaders need to acknowledge that they have failed and are failing the many”
. . . the recent apparent rejection of the elites in both America and Britain is surely aimed at me, as much as anyone. Whatever we might think about the decision by the British electorate to reject membership of the European Union and by the American public to embrace Donald Trump as their next president, there is no doubt in the minds of commentators that this was a cry of anger by people who felt they had been abandoned by their leaders. . . .
. . . The concerns underlying these votes about the economic consequences of globalisation and accelerating technological change are absolutely understandable. The automation of factories has already decimated jobs in traditional manufacturing, and the rise of artificial intelligence is likely to extend this job destruction deep into the middle classes, with only the most caring, creative or supervisory roles remaining.
This in turn will accelerate the already widening economic inequality around the world. The internet and the platforms that it makes possible allow very small groups of individuals to make enormous profits while employing very few people. This is inevitable, it is progress, but it is also socially destructive.
We need to put this alongside the financial crash, which brought home to people that a very few individuals working in the financial sector can accrue huge rewards and that the rest of us underwrite that success and pick up the bill when their greed leads us astray. So taken together we are living in a world of widening, not diminishing, financial inequality, in which many people can see not just their standard of living, but their ability to earn a living at all, disappearing. It is no wonder then that they are searching for a new deal, which Trump and Brexit might have appeared to represent.
It is also the case that another unintended consequence of the global spread of the internet and social media is that the stark nature of these inequalities is far more apparent than it has been in the past.
from Lars Syll
Reading an applied econometrics paper could leave you with the impression that the economist (or any social science researcher) first formulated a theory, then built an empirical test based on the theory, then tested the theory. But in my experience what generally happens is more like the opposite: with some loose ideas in mind, the econometrician runs a lot of different regressions until they get something that looks plausible, then tries to fit it into a theory (existing or new) … Statistical theory itself tells us that if you do this for long enough, you will eventually find something plausible by pure chance!
This is bad news because as tempting as that final, pristine looking causal effect is, readers have no way of knowing how it was arrived at. There are several ways I’ve seen to guard against this:
(1) Use a multitude of empirical specifications to test the robustness of the causal links, and pick the one with the best predictive power …
(2) Have researchers submit their paper for peer review before they carry out the empirical work, detailing the theory they want to test, why it matters and how they’re going to do it. Reasons for inevitable deviations from the research plan should be explained clearly in an appendix by the authors and (re-)approved by referees.
(3) Insist that the paper be replicated. Firstly, by having the authors submit their data and code and seeing if referees can replicate it (think this is a low bar? Mostempirical research in ‘top’ economics journals can’t even manage it). Secondly — in the truer sense of replication — wait until someone else, with another dataset or method, gets the same findings in at least a qualitative sense. The latter might be too much to ask of researchers for each paper, but it is a good thing to have in mind as a reader before you are convinced by a finding.
All three of these should, in my opinion, be a prerequisite for research that uses econometrics …
Naturally, this would result in a lot more null findings and probably a lot less research. Perhaps it would also result in fewer attempts at papers which attempt to tell the entire story: that is, which go all the way from building a new model to finding (surprise!) that even the most rigorous empirical methods support it.
Good suggestions, but unfortunately there are many more deep problems with econometrics that have to be ‘solved.’ Read more…
from Dean Baker
In spite of the hopes of many elite types for a last-minute resurrection, it appears that the Trans-Pacific Partnership (TPP) is finally dead. This is good news, but it took a long time to kill the deal, and the country is likely to pay a huge price for the execution.
The basic point that everyone should know by now is that the TPP had little to do with trade. The United States already had trade deals with six of the 11 other countries in the pact. The trade barriers with the other five countries were already very low in most cases, so there was little room left for further trade liberalization in the TPP.
Instead, the main purpose of the TPP was to lock in place a business-friendly structure of regulation. The deal was negotiated by a series of working groups that were dominated by representatives of major corporations. The regulatory structure was to be enforced by investor-state dispute settlement tribunals. This is an extrajudicial system that would be able to override US laws with secret rulings that were not bound by precedent or subject to appeal.
In addition, the TPP would strengthen and lengthen patent and copyrights and related protections. This is protectionism: It is 180 degrees at odds with free trade. These protections can raise the price of protected items, like prescription drugs, by a factor of 10 or even 100. This is equivalent to tariffs of several thousand percent, with the same waste and incentives for corruption. Free-traders oppose such protections, if they are honest. Read more…
from David Ruccio
Back in 2010, Charles Ferguson, the director of Inside Job, exposed the failure of prominent mainstream economists who wrote about and spoke on matters of economic policy to disclose their conflicts of interest in the lead-up to the crash of 2007-08. Reuters followed up by publishing a special report on the lack of a clear standard of disclosure for economists and other academics who testified before the Senate Banking Committee and the House Financial Services Committee between late 2008 and early 2010, as lawmakers debated the biggest overhaul of financial regulation since the 1930s.
Well, economists are still at it, leveraging their academic prestige with secret reports justifying corporate concentration.
from Lars Syll
When a hot new tool arrives on the scene, it should extend the frontiers of economics and pull previously unanswerable questions within reach. What might seem faddish could in fact be economists piling in to help shed light on the discipline’s darkest corners. Some economists, however, argue that new methods also bring new dangers; rather than pushing economics forward, crazes can lead it astray, especially in their infancy …
A paper by Angus Deaton, a Nobel laureate and expert data digger, and Nancy Cartwright, an economist (sic!) at Durham University, argues that randomised control trials, a current darling of the discipline, enjoy misplaced enthusiasm. RCTs involve randomly assigning a policy to some people and not to others, so that researchers can be sure that differences are caused by the policy. Analysis is a simple comparison of averages between the two. Mr Deaton and Ms Cartwright have a statistical gripe; they complain that researchers are not careful enough when calculating whether two results are significantly different from one another. As a consequence, they suspect that a sizeable portion of published results in development and health economics using RCTs are “unreliable”.
With time, economists should learn when to use their shiny new tools. But there is a deeper concern: that fashions and fads are distorting economics, by nudging the profession towards asking particular questions, and hiding bigger ones from view. Mr Deaton’s and Ms Cartwright’s fear is that RCTs yield results while appearing to sidestep theory, and that “without knowing why things happen and why people do things, we run the risk of worthless causal (‘fairy story’) theorising, and we have given up on one of the central tasks of economics.” Another fundamental worry is that by offering alluringly simple ways of evaluating certain policies, economists lose sight of policy questions that are not easily testable using RCTs, such as the effects of institutions, monetary policy or social norms.
From George Monbiot in The Guardian:
Yes, Donald Trump’s politics are incoherent. But those who surround him know just what they want, and his lack of clarity enhances their power. To understand what is coming, we need to understand who they are. I know all too well, because I have spent the past 15 years fighting them.
Over this time, I have watched as tobacco, coal, oil, chemicals and biotech companies have poured billions of dollars into an international misinformation machine composed of thinktanks, bloggers and fake citizens’ groups. Read more…
On his insightful ‘conversable economist’ blog the excellent Timothy Taylor has a good piece about European unemployment which as follows:
American readers: can you imagine the social turmoil in the US if the unemployment rate has been above 10% for the last seven years, instead of peaking at 10% back in October 2009 and falling down to about 5% by a year ago in fall 2015? Can you imagine if half of these unemployed had been looking for work for more than a year? Consider the difference, and you’ll have a better sense of why the EU is struggling to have much appeal to the European public.
And this is without regard for Balkan countries like Albania which are not (yet) a member of the EU and which have rates which sometimes are as high as 30%. Taylor is totally right Read more…
from David Ruccio
In the first installment of this series on “class before Trumponomics,” I argued that the recovery from the crash of 2007-08 created conditions that were favorable to capital at the expense of labor—and that trend represented a continuation of the class dynamic that had characterized the U.S. economy for decades, going back at least to the early 1980s.
There are, of course, many details that were left out of that story, and I want to present a more fine-grained class analysis of the U.S. economy prior to Donald Trump’s election in this post.
Let me start with labor. In the first post, my analysis actually understated the capital share and overstated the labor share. That’s because a large share of the surplus was actually included in wages, and thus attributed to labor, when in fact it properly belongs in the share captured by capital. The idea is that high-level executives and others (e.g., CEOs and those working in finance), while much of their income is reported as “wages,” are actually receiving a cut of the surplus from their employers. Therefore, their wages are actually part of the capital share, while the incomes of the rest of workers form the basis of the labor share properly understood.
This is clear in the chart below (modified, from a paper by Michael W. L. Elsby, Bart Hobijin, and Aysegül Sahin [pdf]), where the labor share is split up by income fractiles. Based on a rough class analysis of the U.S. labor force, the labor share actually includes the first two components (making up the bottom 95 percent of the labor force), while the other fractiles (those making up the top 5 percent) represent a distribution of the surplus from capital. As is evident from a quick glance at the chart, the share of total wages going to the working-class has been declining since the early 1970s, while the share representing distributions of the surplus has grown. Read more…
from Asad Zaman
PRELIMINARY REMARKS: Philosopher Hilary Putnam writes in “The Collapse of the Fact/Value Distinction” that there are cases where we can easily and clearly distinguish between facts and values — the objective and the subjective. However, it is wrong to think that we can ALWAYS do so. There are many sentences, especially in economic theories, where the two are “inextricably entangled” .
This is the fourth post in a sequence about Re-Reading Keynes. This post is focused on a single point which has been mentioned, but perhaps not sufficiently emphasized earlier: the entanglement of the economic system with the economic theories about the system. Our purpose in reading Keynes is not directly to understand Keynesian theory — that is, to assess it as an economic theory in isolation, and whether or not it is valid and useful for contemporary affairs. Rather, we want to co-understand Keynesian theory and the historical context in which it was born. This is an exercise in the application of Polanyi’s methodology, which I described in excruciating detail in my paper published in WEA journal Economic Thought recently:
I must confess that I am not very happy with the paper; I was struggling to formulate the ideas, and could not achieve the clarity that I always try for. It is a difficult read, though it expresses very important ideas — laying out the foundations for a radical new methodology which incorporates political, social and historical elements that have been discard in conventional methodology for economics. One of the key elements of Polanyi’s methodology is the interaction between theories and history — our history generates our experiences of the world, and this experience in understood in the light of theories we generate to try to understand this experience. This obvious fact was ignored & lost due to the positivist fallacy that facts can be understood directly by themselves. The truth is that they can only be understood within the context of a (theoretical) framework. Once we use theories to understand experience, then these theories are used to shape our responses to this experience, and so these theories directly impact on history — history is shaped by theories, and theories are shaped by history. The two are “inextricably entangled.” read more
from Lars Syll
Macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take, after Kydland and Prescott (1982) launched the real business cycle (RBC) model …
In response to the observation that the shocks are imaginary, a standard defence invokes Milton Friedman’s (1953) methodological assertion from unnamed authority that “the more significant the theory, the more unrealistic the assumptions.” More recently, “all models are false” seems to have become the universal hand-wave for dismissing any fact that does not conform to the model that is the current favourite.
The noncommittal relationship with the truth revealed by these methodological evasions and the “less than totally convinced …” dismissal of fact goes so far beyond post-modern irony that it deserves its own label. I suggest “post-real.”
There are many kinds of useless economics held in high regard within the mainstream economics establishment today. Few are less deserved than the post-real macroeconomic theory — mostly connected with Finn Kydland, Robert Lucas, Edward Prescott and Thomas Sargent — called RBC. Read more…
from Lars Syll
The move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brought the question of temporality into the thinking of structure, and marked a shift from a form of Althusserian theory that takes structural totalities as theoretical objects to one in which the insights into the contingent possibility of structure inaugurate a renewed conception of hegemony as bound up with the contingent sites and strategies of the rearticulation of power.
from Asad Zaman
This 1000 word article is the third in a series of posts on Re-Reading Keynes. It traces the impact of Keynesian theories on the 20th century, as necessary background knowledge for a contextual and historically situated study of Keynes. It was published in Express Tribune on 4 Nov 2016.
The Global Financial Crisis (GFC) has created awareness of the great gap between academic models and reality. IMF Chief Economist Olivier Blanchard said that modern DSGE macroeconomic models currently used for policy decisions are based on assumptions which are profoundly at odds with what we know about consumers and firms. More than seven different schools of macroeconomic thought contend with each other, without coming to agreement on any fundamental issue. This bears a striking resemblance to the post-Depression era when Keynes set out to resolve the “deep divergences of opinion between fellow economists which have for the time being almost destroyed the practical influence of economic theory.”
Likewise, today, the inability of mainstream economists to predict, understand, explain, or find remedies for the Global Financial Crisis, has deeply damaged the reputation of economists and economic theories. Recently, World Bank Chief Economist Paul Romer stated that for more than three decades, macroeconomics has gone backwards. Since modern macroeconomics bears a strong resemblance to pre-Keynesian theories, Keynesian theories have fresh relevance, as described below.
In the aftermath of the Great Depression, economic misery was a major factor which led to the Russian Revolution and the rise of Hitler in Germany. Conventional economic theory held that market forces would automatically and quickly correct the temporary disequilibrium of high unemployment and low production in Europe and USA. Keynes argued that high unemployment could persist, and government interventions in the form of active monetary and fiscal policy were required to correct the economic problems. Many have suggested that Keynes rescued Capitalism by providing governments with rationale to intervene on behalf of the workers, thereby preventing socialist or communist revolutions. There is no doubt that strong and powerful labor movements in Europe and USA derived strength from the economic misery of the masses, and also took inspiration from the pro-labor and anti-capitalist theories of Marx. While it is hard to be sure whether Keynes saved capitalism, we can be very sure that Keynes and Keynesian theories were extremely influential in shaping the economic landscapes of the 20th Century. read more
from Lars Syll
Nation states borrow to provide public capital: For example, rail networks, road systems, airports and bridges. These are examples of large expenditure items that are more efficiently provided by government than by private companies.
The benefits of public capital expenditures are enjoyed not only by the current generation of people, who must sacrifice consumption to pay for them, but also by future generations who will travel on the rail networks, drive on the roads, fly to and from the airports and drive over the bridges that were built by previous generations. Interest on the government debt is a payment from current taxpayers, who enjoy the fruits of public capital, to past generations, who sacrificed consumption to provide that capital.
To maintain the roads, railways, airports and bridges, the government must continue to invest in public infrastructure. And public investment should be financed by borrowing, not from current tax revenues.
Investment in public infrastructure was, on average, equal to 4.3% of GDP in the period from 1948 through 1983. It has since fallen to 1.6% of GDP. There is a strong case to be made for increasing investment in public infrastructure. First, the public capital that was constructed in the post WWII period must be maintained in order to allow the private sector to function effectively. Second, there is a strong case for the construction of new public infrastructure to promote and facilitate future private sector growth.
The debt raised by a private sector company should be strictly less than the value of assets, broadly defined. That principle does not apply to a nation state. Even if government provided no capital services, the value of its assets or liabilities should not be zero except by chance.
National treasuries have the power to transfer resources from one generation to another. By buying and selling assets in the private markets, government creates opportunities for those of us alive today to transfer resources to or from those who are yet to be born. If government issues less debt than the value of public capital, there will be an implicit transfer from current to future generations. If it owns more debt, the implicit transfer is in the other direction.
The optimal value of debt, relative to public capital, is a political decision … Whatever principle the government does choose to fund its expenditure, the optimal value of public sector borrowing will not be zero, except by chance.
Today there seems to be a rather widespread consensus of public debt being acceptable as long as it doesn’t increase too much and too fast. If the public debt-GDP ratio becomes higher than X % the likelihood of debt crisis and/or lower growth increases. Read more…