Archive

Author Archive

Statistical significance is not real-world significance

December 9, 2016 Leave a comment

from Lars Syll

ad11As shown over and over again when significance tests are applied, people have a tendency to read ‘not disconfirmed’ as ‘probably confirmed.’ Standard scientific methodology tells us that when there is only say a 10 % probability that pure sampling error could account for the observed difference between the data and the null hypothesis, it would be more ‘reasonable’ to conclude that we have a case of disconfirmation. Especially if we perform many independent tests of our hypothesis and they all give about the same 10 % result as our reported one, I guess most researchers would count the hypothesis as even more disconfirmed.

We should never forget that the underlying parameters we use when performing significance tests are model constructions. Our p-values mean next to nothing if the model is wrong. And most importantly — statistical significance tests DO NOT validate models!   Read more…

A crisis-prone & fragile financial system

December 7, 2016 9 comments

from Asad Zaman

Prior to the Global Financial Crisis (GFC 2007), many senior economists and policy makers expressed confidence that they had finally solved the problem of business cycles, booms and busts, that plagues capitalism. Because of this over-confidence, early warnings of a looming crisis by Nouriel Roubini, Ann Pettifor, Peter Schiff, Steven Keen, Dean Baker, and Raghuram Rajan, were ridiculed and dismissed. Even after the crisis, many economists thought this was a minor glitch, which would soon be remedied. Now however, while conventional economists continue to search for reasons for the mysterious stagnation besetting capitalistic economies, the weak and jobless recovery from the GFC has been labeled as an illusion and a false dawn by Schiff. Like him, deeper analysts are converging on the idea that the problems run deep, and that radical changes in the global financial architecture are required to solve current problems and prevent future crises.

For instance, consider Lord Mervyn King, the Governor of the Bank of England from 2003 to 2013. His experience at the heart of the global financial system led him to the conclusion that   “Of all the many ways of organising banking, the worst is the one we have today. … (can we) think our way through to a better outcome before the next generation is damaged by a future and bigger crisis?” Similarly, Minneapolis Federal Reserve President, Narayana Kocherlakota , after viewing the stark conflicts between the empirical evidence and the macroeconomic theories over the past ten years, writes the economists use “Toy Models” which do not work in face of the complexities of real life

There are two central elements which lie at the core of the fragility of the financial system.  read more

Uncertain media future

December 6, 2016 1 comment

from C. P. Chandrasekhar

In today’s troubled times, mega-mergers are the norm. Announcements of marriages like that between AB Inbev and SABMiller in the beer industry or Bayer and Monsanto in the agribusiness sector attract attention that soon fades, even as the difficult task of getting clearances from the regulators continues. But the just announced agreement to merge by AT&T and Time Warner, in a $85 billion deal, is likely to remain in public discussion for some time. If it does not, it should.

This is a mega merger not just in any industry like the production of cigarettes or beer, but in the media business. There, it brings together a dominant content producer, Time Warner, with interests stretching from news to the movies, and a communications major, AT&T, which has the ability to deliver, through fixed line and wireless services, this content to subscribers in offices, homes and on the move. In normal circumstances such power is viewed with suspicion because it inevitably leads to profiteering at the expense of the customer. The Financial Times (October 29, 2016) refers to a study by the European equities team at BNP Paribas Investment Partners, whose findings suggest that even today, as in the late 19th century, across industries, concentration “allows scope for a cabal of powerful oligopolists to gouge prices and bank excess profit.” That should get those implementing anti-trust legislation worried. But in the AT&T-TW case, the power that the proposed merger delivers would be more closely scrutinised because the media business is one that can shape public opinion and popular culture, with far-reaching political and social implications. Giving excessive power in that area to one or a few players is an altogether different ball game.   Read more…

P5:Intellectual & Theoretical Context

December 5, 2016 1 comment

from Asad Zaman

Fifth Post in a sequence on Re-Reading Keynes.

Chapter 1 of General Theory is just one paragraph, displayed in full HERE

Briefly: Keynes writes that Classical Economics is a special case of his General Theory. Furthermore, the assumptions required for the special case do not hold for contemporary economic societies,”with the result that its teaching is misleading and disastrous if we attempt to apply it to the facts of experience”

The discussion below borrows extensively, without explicit point-by-point acknowledgement, from Brian S. Ferguson, “Lectures on John Maynard Keynes’ General Theory of Employment, Interest and Money (1): Chapter One, Background and Historical Setting” University of Guelph Department of Economics and Finance Discussion Paper No. 2013-06:

1.       RHETORIC: Keynes wishes to persuade fellow economists. Instead of saying that they are all wrong, blinkered idiots, he says that they are studying a special case, which he wishes to generalize. He also acknowledges that he was misled by the same errors, and creates common ground to enable dialog. He is also making a subliminal appeal to the hugely influential General Theory of Relativity published earlier by Einstein.

2.       INVENTION OF MACRO: The revolutionary contribution of Keynes is to study aggregates, instead of micro-level behavior. He is correctly labelled the inventor of macro-economics; prior to him, economists thought that the aggregate behavior would be obtained simply as a sum of the individual behaviors; there is no need to study macroeconomics separately. Parenthetically, it is this same position to which macro-economists retreated in the 70’s and 80’s with the development of DSGE model. Ferguson writes that:  read more 

UK public trusts economists more than politicians but less than hairdressers

December 4, 2016 4 comments

Keynes on the ‘devastating inconsistencies’ of econometrics

December 3, 2016 2 comments

from Lars Syll

In practice Prof. Tinbergen seems to be entirely indifferent whether or not his basic factors are independent of one another … But my mind goes back to the days when Mr. Yule sprang a mine under the contraptions of optimistic statisticians by his discovery of spurious correlation. In plain terms, it is evident that if what is really the same factor is appearing in several places under various disguises, a free choice of regression coefficients can lead to strange results. It becomes like those puzzles for children where you write down your age, multiply, add this and that, subtract something else, and eventually end up with the number of the Beast in Revelation.

deb6e811f2b49ceda8cc2a2981e309f39e3629d8ae801a7088bf80467303077bProf. Tinbergen explains that, generally speaking, he assumes that the correlations under investigation are linear … I have not discovered any example of curvilinear correlation in this book, and he does not tell us what kind of evidence would lead him to introduce it. If, as he suggests above, he were in such cases to use the method of changing his linear coefficients from time to time, it would certainly seem that quite easy manipulation on these lines would make it possible to fit any explanation to any facts. Am I right in thinking that the uniqueness of his results depends on his knowing beforehand that the correlation curve must be a particular kind of function, whether linear or some other kind ?

Apart from this, one would have liked to be told emphatically what is involved in the assumption of linearity. It means that the quantitative effect of any causal factor on the phenomenon under investigation is directly proportional to the factor’s own magnitude … But it is a very drastic and usually improbable postulate to suppose that all economic forces are of this character, producing independent changes in the phenomenon under investigation which are directly proportional to the changes in themselves ; indeed, it is ridiculous. Yet this is what Prof. Tinbergen is throughout assuming …

J M Keynes

Keynes’ comprehensive critique of econometrics and the assumptions it is built around — completeness, measurability, indepencence, homogeneity, and linearity — are still valid today.  Read more…

Stephen Hawking: “we are at the most dangerous moment in the development of humanity”

December 2, 2016 28 comments

from The Guardian   “the world’s leaders need to acknowledge that they have failed and are failing the many”

. . . the recent apparent rejection of the elites in both America and Britain is surely aimed at me, as much as anyone. Whatever we might think about the decision by the British electorate to reject membership of the European Union and by the American public to embrace Donald Trump as their next president, there is no doubt in the minds of commentators that this was a cry of anger by people who felt they had been abandoned by their leaders. . . .

. . . The concerns underlying these votes about the economic consequences of globalisation and accelerating technological change are absolutely understandable. The automation of factories has already decimated jobs in traditional manufacturing, and the rise of artificial intelligence is likely to extend this job destruction deep into the middle classes, with only the most caring, creative or supervisory roles remaining.

This in turn will accelerate the already widening economic inequality around the world. The internet and the platforms that it makes possible allow very small groups of individuals to make enormous profits while employing very few people. This is inevitable, it is progress, but it is also socially destructive.

We need to put this alongside the financial crash, which brought home to people that a very few individuals working in the financial sector can accrue huge rewards and that the rest of us underwrite that success and pick up the bill when their greed leads us astray. So taken together we are living in a world of widening, not diminishing, financial inequality, in which many people can see not just their standard of living, but their ability to earn a living at all, disappearing. It is no wonder then that they are searching for a new deal, which Trump and Brexit might have appeared to represent.

It is also the case that another unintended consequence of the global spread of the internet and social media is that the stark nature of these inequalities is far more apparent than it has been in the past.

Read more…

Three suggestions to ‘save’ econometrics

December 1, 2016 2 comments

from Lars Syll

Reading an applied econometrics paper could leave you with the impression that the economist (or any social science researcher) first formulated a theory, then built an empirical test based on the theory, then tested the theory. But in my experience what generally happens is more like the opposite: with some loose ideas in mind, the econometrician runs a lot of different regressions until they get something that looks plausible, then tries to fit it into a theory (existing or new) … Statistical theory itself tells us that if you do this for long enough, you will eventually find something plausible by pure chance!

0This is bad news because as tempting as that final, pristine looking causal effect is, readers have no way of knowing how it was arrived at. There are several ways I’ve seen to guard against this:

(1) Use a multitude of empirical specifications to test the robustness of the causal links, and pick the one with the best predictive power …

(2) Have researchers submit their paper for peer review before they carry out the empirical work, detailing the theory they want to test, why it matters and how they’re going to do it. Reasons for inevitable deviations from the research plan should be explained clearly in an appendix by the authors and (re-)approved by referees.

(3) Insist that the paper be replicated. Firstly, by having the authors submit their data and code and seeing if referees can replicate it (think this is a low bar? Mostempirical research in ‘top’ economics journals can’t even manage it). Secondly — in the truer sense of replication — wait until someone else, with another dataset or method, gets the same findings in at least a qualitative sense. The latter might be too much to ask of researchers for each paper, but it is a good thing to have in mind as a reader before you are convinced by a finding.

All three of these should, in my opinion, be a prerequisite for research that uses econometrics …

Naturally, this would result in a lot more null findings and probably a lot less research. Perhaps it would also result in fewer attempts at papers which attempt to tell the entire story: that is, which go all the way from building a new model to finding (surprise!) that even the most rigorous empirical methods support it.

Unlearning Economics

Good suggestions, but unfortunately there are many more deep problems with econometrics that have to be ‘solved.’   Read more…

The Economist — Economics prone to fads and methodological crazes

November 30, 2016 1 comment

from Lars Syll

When a hot new tool arrives on the scene, it should extend the frontiers of economics and pull previously unanswerable questions within reach. What might seem faddish could in fact be economists piling in to help shed light on the discipline’s darkest corners. Some economists, however, argue that new methods also bring new dangers; rather than pushing economics forward, crazes can lead it astray, especially in their infancy …

16720017-abstract-word-cloud-for-randomized-controlled-trial-with-related-tags-and-terms-stock-photoA paper by Angus Deaton, a Nobel laureate and expert data digger, and Nancy Cartwright, an economist (sic!) at Durham University, argues that randomised control trials, a current darling of the discipline, enjoy misplaced enthusiasm. RCTs involve randomly assigning a policy to some people and not to others, so that researchers can be sure that differences are caused by the policy. Analysis is a simple comparison of averages between the two. Mr Deaton and Ms Cartwright have a statistical gripe; they complain that researchers are not careful enough when calculating whether two results are significantly different from one another. As a consequence, they suspect that a sizeable portion of published results in development and health economics using RCTs are “unreliable”.

With time, economists should learn when to use their shiny new tools. But there is a deeper concern: that fashions and fads are distorting economics, by nudging the profession towards asking particular questions, and hiding bigger ones from view. Mr Deaton’s and Ms Cartwright’s fear is that RCTs yield results while appearing to sidestep theory, and that “without knowing why things happen and why people do things, we run the risk of worthless causal (‘fairy story’) theorising, and we have given up on one of the central tasks of economics.” Another fundamental worry is that by offering alluringly simple ways of evaluating certain policies, economists lose sight of policy questions that are not easily testable using RCTs, such as the effects of institutions, monetary policy or social norms.

The Economist

P4: The Entanglement of the Objective & The Subjective

November 29, 2016 Leave a comment

from Asad Zaman

PRELIMINARY REMARKS: Philosopher Hilary Putnam writes in “The Collapse of the Fact/Value Distinction” that there are cases where we can easily and clearly distinguish between facts and values — the objective and the subjective. However, it is wrong to think that we can ALWAYS do so. There are many sentences, especially in economic theories, where the two are “inextricably entangled” .

This is the fourth post in a sequence about Re-Reading Keynes. This post is focused on a single point which has been mentioned,  but perhaps not sufficiently emphasized earlier: the entanglement of the economic system with the economic theories about the system. Our purpose in reading Keynes is not directly to understand Keynesian theory — that is, to assess it as an economic theory in isolation, and whether or not it is valid and useful for contemporary affairs. Rather, we want to co-understand Keynesian theory and the historical context in which it was born. This is an exercise in the application of Polanyi’s methodology, which I described in excruciating detail in my paper published in WEA journal Economic Thought recently:

Asad Zaman (2016) ‘The Methodology of Polanyi’s Great Transformation.’ Economic Thought, 5.1, pp. 44-63.

I must confess that I am not very happy with the paper; I was struggling to formulate the ideas, and could not achieve the clarity that I always try for. It is a difficult read, though it expresses very important ideas — laying out the foundations for a radical new methodology which incorporates political, social and historical elements that have been discard in conventional methodology for economics. One of the key elements of Polanyi’s methodology is the interaction between theories and history — our history generates our experiences of the world, and this experience in understood in the light of theories we generate to try to understand this experience. This obvious fact was ignored & lost due to the positivist fallacy that facts can be understood directly by themselves. The truth is that they can only be understood within the context of a (theoretical) framework. Once we use theories to understand experience, then these theories are used to shape our responses to this experience, and so these theories directly impact on history — history is shaped by theories, and theories are shaped by history. The two are “inextricably entangled.”  read more

‘Post-real’ macroeconomics — three decades of intellectual regress

November 28, 2016 6 comments

from Lars Syll

Macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take, after Kydland and Prescott (1982) launched the real business cycle (RBC) model …

67477738In response to the observation that the shocks are imaginary, a standard defence invokes Milton Friedman’s (1953) methodological assertion from unnamed authority that “the more significant the theory, the more unrealistic the assumptions.” More recently, “all models are false” seems to have become the universal hand-wave for dismissing any fact that does not conform to the model that is the current favourite.

The noncommittal relationship with the truth revealed by these methodological evasions and the “less than totally convinced …” dismissal of fact goes so far beyond post-modern irony that it deserves its own label. I suggest “post-real.”

Paul Romer

There are many kinds of useless economics held in high regard within the mainstream economics establishment today. Few  are less deserved than the post-real macroeconomic theory — mostly connected with Finn Kydland, Robert Lucas,  Edward Prescott and Thomas Sargent — called RBC.   Read more…

Postmodern mumbo jumbo soup

November 28, 2016 3 comments

from Lars Syll

MUMBO-JUMBO1The move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brought the question of temporality into the thinking of structure, and marked a shift from a form of Althusserian theory that takes structural totalities as theoretical objects to one in which the insights into the contingent possibility of structure inaugurate a renewed conception of hegemony as bound up with the contingent sites and strategies of the rearticulation of power.

Judith Butler

P3: Impact of Keynes

November 27, 2016 2 comments

from Asad Zaman

This 1000 word article is the third in a series of posts on Re-Reading Keynes. It traces the impact of Keynesian theories on the 20th century, as necessary background knowledge for a contextual and historically situated study of Keynes. It was published in Express Tribune on 4 Nov 2016.

The Global Financial Crisis (GFC) has created awareness of the great gap between academic models and reality. IMF Chief Economist Olivier Blanchard said that modern DSGE macroeconomic models currently used for policy decisions are based on assumptions which are profoundly at odds with what we know about consumers and firms. More than seven different schools of macroeconomic thought contend with each other, without coming to agreement on any fundamental issue. This bears a striking resemblance to the post-Depression era when Keynes set out to resolve the “deep divergences of opinion between fellow economists which have for the time being almost destroyed the practical influence of economic theory.”

Likewise, today, the inability of mainstream economists to predict, understand, explain, or find remedies for the Global Financial Crisis, has deeply damaged the reputation of economists and economic theories. Recently, World Bank Chief Economist Paul Romer stated that for more than three decades, macroeconomics has gone backwards. Since modern macroeconomics bears a strong resemblance to pre-Keynesian theories, Keynesian theories have fresh relevance, as described below.

In the aftermath of the Great Depression, economic misery was a major factor which led to the Russian Revolution and the rise of Hitler in Germany. Conventional economic theory held that market forces would automatically and quickly correct the temporary disequilibrium of high unemployment and low production in Europe and USA. Keynes argued that high unemployment could persist, and government interventions in the form of active monetary and fiscal policy were required to correct the economic problems. Many have suggested that Keynes rescued Capitalism by providing governments with rationale to intervene on behalf of the workers, thereby preventing socialist or communist revolutions. There is no doubt that strong and powerful labor movements in Europe and USA derived strength from the economic misery of the masses, and also took inspiration from the pro-labor and anti-capitalist theories of Marx. While it is hard to be sure whether Keynes saved capitalism, we can be very sure that Keynes and Keynesian theories were extremely influential in shaping the economic landscapes of the 20th Century.     read more

Public debt should not be zero. Ever!

November 26, 2016 17 comments

from Lars Syll

Nation states borrow to provide public capital: For example, rail networks, road systems, airports and bridges. These are examples of large expenditure items that are more efficiently provided by government than by private companies.

darling-let-s-get-deeply-into-debtThe benefits of public capital expenditures are enjoyed not only by the current generation of people, who must sacrifice consumption to pay for them, but also by future generations who will travel on the rail networks, drive on the roads, fly to and from the airports and drive over the bridges that were built by previous generations. Interest on the government debt is a payment from current taxpayers, who enjoy the fruits of public capital, to past generations, who sacrificed consumption to provide that capital.

To maintain the roads, railways, airports and bridges, the government must continue to invest in public infrastructure. And public investment should be financed by borrowing, not from current tax revenues.

Investment in public infrastructure was, on average, equal to 4.3% of GDP in the period from 1948 through 1983. It has since fallen to 1.6% of GDP. There is a strong case to be made for increasing investment in public infrastructure. First, the public capital that was constructed in the post WWII period must be maintained in order to allow the private sector to function effectively. Second, there is a strong case for the construction of new public infrastructure to promote and facilitate future private sector growth.

The debt raised by a private sector company should be strictly less than the value of assets, broadly defined. That principle does not apply to a nation state. Even if government provided no capital services, the value of its assets or liabilities should not be zero except by chance.

National treasuries have the power to transfer resources from one generation to another. By buying and selling assets in the private markets, government creates opportunities for those of us alive today to transfer resources to or from those who are yet to be born. If government issues less debt than the value of public capital, there will be an implicit transfer from current to future generations. If it owns more debt, the implicit transfer is in the other direction.

The optimal value of debt, relative to public capital, is a political decision … Whatever principle the government does choose to fund its expenditure, the optimal value of public sector borrowing will not be zero, except by chance.

Roger Farmer

Today there seems to be a rather widespread consensus of public debt being acceptable as long as it doesn’t increase too much and too fast. If the public debt-GDP ratio becomes higher than X % the likelihood of debt crisis and/or lower growth increases.  Read more…

The disconnect in the US between productivity and wages

November 25, 2016 3 comments

P3: Impact of Keynes

November 25, 2016 2 comments

from Asad Zaman

This 1000 word article is the third in a series of posts on Re-Reading Keynes. It traces the impact of Keynesian theories on the 20th century, as necessary background knowledge for a contextual and historically situated study of Keynes. It was published in Express Tribune on 4 Nov 2016.

The Global Financial Crisis (GFC) has created awareness of the great gap between academic models and reality. IMF Chief Economist Olivier Blanchard said that modern DSGE macroeconomic models currently used for policy decisions are based on assumptions which are profoundly at odds with what we know about consumers and firms. More than seven different schools of macroeconomic thought contend with each other, without coming to agreement on any fundamental issue. This bears a striking resemblance to the post-Depression era when Keynes set out to resolve the “deep divergences of opinion between fellow economists which have for the time being almost destroyed the practical influence of economic theory.”

Likewise, today, the inability of mainstream economists to predict, understand, explain, or find remedies for the Global Financial Crisis, has deeply damaged the reputation of economists and economic theories. Recently, World Bank Chief Economist Paul Romer stated that for more than three decades, macroeconomics has gone backwards. Since modern macroeconomics bears a strong resemblance to pre-Keynesian theories, Keynesian theories have fresh relevance, as described below.   read more

Econometrics — science built on beliefs and untestable assumptions

November 24, 2016 4 comments

from Lars Syll

What is distinctive about structural models, in contrast to forecasting models, is that they are supposed to be – when successfully supported by observation – informative about the impact of interventions in the economy. As such, they carry causal content about the structure of the economy. Therefore, structural models do not model mere functional relations supported by correlations, their functional relations have causal content which support counterfactuals about what would happen under certain changes or interventions.

causationThis suggests an important question: just what is the causal content attributed to structural models in econometrics? And, from the more restricted perspective of this paper, what does this imply with respect to the interpretation of the error term? What does the error term represent causally in structural equation models in econometrics? And finally, what constraints are imposed on the error term for successful causal inference? …

I now consider briefly a key constraint that may be necessary for the error term to meet for using the model for causal inference. To keep the discussion simple, I look only at the simplest model

y= αx+u

Read more…

Follies and fallacies of Chicago economics

November 23, 2016 3 comments

from Lars Syll

Savings-and-InvestmentsEvery dollar of increased government spending must correspond to one less dollar of private spending. Jobs created by stimulus spending are offset by jobs lost from the decline in private spending. We can build roads instead of factories, but fiscal stimulus can’t help us to build more of both. This form of “crowding out” is just accounting, and doesn’t rest on any perceptions or behavioral assumptions.

John Cochrane

And the tiny little problem? It’s utterly and completely wrong!

What Cochrane is reiterating here is nothing but Say’s law, basically saying that savings are equal to investments, and that if the state increases investments, then private investments have to come down (‘crowding out’). As an accounting identity there is of course nothing to say about the law, but as such it is also totally uninteresting from an economic point of view. As some of my Swedish forerunners — Gunnar Myrdal and Erik Lindahl — stressed more than 80 years ago, it’s really a question of ex ante and ex post adjustments. And as further stressed by a famous English economist about the same time, what happens when ex ante savings and investments differ, is that we basically get output adjustments. GDP changes and so makes saving and investments equal ex ost. And this, nota bene, says nothing at all about the success or failure of fiscal policies!

Read more…

P2: Methodology for (Re)-Reading Keynes

November 22, 2016 9 comments

from Asad Zaman

aaeaaqaaaaaaaahhaaaajdq5nmzhzwvllwmxn2utndg0yy05mtg2lwe5ymqxzjhhmji0nqThe first post on Reading Keynes provided an outline of the reasons why this is a good idea. It is clear that economics is broken. We need a new macroeconomics for the 21st century, one which can solve the massive problems which humanity as a whole is facing on political, social, economic, and environmental dimensions. Keynes faced similar problems, and found solutions which guided economic policy in the mid twentieth century. It is always useful to absorb the insights of our predecessors, before trying to build upon them. Such a methodology is essential for the advancement, progress and accumulation of knowledge. Our current stock of human knowledge is based on the collected insights and labors of hundreds of thousands of scholars, accumulated over the centuries. We would return to the stone ages if we were to reject it as being full of contradictions and errors (which it is). Instead, progress occurs by absorbing the past accumulated wisdom, and trying to remove the errors, or add missing insights, building on our heritage, rather than discarding it and starting over from scratch.

Several of the central Keynesian insights into the causes of the Great Depression never made it into the economics textbooks. However, our goal in studying Keynes goes far beyond just the re-discovery of these lost Keynesian insights.   A central goal is to apply and illustrate a radically different methodology for studying economics in particular, and social science in general. This is derived from a study of The Methodology of Polanyi’s Great Transformation. This is an extremely important point, which we proceed to amplify and explain further.  Read More

P1: Reading Keynes

November 21, 2016 24 comments

from Asad Zaman

I am planning a sequence of posts on re-reading Keynes, where I will try to go through the General Theory. This first post explains my motivations for re-reading Keynes. As always, my primary motive is self-education; this will force me to go through the book again — I first read it in my first year graduate course on Macroeconomics at Stanford in 1975, when our teacher Duncan Foley was having doubts about modern macro theories, and decided to go back to the original sources. At the time, I could not understand it at all, and resorted to secondary sources, mainly Leijonhufvud, to make sense of it. Secondarily, i hope to be able to summarize Keynes’ insights to make them relevant and useful to a contemporary audience. Thirdly, there are many experts, especially Paul Davidson, on this blog, who will be able to prevent me from making serious mistakes in interpretation.

Reasons for Studying Keynes

The heart has its reasons of which reason knows nothing.” Blaise Pascal

In line with the objectives of the WEA Pedagogy Blog, I am initiating a study group with the aim of [re-]reading Keynes’ classic The General Theory of Employment, Interest and Money. There are many reasons why I think this is a worthwhile enterprise. I hope to make weekly posts summarizing various aspects of the book, as we slog through the work, which can be difficult going in some parts. At the very least, this will force me to re-read Keynes, something I have been meaning to do for a long time. In this first post, I would like to explain my motivation in doing this exercise. Read More