Archive

Author Archive

Economics as science and ideology

January 21, 2017 2 comments

from Peter Söderbaum

Book review of Offer, Avner and Gabriel Söderberg, The Nobel Factor: The Prize in Economics, Social Democracy and the Market Turn, Princeton University Press, Princeton 2016.

Since 1969 there has been a so called Nobel Economics Prize. It is not a normal Nobel Prize established by Alfred Nobel but rather a prize reminding us of the 300 year existence of the Central Bank in Sweden. The correct name is “The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel”. The history of this prize, how it came about and its development over the years with Laureates and their achievements is now presented by two scholars in Economic History, Avner Offer and Gabriel Söderberg.

Offer and Söderberg appear to be well acquainted with developments in mainstream economics and the different achievements by the winners of the prize. However, their study is of interest mainly because they depart from neoclassical economists in their approach. Mainstream economists believe in value-neutrality (or at least behave as if they believed in value-neutrality). For Offer and Söderberg, value issues are instead at the heart of analysis. They are interested in the ideological and political role of “the Nobel Factor” over the years.

Beliefs in value-neutrality suggest that the values or ideological orientations of economists are of little interest. The scholar is just looking for the truth about economic agents (households and firms), the functioning of markets and the economy as a whole. In fact this value-neutrality idea functions as a “limited responsibility” doctrine for the neoclassical economist as scholar.  read more

 

Uncovering where the econometric skeletons are buried

January 20, 2017 Leave a comment

from Lars Syll

A rigorous application of econometric methods in economics presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. Parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Invariance assumptions need to be made in order to draw causal conclusions from non-experimental data: parameters are invariant to interventions, and so are errors or their distributions. Exogeneity is another concern. In a real example, as opposed to a hypothetical, real questions would have to be asked about these assumptions. Why are the equations ‘structural,’ in the sense that the required invariance assumptions hold true? Applied papers seldom address such assumptions, or the narrower statistical assumptions: for instance, why are errors IID?

The tension here is worth considering. We want to use regression to draw causal inferences from non-experimental data. To do that, we need to know that certain parameters and certain distributions would remain invariant if we were to intervene. Invariance can seldom be demonstrated experimentally. If it could, we probably wouldn’t be discussing invariance assumptions. What then is the source of the knowledge?

‘Economic theory’ seems like a natural answer, but an incomplete one. Theory has to be anchored in reality. Sooner or later, invariance needs empirical demonstration, which is easier said than done.

David Freedman: Statistical Models – Theory and Practice (CUP 2009:187)

Since econometrics aspires to explain things in terms of causes and effects it needs loads of assumptions. Invariance is not the only limiting assumption that has to be made. Equally important are the ‘atomistic’ assumptions of additivity and linearity.  read more

Theory of Employment

January 19, 2017 Leave a comment

from Asad Zaman

jobsThis is the 9th Post in a sequence about Re-Reading Keynes. In chapter 2 of General Theory, Keynes wishes to develop a theory of employment. He claims that classical economics does not have a theory of employment, because it assumes that all resources will be fully employed. But the theory that unemployment will always be 0% – except for frictional – is not a theory which can explain observations of high and persistent unemployment. Taking this post-Depression observation for granted, the question arises how we can create a theory in which the labor resources can be utilized at different levels. In order show that classical theory cannot explain the observed fluctuations in the level of employment, Keynes lists the four possibilities under classical theory which could create a change in the quantity of labor being employed:

  1. A more efficiently organized labor market, which find faster matches between the unemployed and job opportunities, would lower frictional unemployment and increase employment.
  2. A decrease in the disutility of labor would mean that laborers would be willing to accept lower wage offers, which would lead to expansion of the employment.
  3. An increase in the productivity of labor would bring greater rewards to the employers and induce them to hire more labor at a given wage.
  4. An exogenous decline in price of consumer goods purchased by laborers would increase the real wage and thereby employment. Exogenous means that demand for these goods by non-laborers decreases, causing the price decline.  read more

Catalyst’s Malick, unhappy about my reporting on US influence, hits back with false claim

January 18, 2017 1 comment

from  Norbert Haering

In a news piece on rediff, one of India’s most popular news-sites, Badal Malick, CEO of the US-Indian organization Catalyst, explains via a friendly journalist, what Catalyst is doing and that my writing on Catalyst and on Washington’s meddling in the fight against cash in India was bogus. He did not convince me. Maybe he will convince you.

To very briefly summarize my piece “‘A Well-Kept Open Secret: Washington Is Behind India’s Brutal Demonetisation Project‘”( augmented here or both in a consolidated version on zero hedge), I had written that the longstanding US influence, notably the influence of the Better Than Cash Alliance, in the fight against cash in India has been conspicuously absent in the discussion about the sudden demonetization that Premier Modi decreed on 8 November 2016. I have then provided the evidence of this US involvement, including the launch of Catalyst less than four weeks before the demonetization. The rediff-article even mentions that Catalyst was launched at a conference in Delhi hosted by the … drumrolls …  Better Than Cash Alliance.

This is the part of the rediff-article that deals with my writing:  Read more…

Keynes’ critique of econometrics — as valid today as it was in 1939

January 18, 2017 2 comments

from Lars Syll

Renowned ‘error-statistician’ Aris Spanos maintains — in a comment on this blog a couple of weeks ago — that Keynes’ critique of econometrics and the reliability of inferences made when it is applied, “have been addressed or answered.”

4388529One could, of course, say that, but the valuation of the statement hinges completely on what we mean by a question or critique being ‘addressed’ or ‘answered’. As I will argue below, Keynes’ critique is still valid and unanswered in the sense that the problems he pointed at are still with us today and ‘unsolved.’ Ignoring them — the most common practice among applied econometricians — is not to solve them.

To apply statistical and mathematical methods to the real-world economy, the econometrician have to make some quite strong assumption. In a review of Tinbergen’s econometric work — published in The Economic Journal in 1939 — Keynes gave a comprehensive critique of Tinbergen’s work, focussing on the limiting and unreal character of the assumptions that econometric analyses build on:

Completeness: Where Tinbergen attempts to specify and quantify which different factors influence the business cycle, Keynes maintains there has to be a complete list of all the relevant factors to avoid misspecification and spurious causal claims. Usually this problem is ‘solved’ by econometricians assuming that they somehow have a ‘correct’ model specification. Keynes is, to put it mildly, unconvinced:

Read more…

New Keynesianism — neither new nor Keynesian

January 16, 2017 1 comment

from Lars Syll

Maintaining that economics is a science in the ‘true knowledge’ business, I remain a skeptic of the pretences and aspirations of ‘New Keynesian’ macroeconomics. So far, I cannot really see that it has yielded very much in terms of realist and relevant economic knowledge. And there’s nothing new or Keynesian about it.

counterfeit‘New Keynesianism’ doesn’t have its roots in Keynes. It has its intellectual roots in Paul Samuelson’s ill-founded ‘neoclassical synthesis’ project, whereby he thought he could save the ‘classical’ view of the market economy as a (long run) self-regulating market clearing equilibrium mechanism, by adding some (short run) frictions and rigidities in the form of sticky wages and prices.

But — putting a sticky-price lipstick on the ‘classical’ pig sure won’t do. The ‘New Keynesian’ pig is still neither Keynesian nor new.

The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that ‘New Keynesians’ cannot give supportive evidence for their considering it fruitful to analyze macroeconomic structures and events as the aggregated result of optimizing representative actors. After having analyzed some of its ontological and epistemological foundations, yours truly cannot but conclude that ‘New Keynesian’ macroeconomics on the whole has not delivered anything else than ‘as if’ unreal and irrelevant models.   Read more…

Ideological Macroeconomics & Increasing Inequality

January 15, 2017 Leave a comment

from Asad Zaman

ideologyinequalityEven though very few people have more than a vague idea about them, macroeconomic theories deeply affect the lives of everybody on the planet. Writings of Piketty, Stiglitz and many others, as well as personal experience of the 1% — 99% divide, have created increasing awareness of the deep and increasing inequalities which characterize modern capitalist economies. However, the link between inequality and macroeconomic theory has not been pointed out clearly. The fact that since the 1970’s top corporate salaries have increased by 1000% while the average worker only earns 11% more is closely linked to the revolution in economic theory that occurred over the 70’s and 80’s. We will try to sketch some parts of the complex and coordinated efforts which led to the emergence of theories which provide the invisible foundations and the enabling environment for this inequality.

The oil crisis of the early 70’s destroyed the consensus on Keynesian macroeconomics, and created the opportunities for ideologies disguised as economic theories to emerge. Chicago school economist Robert Lucas attacked the dominant Keynesian theories which argued that governments must play an important role in eliminating unemployment. Guided by free market ideology, Lucas created macroeconomic theories which suggested that government interventions are always harmful. Some elements of the Lucasian methodology provided genuinely superior alternatives to defects in existing Keynesian models. However, other elements were bizarre. Even though unemployment is a painful reality to vast numbers of people, defender-of-free-markets Lucas argued that this was a free choice. According to Lucas, the Great Depression was really the Great Vacation, where vast numbers of people suddenly decided to stop working in order to enjoy leisure. This, and many other strange assumptions of the Lucasian alternative led famous economists like Robert Solow to say that to engage in a serious discussion with the Chicago school would be analogous to discussing technicalities of the Battle of Austerlitz with a madman who claimed to be Napoleon Bonaparte. For example, Solow wrote that “Bob Lucas and Tom Sargent like nothing better than to get drawn into technical discussions, because then attention is attracted away from the basic weakness of the whole story. Since I find that fundamental framework ludicrous, I respond by treating it as ludicrous – that is, by laughing at it – so as not to fall into the trap of taking it seriously and passing on to matters of technique.”  read more

Relevance is not irrelevant

January 12, 2017 Leave a comment

from Lars Syll

irrThere is something about the way macroeconomists construct their models nowadays that obviously doesn’t sit right.

Empirical evidence still only plays a minor role in mainstream economic theory, where models largely function as a substitute for empirical evidence.

One might have hoped that humbled by the manifest failure of its theoretical pretences during the latest economic-financial crisis, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics would give way to methodological pluralism based on ontological considerations rather than formalistic tractability. That has, so far, not happened.

If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypotheses of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations microfoundations is not a symptom of ‘irrationality.’ It rather shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality.   Read more…

On the limits of ‘statistical causality’

January 9, 2017 Leave a comment

from Lars Syll

If contributions made by statisticians to the understanding of causation are to be taken over with advantage in any specific field of inquiry, then what is crucial is that the right relationship should exist between statistical and subject-matter concerns …

introduction-to-statistical-inferenceWhere the ultimate aim of research is not prediction per se but rather causal explanation, an idea of causation that is expressed in terms of predictive power — as, for example, ‘Granger’ causation — is likely to be found wanting. Causal explanations cannot be arrived at through statistical methodology alone: a subject-matter input is also required in the form of background knowledge and, crucially, theory …

Likewise, the idea of causation as consequential manipulation is apt to research that can be undertaken primarily through experimental methods and, especially to ‘practical science’ where the central concern is indeed with ‘the consequences of performing particular acts’. The development of this idea in the context of medical and agricultural research is as understandable as the development of that of causation as robust dependence within applied econometrics. However, the extension of the manipulative approach into sociology would not appear promising, other than in rather special circumstances … The more fundamental difficulty is that, under the — highly anthropocentric — principle of ‘no causation without manipulation’, the recognition that can be given to the action of individuals as having causal force is in fact peculiarly limited.

John H. Goldthorpe

Causality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.  Read more…

More evidence of early US involvement in Indian demonetisation

January 9, 2017 2 comments

from  Norbert Haering

When Prime minister Narendra Modi took the bulk of Indian cash out of circulation, he caused great hardship for many Indians, while a disruption-loving tech elite and political establishment asked for optimism and patience. In an earlier piece I have provided some indications for US involvement in that scheme. In this piece, I am adding some more, including earlier, evidence, summarize the evidence and ask if this evidence is reasonably compatible with the interpretation that the initiative was really Modi’s.

There is no firm proof or admission, as yet, that the decision has been taken at the behest of foreign institutions. In a report of news agency Reuters from December named “Who knew?”, unnamed Indian official sources want to make us believe that only the prime minister himself and a handful of people, knew of the plans. The Reuters-report names only one of the supposedly five who knew, a high-ranking official of the finance ministry. Tellingly, there is not a single mention of any foreign involvement, despite a formal co-operation of the finance ministry with USAID, aimed at pushing back cash in favor of digital payments.

There is plenty of evidence that US government entities, foundations and other institutions were intensely involved. We briefly summarize the evidence presented already in an earlier piece, bring it together with some more evidence, and then ask, if this evidence is reasonably compatible with the interpretation that the Indian government made its own demonetization plan, and either did it alone or –  somewhat more plausibly – enlisted all the help and advice it could get, including from abroad. This is an interpretation that some readers of my earlier piece have brought forward. In the following concise list of evidence, items 3,6 and 7 have been discussed more extensively in my earlier piece, 1,2,4 and 5 are new.

Read more…

Keynesian Complexity

January 7, 2017 5 comments

from Asad Zaman

complexThis continues the sequence of posts on re-reading Keynes. The fundamental point about the labor market which is made in Chapter 2 is that the micro level negotiations on wages between firms and laborers do not determine the real wage in the macro-economy. Before explaining this point in detail, we want to show how it is just a special case of the general idea that the economy is a complex system which cannot be understood by looking at simple sub-systems.

The idea of complex systems is beautifully illustrated by the parable of the blind men and the elephant. Each one understood correctly and accurately one small part of the big picture. When we don’t understand the system as a whole, the descriptions of subsystems appear conflicting and contradictory. Once we have an understanding of the complex system, we can assemble the partial insights into a coherent whole. The main contribution of Keynes can be understood as an attempt to describe the economy as a complex system. Unfortunately, most of his followers were blind to the main insights of Keynes. Accordingly, there have been many different interpretations of Keynes; some followers saw the trunk, others the legs, and yet others the tail of the system that Keynes was describing. But no one appears to have understood the fundamental insights of Keynesian complexity: the system as whole does not act as a simple aggregate of the actions of the individual agents within the system. Pre-Keynesian macroeconomics was based centrally on the misunderstanding that the macroeconomy can be understood by scaling up the microeconomic behaviors of individual agents. While Keynes forcefully rejected this thesis, and created a complex system view of the macroeconomy, simple-minded followers failed to understand complexity, and went back to the pre-Keynesian views.  Read more…

The best advice you will get this year

January 5, 2017 8 comments

from Lars Syll

hunting

Getting it right about the causal structure of a real system in front of us is often a matter of great importance. It is not appropriate to offer the authority of formalism over serious consideration of what are the best assumptions to make about the structure at hand …

Where we don’t know, we don’t know. When we have to proceed with little information we should make the best evaluation we can for the case at hand — and hedge our bets heavily; we should not proceed with false confidence having plumped either for or against some specific hypothesis … for how the given system works when we really have no idea.

Trying to get around this lack of knowledge, mainstream economists in their quest for deductive certainty in their models, standardly assume things like ‘independence,’ ‘linearity,’ ‘additivity,’ ‘stability,’ ‘manipulability,’ ‘variation free variables,’ ‘faithfulness,’ ‘invariance,’ ‘implementation neutrality,’ ‘superexogeneity,’ etc., etc.   Read more…

A well-kept open secret: Washington is behind India’s brutal experiment of abolishing most cash

January 3, 2017 11 comments

from  Norbert Haering

In early November, without warning, the Indian government declared the two largest denomination bills invalid, abolishing over 80 percent of circulating cash by value. Amidst all the commotion and outrage this caused, nobody seems to have taken note of the decisive role that Washington played in this. That is surprising, as Washington’s role has been disguised only very superficially.

US-President Barack Obama has declared the strategic partnership with India a priority of his foreign policy. China needs to be reined in. In the context of this partnership, the US government’s development agency USAID has negotiated cooperation agreements with the Indian ministry of finance. One of these has the declared goal to push back the use of cash in favor of digital payments in India and globally.

On November 8, Indian prime minster Narendra Modi announced that the two largest denominations of banknotes could not be used for payments any more with almost immediate effect. Owners could only recoup their value by putting them into a bank account before the short grace period expired. The amount of cash that banks were allowed to pay out to individual customers was severely restricted. Almost half of Indians have no bank account and many do not even have a bank nearby. The economy is largely cash based. Thus, a severe shortage of cash ensued. Those who suffered the most were the poorest and most vulnerable. They had additional difficulty earning their meager living in the informal sector or paying for essential goods and services like food, medicine or hospitals. Chaos and fraud reigned well into December.  

Not even four weeks before this assault on Indians, USAID had announced the establishment of „Catalyst: Inclusive Cashless Payment Partnership“, with the goal of effecting a quantum leap in cashless payment in India. The press statement of October 14 says that Catalyst “marks the next phase of partnership between USAID and Ministry of Finance to facilitate universal financial inclusion”. The statement does not show up in the list of press statements on the website of USAID (anymore?). Not even filtering statements with the word “India” would bring it up. To find it, you seem to have to know it exists, or stumble upon it in a web search. Indeed, this and other statements, which seemed rather boring before, have become a lot more interesting and revealing after November 8.  

Read more…

The best economics article of 2016

January 2, 2017 2 comments

from Lars Syll

The best economics article of 2016 in my opinion was Paul Romer’s extremely well-written and brave frontal attack on the theories that has put macroeconomics on a path of ‘intellectual regress’ for three decades now:

Macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take, after Kydland and Prescott (1982) launched the real business cycle (RBC) model …

67477738In response to the observation that the shocks are imaginary, a standard defence invokes Milton Friedman’s (1953) methodological assertion from unnamed authority that “the more significant the theory, the more unrealistic the assumptions.” More recently, “all models are false” seems to have become the universal hand-wave for dismissing any fact that does not conform to the model that is the current favourite.

The noncommittal relationship with the truth revealed by these methodological evasions and the “less than totally convinced …” dismissal of fact goes so far beyond post-modern irony that it deserves its own label. I suggest “post-real.”

Paul Romer

There are many kinds of useless ‘post-realeconomics held in high regard within mainstream economics establishment today. Few — if any — are less deserved than the macroeconomic theory/method — mostly connected with Nobel laureates Finn Kydland, Robert Lucas, Edward Prescott and Thomas Sargent — called calibration.  Read more…

New study shows marginal productivity theory has only a ‘negligible’ link to reality

December 30, 2016 9 comments

from Lars Syll

The correlation between high executive pay and good performance is “negligible”, a new academic study has found, providing reformers with fresh evidence that a shake-up of Britain’s corporate remuneration systems is overdue.

jpgimageAlthough big company bosses enjoyed pay rises of more than 80 per cent in a decade, performance as measured by economic returns on invested capital was less than 1 per cent over the period, the paper by Lancaster University Management School says.

“Our findings suggest a material disconnect between pay and fundamental value generation for, and returns to, capital providers,” the authors of the report said.

In a study of more than a decade of data on the pay and performance of Britain’s 350 biggest listed companies, Weijia Li and Steven Young found that remuneration had increased 82 per cent in real terms over the 11 years to 2014 … The research found that the median economic return on invested capital, a preferable measure, was less than 1 per cent over the same period.

Patrick Jenkins/Financial Times

Mainstream economics textbooks usually refer to the interrelationship between technological development and education as the main causal force behind increased inequality. If the educational system (supply) develops at the same pace as technology (demand), there should be no increase, ceteris paribus, in the ratio between high-income (highly educated) groups and low-income (low education) groups. In the race between technology and education, the proliferation of skilled-biased technological change has, however, allegedly increased the premium for the highly educated group.  Read more…

Keynes betrayed

December 27, 2016 8 comments

from Lars Syll

To complete the reconciliation of Keynesian economics with general equilibrium theory, Paul Samuelson introduced the neoclassical synthesis in 1955 …

51zdd7pouql-_sx323_bo1204203200_In this view of the world, high unemployment is a temporary phenomenon caused by the slow adjustment of money wages and money prices. In Samuelson’s vision, the economy is Keynesian in the short run, when some wages and prices are sticky. It is classical in the long run when all wages and prices have had time to adjust….

Although Samuelson’s neoclassical synthesis was tidy, it did not have much to do with the vision of the General Theory …

In Keynes’ vision, there is no tendency for the economy to self-correct. Left to itself, a market economy may never recover from a depression and the unemployment rate may remain too high forever. In contrast, in Samuelson’s neoclassical synthesis, unemployment causes money wages and prices to fall. As the money wage and the money price fall, aggregate demand rises and full employment is restored, even if government takes no corrective action. By slipping wage and price adjustment into his theory, Samuelson reintroduced classical ideas by the back door—a sleight of hand that did not go unnoticed by Keynes’ contemporaries in Cambridge, England. Famously, Joan Robinson referred to Samuelson’s approach as ‘bastard Keynesianism.’

The New Keynesian agenda is the child of the neoclassical synthesis and, like the IS-LM model before it, New Keynesian economics inherits the mistakes of the bastard Keynesians. It misses two key Keynesian concepts: (1) there are multiple equilibrium unemployment rates and (2) beliefs are fundamental.

Not that long ago Paul Krugman had a post up on his blog telling us that what he and many others do is “sorta-kinda neoclassical because it takes the maximization-and-equilibrium world as a starting point” and that “New Keynesian models are intertemporal maximization modified with sticky prices and a few other deviations.”  Read more…

Military Keynesianism and the Military-Industrial Complex

December 27, 2016 15 comments

from Jonathan Nitzancheap_wars_can_trump_buck_the_trend_v01.jpg

Read more…

P7: GT02 Keynesian Unemployment

December 24, 2016 11 comments

from Asad Zaman

unemploymentThis 7th post in a series about re-reading Keynes, starts the discussion of Chapter 2 of General Theory, which deals with the Classical (and neoclassical) Postulates characterizing the Labor market. The astonishing fact is that Keynes central arguments regarding how the labor market can fail to be at equilibrium, despite flexible wages, were never understood. As a consequence, the theory of the labor market is taught today exactly as it was prior to Keynes, and completely disregards Keynesian objections, and the Keynesian alternative. This post makes a start on Chapter 2, and the analysis will be continued in later posts.

In this chapter, Keynes formulates and rebuts the (neo)-classical theory of the labor market and presents an alternative theory of employment. This chapter was apparently never understood by economists, who mis-interpreted it as stating that unemployment arises due to price rigidities. In fact, Keynes held this position earlier, but renounces it explicitly in this chapter. His theory of employment states that the real wage is an “emergent” phenomenon. That is micro level decisions and actions of laborers and firms are based on nominal wages, but the complex economic system itself determines the general level of prices which is not in control of individual agents. So the real wage is out of reach of individual actors, and even though all parties may try to reduce real wages, they may fail to do so, because prices may respond in un-anticipated ways.

Keynes starts out be stating the classical postulates for the labor market, which continue to be the basis of modern labor economics.  read more

Keynes’ critique of econometrics — the nodal point

December 23, 2016 Leave a comment

from Lars Syll

In treatprob-2my judgment, the practical usefulness of those modes of inference, here termed Universal and Statistical Induction, on the validity of which the boasted knowledge of modern science depends, can only exist—and I do not now pause to inquire again whether such an argument must be circular—if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appear more and more clearly as the ultimate result to which material science is tending …

The physicists of the nineteenth century have reduced matter to the collisions and arrangements of particles, between which the ultimate qualitative differences are very few …

The validity of some current modes of inference may depend on the assumption that it is to material of this kind that we are applying them … Professors of probability have been often and justly derided for arguing as if nature were an urn containing black and white balls in fixed proportions. Quetelet once declared in so many words—“l’urne que nous interrogeons, c’est la nature.” But again in the history of science the methods of astrology may prove useful to the astronomer; and it may turn out to be true—reversing Quetelet’s expression—that “La nature que nous interrogeons, c’est une urne”.

Professors of probability and statistics, yes. And more or less every mainstream economist!

The standard view in statistics – and the axiomatic probability theory underlying it – is to a large extent based on the rather simplistic idea that ‘more is better.’ But as Keynes argues – ‘more of the same’ is not what is important when making inductive inferences. It’s rather a question of ‘more but different.’   Read more…

The non-existence of Paul Krugman’s ‘Keynes/Hicks macroeconomic theory’

December 21, 2016 2 comments

from Lars Syll

islmPaul Krugman has in numerous posts on his blog tried to defend “the whole enterprise of Keynes/Hicks macroeconomic theory” and especially his own somewhat idiosyncratic version of IS-LM.

The main problem is simpliciter that there is no such thing as a Keynes-Hicks macroeconomic theory!

So, let us get some things straight.

There is nothing in the post-General Theory writings of Keynes that suggests him considering Hicks’s IS-LM anywhere near a faithful rendering of his thought. In Keynes’s canonical statement of the essence of his theory in the 1937 QJE-article there is nothing to even suggest that Keynes would have thought the existence of a Keynes-Hicks-IS-LM-theory anything but pure nonsense. So of course there can’t be any “vindication for the whole enterprise of Keynes/Hicks macroeconomic theory” – simply because “Keynes/Hicks” never existed.

And it gets even worse!

John Hicks, the man who invented IS-LM in his 1937 Econometrica review of Keynes’ General Theory – ‘Mr. Keynes and the ‘Classics’. A Suggested Interpretation’ – returned to it in an article in 1980 – ‘IS-LM: an explanation’ – in Journal of Post Keynesian Economics. Self-critically he wrote:   Read more…