Home > Uncategorized > Modern macroeconomics — theory based on misleading illusions

Modern macroeconomics — theory based on misleading illusions

from Lars Syll

Standard new Keynesian macroeconomics essentially abstracts away from most of what is important in macroeconomics. To an even greater extent, this is true of the dynamic stochastic general equilibrium (DSGE) models that are the workhorse of central bank staffs and much practically oriented academic work.

kWhy? New Keynesian models imply that stabilization policies cannot affect the average level of output over time and that the only effect policy can have is on the amplitude of economic fluctuations, not on the level of output. This assumption is problematic at a number of levels …

As macroeconomics was transformed in response to the Depression of the 1930s and the inflation of the 1970s, another 40 years later it should again be transformed in response to stagnation in the industrial world.

Lawrence Summers

Mainstream macroeconomics is stuck with crazy models — and ‘New Keynesian’ macroeconomics and DSGE models certainly, as Summers puts it, “essentially abstract away from most of what is important in macroeconomics. ”

Let me just give one example.

A lot of mainstream economists out there still think that price and wage rigidities are the prime movers behind unemployment. What is even worse — I’m totally gobsmacked every time I come across this utterly ridiculous misapprehension — is that some of them even think that these rigidities are the reason John Maynard Keynes gave for the high unemployment of the Great Depression. This is of course pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage and price rigidities, he certainly did not hold this view.

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the classical economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

To Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …
The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemploy-ment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

J M Keynes General Theory

People calling themselves ‘New Keynesians’ ought to be rather embarrassed by the fact that the kind of microfounded dynamic stochastic general equilibrium models they use, cannot incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. Maybe that’s also the reason prominent ‘New Keynesian’ macroeconomist Simon Wren-Lewis can write

I think the labour market is not central, which was what I was trying to say in my post. It matters in a [New Keynesian] model only in so far as it adds to any change to inflation, which matters only in so far as it influences central bank’s decisions on interest rates.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

In a blogpost discussing ‘New Keynesian’ macroeconomics and the definition of neoclassical economics, Paul Krugman writes:

So, what is neoclassical economics? … I think we mean in practice economics based on maximization-with-equilibrium. We imagine an economy consisting of rational, self-interested players, and suppose that economic outcomes reflect a situation in which each player is doing the best he, she, or it can given the actions of all the other players …

Some economists really really believe that life is like this — and they have a significant impact on our discourse. But the rest of us are well aware that this is nothing but a metaphor; nonetheless, most of what I and many others do is sorta-kinda neoclassical because it takes the maximization-and-equilibrium world as a starting point or baseline, which is then modified — but not too much — in the direction of realism.

This is, not to put too fine a point on it, very much true of Keynesian economics as practiced … New Keynesian models are intertemporal maximization modified with sticky prices and a few other deviations …

Why do things this way? Simplicity and clarity. In the real world, people are fairly rational and more or less self-interested; the qualifiers are complicated to model, so it makes sense to see what you can learn by dropping them. And dynamics are hard, whereas looking at the presumed end state of a dynamic process — an equilibrium — may tell you much of what you want to know.

Being myself sorta-kinda Keynesian I find this analysis utterly unconvincing.

Maintaining that economics is a science in the “true knowledge” business, yours truly remains a skeptic of the pretences and aspirations of ‘New Keynesian’ macroeconomics. So far, I cannot really see that it has yielded very much in terms of realistic and relevant economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious underlabouring of its deeper philosophical and methodological foundations. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that ‘New Keynesians’ cannot give supportive evidence for their considering it fruitful to analyze macroeconomic structures and events as the aggregated result of optimizing representative actors. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that ‘New Keynesian’ macroeconomics on the whole has not delivered anything else than “as if” unreal and irrelevant models.

Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427]. We should look out for causal relations. But models can never be more than a starting point in that endeavour. There is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model.

The kinds of laws and relations that economics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

Economic policies cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” is not enough. If they could not get at the causal structure that generated the data, they are not really “identified”. Dynamic stochastic general euilibrium (DSGE) macroeconomists – including ‘New Keynesians’ — have drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

Here we are getting close to the heart of darkness in ‘New Keynesian’ macroeconomics. Where ‘New Keynesian’ economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the idividual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms. And although one may easily agree with Krugman’s emphasis on simple models, the simplifications used may have to be simplifications adequate for macroeconomics and not those adequate for microeconomics.

‘New Keynesian’ macromodels describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “macroeconomic laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.

Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in these macroeconomic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity, when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real world target systems does not take us very far, unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to the real world.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. The “deep parameters” of ‘New Keynesian’ DSGE models – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Krugman — and other sorta-kinda ‘New Keynesians’ — when they try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations.”

The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than “hand waving” that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to ‘New Keynesians.’

Larry Summers is right — we certainly need “a new Keynesian economics that is more Keynesian and less new.”

  1. ghholtham
    October 9, 2020 at 10:21 pm

    Lars Syll’s exegesis of Keynes views is correct to the best of my knowledge and his criticisms of the New Keynesian school and of DSGE modelling are entirely correct. Although most macroeconomists would fiercely resist the conclusion, the fact is the subject has made no progress in half a century. For everything that has been learned something else has been wilfully forgotten. It might be instructive to consider how this state of affairs came about.
    After the Keynesian insights were accepted into mainstream thinking, efforts were made to build empirical models of the economy. This was the period in which Laurence Klein received a Nobel Prize for efforts to build empirical models with parameters derived from analysis of time series of data collected by national statistical agencies. The effort was bedevilled by the fact that economies evolve and are subject to unforeseen events. Things happen that bring to the fore mechanisms that had previously been insignificant and therefore neglected. The oil shocks and inflation of the 1970s were a key example. They led to stagflation which was widely asserted to “disprove” Keynesian economics. But Keynes and the Kleinian models variables had never considered the effect of a violent shock to the terms of trade. Events required an extension of the theory not its wholesale rejection. Nonetheless the failure of models to deal with new events was seen as owing to their lack of “generality”. A true, structural model, it was thought, would incorporate changes of behaviour when something new happened and would not merely calibrate and replicate past behaviour as Kleinian models did.
    This reflection led to a search for “microfoundations” and the notion of “rational expectations”. Instead of equations that simply reflected past movements of economic aggregates, the new approach would be based on individual behaviour that would respond flexibly to new developments. It was a fine aspiration but implementation was not simply defective, it was ridiculous. There is no simple way to aggregate the behaviour of disparate individuals or to model what they know or don’t know so the search for microfoundations degenerated into modelling the behaviour of an all-knowing (and non-existent) representative agent. This led to all the vices that Lars Syll describes. The new models in almost all circumstances perform much worse in forecasting than the modes they replaced. Assuming a “representative agent” abolishes the co-ordination issues that are at the heart of real macroeconomics and ignoring the consciousness of uncertainty and doubt that afflicts all economic agents in the real world leads to models that seriously mislead both in forecasting and policy design.
    The question is: where do we go from here? The criticisms of old-fashioned “hydraulic” macro-modelling were correct. But the “cure” was worse than the disease and I am at a loss to know why intelligent and well-meaning people like Krugman and Wren-Lewis do not recognise as much. When they talk policy, they talk sense and it does not seem to bother them that that sense is at variance with the modelling methods they follow.
    If we want to go past the simple calibration of macro-economic relations without benefit of “microfoundations”, the only promising approach is so-called agent-based-models that use empirical research to model what groups or individuals actually do and aggregate those behaviours by simulation. That leads to “black-box” modelling with unpredctable outcomes that take a lot of effort to understand. As an approach it does not lead to “theorems” or analytic solutions of equations so beloved of the neo-classicists. The bad news is that’s how the world is. Unless and until some future economic Einstein discovers that simple conclusions can be derived from evolving complexity, simulation of data-based, empirically derived models with no generally true characteristics is the best we can do.
    The one false note in Lars’ fair and cogent critique is his ritual side-swipe at econometrics. If assumptions made in modelling are to be empirically based and not the result of introspection and thumb-sucking they will require the statistical evaluation of data, with all the risks and traps that entails. Lars understand the mess we are in but wants to close one of the paths that could lead out of the morass.

    • October 10, 2020 at 2:02 pm

      Thanks for your interesting comment, Gerry.
      Since the one thing we obviously disagree on is the value of statistical/econometric modelling, let me take the opportunity to further situate/explain my “side-swipe” at it.
      When it comes to real-world social systems, my view is that they are usually not governed by stable causal mechanisms or capacities. So the kinds of ‘laws’ and relations that econometrics has established, are, ipso facto, laws and relations only about entities in models that presuppose causal mechanisms and variables — and the relationship between them — being linear, additive, homogenous, stable, invariant and atomistic. But when causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. Since statisticians and econometricians, as far as I can see, haven’t been able to convincingly warrant their assumptions of homogeneity, stability, invariance, independence, additivity, etc., etc., as being ontologically isomorphic to real-world economic systems, I remain a sceptic of the scientific aspirations of econometrics. Econometric patterns should never be seen as anything else than possible clues to follow. For me, as a critical realist, it is obvious that behind observable data there are real structures and mechanisms operating, things that are, at least if we want to understand/explain things in the real world, more important to get hold of than to simply correlate and regress observable variables. In the end,​ this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type — and then the statistical-econometric approach simply is not applicable, since it builds on the fundamental assumption of uncertainty being able to reduce to risk. It isn’t.

  2. October 10, 2020 at 11:06 am

    Lars says:
    “Where ‘New Keynesian’ economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system”.

    So economic systems evolve. Does that necessarily improve them?

    “The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level”.

    But that does not mean people can be left out of the model. These variable exist, and the macro model must permit discussion of e.g. whether their interactions differ in different parts of the system.

    “If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around. To Keynes this was self-evident. But obviously not so to ‘New Keynesians.’”

    Yes. But the economic system has evolved. Has the Keynesian mathematics?

    Gerald says:
    “Unless and until some future economic Einstein discovers that simple conclusions can be derived from evolving complexity, simulation of data-based, empirically derived models with no generally true characteristics is the best we can do”.

    Gerald, the “Einstein” needs to be an up-to-date mathematician, not an economist. And read Waldrop on “Complexity”. The simple conclusion (the logistic equation) exists, and was indeed not derived, but an insight justified by simulation. It corresponds to the PID cybernetic relationship I’ve personally dwelt on (the choice of name serendipitous in light of Ashby’s “requisite variety”), which I’ve tried to show the mathematician Keynes was moving towards.

    “The one false note in Lars’ fair and cogent critique is his ritual side-swipe at econometrics”

    Again, you miss the point that Keynes knew more about statistics than his critics, and I myself have constantly pointed out that dice-throwing simulates a frequency, indicating bias that can be sufficiently eliminated not by adding more data (induction) but by Newton’s method of successive approximation.

    So the moral of this story is that the economists should beware their own arrogance, whether or not that is a cover for uncertainty. It stops them learning.

  3. ghholtham
    October 10, 2020 at 5:04 pm

    It might be useful to distinguish two kinds of uncertainty. Think of them as outside and inside uncertainty. Inside uncertainty concerns the operation of processes that have been identified and are under examination. Outside uncertainty concerns what processes may become important in future that have not been evident in the past and therefore not been studied and perhaps not even identified. The latter kind of uncertainty is what makes the future so opaque and which Keynes referred to as the dark forces of time and ignorance. We can all agree that we can’t do much about outside uncertainty. Our theorising is based on what has happened in the past and, if we wish to forecast or generalise, we make an assumption that the future will resemble that past in key respects. All theorising and forecasting is conditional in that sense. We may still be wrong, our conclusions are still uncertain even when so qualified. But about this sort of inside uncertainty it is possible to say something. And the real world confirms that. You can insure your house and your car, you can bet on a horse-race or an election. Insurance companies, bookmakers and careful econometricians all know how to identify and play the odds. To suggest that everything is so uncertain as to be irreducible to risk is a counsel of despair for which every-day life provides no support. Of course people know the future is uncertain and act accordingly but that does not necessarily make their behaviour unfathomable.

    I seem to disagree with Lars and I would like to narrow that disagreement down because there should be common ground. If we use statistics to test a proposition or theory, evidently we start with the theory. We bring that to the data; it does not emerge from the data. We are not assuming causality but we are positing it. If we define the domain of the proposition we are saying in what conditions we expect it to hold and, yes, we are making a ceteris paribus condition that the dark forces won’t deliver some wholly new factor to upset everything. We know that condition will not always be met but life (and social studies) are possible because a lot of the time it is met. None of the properties that Lars asserts are required for econometric analysis are in fact necessary. Modern econometrics can deal with non-linearity, different sorts of instability, time-varying elements etc. The only limitation – a serious one – is the more complex the relationship you posit the more data you need to identify and test it. If data is sparse you can test propositions only in a simple form. Moreover I think Lars believes the properties of models he wants to criticise are necessary properties of econometric models. “Atomistic” for example. Not so; don’t confuse the common with the inevitable. If you want to specify a model with interaction effects based on class warfare or on sociological impulses like keeping up with the Joneses – that’s fine. As long as it has testable implications for observable data you can test it econometrically. Econometrics is also used to dealing with non-observables. Virtual variables can be handled by techniques like Kalman filter so long as they have observable effects.
    I remain puzzled about one element in Lars’ position. He says: “it is obvious that behind observable data there are real structures and mechanisms operating, things that are, at least if we want to understand/explain things in the real world, more important to get hold of than to simply correlate and regress observable variables.”
    Lots of theories in different disciplines posit unobservable elements like quarks (or utility!). We can only “get hold of them” by a two-step process of imagination to begin with and then working out what their consequences are for observables. If they have no consequences for observables we can happily leave them to the metaphysician. To reassure ourselves we really have “got hold of them” we then have to test whether the observable consequences appear in reality. If you can do fully controlled experiments that’s what you do. If you can’t you accept the expected consequences could be obscured by the noise of other factors and you have to test statistically. Tests may have low power and may be inconclusive. If the consequences are not observed, however, you have not got hold of anything and you should try another tack. Economics’ reluctance to accept the results of this kind of testing is one of its abiding weaknesses. I am genuinely puzzled to know how Lars would subject theories to empirical test if he rejects statistical analysis.

    Dave, I don’t assume that the grand Einsteinian resolution in economics is possible at all and I certainly don’t make any assumption as to who will make it. That it might well require advances in new branches of mathematics seems very likely. People hoped chaos theory might help but so far no cigar. And let’s not beatify Keynes. He new a lot of statistics for his time but the practical, if not the philosophical, end of the subject has moved on in eighty years – more than economics if truth be told. So have computational methods that now make things possible that weren’t back then. Don’t forget Keynes died before the computer became commonplace and he was unacquainted with contemporary numerical methods. I am confident he would be exacting but open-minded about their use.

  4. October 11, 2020 at 8:03 pm

    I am afraid that a tiny little bit of justification of New Keynesian or any other DSGE (equilibrium-utility maximisation) approach goes unnoticed at large by our community here on RWER: Lars quotes Simon Wren-Lewis as saying “It matters in a [New Keynesian] model only in so far as it adds to any… .” The key word that slips attention too often is the “model”. New Keynesians and friends have abandoned the investigation of real life matters. They only care about their models. To them reality is what the model describes and not the other way round. Therefore, they do not even notice criticism based on missing links between model and reality because they simply cannot understand it.

    • October 12, 2020 at 9:58 am

      Spot on, Christian!

    • October 12, 2020 at 10:56 pm

      Good to see Lars is still alive enough to read at least some of us! Christian, I presume what you objected to in Wren-Lewis came via Lars’ link. This is what I objected to, ironically from a self-proclaimed neo-Keynesian:

      “I think the labour market is not central, which was what I was trying to say in my post”.

      What the hell else was real Keynesianism about other than problems in the labour market?

      I am much more taken in by the quote from Paul Krugman which Lars so vehemently disagrees with.

      “In the real world, people are fairly rational and more or less self-interested; the qualifiers are complicated to model, so it makes sense to see what you can learn by dropping them. And dynamics are hard, whereas looking at the presumed end state of a dynamic process — an equilibrium — may tell you much of what you want to know”

      Says Lars: “Being myself sorta-kinda Keynesian I find this analysis utterly unconvincing”.

      Being a trained engineer like the one cited elsewhere Gerald couldn’t bring himself to name, I can read Krugman two ways: one at face value, where I agree with Lars, and the other where I see where he is wrong: assuming that “dynamics are hard” and “presuming” the aim of economics rather than looking at the reality. On the “hardness”, it is said “an engineer can do for a penny what any fool can do for a pound”, and here circuit diagrams and PID error control theory constitute the engineer’s “penny”. Heaviside and Shannon were a couple of odd-balls, so contrary to the facts, elitist Gerald still wants us to believe the new Einstein hasn’t (so can’t) come, and that there is any economically significant connection between chaos theory and PID logic: “People hoped chaos theory might help but so far no cigar”. He hasn’t learned that Shannon’s computation with error detection built in is not the same as calculation, and that calculation aims narrowly at precision in numerical answers whereas computers with their parallel processing can aim at satisfying many different types of aim concurrently.

      I can admire Gerald’s often instructive comments, but not his “educated” arrogance.

  5. ghholtham
    October 12, 2020 at 6:57 pm

    i am going to defend Wren Lewis in this instance. One can explain or exposit how a model behaves without reference to reality. That does not mean you don’t care about reality or that you are indifferent to failures of the model to match reality. It depends on context. Some Chicago ideologues may indifferent to the real world but I don’t believe WL is guilty as charged.

    • October 12, 2020 at 10:17 pm

      Yes, of course, you CAN spend hours and hours to “explain or exposit how a model behaves without reference to reality.” But what’s the point? I CAN spend years of my life building models of just anything. But what’s the point? Why would I? If I really want to understand and explain what goes on in real economies, why WOULD I spend time on things “without reference to reality”? I know that’s what most mainstream modellers (and not only those in Chicago) do, but to me, that is little less than a monumental waste of time and effort. I think we as economists spend far too much time on models without first posing the fundamental optimization question: Is it really worth the time to invest hundreds of hours in constructing and/or studying models without any real significance or reference to reality? After forty years spent with models in economics, I have to admit I myself have wasted far to much time on often fancy-looking and pretentious garbage because I did not start by asking that first decisive question!

  6. ghholtham
    October 12, 2020 at 7:01 pm

    PS Lars, will you tell us how to test economic theories or the usefulness of models without statistics? How do we reconnect economics to reality? How do we save the sinners?

    • October 13, 2020 at 10:14 am

      I’m not Lars, but I have answers to these questions.

      To the first, the science of Bacon: look inside the black box; look for explanations of behaviour in the structures you find there. To the second, by understanding why Hume’s black box method is not science but scientism. (The proof of that opened the black box of cosmic history to reveal structure in evolution). To the third, we cannot save sinners, merely help them become worthy of forgiveness. The Catholic catechism offers this advice: “Three conditions for forgiveness are required on the part of the penitent – Contrition, Confession and Satisfaction”. [This last: insofar as possible, really righting wrongs in ourselves and what we have done; symbolically reminding ourselves, e.g. by “wearing sackcloth and ashes”].

      • Robert Locke
        October 13, 2020 at 4:13 pm

        +To the first, the science of Bacon: look inside the black box; look for explanations of behaviour in the structures you find there”

        This sounds like an economics that is not market focused, but production-firm empowerment structure focused

  7. ghholtham
    October 13, 2020 at 9:17 am

    You can have a correct argument but cite something in evidence that is not evidence at all. I am not disagreeing with your general point, Lars. We agree and have both said there is a massive misallocation of resources within economics. I am saying the Wren Lewis quotation was ripped from context and the inference made about him specifically is unfair.

  8. ghholtham
    October 13, 2020 at 9:28 am

    I don’t recall saying PID logic and chaos theory were related. And if someone has produced the general theory of economics, I must admit I missed it. PID logic was used by Hendry and Mizon in econometrics. They distinguish differential, proportional and integral relationships and devise tests for them. Some variables are related only in differences but their levels can drift apart (they are not cointegrated in the jargon). Levels can be related but cumulated levels can drift apart unless there is integral control. One of the ways to test a theory is to ask what level of relationship between variables it implies and then see if that holds. Because economic theory seldom strays into real dynamics it does not confront those issues. In practice adjustment is not instantaneous so when econometricians examine time series they have to take them into account. An open-minded approach to data is the opposite of arrogance.

    • Yoshinori Shiozawa
      October 13, 2020 at 4:52 pm

      If what Dave Talylor talks about is PID controller (see “PID controller” in Wikipedia), the question is not difficult. What is PID logic? Please explain as clear as PID controller is explained in Wikipedia article.

    • October 14, 2020 at 10:24 pm

      “I don’t recall saying PID logic and chaos theory were related”.

      You didn’t, and I apologise for an editorial typo suggesting that, leaving “any” where adding “contrary to the facts” required “no”.

      “And if someone has produced the general theory of economics, I must admit I missed it”.

      What I have said is that there are several general theories, differing not in their structure but in their interpretation of money; and that a general theory points out where to look, it cannot predict the specifics you will find if you look. Let me say at this point I agree with what Lars has written about economists wasting time, here taking their specific theories as models of a general one and going round in circles by not admitting arguments to the contrary.

      Hendry and Mizon are new to me, but from what you say they are relating PID to parallel levels and not (like Newton’s equations of motion) to orthogonal directions of motion and hence distinct time scales (present, past and future). Bell’s “The Development of Mathematics” (1945, McGraw Hill) may understandably be new to you, but most of what I aim claiming may be found in that, e.g. the definition of structure on p.216, indexed under Whitehead and Russell; the modularity of mathematics and its applications (structure as “algebraic forms and their covariants and invariants”) on p.227; hence Hamilton’s quaternions applying to space but Grassman’s hypercomplexity being merely mathematical (pp.201-4).

      Thanks for your attempt to be reasonable here, but I’m going to have to cut this short. Too much has been going on here for me to concentrate on the argument about Data (and the interpretation of what one observes) not always being true.

      Yoshinori, if you want to understand PID logic works, imagine yourself navigating a ship, not trying to understand the words of a wiki writer who sees only three different feedbacks, not their operating on three different (mathematically orthogonal) timescales.

      • Yoshinori Shiozawa
        October 15, 2020 at 4:00 am

        Navigating a ship might be a question of controlling. I know the basics of cybernetics and can easily understand it. PID controller can help us. It is a tool of cybernetics. But you always add some strange words like the three different times scales. Are they present, past and future? They are not different time scales. They may be a mode of partitions of time. You say they are mathematically orthogonal. Then, please how they are orthogonal. Normally, in mathematics, two vectors x and y are orthogonal when their inner product (x , y) is zero. This definition can be generalized in various ways but
        I cannot imagine that past and present, past and future, or present and future cross diagonally. If you ask to be understood, you must explain in terms that readers can understand.

  9. Yoshinori Shiozawa
    October 13, 2020 at 2:10 pm

    I have posted two long comments in this page since last Saturday. I post this very short one, if it is accepted by the machine.

  10. Yoshinori Shiozawa
    October 13, 2020 at 2:26 pm

    Quite strange! A comment of less than 50 lines was not accepted. What is happening?

  11. Craig
    October 13, 2020 at 6:22 pm

    Virtually all of us here are heterodox and think the neo-classical synthesis and general equilibrium are bunk. Why continue to theoretically analyze macro 3000 ways from the middle???

    We need policies that will effect the changes we all agree are needed, and we need a mass movement to get those policies politically implemented.

    All the theory and palliative reforms in the cosmos can be flicked aside like one would flick away a fly with one’s finger by the power of vested wealth, present academic ideology and the unconscious inertia of the current paradigm. We all know this to be true because everyone bemoans it here continually.

    But well communicated policies that resolve the long lingering problems of that current paradigm paired with a mass movement not unlike MLK’s Civil Rights movement can slay the dragons of money and finance and effect a mega-paradigm change that might literally save the species.

    So again I ask: What are your policies? Are they mere reforms that can easily be turned on their head as Keynesianism was, or are they developed around the deep simplicities that have always characterized historical paradigm/pattern changes? And when are we going to unite in developing a mass movement to get them implemented?

    • Yoshinori Shiozawa
      October 15, 2020 at 3:32 am

      Craig, to whom is your question “What are your policies?” directed? To Lars Syll? Probably he will not respond to this kind of questions by a reason I do not know. If you are asking it to the people who read your “comments” (or expressions of your policy), please know that no responses are their replies. Many readers do not believe that your simple solution does not solve everything at all. It is not a logical response but they know by experience that real-world are not as simple as that its problems can be solved by any simple “policies”.

      • Craig
        October 15, 2020 at 6:37 pm

        Yoshi,

        “to whom is your question “What are your policies?” directed?”

        Anyone and everyone here.

        “If you are asking it to the people who read your “comments” (or expressions of your policy), please know that no responses are their replies.”

        Yes, I’m aware of that. I’m also aware that new paradigms are never generally perceived let alone accepted by even the most intelligent and iconoclastic for many reasons the two most important probably being that 1) they are unconsciously inured to the present one and 2) paradigms being the penultimate integrative phenomenon where a SINGLE concept is so deep that applied it resolves the major problematic complexities of the current paradigm and so transforms the PLURALITY of an entire pattern….most are not willing to chance agreeing/affirming a new paradigm for a variety of reasons like ego, fear of loss of reputation and respect, scientism and not recognizing that science is a subset of Wisdom and on and on, and 3) a consequent atrophy of and unwillingness to practice the integrative process AKA Wisdom…which also describes the phenomenon of a paradigm change as in #2.

        “Many readers do not believe that your simple solution does not solve everything at all. It is not a logical response but they know by experience that real-world are not as simple as that its problems can be solved by any simple “policies”.

        1) I’ve never said the new monetary and financial paradigm will resolve every economic problem….only its deepest and most urgently important to resolve ones. That’s why I have recommended regulations to stabilize and protect it from the likely gamers of it

        2) I HAVE cited the psychological fact that when a neurotic resolves their major conflict numerous seemingly irrelevant behaviors tend to dissipate and resolve as well, and economics being a humanly devised system this same phenomenon will likely manifest

        3) Simple is not the same thing as simplistic. Simple can be deep…as in the phenomenon of genuine paradigm changes and

        4) one of the signatures of new paradigms is the discovery of a new tool and/or insight like the telescope and discovery of the ellipse…and the integratively problem resolving power (for instance of the problems of individual monetary scarcity, systemic austerity and the inflationary tendencies of advanced economic systems) of a price and monetary policy specifically at the point of retail sale

        5) As I have said many times here the leading reform movements all circle about money, debt and banks which is a giant hint that this is where the problem actually lies. All that is needed is an honest and courageous recognition of the insight in #4

        So lets have some (other and equally resolving) policy recommendations from ….someone.

      • Craig
        October 15, 2020 at 9:02 pm

        4) above should read: one of the signatures of new paradigms is the discovery of a new tool and/or insight like the telescope and discovery of the ellipse…and the integratively problem resolving power (for instance of the problems of individual monetary scarcity, systemic austerity and the inflationary tendencies of advanced economic systems) of a price and monetary policy specifically at the point of retail sale…is an example of such signature.

  12. Yoshinori Shiozawa
    October 14, 2020 at 3:00 pm

    Eight years ago, there was a debate between Lars Syll and Simon Wren-Lewis on each of their blog sites. Following back the links, I came to know the existence of a post by the latter on 8 July 2012 that has the title: Heterodox and mainstream macroeconomics: a great divide.

    In this post, Wren-Lewis called Lars Syll’s attitude “rejectionist strategy”. Certainly, the way of Syll’s argument has no effects to attract mainstream economists to reflect on the troubles of their theories. If we compare Syll and Wren-Lewis, the latter seems to be more open-minded person. However, the question does not lie there. The question we should ask is (1) whether this “rejectionist strategy” has a good effect on young students and researchers to attract them into the camp of heterodox economics, and (2) whether it helps us to construct a new economics. I am doubtful on the first point, but the second point must be a crucial strategy for all heterodox economists to take. Let us illustrate by an example.

    It seems Wren-Lewis was once (a bit) fascinated by the call for a paradigm change:

    What interests me is why the need for such wholesale rejection of the mainstream? I learnt one possible answer when young, which is the appeal of revolution rather than evolution. In Cambridge (UK) in the early 1970s, a significant group of the faculty called themselves Neo-Ricardians, and they too rejected neo-classical theory. Joan Robinson was an inspirational figure for this group, although the key influence was Piero Sraffa. They were strongly attracted to the ideas of the philosopher Thomas Kuhn, who talked about paradigm shifts in science. The mainstream was not going to evolve into something better: it was fundamentally flawed, and therefore had to be overthrown. Attractive stuff for undergraduates ? too attractive in my case ? but that particular paradigm shift never came.

    (To be continued)

    • Yoshinori Shiozawa
      October 14, 2020 at 3:04 pm

      Sorry! Now it appeared.

  13. Yoshinori Shiozawa
    October 14, 2020 at 3:03 pm

    I have posted three time a comment concerning Wren-Lewis. But they are rejected? Why?

  14. Yoshinori Shiozawa
    October 14, 2020 at 3:09 pm

    #2 (Continued from my post on October 14, 2020 at 3:00 pm)

    As a Sraffian, or a neo-Ricardian in the Wren-Lewis’s naming, I feel this criticism was convincing and needs to be refuted by fact. We have failed to attract able economists like Wren-Lewis (and probably many others) in the camp of heterodox economics. But, I still believe in Sraffian strategy and is still trying to produce a new paradigm. Our book with Morioka and Taniguchi Microfoundations of Evolutionary Economics (2019) is, I believe, a core of such ne2n the above citation. We should reflect why we have failed in our efforts.

    If I was a Sraffian, I adopted a bit different strategy than others. Most of Sraffians did not recognize that equilibrium framework was the very obstruct for a paradigm shift (in the word of Louis Althusser epistemological obstruct). When I started to study economics in Paris in early 1970’s, I was attracted by two persons: Luis Althusser and Piero Sraffa. Althusser pointed that the one of greatest innovation in social sciences is Marx’s introduction of the category process or processus. He also proposed the notion of epistemological obstruct. For me, equilibrium was the very framework that we should reject as epistemological obstruct.

    For about half century, I searched and tried almost everything. But I prohibited me any easy compromises. Rejecting any equilibrium framework was as important as adopting process as a leading idea. In the world of economics, equilibrium was and still is ubiquitous framework. Without a clear policy of rejecting all equilibrium framework, it was difficult to arrive at our book, because even if we know that process analysis (or sequence or step-by-step method) must be the right way, the difficulties of the process analysis is much greater than equilibrium analysis.

    In the case of our book, the major difficulty was to explain why myopic people’s reactions to the immediate changes of their environment can generate a rather stable economics process as a whole (as a network as big as a national and world economy connected by input-output relations). All Sraffians, Kaleckians, and fundamental Post Keynesians did not ask such a question. They all assumed that there must be a mechanism that assures almost everything in the short run goes well. They were satisfied by assuming that demand and supply of products are nearly equal and did not inquire how this happens. By Taniguchi-Morioka’s results, we came to explain how the modern industrial economy works without relying on price mechanism. Post Keynesians (including Sraffians and Kaleckians) recognized that prices are not the unique mechanism of the market economy, but neglecting the details of quantity adjustment they failed to know the real function that price system plays.
    (To be continued)

  15. Yoshinori Shiozawa
    October 14, 2020 at 3:12 pm

    #3 (Continued from my post #2 on October 14, 2020 at 3:09 pm)

    The recognition of the importance of process analysis and the difficulty it implies is not confined to Marxists. As Meir Kohn argued in his profound paper of 1986 Analysis, the Equilibrium Method, and Keynes’s “General Theory”, Journal of Political Economy 94(6): 1191-1224, process analysis was common way of analyses in 1920’s, if it was in a very crude and primitive stage. In Kohn’s understanding, Keynes’s “revolution lay in Keynes’s abandonment of sequence analysis in favor of the method of equilibrium.”

    To help readers’ understanding, I cite the whole starting paragraph of Kohn’s article, which contains the above citation.

    To understand the nature of the Keynesian revolution (and the significance of the new classical counterrevolution) one must realize that the General Theory (1936) was more a revolution of method than one of substance. Although Keynes represented his book as a new and dramatic break with what he called “classical economics” — Say’s law, the separation of real and monetary phenomena, the stability of general economic equilibrium — that break was far from new. It went back at least to Marshall, Fisher, and Wicksell and had been widening steadily for over half a century in the work of Ohlin, Myrdal, Hayek, Hawtrey, Robertson, and Keynes himself, among others. While the General Theory did make significant new contributions of substance to this neoclassical tradition, it represented a truly startling revolution in method. This revolution lay in Keynes’s abandonment of sequence analysis in favor of the method of equilibrium. Indeed, contrary to what one might have expected from reading the first few chapters of the General Theory, no “classical” economist stepped forward to debate with Keynes the validity of’ Say’s law. Instead, the debate raged over the instantaneous multiplier, the role of expectations, liquidity preference versus loanable funds, and, in general, over the nature and the validity of Keynes’s concept of equilibrium. (pp.1191-1192)

    As Kohn points it, “The adoption of the equilibrium method was both the strength of the General Theory and its weakness.” Keynesian revolution in method was a counter-revolution against process analyses in the 1920’s. It paved the way to its anti-Keynesian counter-revolution called rational expectation revolution. Rational expectation and DSGE modelling were almost inevitable consequences of Keynesian revolution of the General Theory.

    Lars Syll still talks of Keynes’s General Theory as a kind of holy books and cites each phrase of Keynes without any re-examinations. We should admit that Keynes had committed many mistakes and simply following him does not lead to a real revolution in economics.

    If I return to rejectionist strategy, it must not be directed to a school of economics. In particular, if it is a broad church such as mainstream economics or neoclassical economics, rejection strategy should be kept against epistemological obstructs (i.e. concepts and categories we use). There is no other way to achieve a paradigm change. Paradigm shift can occur only by using old theoretical materials as Otto Neurath said by his boat parable (See “Neurath’s boat” in Wikipedia). It is not wise to take this strategy against mainstream economics, because we can learn and adopt many ideas that can contribute enriching the renewed economics. Rejection strategy is necessary for a paradigm change. But the rejection must be directed to what we do and not to what others are doing.

  16. Craig
    October 14, 2020 at 9:09 pm

    Money as Debt is most fundamentally accounting. A virtual monopoly on its creation is bad enough, but a monopoly on its paradigm (Debt Only) as the sole form and vehicle for its distribution is one that is I’m sure “making the Martians laugh”. Try exteriorizing yourselves from that (generally unperceived) fact. It’s enlightening.

    Monetary and financial policies in the economy from any of you? Or are you all so stuck in your heads and afraid to step out of line and be critiqued that you simply won’t offer any up?

  17. Yoshinori Shiozawa
    October 16, 2020 at 6:59 am

    I agree with you, Craig, that monetary system is one of most important institutions of our economy. It is so important that many people are conservative and reluctant for a reform.

    Many economists talk about the necessity to make economics really monetary economics. But, this necessity is seldom satisfied by a concrete theory. This situation continues more than a century. Around 1933, Keynes named his lecture course monetary theory of production but such theory did not appeared after all. It is hard to say that The General Theory of Employment, Interest and Money is a monetary theory of production. It produced a new thinking about employment. It was a great innovation in political thought. After that, all governments began to consider (near) full employment one of their most important necessary conditions for keeping their authority. However, The General Theory was a failed theory of interest and contains few real theory on money. Lars Syll is not aware of this fact.

    After Keynes, only a few theoretical development was achieved. The theory of endogenous money (or money supply), inaugurated by Nickolas Kaldor, was an example of a few true theoretical breakthroughs. Variants of “money as debt” theory existed even in the golden time of gold standard (e.g. Knapp’s The State Theory of Money, this is a kind of debt theory of money). Debt was serious problems even money started to circulate as medium of exchange. David Graever wrote a history of five thousand years of money and debt. But he explained little how money works in the modern economy. The book cover of the Japanese edition of Debt: The first 5,000 years put a catch copy “From Capital to Debt“, but as a money theory of modern capitalism Graever elucidated far less than Marx.

    In the 20th century, many economists and politicians believed, after Marx, that once the capitalism is overthrown, a good society comes. They thought there is no need of economics, because all is planned by governmental organization. Marx never talked about planned economy, but many people believed that a planned economy works better than market economy, without no reasons. As we know now very well, it was a greatest error of the human history. We had no sufficient preparation. Economics was extremely defective in designing a new economy. When we think of economy and economics, we should not forget this. Simple romanticism can often lead to a disaster much greater than world wars.

    If you want to spread your ideas, you must prove that your plan works. But, I believe, it is an extremely difficult work. Even if actual mainstream economics is rough and poor economics that is not realistic at all, it gives less disaster than utopic prescription, because it only approves what is going now. If you propose a totally new system, you are in a similar situation before Russian socialist revolution. You have to prove your plan works, not only on the planned sides, but also on all aspects that may come as side effects. Even a very advanced medical science cannot produce a vaccine without clinical trials. The same is true for drugs that can be designed by computers. Economics including heterodox economics is not as advance as today’s medical sciences.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.