Home > Uncategorized > Is economics — really — predictable?

Is economics — really — predictable?

from Lars Syll

oskarAs Oskar Morgenstern noted already back in his 1928 classic Wirtschaftsprognose: Eine Untersuchung ihrer Voraussetzungen und Möglichkeiten, economic predictions and forecasts amount to little more than intelligent guessing.

Making forecasts and predictions obviously isn’t a trivial or costless activity, so why then go on with it?

The problems that economists encounter when trying to predict the future really underlines how important it is for social sciences to incorporate Keynes’s far-reaching and incisive analysis of induction and evidential weight in his seminal A Treatise on Probability (1921). 

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but ‘rational expectations.’ Keynes rather thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. treatprobTo Keynes, ​expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by ‘modern’ social sciences. And often we “simply do not know.”

How strange that social scientists and mainstream economists, as a rule, do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for measurable quantities, one puts a blind eye to qualities and looks the other way.

So why do companies, governments, and central banks, continue with this more or less expensive, but obviously worthless, activity?

A part of the answer concerns ideology and apologetics. Forecasting is a non-negligible part of the labour market for (mainstream) economists, and so, of course, those in the business do not want to admit that they are occupied with worthless things (not to mention how hard it would be to sell the product with that kind of frank truthfulness). Governments, the finance sector and (central) banks also want to give the impression to customers and voters that they, so to say, have the situation under control (telling people that next years x will be 3.048 % makes wonders in that respect). Why else would anyone want to pay them or vote for them? These are sure not glamorous aspects of economics as a science, but as a scientist, it would be unforgivably dishonest to pretend that economics doesn’t also perform an ideological function in society.

  1. Garrett Connelly
    January 13, 2020 at 12:44 am

    Yes. 100%. Yet, within a given pool of intelligences, a properly constructed forecast includes input from every functional individual or constituency in the acting responsible population. Thus, through dynamic input of actors reacting to each other, a forecast becomes a living document and much more than prognostications of an isolated economic guru. The proof is in the pudding. What are the results?

  2. January 13, 2020 at 4:06 am

    Yes, how to deal with the uncertainties that are not probabilistic? The Algorithmical answer is: to induce, to imitate, to inherit, to learn, to imagine, to associate, to draw a lottery, to assume, to experiment, to force, to negotiate, to persuade, etc. which are totally named “Heterodox Algorithms”, except “deduct” as the “Mainstream Algorithm”. Mainstream economics can be regarded as the depiction of the most primary, simple or clear part of an economy, now its critics can remedy it by the other parts, and then going to build a new unified economics and to explain and predict the world completely (omnilaterally). Most real ideas, means, actions and phenomena can be deemed the remedy. The word “expectation” is too simple to say all of the above Heterodox Algorithms.

  3. January 13, 2020 at 4:17 am

    In addition, “Heterodox Algorithm” means any method subjective, non-deductive, or “irrational” as having been wrongly called for long times.

  4. January 13, 2020 at 6:02 am

    In addition again, various agents prefer quantitative measures, because, to Algorithmically understanding, quantitative data are generally more easy, noticeable, comparable, computable, and hence more economical than other types of data (e.g. qualitative data). This is why price system, as a quantitative analysis, has been prioritized over other economic analyses. To reach these views, the concept of datum type, the frame of computing economics, and hence the Algorithmic economics are needed.

  5. Yoshinori Shiozawa
    January 13, 2020 at 6:52 am

    Have you read my comment on January 11, 2020 at 2:48 pm on your post on January 10, 2020 at 8:20 in the comment page below?

  6. Frank Salter
    January 13, 2020 at 10:30 am

    Predictability can only be of the form: if these actions are undertaken then this will be the result. If different actions are undertaken there will be different results. It would appear that some expect more than this but that is not possible.

  7. Gerald Holtham
    January 14, 2020 at 3:17 pm

    The way unquantifiable risk is tackled in practice is by scenario analysis. Frank is right that all forecasts are conditional. When you have no idea which conditions will pertain you forecast for more than one set. Neither politicians not businessmen subscribe to charities for out-of-work economists. In the financial markets they will employ you even if they think you are a Trotskyist as long as they also think you can help them make money.

    • Garrett Connelly
      January 14, 2020 at 4:55 pm

      Yes, And the economist can “help them make money.” Interestingly, establishing full dynamic input from a plan responsible workforce is a brief description of what post capitalist democracy might look like if we survive.

  8. Ken Zimmerman
    January 15, 2020 at 12:56 pm

    If our intent is to address uncertainty (which is always with us), predicting is not the best approach. In recent years, organizations (corporations, governments, NGOs, etc.) have been caught off guard by economic volatility, unexpected political events, natural disasters, and disruptive innovations. In response, there is increased interest in scenario planning. Rather than tying their company’s, government’s, etc. future to a strategy geared to a single set of events, many have come to the view that smart organizations benefit from a richer and broader understanding of the present possibilities afforded from multiple views about possible futures. Scenario planning came to prominence following its use in WWII and gained recognition and credentials as an effective planning tool in the corporate world in the late 1960s and 1970s, around the time when Royal Dutch/Shell used it to help address the turbulence caused by the 1973 oil crisis. Governments began using scenario planning in the 1970s. Although sometimes hampered by ideological litmus tests, by the 1990s scenario planning was an established tool of governments around the world, including the US. At the national level in the US it is currently used by such agencies as DOE, EPA, the FED, etc.

    While several approaches to scenario planning have emerged since then, all of which I’ve used, I want to focus here for a moment on what we call the Oxford scenario planning approach (R. Ramírez and A. Wilkinson, “Strategic Reframing: The Oxford Scenario Planning Approach” (Oxford, U.K.: Oxford University Press, 2016). This approach is intended to be collaborative in order to get individuals and groups at all levels and functional backgrounds within an organization to examine an array of factors that contribute to the future and, in the process, to reframe their collective understanding of the present. Unlike approaches to scenario planning that take a probabilistic stance (that is, making predictions in percentage terms or as best-case/worst-case scenarios) or a normative stance (that is, envisioning what a future should look like), the Oxford scenario planning approach is based on plausibility. By recognizing the part of uncertainty that is unpredictable and by actively exploring the sources of the turbulence and uncertainty, the goal is to iteratively and interactively generate new knowledge and insights to help organizations reperceive their circumstances.

    The Oxford approach is a bit different for governments. And requires somewhat more effort. Government application must include not only members of the government, but also constituents, voters, interest groups, political parties, etc. This obviously makes the process longer and often more difficult.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.