Home > Uncategorized > Revisiting the foundations of randomness and probability

Revisiting the foundations of randomness and probability

from Lars Syll

dudeRegarding models as metaphors leads to a radically different view regarding the interpretation of probability. This view has substantial advantages over conventional interpretations …

Probability does not exist in the real world. We must search for her in the Platonic world of ideals. We have shown that the interpretation of probability as a metaphor leads to several substantial changes in interpretations and justifications for conventional frequentist procedures. These changes remove several standard objections which have been made to these procedures. Thus our model seems to offer a good foundation for re-building our understanding of how probability should be interpreted in real world applications. More generally, we have also shown that regarding scientific models as metaphors resolves several puzzles in the philosophy of science.

Asad Zaman

Although yours truly has to confess of not being totally convinced that redefining​ probability as a metaphor is the right way to go forward on these foundational issues, Zaman’s article​ sure raises some very interesting questions on the way the concepts of randomness and probability are used in economics.

Modern mainstream economics relies to a large degree on the notion of probability. To at all be amenable to applied economic analysis, economic observations have to be conceived as random events that are analyzable within a probabilistic framework. But is it really necessary to model the economic system as a system where randomness can only be analyzed and understood when based on an a priori notion of probability?

slide_1When attempting to convince us of the necessity of founding empirical economic analysis on probability models,  mainstream economics actually forces us to (implicitly) interpret events as random variables generated by an underlying probability density function.

This is at odds with reality. Randomness obviously is a fact of the real world (although I’m not sure Zaman agrees but rather puts also randomness in ‘the Platonic world of ideals’). Probability, on the other hand, attaches (if at all) to the world via intellectually constructed models, and a fortiori is only a fact of a probability generating (nomological) machine or a well constructed experimental arrangement or ‘chance set-up.’

Just as there is no such thing as a ‘free lunch,’ there is no such thing as a ‘free probability.’

To be able at all to talk about probabilities, you have to specify a model. If there is no chance set-up or model that generates the probabilistic outcomes or events — in statistics one refers to any process where you observe or measure as an experiment (rolling a die) and the results obtained as the outcomes or events (number of points rolled with the die, being e. g. 3 or 5) of the experiment — there strictly seen is no event at all.

Probability is a relational element. It always must come with a specification of the model from which it is calculated. And then to be of any empirical scientific value it has to be shown to coincide with (or at least converge to) real data generating processes or structures — something seldom or never done.

And this is the basic problem with economic data. If you have a fair roulette-wheel, you can arguably specify probabilities and probability density distributions. But how do you conceive of the analogous nomological machines for prices, gross domestic product, income distribution etc? Only by a leap of faith. And that does not suffice. You have to come up with some really good arguments if you want to persuade people into believing in the existence of socio-economic structures that generate data with characteristics conceivable as stochastic events portrayed by probabilistic density distributions.

We simply have to admit that the socio-economic states of nature that we talk of in most social sciences — and certainly in economics — are not amenable to analyze as probabilities, simply because in the real world open systems there are no probabilities to be had!

The processes that generate socio-economic data in the real world cannot just be assumed to always be adequately captured by a probability measure. And, so, it cannot be maintained that it even should be mandatory to treat observations and data — whether cross-section, time series or panel data — as events generated by some probability model. The important activities of most economic agents do not usually include throwing dice or spinning roulette-wheels. Data generating processes — at least outside of nomological machines like dice and roulette-wheels — are not self-evidently best modelled with probability measures.

If we agree on this, we also have to admit that much of modern neoclassical economics lacks sound foundations.

When economists and econometricians — often uncritically and without arguments — simply assume that one can apply probability distributions from statistical theory on their own area of research, they are really skating on thin ice.

This importantly also means that if you cannot show that data satisfies all the conditions of the probabilistic nomological machine, then the statistical inferences made in mainstream economics lack sound foundations!​

  1. Helge Nome
    May 7, 2019 at 12:35 am

    I have more faith in developing a pattern recognition theory and applying it to socioeconomic phenomena. By mapping past human behaviour and external factors of the day, it should be possible to develop some predictive models.

    The more socioeconomic and meta data that can be fed into the algorithm, the better.
    Somewhat akin to a weather prediction model.

  2. May 7, 2019 at 1:41 am

    Helge,

    meteorology is a branch of physical science from which economics must learn much. Its prediction is calculated by models based on (classical) physics. Meteorology had a good luck to have well established physics to draw on before its start. In the case of economics, we are still searching basic theory that is not fundamentally flawed as neoclassical economics.

    As I have repeated several times in my posts, neoclassical economics is just like geocentric system before Copernicus. It was very refined and complicated but fundamentally flawed.

  3. May 7, 2019 at 3:35 am

    Indeed I think science is a process of perceiving patterns in observations and dreaming up metaphors (models) that capture some/much of that behaviour. If probability requires models it would be true of probability too.

  4. Ikonoclast
    May 7, 2019 at 4:25 am

    I offer this, though I am not sure if it is useful to the discussion. An unusual thinker, James J. Wayne, has offered “Five New Physics Laws of Social Science”. I think the proposed laws are speculatively interesting at least. Their practical application to social science might be difficult to derive (in my view).

    “These laws are applicable to any system that is made of elementary articles, including any physical and biological system, human being and human society.

    First Law – Law of Indeterminacy

    For a closed system, the outcome of any future event in the system is indeterministic. The quantum uncertainty of the future is the fundamental property of nature and cannot be overcome by any means.

    Second Law – Law of Predicting the Future

    For a closed system, any future event in the system can be and can only be predicted precisely to the extent of a joint probability distribution among all possible outcomes. The joint probability distribution function exists and is uniquely given by quantum mechanics.

    Third Law – Law of Choice

    Actions, which are constrained by fundamental laws of physics, can be taken between time 0 and time T to modify the joint probability distribution function of time T of a closed system.

    Fourth Law – Law of Information

    The complete historic information of any closed system cannot be recreated based on today’s complete information. At any time-step, new information is created and some historic information is lost permanently.

    Fifth Law – Law of Equilibrium

    For a system under certain constraints, quantum uncertainties in the system will eventually push the system toward equilibrium states.” – James J. Wayne.

    Wayne expands on his Law of Indeterminacy as follows;

    “The Law of Indeterminacy rejects the (more classical) mainstream idea in the scientific community that indeterministic behavior is limited to the microscopic world of atoms and elementary particles, and the macroscopic world can be completely described by deterministic Newtonian physics. Common senses tell us that the indeterministic radioactive decay could cause indeterministic events such as cancers due to the radiation damage. Radiation from a single atom is sufficient to break DNA molecules to cause cancers later in people. Indeed a report from National Research Council says that even low doses of radiation like X-rays are likely to pose some risk of adverse health effects. No threshold of exposure below which low levels of ionizing radiation can be demonstrated to be harmless or beneficial. Putting it simply, the true safety threshold is zero.” – James J. Wayne

    All of this raises the issue of whether we should model the macroscopic world as entirely deterministic, perhaps via chaos theory as the most complex derivation of that view, or whether we should model the broad macroscopic world, especially the socioeconomic world with its myriad, non-identical human agents as indeterministic (nondeterministic?).

    With respect to human “will” or agency as the variable impacting on the Third Law above, should we regard this will as “free choice” (at the moment of choice) or simply as a “determination” meaning the determination in the instant for something that was indeterminate before the “choice” to then become determined (i.e. already in the past, even if just by an instant)? Thus a human making a “choice” to do something is essentially like an atom making a “choice” to decay. It’s not a choice at all. It’s simply an inexplicable movement from an indeteminate future to a determined present (and then determined past losing of the event information like a fading comet tail).

    My bias is to hold that determinism holds proximally and approximately for simple macro events like billiard ball collisions. However, once we reach the levels of large and complexly interacting macro events also involving complex, diverse human agents, like socioeconomic events and processes, we reach a level where micro indeterminancy can continually reset initial conditions for the macro events which might then be modeled by chaos theory. As we know, a small change in initial conditions can make a huge difference in outcomes in chaotic (so-called) systems.

    None of this is to say that we cannot make macro preidctions. If we drive a car at high speed into a brick wall, we can predict a horrible smash but we cannot predict the size, shape and eventual rest points of all the pieces. Likewise, if we drive a globalized socioeconomic system of huge size and momentum rapidly into the brick wall of biospheric limits, we can predict a a civilizational wreck without being able to predict where all the pieces will end up.

    • Rob
      May 9, 2019 at 12:12 pm

      Thus a human making a “choice” to do something is essentially like an atom making a “choice” to decay. It’s not a choice at all. It’s simply an inexplicable movement from an indeteminate future to a determined present (and then determined past losing of the event information like a fading comet tail). ~ Ikonoclast

      In my view you are confusing quantum level behavior and the role of the observer because of a lack of appreciation for the role of mind in the human behavior. You seem (and I could be wrong or simply misunderstanding) equating the the two in some way. The human being making a choice and the random decay of a say a radioactive element are not the same thing, but it seems, to me you are equating the two as essentially the same. I think there are plently of congent arguments that would call into questions such as assumption, if that is the assumption (and I could be wrong) you are making. But I won’t go into posting the relevant quotes if that line of argument doesn’t interest you.

  5. Ikonoclast
    May 7, 2019 at 5:21 am

    Oops, sorry. I should have proofread my above post better.

  6. Rob
    May 9, 2019 at 3:53 am

    Small wonder that students have trouble [learning significance testing]. They may be trying to think. (W. Edward Deming 1975, 152.)

  7. Rob
    May 9, 2019 at 10:43 am

    Because the great controversies of the past often reach into modern science, many current arguments cannot be fully understood unless one understands their history. (Ernst Mayr 1982, 1)

  8. Ken Zimmerman
    May 12, 2019 at 11:57 am

    After the cognitive revolution, Sapiens invented questioning and then questions. And Sapiens questioned everything. What happens next? Why did this happen rather than something else? Will each human die, when, and why? Who invented humans? Why were humans invented? What are the answers? One of the answers humans invented to deal with this and similar questions is randomness. Other inventions to deal with Sapiens’ questioning are chance, mathematics, and probability. Let’ consider the history of randomness.

    In the ancient world randomness was intertwined with fate. As it is frequently in Asia and Africa today. Often in these cultures, devices such as dice, animal entrails, smoke, etc. were used to determine what fate would bring. Often summed up in the term divination. The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the 16th century that Italian mathematicians began to formalize the odds associated with various games of chance. And later the odds associated with human life generally. The invention of calculus changed the formal study of randomness. Allowing for consideration of the complexity of the random. In the 19th century the concept of entropy was introduced in physics. In the 20th century Sapiens invented the mathematics of probability, axiomizing it in the 1930s. In the mid-20th century Sapiens invented quantum mechanics, changing the notion of randomness in many ways. A little later Sapiens added even more dimensions to randomness with algorithmic randomness. In the last 50 years computer scientists began to incorporate randomness into the design of computer algorithms, believing this created better performing computers. In some cases, such randomized algorithms outperform the best deterministic methods.

    But before Sapiens invented randomness, it invented the divine. The gods, whoever and wherever they are, controlled humans like puppets. Deciding humanity’s future based on no plan other than the whims of the gods. Ancient Greece is one example. The ancient Greeks also believed that events were part of an unbreakable sequence. In this setting randomness is simply impossible. Plus, as the Greeks saw it randomness is unprovable and uninteresting.

    Dice are found throughout the history of humanity. The first use of dice was as a divination tool, used in religious ceremonies, even if the first dice were often bones (or ossicles). Their natural asymmetry poses problems of credibility: even the pious believer will wonder about the true will of the gods if the dice constantly fall on the same side. The neutrality of symmetrical dice quickly became apparent. This symmetry has also allowed the development of games of chance, making the games fair for the different players. Generating random numbers has become a popular technique today for researchers to deal with several problems, ranging from behavior at the molecular level to sampling a population, or solving certain equation systems. Actuaries use them daily to quantify uncertainty. Until a century ago, people who needed random numbers could throw coins, shoot balls at restless urns, or roll dice as we have seen before. But recently other techniques have been invented. Including using the final few digits of government created large numbers. One interesting alternative is British statistician Leonard Tippett’s use of the mid-point of the parish area in England.

    In 1927, a Soviet statistician, Evgueni (Eugene) Slutsky used economic series to generate chance, and Slutsky showed that random series could be used to generate all kinds of economic series. At the beginning of the 20th century, many researchers believed that unpredictable events such as wars, crop failures or technological innovations should play a role in economic cycles. But no one really understood how crucial random processes (nowadays we call them “stochastic“) are to understanding how the economy works. Until Slutsky published his work on “cyclical phenomena” showing that very simple manipulations of random series (in this case numbers obtained from government lottery draws) could generate undulating models that could not be distinguished from economic cycles. Or, as Slutsky said, “any economic series could be seen as a stochastic process obtained as “the sum of random causes.”

    The main issue facing us today, “is indeterminacy a problem of probability?” Many projected scientific “advances” are based on the answer being yes.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.