Home > Uncategorized > Weekend Read – Energizing exchange: Learning from econophysics’ mistakes

Weekend Read – Energizing exchange: Learning from econophysics’ mistakes

from Blair Fix

Let’s talk econophysics. If you’re not familiar, ‘econophysics’ is an attempt to understand economic phenomena (like the distribution of income) using the tools of statistical mechanics. The field has been around for a few decades, but has received little attention from mainstream economists. I think this neglect is a shame.

As someone trained in both the natural and social sciences, I welcome physicists foray into economics. That’s not because I think their approach is correct. In fact, I think it is fundamentally flawed. But it is only by engaging with flawed theories that we can learn to do better.

What is important about econophysics, is that it demonstrates a flaw that runs throughout economics: the idea that we can explain macro-level phenomona from micro-level principles. The problem is that in all but the simplest cases, this principle does not work. Yes, complex systems may be reduced to simpler pieces. But rarely can we start with these simple pieces and rebuild the system. That’s because, as physicist Philip Anderson puts it, more is different.

What follows is a wide-ranging discussion of the triumphs and pitfalls of reduction and resynthesis. The topic is econophysics. But the lesson is far broader: breaking a system into atoms is far easier than taking atoms and rebuilding the system.

Let there be atoms!

To most people, the idea that ‘matter is made of atoms’ is rather banal. It ranks with statements like ‘the Earth is round’ in terms of near total acceptance.1 Still, we should remember that atomic theory is an astonishing piece of knowledge. Here is physicist Richard Feynman reflecting on this fact:

If, in some cataclysm, all of scientific knowledge were to be destroyed, and only one sentence passed on to the next generations of creatures, what statement would contain the most information in the fewest words? I believe it is atomic hypothesis that … all things are made of atoms.

(Richard Feynman, Lectures on Physics)

Feynman spoke these words in the 1960s, confident that atoms existed. Yet if he’d said the same words a century earlier, they would have been mired in controversy. That’s because in the 19th century, many scientists remained skeptical of the atomic hypothesis. And for good reason. If atoms existed, critics observed, they were so small that they were unobservable (directly). But why should we believe in something we cannot see?

Today, of course, we can see atoms (using scanning transmission electron microscopes). But long before we achieved this feat, most scientists accepted the atomic hypothesis. Why? Because multiple lines of evidence pointed to atoms’ existence. Here’s how Richard Feynman put it:

How do we know that there are atoms? By one of the tricks mentioned earlier: we make the hypothesis that there are atoms, and one after the other results come out the way we predict, as they ought to if things are made of atoms.

(Richard Feynman, Lectures on Physics)

One of the first hints that atoms existed came from chemistry. When combining different elements, chemist John Dalton discovered that the masses of the reactants and product always came in whole-number ratios. This ‘law of multiple proportions’ suggested that matter came in discrete bits.

Another line of evidence for atomic theory came from what is today called ‘statistical mechanics’. This was the idea that macro properties of matter (such as temperature and pressure) arose from the interaction of many tiny particles. The key inductive leap was to assume that these particles obeyed Newton’s laws of motion. At the time, this assumption was a wild speculation. And yet it turned out to be wildly successful.

A gas of billiard balls

By the mid-19th century, the properties of gases had been formulated into the ‘ideal gas law’. This was an equation that described how properties like pressure, volume and temperature were related. But what explained this law?

Enter the kinetic theory of gases. Suppose that a gas is composed of tiny particles that collide like billiard balls. Next, assume that these collisions obey Newton’s laws. Last, assume that collisions are elastic (meaning kinetic energy is conserved). What does this imply?

It turns out that these assumptions imply the ideal gas law. So here we have the reasoning that Richard Feynman described: we hypothesize that there are atoms, and then the “results come out the way we predict, as they ought to if things are made of atoms.” What was exciting about this particle model was that it gave a micro-level definition of temperature and pressure. Temperature was the average kinetic energy of gas particles. And pressure was the force per area imparted by particles as they collided with the container.

As the mathematics of this kinetic model were developed further, there came another surprise. From equality came inequality. The ‘equality’ here refers to the underlying process. In the kinetic model, each gas particle has the same chance of gaining or losing energy during a collision. One might think, then, that the result should be uniformity. Each particle should have roughly the same kinetic energy. The mathematics, however, show that this is not what happens.

Instead, when James Clerk Maxwell and Ludwig Boltzmann fleshed out the math of the kinetic model, they found that from equality arose inequality. Due only to random collisions, gas particles should have a wide distribution of speed.

Figure 1 illustrates this counter-intuitive result. The top frame shows a simple model of a gas — an ensemble of billiard balls colliding in a two dimensional container. The collisions are uniform, in the sense that each particle has the same chance of gaining/losing speed. And yet the result, shown in the bottom frame, is an unequal distribution of speed.

Figure 1: The random collision of gas particles gives rise to a wide distribution of speeds. [Source: Wikipedia]

The blue histogram (in Figure 1) shows the distribution of speeds found in the modeled gas particles. Because there are relatively few particles (a few hundred), their speed distribution jumps around with time. But if we were to add (many) more particles, we expect that the speed distribution would converge to the yellow curve — the Maxwell–Boltzmann distribution.

This particle model demonstrates how a seemingly equal process (the random exchange of energy) can give rise to wide inequalities. If econophycisists are correct, this model tells us why human societies are mired by inequality. It’s just basic thermodynamics.

Energized particles, monetized humans

After Maxwell and Boltzmann, statistical mechanics gave rise to a long string of successes that bolstered the atomic hypothesis. The icing on the cake came in 1905, when Albert Einstein used the kinetic model to explain Brownian motion (the random walk taken by small objects suspended in a liquid). It’s hard not to be impressed by this success story.

Now let’s fast forward to the late 20th century. By then, statistical mechanics was a mature field and the low-hanging fruit had long since been picked. And so physicists widened their attention, looking for other areas where their theories might be applied.2

Enter econophysics. In the 1990s, a small group of physicists set their sites on the human economy. Perhaps, they thought, the principles of statistical mechanics might explain how humans distribute money?3

Their idea required a leap of faith: treat humans like gas particles. Now, humans are obviously more complex than gas molecules. Still, econophysicists highlighted an interesting parallel. When humans exchange money, it is similar to when gas particles exchange energy. One party leaves with more money/energy, the other party leaves with less. Actually, the parallel between energy and money is so strong that physicist Eric Chaisson describes energy as the “the most universal currency known in the natural sciences” (emphasis added).

With the parallel between energy and money, ecophysicists arrived at a startling conclusion. Their models showed that when humans exchange money, inequality is inevitable.


When econophysicists began publishing their ‘kinetic-exchange models’ of income and wealth, social scientists greeted them with skepticism. People, social scientists observed, are not particles. So it is inappropriate to use the physics of inanimate particles to describe the behavior of (animate) humans.

In the end, I think this criticism is warranted … but for reasons that are probably different than you might expect. Before I get to my own critique, let’s run through some common objections to econophysics models of income.


If I told you that your last monetary purchase was ‘random’, you would protest. That’s because you had a reason for purchasing what you did. When you gave money to Bob, it wasn’t because you ‘randomly’ bumped into him (as envisioned in econophysics world). It was because Bob was selling a car that you wanted to buy. So the effect (the exchange of money) had a definite cause (you wanted the car).

When econophysicists use ‘random exchange’ to explain income, many people are horrified by the lack of causality. It’s an understandable feeling — one that was actually shared (a century earlier) by Ludwig Boltzmann. A founder of statistical mechanics, Boltzmann was nonetheless tormented by his own theory. Like most physicists of the time, Boltzmann wanted his theory to be deterministic. (He thought effects should have definite causes.) Yet to understand the behavior of large groups of particles, Boltzmann was forced to use the mathematics of probability. The resulting uncertainty in cause and effect made him uneasy.4

Ultimately, Boltzmann’s unease was warranted. Quantum mechanics would later show that at the deepest level, nature is uncertain. But this quantum surprise does not mean that probability and determinism are always incompatible. In many cases, the use of probability is just a ‘hack’. It is a way to simplify a deterministic system that is otherwise too difficult to model.

A good example is a coin toss. When you toss a coin, most physicists believe it is a deterministic process. That’s because the equations that describe the toss (Newton’s laws), have strict cause and effect. Force causes acceleration. So if we had enough information about the coin and its environment, we could (in principle) predict the coin’s outcome.

The problem, though, is that in practice we don’t have enough information to make this prediction. So we resort to a ‘hack’. We realize that there only two possible outcomes for the coin: heads or tails. If the coin is balanced, both are equally probable. And so we model the coin toss as a random (non-deterministic) process, even though we believe that the underlying physics are deterministic.

Back to econophysics. When social scientists complain that econophysics models invoke ‘randomness’ to explain income and wealth, they are making a philosophical mistake. Econophysicists (correctly) respond that their models are consistent with determinism. When you spend money, you know exactly why you did it. But that doesn’t stop us from modeling your exchange with a statistical ‘hack’. Regardless of why you did it, the end result is that you spent money. So like a coin toss, econophysicists think we can treat monetary exchange in probabilistic terms. That’s a reasonable hypothesis — one that social scientists too readily dismiss.


In his article ‘Follow the money’, journalist Brian Hayes argues that econophysicists don’t model ‘exchange’ so much as they model theft. He has a point.

When you spend money, you usually get something back in return — namely property. That property could be a physical thing, as when you buy a car. Or it could be something less tangible, as when you buy shares in a corporation. But either way, the exchange is two sided. You lose money but gain property.5

In econophysics, however, there is no property. There is only money. So when you ‘bump’ into Bob (in econophysics world) and give him money, you get nothing in return. Of course, this type of one-sided transaction does happen in real life. But we don’t call it ‘exchange’. We use other words. If you gave the money to Bob, we call it a gift. Or if Bob just took the money, we call it theft. Either way, the gift/theft economy is a tiny part of all real-world monetary transactions. So it would seem that econophysics has a problem.

Or does it?

Econophysicists do not (to my knowledge) deny the importance of property. They just think we can model the exchange of money without understanding property transactions. Here’s why.

Regardless of what you bought from Bob, you leave the transaction with less money, he leaves with more money. We can lump the corresponding property transaction into the ‘causes that are too complex to understand’ column. Despite this (unexplained) complexity, the end result is still simple: money changes hands. Why can’t we model that transfer of money alone? Again, this is a reasonable hypothesis.

Bottoms up

Having defended econophysics models, it’s time for my critique. By appealing to statistical mechanics, econophysicists hypothesize that we can explain the workings of the economy from simple first principles. I think that is a mistake.

To see the mistake, I’ll return to Richard Feynman’s famous lecture on atomic theory. Towards the end of the talk, he observes that atomic theory is important because it is the basis for all other branches of science, including biology:

The most important hypothesis in all of biology, for example, is that everything that animals do, atoms do. In other words, there is nothing that living things do that cannot be understood from the point of view that they are made of atoms acting according to the laws of physics.

(Richard Feynman, Lectures on Physics)

I like this quote because it is profoundly correct. There is no fundamental difference (we believe) between animate and inanimate matter. It is all just atoms. That is an astonishing piece of knowledge.

It is also, in an important sense, astonishingly useless. Imagine that a behavioral biologist complains to you that baboon behavior is difficult to predict. You console her by saying, “Don’t worry, everything that animals do, atoms do.” You are perfectly correct … and completely unhelpful.

Your acerbic quip illustrates an important asymmetry in science. Reduction does not imply resynthesis. As a particle physicist, Richard Feynman was concerned with reduction — taking animals and reducing them to atoms. But to be useful to our behavioral biologist, this reduction must be reversed. We must take atoms and resynthesize animals.

The problem is that this resynthesis is over our heads … vastly so. We can take atoms and resynthesize large molecules. But the rest (DNA, cells, organs, animals) is out of reach. When large clumps of matter interact for billions of years, weird and unpredictable things happen. That is what physicist Philip Anderson meant when he said ‘more is different’.

Back to statistical mechanics. Here we have an example where resynthesis was possible. To develop statistical mechanics, physicists first reduced matter to particles. Then they used particle theory to resynthesize macro properties of matter, such as temperature and pressure. That this resynthesis worked is a triumph of science. But the success came with a cost.

In all fairness to Maxwell and Boltzmann, their equations describe systems that are utterly boring. The math applies to an ideal gas in thermal equilibrium. At the micro scale, there’s lots going on. But at the macro scale, literally nothing happens. The gas just sits there like a placid lake on a windless day. Boring.6

Water, reduced and resynthesized

When we move beyond ideal gases in thermal equilibrium, the mathematics of resynthesis quickly become intractable. Let’s use water as an example, and try to resynthesize it from the bottom up.

At the most fundamental level, water is described by quantum mechanics. According to this theory, chemistry reduces to the bonding of electrons between atoms. We can use this ‘quantum chemistry’ to predict properties of the water molecule (H2O), such as the bond angle between the two hydrogen atoms. We can also predict that, because of this bond angle, the water molecule will have electric poles. These poles cause water molecules to attract one another, a fact that explains properties of water like its surface tension and viscosity.

When we move to larger-scale properties of water, however, we have to leave quantum mechanics behind. Suppose we want to explain a simple vortex, as shown in Figure 2. To create the vortex, all you need is a tube filled with water and a hole in the bottom. Pull the plug and a vortex will form. (We add the pump to sustain the system.) Now the question is, how do we explain the whirlpool?


Figure 2: A setup for making a vortex.

Quantum physics is no help. It’s a computational chore to simulate the properties of a single water molecule. But our vortex contains about 1024 molecules. So quantum physics is out.

What about statistical mechanics? It’s not helpful either. Physicists like Maxwell and Boltzmann made headway explaining the properties of gases, because when matter is in this state it’s valid to treat molecules like billiard balls. But in a liquid, this assumption breaks down. Rather than a diffuse cloud of billiard balls, a liquid is analogous to a ball-pit filled with magnetized spheres. The molecules don’t zoom around freely. They roil around in a haze of mutual attraction. It’s a mess.

So to make headway modeling our vortex, we abandon particle theory and turn to a higher-level approach. We treat water as a continuous liquid, defined by macro-level properties such as density and viscosity. We determine these properties from experiment, and then plug them into the equations of fluid dynamics. If we have enough computational power, we can simulate our vortex.

To model larger systems, we need still more computational power. Climate models, for instance, are basically large fluid-dynamics simulations. Presently, such models cannot resolve weather below distances scales of 100 km. Doing so would take too long on even the fastest supercomputers.

Back to water. The point of this story is to show the difficulty of resynthesis. We cannot take water molecules and resynthesize the ocean. Doing so is computationally intractable. So we introduce hacks. We do a partial resynthesis — water molecules to properties of water. We then put these properties in simpler equations to simulate the macro-level behavior of water.

This partial resynthesis is an important achievement. But it pales to what we ultimately wish to do. In a famous segment from Cosmos, Carl Sagan depicts the evolution of life from primordial ooze to humans. He concludes by noting: “those are some of the things that molecules do given four billion years of evolution.”

Sagan’s evolution sequence illustrates the scale of the resynthesis problem. We can easily reduce life to its constituents — primarily water and carbon. And yet we cannot take these components and resynthesize life. Unlike with water alone, with life there are no simplifying hacks that allow us to get from molecules to mammals. That’s because in mammals, the fine-scale structure defies simplification. Baboons are mostly sacks of water. But modeling them as such won’t explain their behavior. If it did, we would not have behavioral biology. We would have behavior fluid dynamics.

A foolhardy resynthesis

Social scientists often rail against the plight of ‘reductionism’. I think this is a mistake. To understand a complex system, we have no choice but to reduce it to simpler components. But that’s just the first step. Next, we must understand the connections between components, and use these connections to resynthesize the system. It is this second step that is filled with pitfalls. So when social scientists critisize ‘reductionism’, I think they are really criticizing ‘foolhardy resynthesis’.

And that brings me back to econophysics. When social scientists hear that econophysicists reduce humans to ‘particles exchanging money’, they are horrified. But when you have a close look at this reduction (as I have above), it’s not so terrible. Certainly humans are more complicated than molecules. But we do exchange money much like molecules exchange energy. The physicists who developed statistical mechanics were able to take energy-exchanging particles and resynthesize the properties of ideal gases. Why, say econophysicists, can’t we do the same with humans? Why can’t we take simple principles of individual exchange and resynthesize the economy?

The problem is that by invoking the mathematics of ideal gases, econophysicists vastly underestimate the task at hand. Explaining the human economy from the bottom up is not like using particle theory to explain the temperature of a gas. It is like using particle theory to explain animal metabolism.

Let’s dig deeper into this metaphor. We can certainly reduce metabolism to energy transactions among molecules. But if you try using these transactions to deduce (from first principles) the detailed properties metabolism, you won’t get far. The reason is that animal metabolism involves an ordered exchange of energy, defined by complex interconnections between molecules. Sure, the chemical formula for respiration is simple. (It’s just oxidizing sugar.) But when you look at the actually mechanisms used in cells, they’re marvellously complex. (Check out the electron transport chain, the machine that drives cellular respiration.)

What econophysicists are trying to do, in essence, is predict the properties of animal metabolism using the physics of ideal gases. The problem is not the reduction to energy exchange between molecules. The problem is a foolhardy resynthesis. In gases, the exchange of energy is unordered. So we can resynthesize properties of the gas using simple statistics. But we cannot apply this thinking to animal metabolism. When you ignore the ordered connections that define metabolism, you resynthesize not an animal, but an amorphous gas.

Money metabolism

The metaphor of metabolism is helpful for understanding where econophysicists go wrong in their model of the economy. Take, as an example, the act of eating.

Suppose a baboon puts a piece of fruit in its mouth. What happens next is a series of energy transactions. We might think, naively, that we can understand these transactions using the physics of diffusion. So when the fruit hits the baboon’s tongue, the energy in the fruit starts to ‘diffuse’ into the tongue. Unfortunately, that is not how metabolism works. The fruit touches the baboon’s tongue, yes. But the tongue cells don’t get any energy (yet). Instead, the tongue passes the fruit down the throat, where a long and complicated series of energy transactions ensue (digestion, circulation, respiration). Eventually, some of the fruit’s energy makes it back to the cells of the tongue. But the route is anything but simple.

Something similar holds in the human economy. Suppose you want to buy a banana. One option would be to purchase the fruit from your neighbor. If you do, things are much as they appear in econophysics models. You ‘bump’ into your neighbor Bob, and give him $1. He gives you a banana. It’s a particle-like transaction.

The problem is that (almost) nobody buys fruit from their neighbors. If you want a fruit, what you actually do is go to a grocery store … say Walmart. And there the transaction is different.

True, in the Walmart things start out the same. You find some bananas and head to the check out. There you meet Alice the cashier. You give her money, she lets you leave with the bananas. It’s still a particle-like transaction, right?

Actually no. The problem is that when you hand your money to Alice the cashier, it’s not hers to keep. The hand-to-hand transfer of cash is purely symbolic. As soon as Alice receives the money, she puts it in the cash register.7 Later, a low-level manager collects the cash and deposits it in a Walmart bank account. From there, mid-level managers distribute the money, working on orders from upper management. Eventually, some of your money may end up back in Alice’s hands (as a pay check). But the route is anything but simple.

Think of this route as ‘money metabolism’. Alice ‘touches’ your money much like tongue cells ‘touch’ the energy in food. But like the tongue cells, which simply pass the food down the digestive tract, Alice passes your money into Walmart’s ‘accounting tract’. As with the food energy, the result is a complex web of transactions governed by many layers of organization.

Yes, you can cut out this organization and observe that the end result is that money changes hands. But that’s like taking animal metabolism and reducing it to diffusion. When you cut out the details, you effectively convert the organism to a placid pool of matter. Similarly, when you cut out the ordered web of transactions that occur within organizations, you convert the economy to an inert gas. It is a foolhardy resynthesis.

Resynthesizing mud

Many econophysicists will grant that their models oversimplify the economy. But they will defend this simplification on the grounds that it ‘works’. Their models reproduce, with reasonable accuracy, the observed distribution of income.

The appropriate response should be “Good, your model passes the first test. Now open the hood and keep testing.” The problem is that econophysicists often do not keep testing. They are content with their ‘good results’.8

To see why this is a bad idea, let’s use an absurd example. Suppose that a biology lab is trying to synthesize life. The scientists combine elements in a test tube, add some catalysts, and then look at the result. They first test if the atomic composition is correct. They look at the abundance of hydrogen, carbon and oxygen to see if it is consistent with life. Lo and behold, they find that the test tube has the same atomic composition as humans!

“We’ve synthesized a human!” a lab tech shouts euphorically. The other scientists are less optimistic. “We’d better do more tests,” they caution. Their skepticism is well founded. Looking further, they see that the test tube contains no tiny human. It contains no organs, no cells, no DNA. It’s the right composition of elements, yes. But the test tube contains nothing but mud.

Kinetic exchange models of income/wealth, it pains me to say, are the equivalent of mud. Yes, these models replicate the macro-level distribution of income and wealth. But they ignore all other social structure. That’s okay, so long as the model is just a stepping stone. But the tendency in econophysics is say: “Look! My kinetic-exchange model reproduces the distribution of income. Therefore, inequality stems inevitably from the laws of thermodynamics.”

Sure … just like humans stem inevitably from mud.

How far down?

One of the central tenets of the scientific worldview is that the universe has no maker. It is a system that has self assembled. The consequence of this worldview (if it is correct) is that complexity must have arisen from the bottom up. After the Big Bang, energy condensed to atoms, which formed molecules, which formed proteins, which formed cells, which formed humans, who formed global economies. (For an epic telling of this story, see Eric Chaisson’s book Cosmic Evolution.)

The ultimate goal of science is to understand all of this structure from the bottom up. It is a monumental task. The easy part (which is still difficult) is to reduce the complex to the simple. The harder part is to take the simple parts and resynthesize the system. Often when we resynthesize, we fail spectacularly.

Economics is a good example of this failure. To be sure, the human economy is a difficult thing to understand. So there is no shame when our models fail. Still, there is a philosophical problem that hampers economics. Economists want to reduce the economy to ‘micro-foundations’ — simple principles that describe how individuals behave. Then economists want to use these principles to resynthesize the economy. It is a fools errand. The system is far too complex, the interconnections too poorly understood.

I have picked on econophysics because its models have the advantage of being exceptional clear. Whereas mainstream economists obscure their assumptions in obtuse language, econophysicists are admirably explicit: “we assume humans behave like gas particles”. I admire this boldness, because it makes the pitfalls easier to see.

By throwing away ordered connections between individuals, econophysicists make the mathematics tractable. The problem is that it is these ordered connections — the complex relations between people — that define the economy. Throw them away and what you gain in mathematical traction, you lose in relevance. That’s because you are no longer describing the economy. You are describing an inert gas.

So what should we do instead? The solution to the resynthesis problem, in my view, is to lower our expectations. It is impossible, at present, to resynthesize the economy from individuals up. So we should try something else.9

Here we can take a hint from biology. To my knowledge, no biologist has ever proposed that organisms be understood from atomic ‘first principles’. That’s because thinking this way gets you nowhere. Instead, biologists have engaged in a series of ‘hacks’ to try to understand life. Each hack partially reconstructs the whole.

Biologists started with physiology, trying to understand the function of the organs of the body. Then they discovered cells, and tried to reconstruct how they worked. That led to the study of organelles, and eventually the discovery of molecular biology. Along the way, biologists tried to resynthesize larger systems from parts — bodies from organs, organs from cells, cells from organelles, etc. They met with partial success.

When economists try to reconstruct the economy from ‘micro principles’, they are doing the biological equivalent of resynthesizing mammals from molecules. (Yes, the human economy is that complex.) The way to make progress is to do what biologists did: take baby steps. First look at firms and governments to see how they behave. Look at the connections between these institutions. Then look inside these institutions and observe the connections among people. Resynthesize in small steps.10

As we try to ‘hack’ our way to a resynthesis, we will probably fail. But we will fail less badly than if we tried to resynthesize society from individuals up. And hopefully, from our failure we will learn something. That’s science.

  1. March 13, 2021 at 3:54 am

    Thanks for a lucid explanation. I appreciate it.

  2. Marcelo B. Ribeiro
    March 13, 2021 at 7:28 am

    There are some useful criticisms here, but I would not agree that econophysicists are happy with their results and stopped at there. And econophysics income inequality modelling is not only about kinetic exchange. There is more to it than just kinetic exchange, as I discussed in my recent book entitled “Income Distribution Dynamics of Economic Systems: An Econophysical Approach”, Cambridge University Press (2020), ISBN 9781107092532. There is more to econophysics income distribution modelling than just kinetic exchange.

  3. pfeffertag
    March 13, 2021 at 8:02 am

    Well expressed and interesting. I’d just comment on this:

    “To understand a complex system, we have no choice but to reduce it to simpler components. But that’s just the first step. Next, we must understand the connections between components…”

    You italicise the word “connections” which is good for the essence of science theory is relationships between components. But perhaps the order is back to front and the first step is the relationship. Science only introduces components (gravity, temperature, aether, phlogiston, dark matter…) in order to fulfil some relational requirement. That is the purpose of components. Science concepts only exist in relationships. Lone concepts do not exist.

    It is the relationship which is tested (hopefully). A test withstood confirms not just the connective relationship but the correctness, or usefulness, or existence, of the concepts.

    It is the usual practice of social science to nominate concepts and then go looking for relationships – and the exigencies of careers and publishing mean that something or other will be found. The result is millions of papers which disagree with each other, none of which are wrong.

  4. Ikonoclast
    March 13, 2021 at 10:21 am

    Very interesting article. We need, I believe, to prepare ourselves philosophically for the insuperable incompleteness of science. I will start with a fascinating quote.

    “Understanding emergence along the lines of self-organization has become so ubiquitous the two terms have just about become synonymous. However, the usual connotations of self-organization result in a misleading account of emergence by downplaying the radical novelty characterizing emergent phenomena. It is this radical novelty which generates the necessary explanatory gap between the antecedent, lower level properties of emergent substrates and the consequent, higher level properties of emergent phenomena. Without this explanatory gap, emergent phenomena are not unpredictable, are not non-deducible, are not irreducible, and thus are not truly emergent. For emergent phenomena to be genuinely emergent, processes of emergence must accomplish the seemingly paradoxical feat of producing an explanatory gap while simultaneously maintaining some degree of continuity with the substrate level.” – Professor Jeffrey A. Goldstein.

    We have to concede that, despite great progress in physics, a mathematical unification of general relativity (GR) and quantum field theory (QFT) has not been achieved to date. Such a unification might yet be developed or it might be precluded due to some ultimate limitation of human intelligence, or of mathematics or of computational power or of technically feasible instruments.

    All modelling, including mathematical modelling, implies simplification. It involves the process of abstracting essential elements from reality and making simplifying assumptions. A working model, including a mathematical system as a working model (which links the quantities involved in static or dynamic events into abstractly modeled relationships via equations), can function as a set of general and linking explanations or as a set of specific pathways and algorithms for the calculation or prediction of real processes occurred, occurring or yet to occur.

    The incompleteness of science is very likely to prove intractable. Indeed, a proper consideration of relational system priority monism demonstrates that mathematical and scientific incompleteness will be intrinsic and unavoidable, if the initial thesis of priority monism is correct. Under an assumption of thorough-going priority system monism, an explanation or model of reality, as a set of language statements or mathematical equations, can never be complete. In addition to being abstracted and simplified, an explanation or model of reality (of “all existence”) or of a sub-system thereof, must itself (the model that is) perforce be a smaller sub-set system (an emergent subset generated and mediated by a conscious agent) of monistic reality itself (of “all-existence”). In emergence, the new formal statement adds a “radical novelty” to all-existence (the cosmos). All-existence now has a new theory about itself which theory is a new, emerged part of all-existence. The explanation or model cannot have any existence separate and unconnected from the posited monistic system otherwise the claimed monistic system would not be monistic.

    The above must be true under the a priori assumption of priority monism IFF (if and only if) the a priori assumption itself is true. A theory model is (or becomes in the emergent sense) a novel subset of the monistic system. A sub-system model of a system never fully replicates or models the entire system. The model must always be incomplete.

    To sum up, a clear characteristic of a monistic cosmos system, which manifests characteristics of emergence and evolution, including the emergence of minds and then mind-made models as subsets, will be that it (the cosmos) will demonstrate only partial reducibility to modelling and/or explanation. There will be an unavoidable incompleteness in all mathematical, scientific and philosophical theories. This consideration appears as an extension, in the reverse direction, of Hume’s observation of the infinite regress problem for the explanation of causes. As well as the infinite regress problem we may also detect, in the other direction, an “infinite emergence” problem. There would appear to be no theoretical upper or final limit to emergent novelties in the cosmos except possibly upon the heat death or other ending of the cosmos itself.

    All of the above does not mean that we should stop trying to make better models. Just that we must stop expecting to solve everything, tomorrow… or ever, as humans. The next small step of understanding ought to be our goal. Blair Fix’s thoughts should certainly prove useful in developing a “where to next?” guide for making new testable models of the economy.

    It is important however to remember that money and property are social fictive constructions, albeit the human work of a few millennia to get them to their current expression. This social fictive nature means that we could, in theory, supersede money and property in future human socioeconomic development. Something new could emerge or develop from further social development and even from human evolution itself. This is to say that money and property (in their current forms) are not immutable verities of this time and place of the cosmos as are hydrogen and oxygen atoms (for example).

    We don’t expect the fundamental laws of physics and chemistry to change any time soon in our corner of the cosmos. However, we can realistically expect that money and property will not continue immutable in their current forms. They are social fictive constructions changeable by new theories of economics and new ideological postulates. Blair Fix has referred to the problem of “Changing Meter Stick” in his paper “The Aggregation Problem:Implications for Ecological and Biophysical Economics.” Equally, we need to concerned about the “Changing Ontological Basics”. We cannot validly posit money and property, in current forms, or ultimately in any forms, as unchanging ontological fundamentals. They cannot and should not be taken in ontological terms as basic, consistent or eternal economic objects. Therefore, however one might attempt to make a science of economics, money and property cannot be taken as basic immutable objects of that science.

  5. Meta Capitalism
    March 13, 2021 at 2:39 pm

    One of the central tenets of the scientific worldview is that the universe has no maker. It is a system that has self assembled. The consequence of this worldview (if it is correct) is that complexity must have arisen from the bottom up. After the Big Bang, energy condensed to atoms, which formed molecules, which formed proteins, which formed cells, which formed humans, who formed global economies. (Blair Fix, RWER, 3/12/2021)

    Enjoyed the post, and overall agree with it, but find you assume too much when you confuse your personal philosophical views with science per se. You conflate your personal philosophy (philosophical materialism) for science (methodological materialism). Philosophy science is important for scientists too, not just philosophers.
    “In May 1998 Dr. Eugenie C Scott, NCSE’S Executive Director, was awarded the American Humanist Association’s 1998 “Isaac Asimov Science Award”. What follows is excerpted from her acceptance speech. Ed.” Her comments follow:

    Properly understood, the principle of methodological materialism requires neutrality towards God; we cannot say, wearing our scientist hats, whether God does or does not act. I could say, speaking from the perspective of my personal philosophy, that matter and energy and their interactions (materialism) are not only sufficient to understand the natural world (methodological materialism) but in fact, I believe there is nothing beyond matter and energy. This is the philosophy of materialism, which I, and probably most humanists, hold to. I intentionally added “I believe” when I spoke of my personal philosophy, which is entirely proper. “I believe,” however, is not a phrase that belongs in science.
    We philosophical materialists may all be methodological materialists, but the converse isn’t true. Gregor Mendel was a methodological materialist who didn’t accept the philosophy of materialism. I think we make a grave error when we confuse philosophical views derived from science even those we support with science itself.
    Let me give you an example. There exists a group of critics of science about whom Barbara Ehrenreich has written eloquently; they call themselves deconstructionists, or postmodernists, and they can be found in unfortunately large numbers in the humanities and social sciences departments of most universities and colleges. They claim that science is largely responsible for the current destruction of the environment, for social policies based on racism and sexism, for genocide, the Holocaust, for iatrogenic illness. They argue that the very Enlightenment principles that Humanists embrace should be knocked off their pedestals and replaced with more subjective, personal, and allegedly “more humane” ways of making decisions.
    Most of us Humanists (being rational, Enlightenment types) would argue vigorously against this position. With Barbara Ehrenreich, we would point out that, yes, indeed, science has been used to promote ideas like genocide that we would consider evil, but that postmodernists are confusing ideologies and ideas drawn from science with science itself. Science has, for example, been used both to promote and to rebut sexism and racism, but the philosophical view one draws from science should not be used to raise up or cast down science itself.
    The same principle applies to philosophical materialism, the view at the foundation of our Humanism; we may derive this view from science, but an ideology drawn from science is not the same as science itself. Science is an equal opportunity methodology.
    Therefore, I agreed with the two theologians who asked NABT to take the words “impersonal” and “unsupervised” from its statement on evolution. NABT was making a philosophical statement outside of what science can tell us. Plantinga and Smith wrote:
    [I]t is extremely hard to see how an empirical science, such as biology, could address such a theological question as whether a process like evolution is or isn’t directed by God…. How could an empirical inquiry possibly show that God was not guiding and directing evolution?
    And they were right. If we are to say to postmodernist attackers of science that they should not confuse science with positions or philosophies derived from science, then we must be consistent and not equate science with materialist philosophy.
    I argue for the separation of methodological from philosophical materialism for logical reasons, and for reasons based on the philosophy of science. It is also possible to argue from a strategic standpoint. Living as we do in a society in which only a small percentage of our fellow citizens are non-theists, we who support the teaching of evolution in the public schools should avoid the creationist’s position of forcing a choice between God and Darwin. Creationists are perfectly happy if only 10% of the population (the percentage of non-theists) accepts evolution. I am not. I want people to understand and accept the science of evolution; whether or not someone builds from this science a philosophical system that parallels mine is logically and strategically independent. An ideology drawn from science is not the same as science itself.
    Ironically, I find myself being praised and encouraged in my position by conservative Christians and taking flak from some fellow non-theists, including some scientists. I must say, though, that over the last several months I have presented lectures at several universities and two meetings of professional scientists in which I have argued that a clear distinction must be drawn between science as a way of knowing about the natural world and science as a foundation for philosophical views. One should be taught to our children in school, and the other can optionally be taught to our children at home. Once this view is explained, I have found far more support than disagreement among my university colleagues. Even someone who may disagree with my logic or understanding of philosophy of science often understands the strategic reasons for separating methodological from philosophical materialism if we want more Americans to understand evolution.
    Science and Humanism are too important for us not to think very clearly about what they have in common and where they are distinct. The most difficult questions for us to think about critically are the ones where one answer better suits our ends, even if another one is truer. As Humanists, we might want to claim the power of science as our own, but we cannot honestly do so. Humanists should be modeling clear thinking, not muddling it and I think we are up to the task. (Reports of the National Center for Science Education | Volume 18 | No. 2 | March-April 1998)

  6. Craig
    March 13, 2021 at 6:46 pm

    Wisdom is the integrative process itself. When onerealizes that the integration of their own awareness with the momentary flow of nature/cosmos is the experience known as God they are much more ready to embrace a naturalistic pan-entheism which is also an integration of science/knowing about and wisdom/knowingness.

  7. Ikonoclast
    March 13, 2021 at 9:18 pm

    My key point for Blair Fix to consider was as follows (slightly reworded for more clarity).

    We don’t expect the fundamental laws of physics and chemistry to change any time soon in our corner of the cosmos. However, we can realistically expect that money and property will not continue immutable in their current forms. They are social fictive constructions changeable by new theories of economics and new ideological postulates. Blair Fix has referred to the problem of the “Changing Meter Stick” in his paper “The Aggregation Problem:Implications for Ecological and Biophysical Economics.” Equally, he and we need to concerned about “Changing (Mutable) Ontological Objects”. We cannot validly posit money and property, in their current forms, or ultimately in any form(s), as unchanging ontological fundamentals. They cannot and should not be taken in ontological terms as basic, consistent or eternal social or economic objects. Therefore, however one might attempt to make a science of economics, money and property cannot be adopted as immutable fundamental objects for that science.

    To expand on this, we cannot change the nature of the fundamental standard and isolated oxygen atom by changing our minds about what oxygen atoms should “look like” or be like. We can change the nature of money and property by changing our minds (and thence our cultural behaviors, customs, regulations, and legal laws) about what money and property should be. Money and property show a continuous mutability and emergent “nature” in socio-cultural and socieconomic historical emergence terms. We, historically and contemporaneously, keep on changing our minds about what money and property are and what they should be. Or at least, our elites keep making these decisions for us so long as we we remain politically supine and compliant. Yes, I am implying that we continually allow our elites to violate us up our collective fundamental orifices. We let them tell us what our social realities will be instead of deciding and defining these things for ourselves.

    I continue to be a little disappointed that the moment I speak about ontology, people overlook that I mean the discipline of finding/developing base ontologies for empirical disciplines. Yes, there is also the philosophical branch we may term speculative metaphysics. I am not talking about this branch of philosophy when I mention what I term in full “empirical ontology”. We might define empirical ontology as follows. Empirical ontology deals with the ontology of empirically detectable base existents (objects, processes, fields and systems). Empirical ontology does not deal with objects of speculation or belief and makes no assertions about them, supportive or dismissive. What a base existent IS will be discipline relevant The base existents as defined must be “basic enough” but no more basic than “basic enough”. [1]. Of course, to have a dependable base ontology for a given empirical discipline (and each discipline will have a pragmatically and functionally different base ontology) these base existents must be immutable (unchangeable) or at least unchangeable within the boundary conditions of the discipline and the questions it asks and seeks to resolve. They also must be empirically detectable.

    Note 1. – As an example of the concept of a “base existent”, atoms (and electrons, electron shells and valences) are some of the defined base existents of the discipline of chemistry. As another example, in the arts and crafts field, the tile is one base existent of the craft of mosaics. There are further elements of the tile (color, glaze, “stickability” of the back with an adhesive or grout) which might be considered base elements for the discipline of creating mosaics. However, the traditional, practical mosaic craftsmen is not directly concerned with what atoms are in each tile. Such “too basic” existents are “too basic” in practical and pragmatic terms, for his discipline. Someone using modern science to push the boundaries of the colors, glazes, strength, durability etc. of mosaic tiles will likely be concerned with compounds, molecules and atoms as base existents, to some degree.

    • Meta Capitalism
      March 14, 2021 at 1:38 am

      Ikon, my apology. I did not mean for my comment to fall under your comment and misleadingly appear to be replying to your comment. My mistake. I have not read your comment carefully enough to feel it appropriate to reply. My comment was only meant as a general comment on certain specific statements in Blix’s main post. Cheers.

      • Ikonoclast
        March 14, 2021 at 9:24 am

        No worries. Yes, we may well decide that much of political economy belongs back in moral philosophy. It’s my suspicion that it does. But if we try to scientize aspects of political economy we must strictly adhere to what I term empirical ontology as part of our method for that side of the enterprise.

  8. March 14, 2021 at 7:57 am

    “Initial Conditions as Exogenous Factors in Spatial Explanation” – possibly of interest https://www.researchgate.net/publication/209404426_Initial_Conditions_as_Exogenous_Factors_in_Spatial_Explanation

  9. Gerald Holtham
    March 19, 2021 at 11:37 pm

    “In many cases, the use of probability is just a ‘hack’. It is a way to simplify a deterministic system that is otherwise too difficult to model.”
    Yes indeed. The failure to grasp that seems to underlie misconceptions about econometrics sometimes found on this blog.
    While economic life cannot be entirely explained by interactions of a random character, it is nonetheless probably true that many aggregate phenomena are the unintended result of large numbers of interactions independent of people’s purposes. Mainstream economic theory treats phenomena as the predictable outcome of purposeful behaviour without any “random” effects arising as the unintended and unexpected result of numerous interactions. The truth is surely that both sets of cause-effects are present in social life and it is very hard to disentangle them. The naive thermodynamicist and the neo-classical economist are sitting on opposite ends of a girder. The task is to inch nearer to the middle of the girder where reality is best represented.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.