The technosphere as a computer: What are the thermodynamic implications?

In modern economics, information is arguably the core notion. Indeed, many economists would agree that the market is a giant distributed computer that processes information about scarcities of resources and generates prices that guide the actions of economic agents accordingly. However, economists rarely consider the physical side of information, although in practice that matters much, such as determining the locations of servers or managing their energy costs.

What always comes as a surprise to me is that physicists and engineers also tend to sideline the topic of information when considering the physical side of the economy. I recently attended the conference ‘Thermodynamics 2.0’ where this was salient, again. I know many researchers who explicitly exclude this topic from their research (for example, Axel Kleidon, a contributor to this blog, always shrugs shoulders, too messy, no data…). Of course, one reason is simply that it is complex and difficult. In comparison, investigating the energetic aspects of the economy is a clear-cut question, and data are readily at hand. But for me, this is ‘Hamlet’ without the Prince.

In his seminal 1994 book, Robert Ayres, one of leaders in research into physical aspects of the economy, already elaborated on the role of information in much detail. However, even he retracted from the issue in his subsequent prolific work. In this book, the central problem was already salient. Physics refers to the Shannon notion of information as established concept that allows for its quantification, such as in computation. In economics, as in Ayres’ book, this is mostly used as a measure of complexity. That fits well with the concept of entropy, and since there is a direct formal correspondence with Boltzmann’s definition of entropy, that seems the end of the story. But as Ayres already realized, what is entirely lost is the meaning of the information (which Shannon deliberately kept out of his original work on transmission of information). In his book, therefore he distinguished between two types of information, one relating to complexity, and the other to functions in the context of evolutionary processes: The physical meaning of information would be its contribution to maintenance and survival of a system. This strand was rarely pursued again. Take, for example, Hidalgo’s celebrated 2015 book, which also starts with opening up new vistas on thermodynamics and economics, but gets stuck in the complexity cul-de-sac again.

Therefore, I suggested and elaborated in my 2013 book ‘Foundations of Economic Evolution’ an entirely new approach ‘from scratch’ based on Peirce’s semiotics. Yet, this move seems extremely difficult to communicate and understand, since it is unfamiliar to most other scholars (and after all, what could a 19th century philosopher contribute?). In my contribution to the ‘Thermodynamics 2.0’ conference, I presented an introduction based on a reinterpretation of Maxwell’s demon. Recently, one of the participants, Carey King, draw my attention to an important paper by Artemy Kolchinsky and David Wolpert on ‘semantic information’ that readily provides a formal foundation for my view, yet without being related in any way. This paper starts out from the same basic insight that motivated my work, Landauer’s principle that established the energetic equivalence to a bit of information. Landauer’s equivalence can be almost seen as a physical constant and implies that there is no computation without energetic costs. Kolchinsky and Wolpert connect this with Stuart Kauffman’s concept of autonomous agent (as I do in my book), which is a self-sustaining system that feeds on energetic throughputs. Then, one can show that semantic information is the capacity of a bit of information that contributes to the thermodynamic sustainability of the system (‘survival’ in Ayres’ approach). The crucial corollary of this is that information processing exports entropy to the environment, i.e. is subject to the Second Law. Indeed, similar ideas have been ventilated in biology for long, especially in the context of origin of life theories. Life is evolving information, and if we want to explain its origins physically, this close relationship between energy and information must be essential.

I refer the reader to the sources that I cited for more detail. For understanding the technosphere, the theory suggested by Kolchinsky and Wolpert is crucial for specifying its thermodynamic role in the Earth System. The technosphere is the hardware of the economy, conceived as a computer. Thus, information processing in the economy always comes with physical costs, i.e. exporting entropy to the environment. The big difference to the standard thermodynamic approaches to the economy, as seminally launched by Georgescu-Roegen in his 1971 book, is that this entropy production is not merely seen as reflecting the consumption of energy, and therefore could be contained just by adapting energetic needs of the economy. In my view, these energetic needs result from the more fundamental process of information processing.

The catch is obvious: Most economists think that more information is the solution to our current climate quandary. But they still treat information as ‘manna from heaven’, without any costs. There is growing awareness of the issue in the context of energetic requirements of ‘big data’ and AI. But again, the solution is envisaged as ‘more information’.

The ‘technosphere computer’ is devouring energy to expand its embodied information, and the meaning of that information is to perpetuate and expand this pattern. We must develop a research program that explores this aspect of the technosphere. We happen to live in a grand delusion: That more information is always good for us. But what is the function of the flood of information that is overwhelming us and that we cannot possibly understand at all anymore? Is there possibly a ‘Second Law of Information’ that implies that in generating more information, there is always an irreversible loss of information?

One Reply to “The technosphere as a computer: What are the thermodynamic implications?”

  1. The Holy Trinity

    Yes the flow of information is important, and energy is used and entropy produced, but we are also interested in how this information accumulates and defines the structures making up the ‘technosphere’. After all, it is these structures that differentiate the technosphere from its background and, more importantly, define its ability to do productive useful work for any given input of primary energy. We might even go so far as to say that it is the final creation of productive structure that is the definition of what is ‘useful’, a view supported by the growing appreciation that GDP comes close to measuring this. If this is so, then the energy supported flow of information articulated by Carsten is into the stock of information encoded in the mat[t]erial fabric of the technosphere. This view underscores that the importance of seeing and accounting for all three states of the system simultaneously – Matter, Energy and Information – the Holy Trinity. Our failing in understanding both human and natural systems derives largely from over-emphasising the role of one or other elements of this trinity. We started with a uniquely material view, and slowly broadened this view to include energy. Not surprising we left information ’til last given its ethereal spirit. Given we are struggling to wrestle information into the picture perhaps we should focus on how we observe it, and then use these ‘measurements’ to paint an inductive landscape?


Leave a Reply to Jarvis Cancel reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s