Information Viability and Systemic Exchange Value
A Thermodynamic Reframing of Information and Entropy
Preface: In the Systemic Exchange Value or thermoeconomics framework for short, I have been working to develop a theoretical frame to understand the dynamics of capital accumulation economic systems that draws together three ‘circuits’ - the circuits of material or thermodynamic transformation (the real economy or the system of use value), the circuits of exchange value (endogenous money and fictitious capital), and information. Throughout various essays, I have focused mainly on the first two circuits, with the last featuring as a ‘support act’. This essay turns its attention specifically to the question of information. In short, I propose to treat information as a thermodynamic problem as well with implications for systemic entropy and negentropy. The concepts of viable and unviable information are introduced.
Introduction
In modern discourse, information is often celebrated as the quintessential anti-entropic force. Since Shannon’s foundational work, the notion that information reduces uncertainty has underpinned theories in communication, cybernetics and economics. Information is treated as an unqualified good: more of it supposedly enhances coordination, expands choice and enables control. Yet this celebratory framing obscures the material conditions of information itself. Information is not immaterial. It takes energy to create, process, store, circulate and to use it. In this sense, information itself can be conceptualised in supply chain terms, with a thermodynamic profile.
If energy return on energy invested (EROEI) has proven indispensable for assessing the viability of energy systems, a parallel logic applies to information. We can distinguish between the energy required to produce information (EROEIp) and the utility potential it unleashes in negentropic terms (EROEIu). The viability of information, then, depends not on its abstract Shannonian entropy but on whether the energetic cost of producing it (including its storage) is outweighed by the energetic surplus or systemic coordination gains it enables.
This reframing suggests that not all information is anti-entropic. Some information is actually unviable: its production, storage and circulation consume more energy than it returns to the system in usable surplus or negentropy. In such cases, information becomes noise in the most literal thermodynamic sense - a net entropic drain. This short essay develops this argument within the broader framework of Systemic Exchange Value (or thermoeconomics), which grounds value in energetic flows across systemic boundaries.
Thermodynamics, Information and Systemic Exchange Value
The concept of Systemic Exchange Value (SEV) emerges from the recognition that value in any system must ultimately be linked to energy. Exchange value - whether expressed in monetary form or in the circulation of use-values - depends on flows of energy through and across systems. Energetic flows sustain structures against entropy, enabling reproduction and transformation.
Information is part of this energetic economy. Its materiality is evident: computation requires electricity, storage requires physical media and transmission requires infrastructures of copper, fibre optics, satellites and data centres. Human cognition itself is an energetically costly process; the brain consumes roughly 20% of metabolic energy while processing a sliver of the informational environment. Information, then, is not free.
In the SEV frame, information is one modality of exchange value that must be subjected to thermodynamic accounting. Information enters systemic flows as a potential negentropic resource, but its viability depends on whether the system’s energetic reproduction is enhanced or diminished by its presence. This shifts the emphasis away from information as an abstract reduction of uncertainty to information as an energetic process embedded in material systems.
Defining Information Viability
To clarify this thermodynamic framing, we introduce the distinction between viable and unviable information.
Viable information arises when the utility potential (EROEIu) of information -measured as the systemic negentropy or surplus energy it helps unlock - meets or exceeds the production cost (EROEIp) required to generate, process, store, circulate and to use it. Examples include sensor data that optimise energy use, scientific discoveries that enable more efficient energy capture, or logistical information that reduces systemic waste.
Unviable information occurs when EROEIp > EROEIu. In such cases, the energy invested in producing and circulating information is greater than the surplus or negentropy it enables. This is information as systemic burden, producing complexity, redundancy or noise without sufficient utility return.
This formulation turns Shannon on his head. Shannon defined information as a measure of surprise or improbability in a signal: the less likely a signal, the more information it contains. But Shannon’s framework was explicitly indifferent to meaning or systemic consequence. In thermodynamic terms, however, not all signals are equally valuable. Some rare signals consume immense energy yet generate negligible surplus. Others, though seemingly banal, may sustain systemic resilience.
The viability framing thus introduces a thermodynamic threshold for information: information is anti-entropic only if its systemic utility outweighs its energetic cost.
The Costs of Information Overproduction
Modern societies provide abundant examples of unviable information. Here are some:
Data Centres and ICT Infrastructure. Global information and communication technology (ICT) systems consume between 8–10% of global electricity, a figure projected to rise sharply with AI, cloud computing and blockchain. Much of this energy is spent storing and transmitting data of marginal or no systemic utility: duplicated files, redundant streams and ephemeral social media interactions. When the energetic cost of maintaining this information exceeds the surplus it enables, information becomes an entropic sink. I have discussed these issues in detail in the context of AI with American characteristics, as well as in the implications of AI for electricity systems.
Financial Markets. High-frequency trading epitomises unviable information. Vast computational resources are mobilised to exploit microsecond price differentials. The energy cost is staggering, yet the systemic utility is negligible, often amplifying volatility rather than stabilising capital allocation. In SEV terms, this is information that circulates fictitious value: its EROEIp is immense, yet its EROEIu minimal. My earlier discussion on US government spending, bonds and circuits of fictitious capital examines the entropic implications of these dynamics.
Social Media and Attention Economies. The exponential proliferation of social media content produces torrents of signals, but the utility of much of this information is questionable. The energy required to produce, store, transmit and consume this content is vast. Worse, much of it destabilises coordination, amplifies misinformation and fragments collective attention. The entropic effects are not merely energetic but also social: unviable information degrades the capacity for systemic coherence. (I will return to this issue in a separate essay, in the future.)
These cases illustrate that the mere expansion of informational throughput does not equate to negentropy. Beyond a threshold, information proliferation becomes a liability.
Negentropy and Utility Potential of Information
If some information is unviable, other forms are paradigmatically viable. These are instances where modest energetic investments in information yield large systemic benefits. Here are some examples.
Logistical optimisation: the introduction of barcodes, RFID tags, and now IoT sensors in supply chains enables fine-grained tracking of goods, their whereabouts and their conditions. The energetic cost of generating and transmitting this data is modest relative to the waste avoided through improved coordination. Here, information has high EROEIu: it unlocks systemic efficiencies that conserve energy and material. For those interested in a concrete case study of these issues, my conference paper ‘When Time is Money’ concerning data, supply chains and payments in the beef export space may be of interest.
Medicine and Health applications: Diagnostic information - ranging from imaging scans to genetic sequencing - consumes energy in production but often yields disproportionately large negentropic benefits. A timely diagnosis can prolong life, reduce systemic treatment costs and sustain productive capacities.
Science and Engineering research and development: scientific discoveries that unlock new energy harnessing technologies epitomise viable information. The energy cost of scientific research is dwarfed by the energetic surplus unleashed by innovations such as photovoltaics, hydrogen, nuclear fission, sodium-ion technologies or advances in overall efficiency.
These examples aren’t the only ones, but serve to highlight that information viability is context-dependent. Information has utility potential when it enables systemic negentropy across scales.
Toward an Information-Thermodynamic Metric of Systemic Exchange Value
How might information viability be operationalised? One possibility is to extend the logic of EROEI accounting to informational systems. This would involve:
Calculating EROEIp: the energetic cost of producing, processing, circulating, storing and using information. This would include infrastructure (servers, networks, cognitive effort) as well as operational energy.
Estimating EROEIu: the systemic energetic surplus or negentropy potential enabled by the information. This is more complex, requiring systemic modelling of coordination gains, waste reduction and innovation effects.
Assessing the ratio: where EROEIu ≥ EROEIp, information is viable; where not, information is fictitious or entropic.
This metric reframes value assessment within systemic exchange value theory. Just as fictitious capital represents claims on future value (use value or other exchange values) without sufficient productive backing, fictitious information represents informational flows whose energetic cost exceeds their systemic utility.
Implications and Conclusion
The dominant narrative that “information wants to be free” and that its proliferation is inherently progressive must be rethought. Information is not inherently anti-entropic; it is conditionally so. Its viability is determined not by Shannonian probability distributions but by thermodynamic accounting: does the information unlock more negentropy than it consumes?
The implications are threefold.
At a conceptual or theoretical level, this framework repositions information within thermodynamic political economy. It suggests that systemic survival depends not on maximising informational throughput but on optimising for viable information. We need to understand the thermodynamic determinants of viable versus unviable information.
In concrete policy terms, societies must account for the energetic costs of information infrastructures and prioritise informational systems that sustain negentropy. This implies regulating unviable information flows (e.g., speculative trading, redundant data storage, unproductive data processing) and investing in high-EROEIu information systems (e.g., scientific research, logistics and healthcare and the like).
And in operational practical terms, organisations can apply information viability metrics to assess whether their data strategies are energetically and systemically justified.
In turning Shannon on his head, this reframing recognises that information can be both negentropic resource and entropic burden. The challenge is to distinguish viable from unviable information and to organise systemic exchange value flows accordingly. Only then can information serve as a genuine resource for sustaining complex systems against entropy.




