The Nature of Disorder

Conceptualizing the machinic process as one that seeks to reduce randomness in pursuit of an objective, rather than one directly targeting an objective, allows us to contextualize the machine within the fundamental laws of physics governing our universe. This intuition has persisted since antiquity, yet the cornerstone principle—the second law of thermodynamics—remained elusive. Consequently, great minds such as Archimedes, Da Vinci, and Newton, among others, were unable to formulate a comprehensive theory of the Machine.

Thermodynamics

The machine itself ultimately revealed its secrets. In the early 19th century, against the backdrop of industrialization, the study of physical laws governing engines led to the emergence of fundamental thermodynamic principles. The second law of thermodynamics, also known as Carnot's principle (1824), established the irreversibility of physical phenomena. This concept was further developed by Clausius (1865), who introduced the notion of entropy—a measure of a system's disorganization. In essence, the second law of thermodynamics posits that in an isolated system, entropy never decreases over time; rather, it increases until maximum chaos is achieved.

This law holds numerous implications relevant to a theory of the Machine. Let us first consider this: if we observe that disorder is not increasing, then either chaos has reached its maximum, or the system is not closed—meaning external energy is continuously introduced to counteract inevitable disorder. The common thread linking all machines, whether general or specific, is their ability to channel energy to resist the increase in entropy. This principle applies universally, from mechanical processes, to animal labor to procedures organizing tasks performed by humans.

Specific machines are designed to maximize the probability of a particular outcome, while general machines aim to maintain overall system stability.

Let us park specific machines for a moment and consider solely the resistance provided by general machines whose primary purpose is to prevent the system's elements from dispersing. The system's equilibrium hinges on a tension between two opposite forces. Drawing an analogy with planetary systems, our general machine functions akin to gravity—without which celestial bodies, if they had managed to form at all, would drift erratically through our expanding universe1.

This analogy has inspired architects of general machines since Newton's formalization of gravitation in 1687. For instance, Beccaria explained in 17642 that the legislators "endeavours to counteract the force of gravity by combining the circumstances which may contribute to the strength of his edifice". Since the 18th century, scientific progress has prompted us to invert Beccaria's perspective. We are now inclined to view the rules contributing to a system's equilibrium as analogous to gravitation, rather than the inverse. Nevertheless, the idea of a resistance—in its mechanical sense—to a natural force that inevitably leads to disorder remains valid. This insight is particularly intriguing coming from Beccaria who was likely more influenced by Rousseau than by Hobbes.

Reversing Beccaria's cosmic analogy has a major consequence: while gravity can maintain the system in a state of equilibrium that doesn't correspond to maximum chaos, there is a threshold beyond which the system collapses upon itself. Controlling a system's entropy is one challenge; attempting to reduce it is quite another. This serves as our first caution regarding the energy we channel through our general machines.

We must not delude ourselves into believing we have total control over the Machine. The machinic process is not the exclusive domain of humanity; rather, humans appear to be the only species not merely subject to its power, but capable of harnessing it. In fact, all forms of life consume energy in an attempt to resist randomness, uncertainty, and ultimately, chaos. Life itself could be distilled to this very resistance. Norbert Wiener eloquently articulated this idea in Cybernetics and Society : The Human use of Human Beings.

Organism is opposed to chaos, to disintegration, to death, as message is to noise. To describe an organism, we do not try to specify each molecule in it, and catalogue it bit by bit, but rather to answer certain questions about it which reveal its pattern: a pattern which is more significant and less probable as the organism becomes, so to speak, more fully an organism

Norbert Wiener, 1950

The above quote encapsulates two essential pieces of information to our reasoning.

Firstly, Wiener's concept of "pattern" aligns closely with the organization of a system, standing in opposition to chaos. He describes this pattern as "less probable" because the combined probability of the characteristics present in the system (those that define a species) diminishes as their number increases. This implies that more complex and precise patterns—indicating stronger resistance to disorder—correlate with more evolved life forms.

His perspective invites two possible interpretations: either as evidence of human superiority in the biological hierarchy, or as a justification for his work on cybernetics. While a comprehensive explanation and illustration of these interpretations will be necessary, we should refrain from drawing hasty conclusions. We can merely observe at this stage that determinism holds a universal appeal on humans, regardless of intellectual sophistication.

A more commonplace example of our inclination towards determinism is the seemingly universal wonder experienced at the sight of a child resembling one of its parents. Whether this reaction is innate or culturally induced remains unclear, but it's worth noting that we have no record of cultures where such resemblance is considered strange or repulsive.

Information theory

Another crucial aspect of Wiener's quote is the juxtaposition of "message" against "noise," mirroring the opposition of pattern to chaos. In doing so, Wiener establishes a connection between physics and information theory. This bridge forms the cornerstone of cybernetics, the theory of control and communication in the Animal and the Machine, as elaborated in his seminal 1948 work. This connection necessitates a foray into information theory without waiting to address directly the role of communication in the organization of communities.

Information theory is typically associated with Claude Shannon3 who, in the late 1940s, created this new branch of mathematics ex nihilo. The theory's foundation lies in quantifying a message's information content not by the physical amount of data transmitted (i.e. number of bits), but in terms of probabilities. This revolutionary approach has found wide-ranging applications, including data compression, cryptography, detection and correction of signal interference… This science has been instrumental in developing all modern communication technologies, even enabling communication with objects millions of kilometers away in space.

Importantly, one need not be a mathematician to utilize information theory. A key principle, accessible to the layperson, is that the higher the probability of a piece of information, the less data is required to transmit it. Almost everyone applies this principle on a daily basis without conscious awareness, for example when playing Wordle or using abbreviated texting. When we omit letters (frequently vowels) messages remain comprehensible. This process of mental gap-filling is facilitated by the limited number of potential letters and the even more restricted options to form actual words. Essentially, some bits of information contribute minimal value and can be omitted without significant loss of meaning. It follows that any messages or meetings with zero informative content—where nothing new is learned—are, in theory, superfluous. Paradoxically, the act of sending an ostensibly unnecessary message can itself carry information so it will be worthwhile examining organizational communication patterns in bureaucratic settings.

To round out this primer on information theory, it's crucial to introduce the concept of entropy in this context. Information entropy, also known as Shannon entropy, parallels its counterpart in physics in its close relationship to probability.

In information theory, entropy represents the expected value of the informative content in a message. When entropy is very low, the probability of being surprised by a received message is correspondingly low. In other words, there's minimal likelihood that the message will prompt you to update your prior understanding or representation of the subject matter.

Learning

Our introduction of physical and information entropy leads us to a fascinating paradox, particularly when we consider entities capable of learning. This paradox emerges when we, like Wiener, juxtapose physical entropy with information entropy4 in the context of adaptive systems.

Consider an event that is completely unexpected—one with a perceived probability of zero. Once we experience such an event, it ceases to be entirely unexpected. Even if we assign it a minuscule probability of 0.000001% (up from 0%), the information content of the next instance of this event will be lower. Consequently, the entropy—in the sense of information theory—also decreases. In essence, learning creates order from chaos, running counter to the tendency of physical systems to increase in entropy over time.

It's common to equate intelligence directly with the ability to learn. However, this association oversimplifies a complex relationship as the capacity of learning is a necessary condition for intelligence, but it is not sufficient on its own. Learning and life are intimately intertwined, even in organisms we might consider unintelligent.

Bacteria represent one of the earliest and most fundamental forms of life—being single-celled organisms without a nucleus, often possessing just a single chromosome. These microorganisms reproduce through binary fission, essentially creating clones of themselves. Yet bacteria exhibit remarkable adaptive capabilities and their apparent simplicity hides a sophisticated capacity for "learning" at the genomic level.

The bacterial genome serves as a historical record of past encounters, documenting the "surprises" these organisms have faced throughout their evolution. A prime example of this genomic learning is the CRISPR-Cas9 system : the genome editing technology, now deployed in the pharmaceutical industry, is a natural mechanism used by bacteria to integrate DNA fragments from viruses that have previously infected them, thus developing an immune defense against future encounters with similar viruses.

Recent research suggests that the last universal common ancestor (LUCA), some 4.2 billion years ago, already had CRISPR-Cas9 genes. This speaks to the intricate link between life and the capacity of learning through dynamic encoding of new information. While the precise mechanisms may vary considerably between species today, this ability to learn from and adapt to environmental challenges is a common thread running through most forms of life.

Learning, in all its forms, carries an energetic cost. This energy expenditure occurs in two primary stages: first during the encoding process and second when retrieving and interpreting encoded information. This principle holds true for both the complex mechanisms at work in the brain and the simpler immune processes described earlier. If life is learning and learning reduces entropy, then life aligns with our definition of a Machine even in its most basic manifestations. And it's a remarkably efficient Machine.

Here, we find ourselves in agreement with Guattari's conception that the "machine is the prerequisite for technology rather than its expression". Our line of reasoning has already led us to explore two key components of Guattari's concept of a machine5 :

  • The "material and energetic" component
  • The "semiotic, diagrammatic and algorithmic components" component

For those versed in information technology, the intricate workings of the DNA/RNA machinery must inspire awe.

Life is not the ultimate Machine

Despite its efficiency, this biological machinery is not infallible. It remains subject to errors that continuously modify genetic code, keeping life on Nature's inexorable path toward chaos. These modifications occur through several mechanisms:

  • Random mutations in existing genes (nucleotide sequences that encode protein production)
  • The emergence of de novo genes from random mutations in non-coding DNA (which constitutes the majority of the genome)
  • Horizontal gene transfers, where genetic information passes between species via intermediaries such as bacteria or retroviruses. Remarkably, it's estimated that up to 8% of the human genome consists of retrovirus remnants, even though humans don't naturally employ CRISPR-Cas9 mechanisms.

This constant tension between order and chaos contributes to the fragile equilibrium of biological systems, perpetually at risk of collapse under the relentless assaults of unpredictability and randomness. In nature's response to this challenge, we observe the creation of increasingly specific subsystems. These allow the 'machine' to concentrate its energy on defending an ever more restricted front against entropy.

Humans occupy a singular position as the only known life form to have harnessed and directed the Machine. This mastery has evolved through several stages: first by framing the machinic process, then by optimizing it (through augmentation or simplification) and finally by generalizing and multiplying it. In this endeavor, humanity has borrowed from life itself a proven method: subdividing systems into specialized subsystems. These increasingly specialized 'machines' combat chaos with growing efficiency.

Our Domesticated Machines series will examine many of these human-created machines. In the next few posts, we will keep Life as the common thread to gain deeper insights as to how Nature's technology inspired and guided the development of human machines.


1 Does expansion challenge the closed system? Neither expansion nor gravity are formally explained in physical terms
2 On Crime and Punishments, 1764
3 Father of the bit, he theorized the use of switches to encode information in a binary system in what is considered one of the most important Masters thesis of the 20th century: A symbolic analysis of relay and switching circuits, 1937,
4 “in control and communication we are always fighting nature's tendency […] for entropy to increase” - Cybernetics and Society: the Human use of Human beings
5 Chaosmosis, 1992
We care about your privacy so we do not store nor use any cookie unless it is stricly necessary to make the website to work
Got it
Learn more