So far in this series, we have discussed how Maxwell’s Demon emerged in 1867 as a thought experiment designed to reveal the statistical nature of the second law of thermodynamics. If you’re new to this thread, be sure to check out the historical background and politics of Maxwell’s Demon in Part I. Part II takes us to the Information Age, where the question of entropy reframes Maxwell’s Demon for a different set of cultural concerns. Therefore, I want to begin Part II of “A Tale of Two Entropies” by hopping forward roughly a century to 1965, when Thomas Pynchon published his short novel, The Crying of Lot 49. By that time, both entropies had circulated in the public imagination such that Pynchon could insert a tongue-in-cheek Demon in his novella, yoking the two interpretations of entropy.
A Literary Prelude
As an expert of Victorian literature and science, I claim no mastery at all over postmodernism or the oeuvre of Thomas Pynchon. With that disclaimer out of the way, I’m now going to muddle through a gloss of entropy in a passage from The Crying of Lot 49. This novella captured my interest when I first read it as an undergrad, and it has remained a personally generative text as I’ve learned more about the Victorians over the years. Despite the definitively postmodern exploration of representation and semiotics, there is, to my mind at least, something surprisingly Victorian about how the characters in this novel grasp at representations to render a physical reality. And equally, yet ironically, as Bruce Clarke puts it, there is also something “un-Maxwellian, if not un-Victorian” about that same point.
After all, Maxwell’s particular strategies were successful indeed because his models remained metaphorical or analogical [1]. Victorian physics relied on various strategies of representation, from mathematics to metaphor, to coax energy out of its capacious and imponderable invisibility. Yet Maxwell, unlike some of his contemporaries, never let those “factual fictions” [1] stand in for reality. He instead used them as scaffolding or mediating devices. As such, the Demon is an aid for Maxwell, rather than something to be hardened into anthropomorphic form or objective agency. Pynchon’s novel plays with this Victorian question in a postmodern context.
By and large, The Crying of Lot 49 is not “about” entropy; but it does place Maxwell’s Demon in the center of a larger allegory on communication and representation. What concerns us here, specifically, is the “Nefastis Machine,” or Pynchon’s satirical version of a perpetual motion engine. Spoofing on Victorian spiritual and telepathic valences of electromagnetic field theory and the luminiferous ether, Pynchon positions “communication” at the nexus of thermodynamic entropy and information entropy.
I’ll mention here that the Nefastis Machine is something I puzzled over as an undergrad in a postmodern lit class, but whose imagery is now hilarious to me because I understand how the Victorians collapsed energy science, spirituality, mediumship, and communication technologies. There is so much to unpack from the Nefastis Machine, but I’ll explain the entropy parts and how they map to communication theory at the time of Pynchon’s writing.
The Nefastis Machine is a box containing an “actual” Maxwell’s Demon with which a “sensitive” individual communicates by staring at a picture of Maxwell in profile [2]. Inside the box, the Demon sorts gas molecules, and then telepathically communicates the information about those molecules to the sensitive. The sensitive feeds that quantity of information back to the Demon (again, telepathically), and a piston moves.
Nefastis, the machine’s inventor, explains his fascination with entropy to the novel’s protagonist, Oedipa Maas, but she finds the entire affair confusing and overwhelming.
“He began then, bewilderingly, to talk about something called entropy… She did gather that there were two distinct kinds of entropy. One having to do with heat-engines, the other to do with communication. The equation for one, back in the ‘30’s, had looked very like the equation for the other. It was a coincidence. The two fields were entirely unconnected, except at one point: Maxwell’s Demon. As the Demon sat and sorted his molecules into hot and cold, the system was said to lose entropy. But somehow the loss was offset by the information the Demon gained about what molecules were where” [3].
Pynchon stuffs fifty years of Information Theory and its mutagenic consequences for Maxwell’s Demon in this tidy paragraph. No wonder Oedipa’s head is swimming. In short, we now have two entropies: one thermodynamic, and one informational. Their equations look the same, but they are not the same. Somehow, Maxwell’s Demon connects them.
Nefastis concludes, “Entropy is a figure of speech, then…a metaphor. It connects the world of thermodynamics to the world of information flow. The Machine uses both. The Demon makes the metaphor not only verbally graceful, but also objectively true.” Feeling like a “heretic,” Oedipa asks, “But what… if the Demon exists only because the two equations look alike? Because of the metaphor?” [3]
And this is the crux of the issue. Very much like metaphor, which operates based on similarities and differences between two objects, the differences between information entropy and thermodynamic entropy matter. In fact, N. Katherine Hayles made this very point in her book, Chaos Bound [4]. We cannot collapse these two entropies into sameness, even if their differences are culturally and mathematically suppressed.
So, I wanted to begin with The Crying of Lot 49 because it provides us with a vivid image of the dilemma: the allegory as a shifting, changing, and (not quite) material unit. Maxwell’s Demon in the Nefastis Machine is a blackboxed variety of what was originally a neat thought experiment about the second law of thermodynamics. Now we have a Maxwell’s Demon, purportedly a concrete agent that we can never see but that definitively communicates with some, but not all, individuals about the locations of gas molecules. The Nefastis Machine belongs to Pynchon’s commentary on the information model of entropy. He intertwines thermodynamics and information theory not as natural bedfellows but as paradox: perpetual motion through communication where it is thermodynamically impossible.
We are now in a position to understand how entropy acquired another definition, as information, and how Maxwell’s Demon figures in that scientific and cultural shift.
Reconfiguring Entropy: From Boltzmann to Shannon
Let’s backpedal now, returning to the Victorians. Recall from Part I that Maxwell engaged in a lengthy correspondence with Rudolf Clausius about the kinetic theory of gases. It was Maxwell who extended Clausius’s work on molecular physics and thermodynamics.
There was another important physicist, Ludwig Boltzmann, who adopted Maxwell’s premise that the second law of thermodynamics had only statistical certainty, and who derived a formula called the “H theorem” to describe the statistical distribution of thermodynamic molecular motion [5]. Boltzmann’s stance on the thermodynamics of microstates adjusted through the years; but by the 1870s he favored a statistical over a mechanical model, thus shifting entropy into the domain of probability.
The H theorem supplied a proof of the second law of thermodynamics using probability calculus. In Jos Uffink’s analysis of this initial iteration, Boltzmann had not yet maneuvered his way into probability theory; rather, he marshalled the usefulness of probability calculus to further the description of a mechanical theory of entropy [6]. Nevertheless, he was inching towards his eventual description of entropy as a statistical measure of randomness in a closed system. The more random or dispersed the state of the molecules, the higher the entropy.
At this point, we need to make an important distinction. For Clausius, the scientist who initially coined the term “entropy,” a hot gas is more entropic because its molecules move faster and it therefore undergoes a swifter thermal exchange. For Boltzmann, however, the hot gas is more entropic because faster-moving molecules intermix more thoroughly, producing a more random configuration.
The more arrangements, the more randomness, the more entropy.
In 1929, Maxwell’s Demon returned to the conversation when Leo Szilard argued that, in order to sort molecules, the Demon needed a “kind of memory.” Sitting in his chamber, he needs to remember where fast and slow molecules are located [7]. Leon Brillouin famously took up the “memory” question in 1951. His paper, “Maxwell’s Demon Cannot Operate” [8] argues that the Demon can’t do his sorting job at all because the vessel he lives in is too dark. In technical terms, it has the radiation properties of a “blackbody,” or something that radiates as much energy as it absorbs. If we equip the Demon with a headlamp, on the other hand, he can see his molecules and sort them. But by doing this, we also introduce a new source of illumination in the system. The system must now absorb this new radiation, and so the information the Demon acquires is offset by an increase in entropy.
Where we once had a “neat-fingered being,” or even a valve (as Maxwell preferred it), we now have a Demon with a headlamp. Most importantly, Brillouin’s paper concludes that information and entropy are connected. As Hayles summarizes, “the potent new force of information had entered the arena to combat entropy” [4].
In a 1987 article in Scientific American [9], Charles H. Bennett claimed that the Demon doesn’t necessarily need a headlamp. This is a question of memory storage, not of measurement, he argued. Because the Demon needs to remember the measurements he makes, at some point he will also need to clear out that space to make room for more data. The destruction of that information results in entropy increase.
Bennett’s point is significant because it signals a shift in the imagination of entropy. No longer is entropy attached to Victorian anxieties of the universe running down and growing cold; no longer are we awaiting an apocalyptic “heat death” as prophesized by our scientific authorities in one sweeping cosmological gesture. Now we are dealing with the fear of information pile-up, until, as Hayles puts it, “[information] overwhelms our ability to understand it” [4].
But Bennett did not bring us to the thermodynamic/information isomorphism that Pynchon grapples with and satirizes in The Crying of Lot 49. That move we owe to Claude Shannon.
In 1948, an engineer at Bell Laboratories named Claude Shannon published a paper titled “A Mathematical Theory of Information” [10]. This two-part paper issued an argument that became the foundation of what we call Information Theory.
Simply, Shannon argued that information and entropy were the same thing.
Shannon based this claim on the fact that his equation for entropy took the same form as Boltzmann’s equation for entropy. Hayles calls this “Shannon’s Choice” [4], i.e., the choice to equate these two entropies based on the similarities of their equations, despite a crucial gap in meaning.
Let’s think about what she means here. For Shannon, less information means less entropy. It defines the state of a system that is easy to predict, that does not surprise us much. This is like saying you have a drawer of 10 pairs of socks, but 8 pairs are black. If you reach into the drawer with your eyes closed, there is a high probability that you will grab a black pair. That’s low entropy.
But for Boltzmann, choice has nothing to do with entropy. Entropy probability is derived from not knowing the microstates of a system. Here, think about that slow-moving, low-entropy gas. The molecules are less intermixed, and so we know more about them. It is easier to make predictions.
Obviously, there is a difference.
Circling Back
Remember that in Pynchon’s novel, Nefastis told Oedipa, “Entropy is a figure of speech, then… a metaphor. It connects the world of thermodynamics to the world of information flow. The Demon makes the metaphor not only verbally graceful, but also objectively true.”
Having glossed a history of information entropy, I want to return to this moment one last time to consider Nefastis’s comment.
Let’s start here: “Entropy is a figure of speech.”
Entropy is a term coined by Rudolf Clausius in 1865, in part chosen for its similarities to “energy” (an ancient word, then new to science: see my post, “The Trouble with Defining Energy”). Clausius chose and designed this word as part of a deliberate agenda to codify the two laws of thermodynamics.
Therefore, we would say that entropy “actually” is a material-discursive entity. That is, while we cannot reduce entropy to simply a figure of speech, simply a metaphor, entropy is also not an objective thing, out there in the universe, that Clausius and his colleagues “found” and “discovered.”
Entropy, like energy, is both physical entity and discursive construction. It is what Donna Haraway (and other feminist science and technology studies scholars) would call a “natureculture,” or the entanglement of natural phenomena and the historical/cultural/semiotic practices through which we make sense of natural phenomena [11].
So, Nefastis is not quite right here; but he’s also not completely wrong. Particularly, what makes entropy so perplexing is its transformation along cultural and historical fault lines. In the Victorian era, Thomson and his colleagues weaponized entropy against the secular materialists like Darwin and Tyndall in order to shore up support for a North British theological agenda in physics. In the Information Age of the twentieth century, however, the cultures shifted; the anxieties shifted. Shannon could argue that information and entropy are the same thing based on mathematical isomorphism, but also because, as Bennett argued, information pile-up was a novel cultural threat. The Victorians did not worry about dealing with too much information, but people living in the mid-twentieth century did, and I imagine those twentieth-century folks would be horrified by the information pile-up in our lives today.
And where is the Demon in all of this?
“The Demon makes the metaphor not only verbally gracefully, but also objectively true.”
What does it mean to make a metaphor “objectively true”? If you think about it, a metaphor can never be objectively true. It’s anathema to figurative language. If I say, “my love is a fire,” let’s truly hope my love is not literally fire. In fact, it can’t be; that makes no sense. This metaphor operates because love and fire are different entities. We see a fire in our mind’s eye. We know how hot it is, how it burns, how we can kindle a fire, stamp it out, or let it rage out of control. All that sensory richness we pack into the word “fire” and then attach it to the word “love.” Fire is the concrete anchor, and love is the floating abstraction that we pin down with that anchor. And, in doing so, a sort of magic happens where “love” acquires new dimensions. But love is never literally fire.
Returning to the Demon, then, what does it mean when Nefastis says that Maxwell’s Demon makes the metaphor of entropy objectively true? I would argue that Pynchon is alerting us to the reality that has emerged from a heuristic of representation. In fact, most of The Crying of Lot 49 questions what is “real” and what is representation. In the case of Maxwell’s Demon, though, the original metaphor that Maxwell used in his letter to Tait is stacked beneath other metaphors. In other words, the “anchor” of the metaphor is no longer a concrete entity, but rather another metaphor. The Demon becomes its own anchor: the sorting Demon is now a headlamp-wearing Demon. From here, new science and cultural context emerge intertwined. To my mind, that’s what Pynchon is getting at.
So, the Tale of Two Entropies is still unfolding, particularly because contemporary data science uses the concept of information entropy as a modality that will inevitably shift as time progresses. Again, as a Victorianist, I don’t claim familiarity with that domain of knowledge, and when my husband, a data scientist, discusses his work with me I sometimes nod politely because I just don’t know what on earth he’s talking about. However, you don’t need data science expertise to work through the history of entropy, luckily for me. I think that tracing out the historical trajectories of scientific metaphors is a useful exercise because it reveals how figures like Maxwell’s Demon mediate between our desires and our measurements in and of the world around us.
Citations
[1] Clarke, Bruce. Energy Forms: Allegory and Science in the Era of Classical Thermodynamics. University of Michigan Press, 2001.
[2] Pynchon, Thomas. The Crying of Lot 49. 1965. HarperCollins Publishers, 1966.
[3] Maxwell, James Clerk. Matter and Motion. 1876. Macmillan, 1920.
[4] Hayles, N. Katherine. Chaos Bound: Orderly Disorder in Contemporary Literature and Science. Cornell University Press, 1990.
[5] Harman, P.M. Energy, Force, and Matter: The Conceptual Development of Nineteenth-Century Physics. Cambridge University Press, 1982.
[6] Uffink, Jos, "Boltzmann's Work in Statistical Physics", The Stanford Encyclopedia of Philosophy (Spring 2017 Edition), Edward N. Zalta (ed.), https://plato.stanford.edu/archives/spr2017/entries/statphys-Boltzmann/.
[7] Szilard, Leo. 1929. “On the Reduction of Entropy as a Thermodynamic System Caused by Intelligent Beings.” Zeitschrift für Physik 53: 840-856.
[8] Brillouin, Leon. 1951. “Maxwell’s Demon Cannot Operate: Information and Entropy. I.” Journal of Applied Physics 22 (March): 334-357. https://doi.org/10.1063/1.1699951
[9] Bennett, Charles H. 1987. “Demons, Engines, and the Second Law.” Scientific American 258 (November): 108-116. https://www.scientificamerican.com/article/demons-engines-and-the-second-law/.
[10] Shannon, Claude E. 1948. “A Mathematical Theory of Information.” Bell Systems Technical Journal 27 (July and October): 379-423, 623-656.
[11] Haraway, Donna, and Thyrza Goodeve, How Like a Leaf: An Interview with Donna Haraway. Routledge, 1999.
See also: Latour, Bruno. We Have Never Been Modern. Translated by Catherine Porter. Harvard University Press, 1993.