Epochal Entropy:
Constructive Chaos for the New Millenium

by David Pelovitz

Traditionally, after the Giants beat the Redskins, I get to call my old friend Steve and harass him. This year was no exception. During the course of our subsequent conversation he asked me to read through a press release he was writing for Greenpeace. But he didn't tell me what it was. It turned out to be a lengthy report on CFCs and ozone depletion and all kinds of god-awful depressing stuff about how we humans were blowing it on the planetary level. And I'm tired of seeing that happen.

It's 1994, and I find I'm hearing more people talking about the millennium -- the big change that must come in the year 2000. I'm generally not impressed by superstitions, but even a coldly logical look at the world shows that our old way of thinking is failing on a global level. I can still remember Bob Geldof telling my generation that we do have the resources to feed the world, then last month I watched the nations of the world conclude that those resources were irrelevant because there were too many people. I have no doubt the earth will get along fine without the human race, but we may be the only species ever to have a choice in whether we go extinct. I can't help but wonder if we'll blow it.

Back in April Terence McKenna came to New York and posed an interesting question: Do we have any reason to hope for the future of humanity? He answered yes, based largely on an evolutionary selection toward ingenuity in which we would find our mutual survival. But during his lecture, I kept thinking about mutations.

The backbone of evolution is that the species best able to adapt to their environment will survive. But many adaptations occur through mutation. Certainly, the word mutant has a well- deserved nasty reputation -- in the vast majority of cases mutations are negative. But when you allow for the millions of pieces of genetic information and multiply that by the number of ways it can be mistranscribed, you find yourself well into big numbers. And those one in a million positive mutations are occurring -- maybe only once every million times -- but they do occur. From the largest perspective, we have the second law of thermodynamics telling us that only so much energy exists in the universe and that it will eventually dissipate, and the universe will not so much die as come to a halt. In every naturally occurring process, entropy -- the measure of energy that cannot be converted to thermodynamic work -- increases, so the universe comes closer to being the same temperature. Once simple heat exchange can no longer occur, no naturally occurring process can take place. But entropy is also the measure of disorder in any system. As this universe is dissipating, it is becoming less orderly. And this is where I look for the millennial change.

In 1967, Norbert Wiener looked at the problem of entropy and said "it is highly probable that the whole universe around us will die the heat death." But he also pointed out that this is well into the universe's future. In the meantime, it is possible for entropy to decrease on the local level. The act of planting a tree or educating a person are ways to decrease entropy locally. Another one way to decrease entropy is through increased information (Wiener, 44-5).

James Clerk Maxwell once theorized that a demon could sit within a two chamber box and sort the molecules by their relative heat. If the demon could control the opening and closing of a door in the box, then all the hot molecules could be put within one chamber. If each chamber had a piston attached, then the piston in the hot chamber would rise, and energy would be created without the expenditure of thermodynamic work. If the demon did exist, the action of decreasing thermodynamic entropy would also result in a gain in information about the position of the molecules. Yet Maxwell's Demon is a rare case where thermodynamic entropy and informational entropy decrease together.

Information is subject to entropy. Every time a thought gets lost between one intelligence translating it into and expressing it as language and another intelligence receiving and decoding the message, informational entropy increases. Each time a message is slightly misunderstood or egregiously massacred in translation, chaos ensues.

Claude Shannon, who wrote "The Mathematical Theory of Commuincation," thought ordinary language was not random. To him, language is patterned in such a way that each word actually contains less than one word's worth of information, because each word's meaning is determined (to an extent) by its context. Therefore, in an orderly system of language, information is severely limited. But in a chaotic system, where words may not relate in what we usually consider a coherent fashion, information is actually developing. The interpreter will need to make suppositions to fill in the missing details. Information is actually created as a result of entropy.

Starting from the normal level of misunderstanding any time two people try to communicate, multiply. Let's consider this article. Assuming my examples refer to things you recognize, you come up with an association similar to mine, though not identical. Now multiply by the number of times someone doesn't get the reference. Multiply again by the times someone stops paying attention and picks up again lower on the page without backtracking. Multiply by the number of times line noise in some connection changes enough alphanumeric symbols that someone is faced with a missing word or two and invents new ones as replacements. That's just a start of the number of potential variant messages that could arise from this single message. When information falls victim to entropy, more messages are generated -- which is a two-edged sword, of course, because untrue information may be valueless, but who is to say what's untrue?

When chaos theory comes into play, the world gets very odd very fast. Probability mathematicians looked at megalotteries and said that chances of any particular person ever winning the lottery were ridiculously low, but given the number of mega- lotteries that take place in the USA, the chance that someone would eventually win twice were extremely good. Soon after, a woman in Pennsylvania actually did win twice. It is true that all snowflakes are structured on six points. But despite that limit, we are always told that no two are the same. If we consider snowfall, any flake could land anywhere. But this disorderly rule leads to an even cover of snow every time.

Look back now at Maxwell's two-chamber box and remove the demon. Conquering entropy isn't helped much by an imaginary creature anyway. If we create the box with two pistons and an open door between the two chambers, there is a possible set of conditions under which all the hot molecules will just be on the same side of the box without the influence of any intelligence and raise that piston as a result. It is unlikely but possible. And since we are talking about big numbers, the chance of the unlikely actually occurring gets much larger. Which isn't to say we may not have to sift through a lot of chaos and dross before the epochal change that should be coming.

The truth is, in any given set of circumstances, decay (some would say death) is the most likely outcome. It may well be that we have to watch our society fail on a global level. But we may not. If death is the most likely outcome in any set of circumstances, then every situation that does not result in death must result in more possibilities. A new set of possibilities forestalls entropy, and local reductions in entropy are always possible results. We may never defeat it, but any of those local decreases might just be more than a little localized. Any one of them might change the world in a way that forestalls mass decay and heat death a little longer.

And right now, amidst the loss of Bill Parcells, Lawrence Taylor, and Phil Simms, the Giants have fashioned a 3-0 record. I find hope in that :-)


Copyright © 1994 David Pelovitz
PELOVTZD@acfcluster.nyu.edu
Enterzone Copyright © 1995
Forward to for kay and wsb
Back to Lime in Calipers
Path of Least Resistance
Cover of Episode 1
Enterzone Home Page