Entropy for Information Systems
Entropy is a fairly easy concept to define, the measure of disorder in a closed system, and a rather difficult concept to grasp, but one that furnishes us with wonderful insights into the way the world around us operates. The amount of entropy in the Universe is ever-increasing, the energy concentrated in our sun is constantly radiating away in light and heat, dissipating into an unusable state, absolute undifferentiation.
Living things form “pockets of resistance” to the force of entropy. They do this through syntropy, or negentropy, which is the entropy we export to reduce our internal entropy; in other words, it’s the waste energy we generate to keep our soma in an organized working state. We collect the sun’s waste energy and use it to organize ourselves through syntropy.

How Much Information Entropy?
Credit: Moi
In Information Systems, entropy, known as Shannon entropy for Claude Shannon, is the measure of uncertainty in a random variable. A coin toss has one bit of entropy for the 50/50 chance of it turning up heads or tails, 0 or 1. A six-sided dice carries three bits of entropy for the possible outcomes it may produce with each roll (1 (000), 2 (001), 3 (010), 4 (011), 5 (100), 6 (101)). The weather has an amount of entropy difficult to quantify, but it varies from location to location. The weather in New York has more entropy than the weather in Southern California because Southern California has a more consistent climate. Similarly, in our first example, if we were dealing with a rigged coin, one that turned up heads more often than tails, then there would be less than one bit of entropy in each coin toss because we would expect heads more frequently than tails.
On the face of it, the only thing in common with the thermodynamic and information theory definitions is that entropy is a measure of disorder, but the two are analogous in other ways. In our thermodynamic Universe things move toward a state of increasing entropy, and a similar tendency towards a state of total uniformity occurs in an information system, only in reverse.
A living organism in an information system starts out in a world of absolute entropy, nothing is known. As that life interacts with its Universe, the amount of entropy in the Universe decreases for that being, and the amount of its internal entropy increases as what it knows becomes more of a variable to its peers. As the beings living in an information system decrease the entropy of their universe, it tends toward a state of absolute syntropy, absolute predictability.
We exist in a thermodynamic system, and it powers the information systems in our brains, the information systems we construct, and the information system these combine to form in our civilization. The increase of syntropy in an information system comes at the cost of an increase of entropy in the thermodynamic system powering it. Our thermodynamic system is winding down, a bad thing for us, but our information system is becoming increasingly more sophisticated, more syntropic. This brings a deeper insight to H.G. Wells’ prescient observation: “Human history becomes more and more a race between education and catastrophe.”
Notes:






)) You forgot a closing paren :)
Good stuff. Entropy with cards… Sounds like my Combinatorics exam.
Comment by ClintJCL aka Rev. Xanatos Satanicos Bombasticos — August 30, 2010 @ 9:25 am
I actually took a course in chaos theory, and trust me, it has nothing to do with information entropy(or maybe I just don’t see it). And chaotic systems really don’t have to be highly entropic.
In it’s simplest form an inverted pendulum is a chaotic system. It has 2 points of stability with one being much more likely than the other. The only thing affecting the outcome is the initial conditions. So in comparison to the coin toss example, the inverted pendulum is actually less entropic. It’s more like the rigged coin.
Comment by Chriggy — August 30, 2010 @ 2:39 pm
Related comic:
http://www.smbc-comics.com/index.php?db=comics&id=1986
Comment by Dave — August 31, 2010 @ 7:35 am
The double pendulum is an excellent example of how I think shannon entropy relates to chaos theory. The double pendulum is a deterministic system, but minor variations in its initial state result in behaviors that are seemingly impossible to predict. It seems impossible to predict, because we haven’t yet mastered the variables necessary to forecast its behavior, but if we master that data, overcome the shannon entropy in the system, then we will predict, with increasing accuracy, how the system will behave.
As our capability to simulate such systems grows exponentially, we will increasingly master other chaotic systems like the weather and the orbits of Saturn’s moons. Of course, once a system’s information is rendered syntropic, it’s no longer chaotic. The domain of chaos theory is inversely proportional to the scope of civilization’s knowledge.
Of course, there’s also a really good probability that I’m completely wrong on this too. : )
Comment by ideonexus — August 31, 2010 @ 7:05 pm