Entropy for Information Systems

Entropy is a fairly easy concept to define, the measure of disorder in a closed system, and a rather difficult concept to grasp, but one that furnishes us with wonderful insights into the way the world around us operates. The amount of entropy in the Universe is ever-increasing, the energy concentrated in our sun is constantly radiating away in light and heat, dissipating into an unusable state, absolute undifferentiation.

Sunflower
Sunflower
Credit: riandreu

Living things form “pockets of resistance” to the force of entropy. They do this through syntropy, or negentropy, which is the entropy we export to reduce our internal entropy; in other words, it’s the waste energy we generate to keep our soma in an organized working state. We collect the sun’s waste energy and use it to organize ourselves through syntropy.

How Much Information Entropy?
How Much Information Entropy?
Credit: Moi

In Information Systems, entropy, known as Shannon entropy for Claude Shannon, is the measure of uncertainty in a random variable. A coin toss has one bit of entropy for the 50/50 chance of it turning up heads or tails, 0 or 1. A six-sided dice carries three bits of entropy for the possible outcomes it may produce with each roll (1 (000), 2 (001), 3 (010), 4 (011), 5 (100), 6 (101)). The weather has an amount of entropy difficult to quantify, but it varies from location to location. The weather in New York has more entropy than the weather in Southern California because Southern California has a more consistent climate. Similarly, in our first example, if we were dealing with a rigged coin, one that turned up heads more often than tails, then there would be less than one bit of entropy in each coin toss because we would expect heads more frequently than tails.


On the face of it, the only thing in common with the thermodynamic and information theory definitions is that entropy is a measure of disorder, but the two are analogous in other ways. In our thermodynamic Universe things move toward a state of increasing entropy, and a similar tendency towards a state of total uniformity occurs in an information system, only in reverse.

A living organism in an information system starts out in a world of absolute entropy, nothing is known. As that life interacts with its Universe, the amount of entropy in the Universe decreases for that being, and the amount of its internal entropy increases as what it knows becomes more of a variable to its peers. As the beings living in an information system decrease the entropy of their universe, it tends toward a state of absolute syntropy, absolute predictability.

We exist in a thermodynamic system, and it powers the information systems in our brains, the information systems we construct, and the information system these combine to form in our civilization. The increase of syntropy in an information system comes at the cost of an increase of entropy in the thermodynamic system powering it. Our thermodynamic system is winding down, a bad thing for us, but our information system is becoming increasingly more sophisticated, more syntropic. This brings a deeper insight to H.G. Wells’ prescient observation: “Human history becomes more and more a race between education and catastrophe.”


Notes:

  • With this understanding of Information Entropy, apply this deeper understanding to the Monty Hall Problem (and the interactive demo of information entropy in effect.
  • Playing with a deck of cards is also a fun way to think about information entropy. What’s the measure of entropy in a 52-card deck? What’s the entropy of just the suits?
  • Simulations and bioinformatics are giving us increasing syntropic power over previously chaotic (read “highly entropic”) systems. Chaos theory could just as well be “Information Entropy Theory.”

  • Posted

    in

    ,

    by

    Tags:

    Comments

    4 responses to “Entropy for Information Systems”