The Digital Big-Bang

One Gigabyte 20 Years Ago (left), One Gigabyte Today (Right)

One Gigabyte 20 Years Ago (left)
One Gigabyte Today (Right)

source

Bill Gates is often misquoted as having said, “no one will ever need more than 640K of memory,” in the 1980s. 24 years ago, my Commodore 64 personal computer ran games like “Mail-Order Monsters” and “Archon” on a mere 64 kilobytes of memory. This was a huge advance over my 1977 Atari 2600 game console, which ran “Pong” and “Space Invaders” on a scant 128 bytes of memory. Today my dual-core Pentium uses a gigabyte of RAM, about 7.8 million times as much memory as the Atari, and, after upgrading to Windows Vista, even that doesn’t cut it anymore.

From bits to bytes, kilobytes, megabytes, gigabytes, and, with impending DVD technological advances, terabytes, our computing power grows exponentially. This empirically observed fact is known as Moore’s Law, named after Intel co-founder Gordon E. Moore, who observed in 1965 that the number of transistors on an integrated component doubles every 18 months. In other words, computers double in power every year and a half. This Law of Computing has held true now for over 40 years in an explosion of processing power that allows for what history will record as the Information Age, the times in which we are currently living.

Now it’s time to familiarize ourselves with a new measurement, the exabyte. We can thank research firm IDC’s white paper The Expanding Digital Universe for introducing us to this latest milestone, which estimates the human race collectively produced 161 exabytes of data in 2006.

So what’s an exabyte? To visualize this number, it’s helpful to begin at the smallest measurement of data, the bit. A bit is a 1 or 0, “on” or “off,” “true” or “false.” Up one level from this binary state we have the byte, which is 8 bits. If you open Notepad on your computer, type any one letter and save the file, you have generated one byte of data, which you can verify by right-clicking on the file and selecting “Properties.”

Every additional character typed and saved will add another byte to the file’s size. Every 1,000 characters is a kilobyte, and every 1,000 kilobytes a megabyte. A 90,000-word novel translates into about 0.5 megabytes1. An exabyte is 1,000,000,000,000,000,000 bytes of data, or 500 billion novels. That’s 77 novels written for every person on Earth2, and we are producing 161 times that much data, 230 billion CDs worth3, or nearly 12,400 novels for every person on Earth every year.

We produced more data last year than has been produced in the last 5,000 years of human history. That’s just for 2006, and that’s only the beginning. “In 2010, the amount of digital information created and copied worldwide will rise six fold to a staggering 988 exabytes,” that’s 12 Petabytes short of having to adopt yet another term of measurement, the Zettabyte.

The search engine Google is named after the largest number the nephew of mathematician Edward Kasner could think of, the googol. It is the number one followed by 100 zeros. By one recent estimate, it takes 450,000 computers networked on server farms to run the Google search engine, indexing 8 billion Web pages every year. I wonder when we’ll be talking about our hard drives (or maybe they’ll be flash drives by then) in terms of googlebytes?

And then we still have the googolplex waiting for us in the distant future, the number one followed by a googol of zeroes.


1500,000 characters in Novel based on a Microsoft Word Count and Character count of one of my novels, which came out to 450,000 characters for a 82,000 world novel. So this is a very conservative estimate.

21,000,000,000,000,000,000 bytes translates to
1,000,000,000,000 megabytes which translates to
500,000,000,000 novels divided by 6.5 billion human beings

3CDs hold 700MB of Data
700,000,000
161,000,000,000,000,000,000
230,000,000,000


Posted

in

,

by

Tags:

Comments

10 responses to “The Digital Big-Bang”