Memories of Claude Shannon, father of information theory

“My greatest concern was what to call it. I thought of calling it ‘information,’ but the word was overly used, so I decided to call it ‘uncertainty.’ When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.’” – Claude Shannon, Scientific American

Lest we take information for granted, we find ourselves forsaking the groundbreaking work of mathematician Claude Shannon. The data and functions of computers would not have been possible had Shannon made great strides in the 1930’s and 1940’s about the nature of information. With his pivotal discoveries at a young age, Shannon’s work at Bell labs and MIT changed the ways computers operated forever. Shannon’s discoveries paved the way for information is quantified, stored, managed, and used in any sort of operation on an entirely different level.

Before scientists could approach information using their own techniques, scientists needed to establish guidelines on what matter, energy, and measurements were. And, before Shannon took the stage, the scientists which paved the way for information theory to take root borrowed ideas from chemistry and statistical mechanics. Mathematical physicist James Clerk Maxwell demonstrated methods of managing the disorder of gas molecules in the 1870’s. By this, he taught how to analyze the speeds of the molecules, categorize them by temperature, and even put forward thought experiments such as “Maxwell’s demon.” This research was followed up by Ludwig Boltzmann’s work in establishing the physics foundation of thermodynamics in the 1890’s. Even later, in 1929, physicist Leo Szilard created a potential solution to Maxwell’s demon, and developed the mathematical form for the amount of entropy produced by a bit-wise measurements of gases. Using the words of “entropy” and “memory” in talking about these systems, their potential for allowing computers to store and analyze large amounts of information was less of a convenient metaphor and more of a direct application of these scientific discoveries.

In the 30’s, the 21-year-old scientist demonstrated binary circuits performing actions based on logic. By constructing these logic circuits, computers could perform operations, be them simple or complex. His monumental piece “A Mathematical Theory of Communication” would later be described as the “Magna Carta of the Information Age.” It changed the fundamental beliefs scientists had about information itself and allowed for a far greater effectiveness in communicating information. Accuracay, precision, cost, and function greatly improved through the information a scientist could send and receive. The paper was the first to describe the “bit” a unit of measurement of information carried through different media. A bit is a choice. On or off. Yes or no. One or zero. Shannon saw that these pairs are all the same.

On top of this, Shannon’s work featured values that were common and anticipated to the fields of mathematics and physics. His solutions were elegant, simple, and, by those factors, beautiful. Removing all that is unnecessary and superfluous and delivering the essential features of information theory, Shannon’s paper become a giant upon whose shoulders future researchers would stand. Soon, students interested in mechanical engineering and mathematics found scientists starting multi-million dollar companies and putting forward incredibly practical applications of their disciplines. Titans like Steve Jobs and Bill Gates would arise in the Information Age, and the general public’s access to computers, televisions, and phones would change the landscape of science and engineering forever. Shannon continued to emphasize that the scientific notion of information is void of meaning itself. Instead, chaotic systems, and strings of random numbers, altogether meaningless, are dense with information.

For the rest of his life, Shannon kept a lighthearted demeanor. Taking interests in juggling, poetry, and unicycling, he turned down fame and, instead, chose to remain an earth-shattering researcher in the world of information sciences.

Finally, near the end of his life, the poet-mathematician wrote a letter to Scientific American:

Dear Dennis:

You probably think I have been fritterin’, I say fritterin’, away my time while my juggling paper is languishing on the shelf. This is only half true. I have come to two conclusions recently:


1) I am a better poet than scientist. 2) Scientific American should have a poetry column.


You may disagree with both of these, but I enclose “A Rubric on Rubik Cubics” for you.


Sincerely,


Claude E. Shannon

P.S. I am still working on the juggling paper.

Published by


Leave a comment