The Information, popular science author James Gleick’s new book, begins with a history of information technologies.
Starting with African drumming, it takes in Babbage’s Difference Engine, the telegraph, the telephone, the transistor and the Internet. Some of this might be familiar territory for Information Age readers, but it is as engrossing an account of that story as any and establishes the point that information theory, even at its most abstract, has its roots in machines built by man, usually for very practical reasons.
But as the machinery of information evolved, so too did the concept. Gradually, Gleick lifts the discussion from the technical to the mathematical, and on into the philosophical. Networks and switches give way to bits and bytes, then randomness and uncertainty.
He describes how information theory underpinned many of the critical scientific breakthroughs of the 20th Century, including the discovery of the structure of DNA and numerous esoteric tenets of quantum physics.
As the book wends on, something more elusive begins to emerge – a definition of information. Everyone knows information when they see it, and they know how important it is in modern society. But nailing down exactly what it is, at root, is a remarkably tricky problem.
One important idea, put simply, is that information is a measure of uncertainty. If a message – a string of numbers, say – is entirely predictable from one number to the next, it contains little information. If a simple code can be used to describe the contents (not the meaning) of the message, then all that needs to be transmitted is that simple code.
Quite counter-intuitively, there is a lot of information in an entirely random string of numbers, because extra information is required to explain each subsequent number.
Gleick exposes this as a profound realisation, throwing light on some of the biggest questions in science. What was there before matter? There was uncertainty, and therefore there was information. It is also an idea with very practical ramifications – the degree to which data can be compressed is a function of how random it is.
Information Age readers will probably find all of this interesting, and that alone is enough to recommend the book. A trickier question, though, is what implications information theory, as described here, might have for people whose job it is to collect, manage, protect and analyse information.
This reviewer believes that there are many. For example, the definition of information as a measure of uncertainty articulates why acquiring it helps organisations to make decisions. The conception of information as a fundamental unit of existence shows why information about people is precious, and demands to be treated with respect.
Indeed, the insights in Gleick’s book should reveal to any information professional that the stuff of their work, though often dry in its corporeal form, is in fact fundamental, essential and even quite magical.
The Information: A History, a Theory, a Flood By James Gleick.
Published by Fourth Estate