If you rely only on the primary definition of “entropy” in the Encyclopedia Brittanica, you might think it’s not such a good thing. It sounds like a description of the employee in your shop who doesn’t do any work, as in “the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work.”
Unavailable for useful work? Who wants that? Well, it turns out if you’re interested in the security of your communications or anything else encrypted that is available over the web—and obviously you should be—entropy is more than just useful. You need it in an existential way. Because, as Brittanica also puts it, entropy is “a measure of the molecular disorder, or randomness, of a system.”
And in the digital world, randomness, correctly applied, is your friend. It protects you by making it hard for hackers to break your encryption. In a way, it can be the best “employee” you’ve got.
What entropy is, where it comes from, how it’s applied to encryption, and why it’s important is the focus of this, the first in a series of three blogs on encryption. Subsequent blogs will look at federal government initiatives to set higher encryption standards, and what tools and services are available to organizations to help them meet those standards within impending deadlines.
Its definitions and uses have evolved over the 175 years since it was first applied in thermodynamics to describe the molecular disorder, or randomness, of a system.
It entered the world of computing nearly a century later, through American mathematician and electrical engineer Claude Shannon’s 1948 paper "A Mathematical Theory of Communication." He introduced the concept of "information entropy," which quantifies the uncertainty or randomness of a random variable, crucial for understanding and optimizing information transmission and storage.
Today, in the digital world, uncertainty is what helps to protect an organization’s jewels—the secret strings of data called cryptographic keys that are used with algorithms to encrypt and decrypt information.
The goal of encryption has always been to turn text or data streams into gibberish that nobody else can read. Entropy improves the opacity of that gibberish, making it even less predictable and more random, and therefore harder to crack.
Actually, it’s lurking everywhere. In many cases, hardware random number generators collect what sounds like it would be useless, irrelevant data from human activity—mouse clicks or movements, computer keystrokes, and disk activity. Those sources are even called “noise,” suggesting the static on a TV or radio channel that drives most of us crazy. And indeed, it doesn’t have to be human-generated. Other sources of entropy include atmospheric noise and quantum processes. Another term for them is “jitter” sources.
But all that noise and jitter are anything but useless. Encryption boosted by that kind of randomness is what makes it difficult for malicious hackers to crack it. High entropy yields secure communication and data protection. If it’s low, it’s a bit like leaving your doors and windows unlocked.
A caveat: There is no such thing as perfect randomness, just as there is no such thing as perfect cybersecurity or perfect software. But just as a complex password with a long, random collection of letters and symbols yields much more protection than a predictable “123456” or “password1”, encryption juiced by strong entropy makes a much more difficult target. And after all, most of the time, cyber criminals are looking for easy targets.
Finally, entropy’s importance is universal—crucial to the online security of everything and everybody—organizations, governments, and individuals—because quantum computing, while it is not yet mainstream, is looming. And it will be a more powerful tool in the hands of both good guys and bad guys. The brute force of brute force attacks will be more brutal.
So high randomness is crucial. Low entropy means weak security. As the legendary Hungarian and American mathematician and computer scientist John von Neumann put it many decades ago, “Anyone who attempts to generate random numbers by deterministic means is, of course, living in sin.”
That’s still relevant today, and is one of the reasons that NIST, the National Institute of Standards and Technology is requiring organizations that want to sell digital products to the federal government to up their entropy game. More on that in the next post.