On Encryption, Part 1: The Morality of Mathematics

I have been privileged throughout my life to be surrounded by many intelligent and articulate people. Between friends, colleagues, and family members, I regularly have a chance to engage in real, honest, and meaningful discussion on a wide range of subjects. I value and cherish this fact for numerous reasons, not the least of which being that it forces me to be better: to be more rigorous, to be more intellectually honest, to be more proactive in examining the world. A few days ago, when Apple released their letter to customers regarding the San Bernadino case, I had such an engagement with my friend James. Our conversation reaffirmed my desire to write this series of posts, discussing this subject and trying to both inform and advocate, in a way that hopefully distinguishes between the two.

This first post will be mostly a discussion of facts and terminology. A lot of people I know, especially my less technically-minded friends and colleagues, are not really aware of the state of modern cryptography, and the subject is woefully misconstrued in the popular press and often in our political system. People cannot have a meaningful discussion about this important topic when so many of us operate from a position of relative ignorance on encryption. Hopefully, I can do a little to change that.

My second post will finalize some technical distinctions, and will then offer some arguments about how we respond to encryption in our modern lives.

I’m going to explicitly avoid discussing the particulars of the Apple case, as it is a little different from some of the broader items that I’ll discuss below. I’ll see about possibly doing a follow-up on it at some point, or you could just consider reading these several items that I’ve found educational:

A caveat: I am distinctly not an expert in the field of cryptography. I have a fair working understanding of some of the basic principles, and I’ve explained it several times to doctors and scientists who needed to protect patient data. I use several different pieces of cryptographic software in my day-to-day life, and tend to enjoy messing around with these things. But I am not the end all and be all of this field. I can certainly point you in the direction of those folks, if anyone is interested.

What is a computer?

Let’s get the very basics out of the way first: a computer is a machine that does relatively simple math at the core.

In the modern era, this can be hard to remember. Most of us interact with computers on a daily basis – be it a smartphone, a laptop, or a tablet. Using a computer doesn’t really feel like math. You log on and you type, you “Like” photos on Facebook, you send texts to your friends, and you might even play a video game. None of this feels like doing math (unless maybe you are playing Number Munchers). There’s actually a term for this: abstraction. Abstraction is why you can do things on the computer without thinking of all the math that is supporting it. (Note that I’m using abstraction in a very broad sense here; in the practice of computer science, it is a much more narrowly defined term.)

Abstraction is a wonderful thing, and it’s a foundational part of the modern computing landscape. Abstraction is why everyone can learn to write simple programs, for instance, and why the modern information economy is feasible. Abstraction is why I can use a computer to perform complex statistical analysis, without needing a lifetime of education in electrical engineering. Because of abstraction, you don’t need to be a mathematician to use a computer. (Fun fact: a computer used to be a title for someone who computed things, which does make a certain sort of sense.)

So, why does this matter? Well, it matters because everything we do on the computer is really a mathematical operation happening somewhere on a physical machine. This includes encrypting and securing information, which is the subject studied in cryptography.


Cryptography is about keeping secrets. Specifically, cryptography is about communicating with someone else, while also avoiding having your message revealed to a third party. It’s a very old field – people have been trying to keep secrets for a long time, after all – but the field has advanced tremendously since around World War II. In fact, cryptanalysis played a distinctive role in World War II, and the cracking of the Enigma machine was a key part of the Allied victory in Europe.

I titled this post “The Morality of Mathematics” to draw attention to how we, as humans, tend to project our concepts of morality and ethics onto mathematical constructs, with no consciousness of the fact that such a projection may be suspect. Nowhere is this more true than in cryptography (though a close second is probably statistics).

Cryptography has a single, elegant division: the adversary and you. You are trying to share your message privately with someone else. The adversary is trying to gain access to your message. There is no greater framework from a mathematical sense. There is merely the adversary and you.

A cryptographic algorithm does not know if the adversary is the “good guys” or the “bad guys”. It doesn’t know if the secrets you are keeping are “evil” or “just”. It has no concept of privacy, no moral imperative, and no ethical constraints. It is a mathematical equation built off the concept of the adversary and the secret. Just as a second derivative or a standard deviation has no inherent moral quality, neither does any encryption system. The morality of mathematics is not the morality of humans.

This is not instinctive to us. We can all see that there are places where someone keeps a secret to do harm to others, and we can all see places where we think the government is justified in trying to find out those secrets. As such, it is easy to be seduced by the logic that states that “We want encryption to protect the good guys, but not the bad guys.” This is an understandable point of view, but unfortunately, encryption just does not work that way. There is no mathematical construct for “the good guys”, or for “honesty”. There’s only the adversary. Building a system that keeps out the adversary means that you keep out the adversary, whether that adversary is the justified government conducting an investigation, or a foreign state trying to steal military secrets, or a hacker trying to steal your identity. The adversary is merely the person who is trying to learn your secrets. It has no greater meaning than that.

Open secrets

Among those who are not knowledgeable in the field of cryptography, it is not well-understood that the field is heavily reliant on open source code and algorithms. Again, our native instincts make sense, but lead us in the wrong direction. When trying to keep secrets, it seems natural that you would want to disguise your methods, in order to prevent your adversary from devising a way to break your encryption. This concept, known as security through obscurity, was once commonplace, but is not considered a good practice any more.

The competing concept is best stated through Kerckhoffs’ principle:

The cipher method must not be required to be secret, and it must be able to fall into the hands of the enemy without inconvenience.

All modern cryptography, and essentially all leading modern cryptographers, believe in Kerckhoffs’ principle. In fact, there exist high-quality, completely free (as in both beer and speech) cryptography systems that anyone with a modicum of technical ability can use. I make use of two of them regularly: OpenSSH and GnuPG. (Some might claim that calling OpenSSH an encryption or cryptography system is a little misleading – I concede that it is not exactly what people would consider an “encryption program”, but counter that it runs on the same fundamental mathematical principles that are discussed above.)

Why discuss this idea of “open secrets” at all? Well, it is relevant to understanding why requests to add “backdoors”, or to add deliberate flaws to cryptography systems, are in fact, dangerous. Eric S. Raymond wrote a seminal piece on open source software engineering, called The Cathedral and the Bazaar. In it, he coined what he called Linus’s law, after Linux kernel creator Linus Torvalds:

Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.

The shorter, more pithy version is “Given enough eyeballs, all bugs are shallow.” So, how do we apply Linus’s law to cryptography? We’ll get into that more next time.

High quality encryption

Before we part today, I want to touch on the fact that home users have access to extremely high quality encryption. There is the idea of “just hack the encryption”, made popular in the media and on television by government agents and wizard hackers breaking into things with great ease. The reality is far different. In practice, a properly implemented modern encryption system is functionally unbreakable with modern technology. Now, that isn’t to say that there are not attack vectors which can be used, or ways in which encryption can be broken. That is, after all, the point of the field of cryptanalysis.

For many purposes, however, you can be very secure while using open source tools, like the aforementioned GnuPG. A properly encrypted file, using a secure private key and password, is effectively impossible to unlock using a brute-force attack. (I’ll leave aside larger discussions of quantum computing, information-theoretic secure encryption, and one-time pads, as that gets us far afield of the main question.)

The takeaway here is that very high quality encryption is available to the everyday user for essentially zero cost. Once you realize how easy it is for the average user to obtain very high quality encryption – generally considered to be on-par with what the government uses – the discussion about encryption suddenly becomes very different. We’ll touch more on that in the next post!

Comments are closed.