Apple versus the FBI: Lessons from my MBA class
The terrorist attacks and deaths in San Bernardino last year were tragic and deserve justice. In the FBI’s efforts to investigate the case, they are pursuing every method they can to gather information about the culprits. Toward that end, the FBI has been working with Apple to unlock an iPhone owned by one of the terrorists. What started as a joint effort has degenerated into a stand-off between Apple and the FBI. The FBI has obtained a court order demanding Apple to create an alternative operating system for the iPhone that can be installed as an update, that opens a backdoor method for the FBI to break into the phone. On the one hand, the FBI wants the tools necessary to complete their investigation. On the other, Apple sees such a tool as a fundamental security breach and refuses to comply. Why? It may be difficult to see Apple’s reasons, but I believe there are two key lessons that I teach my MBA students that can help distill some of the complexity in this case and present why I think Apple is not only correct but should be actively supported. First, software is an information good, which has fundamental differences to physical goods, especially when it comes to security. Second, software is a written opinion, with implications for copyright and free speech. Let me explain.
Why won’t Apple write the new operating system voluntarily? As CEO Tim Cook says, once the change is made, it fundamentally weakens the security of their phones. To understand why he might say this, consider what it means when I say software is an information good. As I mentioned in my last post on Apple’s fight for your privacy, information goods have characteristics fundamentally different from physical goods, such as the ability to make infinite copies at virtually no cost and the ability to distribute those copies at virtually no cost. Furthermore, once the code is created, it isn’t used up by use like food might be. This is why, in software development, security weaknesses are never ever purposefully created, even if just for a one time project. 70 years of experience in software development has led to the principle of security. Inevitably, these one-off projects and just as dangerous, knowledge that such a possibility even exists, are enough for hackers find it, build it, duplicate it, and/or share it with all of their friends. This is why Tim Cook calls this request the software equivalent to cancer.
Furthermore, despite the director’s FBI claims to the contrary, this is not just a one-off case. That’s not how our legal system works. America is founded on this common law principle, meaning that once precedent is set it is binding on all further cases. Once precedent has been set that a government agency can force a technology company to rewrite their code – it can do it in other cases as well. As more agencies make similar requests (as they already seem to be champing at the bit to do), more copies will be distributed. Even if each and every one of these agencies promise to destroy their copy after using it, all it takes is one instance of failure (whether by accident or intent) for the code to be released into the wild. And then… well, the security of each and every iPhone in the world will be compromised. Even if we assume that those government agencies are honest, hackers are not. Hackers that obtain the code could gain access to your private messages, your health information, and perhaps even your banking information.
We know data breaches are not uncommon. We hear about them every day. Just because a government agency promises to destroy the code, doesn’t mean hackers can break into their systems, find the code and use it.
And even if you individually are super careful not to use your iPhone (or any other smart phone) to store any personal information, there is no reason to deny others the right to use their iPhone to use such information. Apple promised end-to-end encryption and stiff security to help protect that data if your phone is ever stolen. Like software, your personal information is also an information good. These security features of the iPhone are a good thing.
It is for these reasons, that I believe Apple refused to voluntarily write this code. The concept of information goods allows us to understand why security in an operating system is so incredibly important for our personal information and why creating security holes in an operating system is so fundamentally dangerous. Apple is right to refuse this.
But can the government force Apple to do so? No, they shouldn’t. To understand why not, consider a second principle I teach to my MBA students, that software is an opinion. For users of software, this may not be immediately obvious. Often they see just the end product and assume that’s the way it has to be. It isn’t and it doesn’t. During the development of a piece of software, computer programmers, computer designers, and system analysts must make decisions about what is important to include, what to exclude, how to position things, what to call things, and a whole host of value judgments about what is important and what isn’t. These are opinions. That’s why companies have many competing products. Do a search for “web development tools” or “project management software” and you’ll find 100s of different software tools that offer slightly different ways to accomplish the same thing. Furthermore, these opinions are written in a computer language. While these languages may not be like most common spoken languages, they do exhibit the rules of syntax and semantics that structure the thoughts and instructions for how a piece of software should function. Because software is a written opinion, it is considered a creative work and protected by copyright law. The U.S. codified those principles in a number of laws, most recently the Digital Millennium Copyright Act.
So how does that apply to the Apple-FBI dispute. Let’s use another copyright object as an analogy – a book. Suppose an author decides to kill off a main character in their story (author’s such as Shakespeare or George Martin certainly come to mind). Does the government, for whatever reason, have the right to force the author to rewrite the story so the main character stays alive? No! We as a nation protect free speech (whether we agree with the speech or not), so we can say (and write) what we want. We agree that forcing others to say something other than what they believe is wrong. In this case, Apple believes (and for reasons discussed above) that deep security should be integrated with their software operating system. It is their opinion. And is protected by copyright law. And copyright law is based on an even more fundamental principle – our right to free speech as defined in the constitution. Apple and it’s software developers have a right to free speech and a right to write in the manner they best see fit – including write software. Forcing them to do otherwise would be forcing them to do something they fundamental believe is wrong. It amounts to forcing them to write (and hence think) exactly like the government wants them to write (and think).
Certainly software is an opinion, but that does not give the government nor the popular opinion of the nation the right to force them to change that opinion. Especially to create something that violates that opinion. That’s not the same as restricting some speech in cases where lies might cause severe damage – like the classic case of yelling “Fire” in a crowded theater when in fact there is no fire. But those limitations are negative in manner – referring to what you cannot do or say – not positive in manner, such as forcing you to say something you don’t want to. The danger of such a precedent is extremely dangerous. Dystopian novels such as 1984 by George Orwell come to mind. The government should NOT be allowed to negate the principle of free speech, and hence should not force Apple to write an alternative software program even if only for one use.