Apple’s Fight for Your Privacy
Recently, Apple announced they were publicly fighting a court order to help the FBI to decrypt a phone in the San Bernardino terrorist attacks. From my understanding, the court is ordering Apple to re-write their operating system (iOS) to remove some safe guards on the phone such that the FBI can brute force attack the phone to gain access to whatever information might still be on it. While Apple acknowledges that this is technically possible, they argue (and rightly so), that doing so would irrevocably damage not only Apple’s reputation, but key principles of security and privacy. It is on this last topic – privacy – that I want to comment.
As I stated in my article on privacy “Privacy exists to the extent that someone desires a state where specific personal information is not disclosed to whomever should not access it. These preferences do not guarantee that the information will be kept out of view, but the protection of those preferences can come from any number of sources such as social norms, laws, or security.” While this was stated in negative terms, privacy protection enables a number of values, including freedom to act without observation (such as when I share an intimate moment with my wife or when I practice on the piano), the ability to work without interruption (so I close my office door to work on my research), and as a protections against potential for theft or deception (that’s why I won’t publish what or where I keep my valuables). In order to protect these values, laws around property and liberty have developed over the years that help guarantee our ability to maintain our privacy when we wish.
It is important to note that privacy desires are a very personal thing, varying greatly between individuals and even between contexts for the same individual. For example, just because a lady posses naked for an art class, does not mean that she is willing to posse naked at any other time. She may rightly be offended by a peeping tom spying into her bedroom. The same may be true of information on a smartphone. Just because one person may have nothing to hide on their iPhone at this point in time, doesn’t mean they won’t in the future. And it certainly doesn’t mean that others may not want their information accessed. As a principle, privacy should be respected in those cases, UNTIL the person chooses. That’s how privacy works.
Privacy, however, is not a universal right, as law professor Amy Peikoff argues, though for different reasons. And this becomes evident in legal cases where law enforcement desires access to suspect’s property against his or her wishes. We generally agree that if someone is breaking the law, they lose some of their rights, including some of those that protect their privacy. In such cases, our society recognizes that law enforcement may seek a search warrant to access such property. The search warrant helps prevent law enforcement from abusing their power by requiring a judge to review the evidence suggesting illegal activities. This limitation is important, because the potential for police abuse has led to a number of tyrannical governments to perpetuate their power. Indeed, police abuse and tyranny largely go hand-in-hand.
Search warrants can also require that businesses assist police in accessing properties. If a suspect rents a locker from a business, the police can require the landlord to open the locker with a search warrant. It is this principle that brings Apple into problem above.
There is a temptation to look at just half the context and note that Apple wrote the OS, so they own the OS and can do what they want with it. So their choice not to help the government with gaining access is akin to not giving access to a locked building when handed a search warrant. According to government proponents, Apple is obstructing justice because they won’t provide access to the locked device. But this is not what the government is asking. They are not asking Apple to unlock the door. They are asking Apple to modify the design of the lock so that the FBI can break it open themselves. This may seem like a trivial difference but its not because of the nature of software.
The physical lock analogy is limited because of the nature of software is fundamentally different. It is an information good, meaning it can be infinitely replicated and distributed at no cost. Apple’s contention that once a backdoor (the modification to the design to allow access in generally unknown ways) is known, it would eventually be found by hackers, shared globally, and used to bypass the encryption safeguards in place. This is a basic principle of information security. Software companies have learned that knowledge of any such backdoors eventually fall into the wrong hands. They have 60 years of experience with being burned by this. That’s why best practices in security say you should NEVER share your password, even if you trust the person you are sharing it with.
When Apple designed their most recent version of their iOS, they built encryption deep within the system so as to protect it from any type of hacks. To voluntarily bypass those security safe guards would fundamental contradict the purpose of security. Asking Apple to do so is asking them to voluntarily make their software less secure. It is asking them to make their customer information less secure and less private. Once backdoors to software are created, it will inevitably become known and exploited, destroying our ability to protect our privacy should we want to. This is what Apple is fighting for. They are fighting for our privacy. And this is why I support their efforts.
I really wish I had also read this article before posting on your latest entry. This explains very clearly what the government wants and how it is more than simply “unlocking”. It is a slippery slope to decide down the road that information which was private is no longer. I think some the analogies in this are just fantastic explanations. Good read!