Apple and the FBI face off in court
March 2, 2016
In the last few weeks, the nation has been watching closely as the Federal Bureau of Investigation and tech giant Apple go head-to-head in a battle between national security and personal privacy. This battle is the repercussions of the tragic San Bernardino shooting in December of 2015 that left 14 people dead and 22 injured. Since then, law enforcement has been attempting to piece together the details of the massacre; one that President Obama called an act of terrorism. This has sparked heated debates over government overreach and privacy rights.
As part of the FBI’s investigation, the government obtained one of the shooter’s iPhones, which turned out to be locked with a pin-code, making it impossible to retrieve any data from the device. The FBI therefore requested that Apple help them in unlocking the phone so that they may use it in the investigation.
In response, Apple CEO Tim Cook posted an open letter on their website warning iPhone owners that the government is trying to force Apple to override their own encryption, or build a so-called “backdoor” that can easily fall into the wrong hands, and that Apple would not comply in defense of encryption and digital privacy. Unfortunately, the facts are blurry with the immense media coverage. But what is the FBI really asking? And are they legally able to compel Apple to comply? The first question is more simply answered than the second.
Put frankly, older iPhones, that have software before iOS8, were designed in a way that lets Apple bypass the pin code lock feature and access the phone’s data. In fact, Apple has handed over information like that several times because of court orders.
However, the shooter’s phone runs on the latest software iOS9, which was designed by Apple so that even they cannot bypass the password to access encrypted data. This was done by implementing two security features; one, the PIN cannot be entered one after another as there is a wait time after each wrong entry, and two, that if the wrong PIN code is entered ten times, all the data on the phone is destroyed. The FBI is asking Apple to create a modified version of the software that removes these two features, which would allow law enforcement to “brute force” the device – having a computer rapidly try all the possible combinations until it guesses the right one.
Many see this as a violation of the fourth amendment and as a severe threat to digital privacy. However, the constitutional argument is completely irrelevant, because the iPhone was the suspect’s work phone, which means it technically belongs to his employer, who gave consent to have it searched, according to NPR. The privacy issue is more complicated.
The claim that Apple is primarily concerned about keeping private user data from law enforcement is beside the point, because they already comply with court orders to hand over iCloud data, as does practically every major file-hosting service. Moreover, Apple has the ability to install this modified software on the iPhone fairly easily. The FBI isn’t even asking for the software. They’re saying that Apple can keep it in their own possession. They can even make it so that it is only compatible with this one specific phone, keeping it away from hackers and cybercriminals. Even so, there is reason to be cautious: this sets a very dangerous precedent.
If Apple complies with this request to actively reprogram their software in a way that undermines their own encryption, it is almost certain that law enforcement will constantly be asking for more modifications and backdoors whenever it pleases. The more of these security vulnerabilities there are, the more likely it is that one will leak and be abused. This is bad for business, bad for security, and bad for privacy.