Remember the first commercial for the Mac? Called “1984,” it was directed by Ridley Scott and starred the English athlete Anya Major. The concept, ‘saving humanity from conformity’, alluded to George Orwell’s novel of the same name.
This week, Apple CEO Tim Cook stoked the embers of the company’s erstwhile image by refusing FBI demands to unlock an iPhone linked to the recent San Bernardino terrorist attack so the FBI can find out who Syed Rizwan Farook might have talked to. Apple’s refusal has all the earmarks of a tech company playing the hero: safeguarding the privacy of zillions of customers in the face of a federal law enforcement agency acting under the guise of national security.
“We have great respect for the professionals at the FBI,” Cook said, “and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the US government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
“Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
Okay. I get it. Except… that Apple has cooperated with law enforcement in the past. The Daily Beast reports that the tech giant has unlocked phones at least 70 times since 2008. And while, this time, the technical assistance needed is more complex, it’s fair to question whether Cook’s position is genuine or just a PR ploy designed to please and/or appease the public.
Perhaps Apple has no real intention of refusing the FBI, displaying a rebellious public posture to ensure stronger restrictions on the use of the technology they will ultimately give the Feds. Or maybe they’re making their stand so that when the government forces their hand, as they inevitably will, Apple will look like the white hats at the expense of the sinister black government hats. Or maybe I’m wrong and Apple really is that woman running through the crowd of gray uniforms.
The confrontation deepens
Whatever Apple’s motivations, within twenty-four hours the embers Cook stoked become a roaring fire. Privacy advocates and tech companies, including Google, Microsoft, and, in a round about way Amazon (via an editorial in the Washington Post) are showing their support for Apple. On the other side, a host of former and current CIA, FBI officers, and other intelligence officials are quick to denounce them.
The FBI wants Apple to disable the iPhone feature designed to wipe data after ten unsuccessful password attempts, so that the FBI can test billions of passwords without any data being deleted. But Apple says it can’t unlock its newer iPhones because they don’t store decryption keys. Only the user, or someone else who knows the password, can unlock the device. Tech experts concur, but US Magistrate Judge Sheri Pym countered by saying Apple can ‘write software’ to bypass the feature.
FBI Supervisory Special Agent Christopher Pluhar already has much of Farook’s data that was backed up in the cloud. The agency confirmed that the terrorist in question, Farook, was communicating with suspect individuals. But Pluhar also thinks the terrorist may have intentionally disabled the backup feature, and there might still be, “relevant, critical communications and data” on the phone from the days and hours just prior to the attack that would be very useful.
But former National Counterterrorism Center director, Matt Olsen, says the government has alternative ways to get the data it wants. In his opinion there’s no need for a dangerous ‘backdoor’ into the iPhone. Still, having said that, he thinks this is a special case. Then he’s also an ex-general counsel at the NSA…
Kevin Bankston, the director of New America’s Open Technology Institute, thinks it’s crazy for any business to do what the Feds are asking of Apple: “custom-build malware to undermine its own product’s security features.” He’s also unsure whether it’s even technically possible. Nate Cardozo, a staff attorney with the Electronic Frontier Foundation believes the government already has what it needs to unlock the phone and says “They chose this case because they want precedent that they can order a company to design a particular feature at their whim,” he said, noting no court has ever approved an order this broad.
The FBI responds that this is one phone of one man, not a movement or technology that will be publicly revealed.
At the moment, it’s a Mexican standoff, but the confrontation is bound to deepen over time in court. I suppose we shouldn’t be surprised. The debate about privacy and security has never been resolved, and quite possibly never will be. And every event that gives rise to the debate seems to raise the stakes is like a lit match thrown on a bale of straw.
Repercussions
If Apple gives in to the FBI it could affect much more than the iPhone. Governments, hackers, and froeign spies might end up doing the same with every piece of software, every gadget we use. (Although it’s unclear how so many hackers would get their hands on them)
If Apple caves, moreover, it could be a huge challenge to crack a password that contains more than just a few numbers and letters. It could easily take a decade to crack a strong password, with results unavailable until 2026 (although other tech experts disagree.) Still, is it realistic, useful, worth the investment, or – most of all – worth sacrificing our personal privacy?
In Kevin Bankston’s words, “If this precedent gets set it will spell digital disaster for the trustworthiness of everyone’s computers and mobile phones.” Ahmed Ghappour, professor at the University of California’s Hastings College of the Law, rounds the argument off with a warning. “If the writ can compel Apple to write customized software to unlock a phone, where does it end? Can the government use it to compel Facebook to customize an algorithm that predicts crime? It’s not clear where the line will be drawn, if at all.”
Yes, Jump Cut
Conveniently or maybe not, the debate between privacy and security is at the heart of my upcoming thriller, Jump Cut, which comes out March 1. And it remains unresolved at the end of the book. But we may get some insight into the Bureau’s position soon. As many of you know, I host a monthly radio show, and I have invited an FBI official to talk about some of the issues, including corporate espionage and encryption, that are raised in Jump Cut. Deputy Assistant Director Robert Jones of the DBI’s CounterIntelligence will be joining me on Second Sunday Crime at 6 PM CST Sunday March 13. It’s bound to be a fascinating discussion, so I might move the air date up a week, with such a timely subject on out hands.
I’ll let you know here and elsewhere on the internet exactly when and where it will air. Given the events of the past week, I can’t wait. I hope you can’t either.