Apple CEO Tim Cook attends the inauguration of the academic year at the Bocconi University, in Milan, Italy, Tuesday, Nov.10, 2015.

Apple CEO Tim Cook attends the inauguration of the academic year at the Bocconi University, in Milan, Italy, Tuesday, Nov.10, 2015. AP / LUCA BRUNO

You, Apple, Terrorism and Law Enforcement

Yesterday’s court order compelling Apple to backdoor IOS could have effects long after the San Bernardino case.

The next battle in the privacy wars erupted with a bang Tuesday as a judge in California ruled that Apple must help the FBI bypass security features on an iPhone used by attackers in the San Bernardino shooting in December. Apple’s CEO Tim Cook responded with a letter saying that the government’s order presented a threat to data security and was tantamount to demanding that the company create a “backdoor” in its operating system.

The order does not oblige Apple to “break” its encryption standards, but the encrypted data on the phone is what the FBI is after. Specifically, the order asks Apple to help the government get around a feature that deletes the data if someone makes ten unsuccessful pass combination attempts. That would allow law enforcement to try millions of possible combos and get at that info.

Update: On Friday, the Justice Department filed a motion to compel Apple to comply by the order.

How could the fight affect you, the technology industry, law enforcement, and terrorists and criminals? Here’s a look:

Consumers, Citizens, Privacy

In September 2014, Apple announced that its most recent operating system update, iOS 8, would encrypt user phone data. FBI Director James Comey appeared before the Brookings institution at the time to spell out his concerns about what that meant.

“With Apple’s new operating system, the information stored on many iPhones and other Apple devices will be encrypted by default … What this means is that an order from a judge to monitor a suspect’s communication may amount to nothing more than a piece of paper.”

Thus the current debate about “backdoors” began. This case isn't technically about encryption, so much as a single security feature, but it has become a flashpoint in the broader encryption, backdoor debate.

Can you build a “safe” backdoor? One that the FBI can use to look for terrorists but that criminals can’t use to look for your naked pictures?  

“I just believe that this is achievable,” NSA director Michael Rogers said at a New America event last year to immediate guffaw.

In a July 2015 paper titled “Keys Under Doormats,” encryption expert and privacy advocate Bruce Schneier and his co-authors took a rather different position, arguing that what are sometimes called “exceptional access technologies” developed for limited law enforcement routinely find their way into the wrong hands, such as has reportedly happened in Italy, Greece and elsewhere.

One of the most remarkable was a 2010 incident in which China was able to seize upon a backdoor feature that Google built into Gmail, in order to comply with U.S. search warrants.

China's hackers subverted the access system Google put in place to comply with U.S. intercept orders. Why does anyone think criminals won't be able to use the same system to steal bank account and credit card information, use it to launch other attacks or turn it into a massive spam-sending network? Why does anyone think that only authorized law enforcement can mine collected Internet data or eavesdrop on phone and IM conversations?” he wrote in 2010.

Will these efforts be more successful? Cook’s letter displays extreme skepticism.

Moreover, he said, it presents dangerous precedent, specifically in the use of the All Writs Act of 1789. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”


Cook’s letter yesterday states that “Building a version of iOS that bypasses security in this way would undeniably create a backdoor. … it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”

When Tim Cook says that Apple’s safety features put “data out of our own reach, because we believe the contents of your iPhone are none of our business,” he’s not just being moralistic.

Complying with the order could also hurt Apple’s sales. A growing number of consumers show increasing concern about the privacy of their devices. Otherwise, why build privacy safety features at all?

Submitting to U.S. government efforts to access supposedly secure data could be particularly bad for Apple’s efforts to sell into those emerging markets that the company is looking toward. If you feel bad about the U.S. government having the ability to look at your data, imagine how you would feel if you were Chinese? And Apple is looking abroad to China, for an enormous portion of its future revenue growth, particularly for newer devices like the iPhone 6S, as Piper Jaffray analyst Gene Munster have pointed out.

In December, the Chinese legislature passed a controversial anti-terrorism law that shares a lot in common with recent anti-privacy efforts in the United States. Chinese lawmakers floated a draft about a year ago that required technology companies and Internet Service Providers to give their encryption systems to the government. The final version stepped back from that but still required technology companies to provide “technical means of support” to law enforcement and decrypt their data themselves when faced with government orders to do so.

“This rule accords with the actual work need of fighting terrorism and is basically the same as what other major countries in the world do," Chinese parliament law division head Li Shouwei told reporters.

If Apple were to lose their (likely) appeal, change course, and break their safety features for the FBI, it’s clear that China will expect them to do the same thing or be shut out of a vital market. But the decision to destroy a potentially valuable safety feature would also be bad for business. That fear is probably greater than any fear of the new OS software falling into the wrong hands. It can remain firmly in Apple's control and still present the company with a huge problem. The story is the same throughout much if Silicon Valley for a variety of products and services.

Terrorists and Criminals

Will compelling Apple to build a new operating system to defeat its own security features make it impossible for terrorists to covertly communicate, electronically? Hardly, according to most privacy advocates and many encryption experts.

Earlier this month, Harvard’s Berkman Center released a worldwide survey of encryption products, a report authored in-part by Schneier. They found some 587 entities, mostly companies, that provide encryption products to the public, some of them for free. About two-thirds are outside the United States, from places as diverse as Estonia, Brazil and Tanzania.

The survey counted some 546 foreign-produced encryption products, of which 104 provided message encryption, 35 provided voice encryption, and 61 provided virtual private networking. There were 47 file encryption products and 68 e-mail encryption products.

What that suggests is that no single court action would be sufficient to stop secret electronic messaging.

“Short of a form of government intervention in technology that appears contemplated by no one outside of the most despotic regimes, communication channels resistant to surveillance will always exist. This is especially true given the generative nature of the modern Internet, in which new services and software can be made available without centralized vetting,” the Berkman Center wrote this month.

However, even the privacy advocates behind the report acknowledge that end-to-end user encryption and safety features like the one at play here are making life a bit more difficult for the FBI. “We take the warnings of the FBI and others at face value: conducting certain types of surveillance has, to some extent, become more difficult in light of technological changes” they wrote.

Law Enforcement

Safety features like the one in the court order are a big problem as the FBI sees it. The 2015 FBI budget allocated $31 million for research into backdoor creation. The 2016 budget calls for nearly double that, a rise to $69.3 million.

In October 2015, Comey said that “dozens” of terror suspects had used end-to-end encryption safety features to hide from law enforcement. But make no mistake, Comey doesn’t just want new tools just to chase jihadists. On Feb. 9, Comey told the Senate Intelligence Committee that “going dark” was “a problem I hear about all over the country from my partners in state and local law enforcement.”

If the order stands and Apple, compelled by a court, somehow builds an operating system that it can install on the iPhone 5c used by Syed Rizwan Farook, enabling the FBI to effect a brute-force attack and access the data, then similar orders could follow from different courts around the country.