On March 22, 2017, a tragic terrorist attack happened on Westminster Bridge in London. Khalid Masood drove a car into pedestrians on the bridge, killing three people and injuring 50 more. Masood also fatally stabbed a police officer before being shot and killed by police. It all happened in 82 seconds. Authorities believe that Masood used his smartphone to connect to end-to-end encrypted messaging app WhatsApp two minutes prior to the attack. In the wake of this tragedy, the British government is calling for Apple, Facebook (the owners of WhatsApp), and other technology firms to provide intelligence services hacker tools to decrypt such messages.
Denying Terrorists a Place to Hide by Providing Hacker Tools
Amber Rudd, Home Secretary for the United Kingdom, spoke recently to BBC One’s Andrew Marr Show. Rudd argued that the unbreakable encryption utilized by messaging apps like WhatsApp is “completely unacceptable, there should be no place for terrorists to hide.”
Rudd reminisced about earlier times, before encrypted messaging technology. There wasn’t any need for hacker tools. “It used to be that people would steam open envelopes or just listen in on phones when they wanted to find out what people were doing,” Rudd said. That’s no longer the case, and Rudd argued that intelligence services needed to have “the ability to get into situations like encrypted WhatsApp.”
What Does Apple Have to Do With All of This?
At the heart of Rudd’s argument is encrypted messaging, so why draw Apple into the argument? Quite simply, it’s because Cupertino has argued previously that it would be a mistake for governments to force Apple to allow any sort of a “back door” into its products. In 2016, in the wake of the shooting in San Bernardino, California, the United States government attempted to coerce Apple into developing a means to bypass the security on the terrorist’s iPhone 5c. Apple refused for a variety of reasons, pointing out that “in the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession.”
Oh, But This Is Something ‘Completely Different’
Home Secretary Rudd isn’t buying into Cupertino’s argument. Rudd said, “I would ask Tim Cook to think again about other ways of helping us work out how we can get into the situations like WhatsApp on the Apple phone.” She went on to insist:
Rudd claims that, if Apple were to provide intelligence services with the means to bypass security and encryption, they would only use it only through “the carefully thought through, legally covered arrangement.”
Sorry, I’m Not Buying It
While the government may argue that its use would be limited to this case, there is no way to guarantee such control.
As previously pointed out, Cupertino’s stance on this is pretty clear. Apple’s users have a right to their privacy, and CEO Tim Cook has expressed his doubts about the government’s ability to limit the use of any back doors or other hacker tools. In his open letter outlining Apple’s response to the San Bernardino iPhone case, Cook wrote that “while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
Truer words were never spoken. After all, we’ve recently heard about the Central Intelligence Agency losing control over a large cache of hacker tools through a release to WikiLeaks. These days, intelligence services are about as leak-proof as a sieve, so I sure wouldn’t want them having control over a tool that could bypass my own security. Let’s take things a step further, though …
Let’s assume for a moment that Apple and Facebook were persuaded that they had an obligation to provide such hacking tools to intelligence services. Would such cooperation necessarily end with the United States and United Kingdom? What if Iran insisted it was entitled to the same tools. Or Russia, China, or North Korea? I wonder if Home Secretary Rudd would feel so good about our adversaries having the means to bypass the security of mobile devices and encrypted messaging apps?
No, the technology sector should not be obliged to help governments bypass security measures. If companies such as Apple or Facebook were to build hacker tools that aided “the good guys” in breaking encryption, those tools would inevitably end up in the hands of “the bad guys,” too. That’s not a risk I’m willing to take, and I hope that Apple and Facebook maintain their stance of not taking the chance, either.
Go to Source