As London reels in the aftermath of yet another horrific terrorist attack, the U.K. government has reignited the debate on the use of encryption in mobile messaging services.
The attack in Westminster last Thursday left four people dead and many more injured, and reports have since surfaced that the perpetrator, British man Khalid Masood, was using WhatsApp minutes before he mowed down pedestrians on Westminster bridge and fatally stabbed a policeman. However, police so far have indicated that Masood was a so-called “lone wolf” killer, and there is nothing so far to suggest that WhatsApp played any direct part in the attack — all we know is that Masood had checked his WhatsApp account shortly before, according to a screenshot taken by the Daily Mail.
During an interview with the BBC earlier today, Home Secretary Amber Rudd, who’s responsible for internal affairs within England and Wales, said: “There should be no place for terrorists to hide, we need to make sure that organizations like WhatsApp, and there are plenty of others like that, don’t provide a secret place for terrorists to communicate with each other.”
Though there are countless messaging services out there, including Telegram which has also previously found itself at the center of the terrorism debate, Facebook-owned WhatsApp is one of the world’s most popular, with well over one billion monthly active users. WhatsApp activated end-to-end encryption by default last April, and with governments around the world looking for ways to combat the perceived growing threat of terrorism, technology companies are facing growing pressure to create some kind of backdoor access into their private communication services.
“It used to be that people would steam open envelopes or listen in on phones if they wanted to find out what people were doing — legally, through warrants,” continued Rudd. “But in this situation, we need to make sure that our intelligence services have the ability to get into situations like the encrypted WhatsApp.”
It is true that governments have always sought inroads into private communications, and the age-old argument that “if you have nothing to hide, you have nothing to fear” still holds some weight. But any security compromise that’s created for the so-called “good guys” can equally be abused by the bad guys, and this is one important part of the debate that will never go away. Last year, Apple responded to a court order that stipulated Apple must help the FBI break into a phone belonging to one of the San Bernardino killers, saying it would set a dangerous precedent.
“The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create,” said Apple CEO Tim Cook, at the time. “They have asked us to build a backdoor to the iPhone. Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”
During her interview with the BBC’s political editor Andrew Marr, Rudd was quizzed on the parallels between Apple’s predicament and that of other technology companies including WhatsApp.
“So do you think U.S and U.K. governments have to take on the big internet companies and force them to open their devices?,” asked Marr.
“If I was talking to Tim Cook, I would say to him that this is something completely different,” replied Rudd. “We’re not saying ‘open up,’ we don’t want to ‘go into the cloud,’ we don’t want to do all sorts of things like that. But we do want them to recognize that they have a responsibility to engage with governments, and engage with law enforcement agencies when there is a terrorist situation. We would do it all through the carefully thought-through, legally covered arrangements. But they cannot get away with saying we are a different situation.”
There are contradictions in such statements. On the one hand, Rudd suggests that the U.K. government isn’t asking technology companies to “open up,” but the only way to get the information that they’re looking for is for technology companies to open up. That is about the size of it, and there’s no getting away from that.
This latest debate comes as YouTube faces increasing pressure from companies over the placement of advertisements against controversial videos, with Google recently promising stricter policy enforcement and more control for brands. Facebook, Twitter, and Google have also faced lawsuits lately over their roles in facilitating communications between and within terrorist organizations.