An international coalition of civic society organizations, security and policy experts and tech companies — including Apple, Google, Microsoft and WhatsApp — has penned a critical slap-down to a surveillance proposal made last year by the UK’s intelligence agency, warning it would undermine trust and security and threaten fundamental rights.
“The GCHQ’s ghost protocol creates serious threats to digital security: if implemented, it will undermine the authentication process that enables users to verify that they are communicating with the right people, introduce potential unintentional vulnerabilities, and increase risks that communications systems could be abused or misused,” they wrire.
“These cybersecurity risks mean that users cannot trust that their communications are secure, as users would no longer be able to trust that they know who is on the other end of their communications, thereby posing threats to fundamental human rights, including privacy and free expression. Further, systems would be subject to new potential vulnerabilities and risks of abuse.”
GCHQ’s idea for a so-called ‘ghost protocol’ would be for state intelligence or law enforcement agencies to be invisibly CC’d by service providers into encrypted communications — on what’s billed as targeted, government authorized basis.
The agency set out the idea in an article published last fall on the Lawfare blog, written by the National Cyber Security Centre’s (NCSC) Ian Levy and GCHQ’s Crispin Robinson (NB: the NCSC is a public facing branch of GCHQ) — which they said was intended to open a discussion about the ‘going dark’ problem which robust encryption poses for security agencies.
The pair argued that such an “exceptional access mechanism” could be baked into encrypted platforms to enable end to end encryption to be bypassed by state agencies would could instruct the platform provider to add them as a silent listener to eavesdrop on a conversation — but without the encryption protocol itself being compromised.
“It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved — they’re usually involved in introducing the parties to a chat or call,” Levy and Robinson argued. “You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.”
“We’re not talking about weakening encryption or defeating the end-to-end nature of the service. In a solution like this, we’re normally talking about suppressing a notification on a target’s device, and only on the device of the target and possibly those they communicate with. That’s a very different proposition to discuss and you don’t even have to touch the encryption.”
“[M]ass-scale, commodity, end-to-end encrypted services… today pose one of the toughest challenges for targeted lawful access to data and an apparent dichotomy around security,” they added.
However while encryption might technically remain intact in the scenario they sketch, their argument glosses over both the fact and risks of bypassing encryption via fiddling with authentication systems in order to enable deceptive third party snooping.
As the coalition’s letter points out, doing that would both undermine user trust and inject extra complexity — with the risk of fresh vulnerabilities that could be exploited by hackers.
Compromising authentication would also result in platforms themselves gaining a mechanism that they could use to snoop on users’ comms — thereby circumventing the wider privacy benefits provided by end to end encryption in the first place, perhaps especially when deployed on commercial messaging platforms.
So, in other words, just because what’s being asked for is not literally a backdoor in encryption that doesn’t mean it isn’t similarly risky for security and privacy and just as horrible for user trust and rights.
“Currently the overwhelming majority of users rely on their confidence in reputable providers to perform authentication functions and verify that the participants in a conversation are the people that they think they are, and only those people. The GCHQ’s ghost protocol completely undermines this trust relationship and the authentication process,” the coalition writes, also pointing out that authentication remains an active research area — and that work would likely dry up if the systems in question were suddenly made fundamentally untrustworthy on order of the state.
They further assert there’s no way for the security risk to be targeted to the individuals that state agencies want to specifically snoop on. Ergo, the added security risk is universal.
“The ghost protocol would introduce a security threat to all users of a targeted encrypted messaging application since the proposed changes could not be exposed only to a single target,” they warn. “In order for providers to be able to suppress notifications when a ghost user is added, messaging applications would need to rewrite the software that every user relies on. This means that any mistake made in the development of this new function could create an unintentional vulnerability that affects every single user of that application.”
There are more than 50 signatories to the letter in all, and others civic society and privacy rights groups Human Rights Watch, Reporters Without Borders, Liberty, Privacy International and the EFF, as well as veteran security professionals such as Bruce Schneier, Philip Zimmermann and Jon Callas, and policy experts such as former FTC CTO and Whitehouse security advisor, Ashkan Soltani .
While the letter welcomes other elements of the article penned by Levy and Robinson — which also set out a series of principles for defining a “minimum standard” governments should meet to have their requests accepted by companies in other countries (with the pair writing, for example, that “privacy and security protections are critical to public confidence” and “transparency is essential”) — it ends by urging GCHQ to abandon the ghost protocol idea altogether, and “avoid any alternative approaches that would similarly threaten digital security and human rights”.
Reached for a response to the coalition’s concerns, the NCSC sent us the following statement, attributed to Levy:
We welcome this response to our request for thoughts on exceptional access to data — for example to stop terrorists. The hypothetical proposal was always intended as a starting point for discussion.
It is pleasing to see support for the six principles and we welcome feedback on their practical application. We will continue to engage with interested parties and look forward to having an open discussion to reach the best solutions possible.
Back in 2016 the UK passed updated surveillance legislation that affords state agencies expansive powers to snoop on and hack into digital comms. And with such an intrusive regime in place it may seem odd that GCHQ is pushing for even greater powers to snoop on people’s digital chatter.
Even robust end-to-end encryption can include exploitable vulnerabilities. One bug was disclosed affecting WhatsApp just a couple of weeks ago, for example (since fixed via an update).
However in the Lawfare article the GCHQ staffers argue that “lawful hacking” of target devices is not a panacea to governments’ “lawful access requirements” because it would require governments have vulnerabilities on the shelf to use to hack devices — which “is completely at odds with the demands for governments to disclose all vulnerabilities they find to protect the population”.
“That seems daft,” they conclude.
Yet it also seems daft — and predictably so — to suggest a ‘sidedoor’ in authentication systems as an alternative to a backdoor in encrypted messaging apps.