Remember that secret spy summit Apple and Google attended? Here's what happened

"Chatham House rules" meant everyone could speak freely as it was off the record. Here's what was said, in an overview.
Written by Zack Whittaker, Contributor
Ditchley House, where the summit was held last month
(Image: Ditchley Foundation/Flickr)

In late-May, high-ranking intelligence officials and representatives from some of the world's best known and powerful technology companies met in an 18th-century manor house.

The summit had one major purpose: to talk off-the-record so conversation could be free, honest, and open, with the aim of figuring out, "now what?" in the wake of the surveillance leaks.

In a note of personal impressions published Thursday, Sir John Holmes, director of the Ditchley Foundation, detailed in an overview the vast scope of discussions on current and ongoing issues in the wake of the Edward Snowden disclosures.

Two years ago, technology giants were thrown under the bus by claims they were knowingly participating in government surveillance. In anger and frustration, they began doubling-down on security and encryption and locking out the government, in what they saw as overreaching surveillance and mass intrusions on their systems.

Now, said to be for the first time, companies including Apple, Google, and telecom giant Vodafone; representatives from US and UK intelligence agencies (including MI6, GCHQ, the CIA and NSA); a bevy of data protection officials and academics met to hammer out some of the problems faced by both industry and government today.

It's not clear who said what, what was unanimously or widely agreed upon, or whose objections were taken over another -- if at all. But the discussions give new light on what some of the current problems are, and how both government and industry aim to fix them.

Here's what you need to know:

Were the technology companies "complicit" in surveillance? Sort-of, but not quite. There's still a considerable debate over whether or not tech companies were knowingly handing data to the government, based on secret warrants or wide-ranging orders.

There is a difference in relationships between telecom companies and internet providers. Telecom firms need licenses and are under a different rule of law than internet providers, which do not. (The note also said that the majority of information asked for by intelligence agencies come from the phone companies).

Tech companies, however, are now providing encryption to their customers, who hold the keys themselves. Firms like Apple and Google have been criticized heavily for this, but say they "could not afford to alienate customers," who were more aware of the surveillance risks than ever.

Tech companies don't want a "backdoor." They want a "front door." The attendees were "confident" that an agreement could be reached that allowed governments to demand data from tech giants without overstepping the mark. They said this would not be easy, considering Apple and Google's already firm position on the Obama administration's backdoor proposals.

"The key to this was acceptance by all concerned that access to information held by companies had to be through front doors, via legitimate, authorized warrants for specific purposes, not through back doors, or on the basis of bulk requests with scant justification," said the note. "The companies were quite willing to share data in this way -- they wanted to cooperate, and did not wish to be seen as unpatriotic or unwilling to help in the fight against terrorism or crime."

"Nothing else would be acceptable to their customers," said the note.

There shouldn't be absolute secrecy in surveillance, but the burden of proof should fall on those wanting secrecy. Not everything should be secret. The intelligence agencies need to be able to keep operations secret when necessary, but also able to talk about the success stories. That said, secrecy should not be the default.

"The burden of proof should be on those arguing for secrecy, not the other way round," the attendees agreed.

The attendees couldn't decide on whether or not privacy is an absolute universal right. The note said that it was generally accepted that privacy was not and never had been a fundamental human right because different requirements varied from country to country, generally because of national security. That was measured through different national reactions. "Privacy was still a fundamental value of democracy and the rule of law, and had to be maintained," said the note.

Oversight is tough, but politicians and oversight committees need to trust from the public. It was agreed that elected officials who know the full scope of the surveillance programs are "better-placed" to issue warrants for surveillance than judges. But they "could and should" be able to defend their decisions in parliament or a court.

There was also an agreed need for "translucency" that allowed more to be revealed without compromising operational security, but those who oversee the agencies and programs should be given full access in order to perform their duties. But there was a catch: "At the same time it had to be accepted that no oversight mechanism was ever likely to be able to reveal all it did or knew."

International data sharing was stalled, and needs improving. Sharing a citizen or resident's data with an international partner is "often highly restricted," not least because US law prevents most Silicon Valley companies from sharing data with partners without lengthy legal proceedings.

It's a particularly contentious topic at the moment as the US is trying to circumvent existing mutual legal assistance treaties by forcing a US-based company to serve up data stored in Ireland. The outcome of that case could threaten the fundamental trust in the US technology industry.

While there was an agreement for "better arrangements" and "greater reciprocity," it wasn't entirely clear how the problem could be fixed.

Bulk data collection is not the same as mass surveillance, but there's a very fine line. Some argued that there was a distinct difference between computers analyzing who is talking to whom and humans performing the same task. But it's a fine line, and without proper regulation and oversight, bulk data collection could "be misused for that purpose."

Others in the room argued that the distinction between metadata, such as who and when you call or email, and the content of those calls or emails, is "less valid" and "should not be relied on as the prime argument for the ethical acceptance of these activities." The counter-argument was that metadata, though there was no agreed definition, can be hugely revealing to your activities, regardless of what the contents of the calls or emails were.

We reached out to Google and Apple but did not hear back. If we do, we'll update the piece.

14 privacy tools you should use to stay secure

Editorial standards