How Strava's "anonymized" fitness tracking data spilled government secrets

Analysis: Strava may "anonymize" the user, but that isn't helpful when that user inadvertently reveals the location of sensitive government facilities.
Written by Zack Whittaker, Contributor

Remember when you said you have "nothing to hide?"

It turns out you do. If it's not an affair you're hiding from your spouse, it's your highly classified place of work that's now painted in precise detail on a map for anyone to see.

That's exactly what happened when Strava, a widely used app for tracking activity and exercise, released an "anonymized" heatmap of all its global data in November. The map only came to light this weekend after Australian student Nathan Ruser started digging into the data. With over 3 trillion coordinates at the street level from over 27 million fitness device users, like Fitbit and Jawbone, the GPS tracking company mapped out its aggregated data over the past two years of activity to reveal some of the most visited areas. Predictably, high population areas -- like most of the US and Europe -- are brightly lit up.

That same data also illuminated a scattering of little-known locations in war zones, where US secret facilities and military bases have operations and personnel -- presumably because soldiers and staff are unknowingly uploading their fitness tracking data to Strava.

The news has prompted US-led coalition forces to reevaluate their use of fitness trackers, amid fears that enemy forces could use the data to locate troops on the ground, according to a statement obtained by the Washington Post.

Ned Price, a former special assistant to President Obama, said in a tweet that "capable adversaries have almost certainly harvested this data for years."

That's not even the half of it.

You can see in close detail staff walking around at UK eavesdropping station GCHQ in Cheltenham. There are also walking routes of border patrols along the US-Mexican border. Pyongyang is one of the quietest capital cities in the world, but even the isolated state lights up thanks to fitness tracking activity (although, there's reportedly no tracking data at the country's primary nuclear test site). The data also shows jogging activity on the beach around a CIA annex at Mogadishu airport.

Even Area 51 is on the map.


A border patrol officer walking along the the US-Mexican border. (Image: Strava)


Fitness tracker users at GCHQ, the UK's electronic eavesdropping spy agency. (Image: Strava)


Australia's secret Pine Gap facility in Alice Springs. (Image: Strava)

We have a serious problem when millions of users' tracking data is going up to the cloud, and for the most part without the user's direct knowledge or explicit consent. Our location, whether in real-time or historical, is one of our most private data points in our lives. They can show where we work and where we live -- and even if you have "nothing to hide," most people wouldn't readily give up their home addresses to a random person on the street.

But Strava's app, which ranks as one of the most popular health and fitness apps available, takes the approach of requiring users to opt-out rather than opt-in, security researchers say.

In a blanket statement given to reporters, Strava said the data "excludes activities that have been marked as private," and linked to a blog post.

Strava does have an option to turn off uploading tracking data, but the privacy settings -- largely public by default, according to a Quartz story from August -- are confusing at best.

"It's true that as a user, it's my responsibility to know what I'm opting into when I use a service, especially when it's free," said reporter Rosie Spinks. "But the fact that it took me three rounds of emails with a support rep, a call with [the company's communications chief], and a follow-up email round with him to fully understand how to prevent strangers from seeing my running routes is troubling."

Spinks isn't wrong. It is a user's responsibility to maintain the app's privacy settings -- what's commonly known as operational security or "opsec" (or "persec" for personal security). Given the app's primary function is to track a user's location when they're running, swimming, or exercising, a little common sense about location data sharing goes a long way.

In hindsight, maybe giving fitness trackers to your staff at a highly classified military base isn't the best idea.

But it too is the responsibility of the companies in question not to trick or deceive users into sharing their data with the world. Based on social media responses alone, many Strava users don't seem to realize that their data is being used in this way. Although the heatmap doesn't include specific users' information, it's been demonstrated that individual activity, when made public, can be accessed and scraped. It's a stark reminder that companies like Strava, and other tech giants -- Facebook, Google, and Microsoft, to name a few -- have hoards of data on users that can paint an accurate (and incriminating) view of a person's life.

Strava can hide behind the claim that its app data anonymizes the user, but that doesn't help when that user is inadvertently revealing the location of sensitive facilities.

Granted, in today's world of readily available aerial and satellite imagery, it's easy to argue that keeping the very existence of military and other intelligence bases classified is impossible.

But when you're working for the government, your opsec is their opsec.

Editorial standards