After fitness data service Strava revealed bases and patrol routes with an online “heat map,”
the US military is reexamining its security policies for the social media age.
A modern equivalent of the World War II era warning that “loose lips sink ships” may be “FFS don’t share your Fitbit data on duty.” Over the weekend, researchers and journalists raised the alarm about how anyone can identify secretive military bases and patrol routes based on public data shared by a “social network for athletes” called Strava.
This past November, the San Francisco-based Strava announced a huge update to its global heat map of user activity that displays 1 billion activities—including running and cycling routes—undertaken by exercise enthusiasts wearing Fitbits or other wearable fitness trackers. Some Strava users appear to work for certain militaries or various intelligence agencies, given that knowledgeable security experts quickly connected the dots between user activity and the known bases or locations of US military or intelligence operations. Certain analysts have suggested the data could reveal individual Strava users by name.
But the biggest danger may come from potential adversaries figuring out “patterns of life,” by tracking and even identifying military or intelligence agency personnel as they go about their duties or head home after deployment. These digital footprints that echo the real-life steps of individuals underscore a greater challenge to governments and ordinary citizens alike: each person’s connection to online services and personal devices makes it increasingly difficult to keep secrets.
All Your Base Are Belong to Us
The revelations began unspooling at a rapid pace after Nathan Ruser, a student studying international security at the Australian National University, began posting his findings via Twitter on Saturday afternoon. In a series of images, Ruser pointed out Strava user activities potentially related to US military forward operating bases in Afghanistan, Turkish military patrols in Syria, and a possible guard patrol in the Russian operating area of Syria.
Not just US bases. Here is a Turkish patrol N of Manbij
You can see the Russian operating area in Khmeimim,
but also the guard patrol to the NE.
If soldiers use the app like normal people do, by turning it on tracking when they go to do exercise, it could be especially dangerous. This particular track looks like it logs a regular jogging route. I shouldn’t be able to establish any Pattern of life info from this far away.
Other researchers soon followed up with a dizzying array of international examples, based on cross-referencing Strava user activity with Google Maps and prior news reporting: a French military base in Niger, an Italian military base in Djibouti, and even CIA “black” sites. Several experts observed that the Strava heatmap seemed best at revealing the presence of mostly Western military and civilian operations in developing countries.
Many locations of military and intelligence agency bases pointed out by researchers and journalists had already been previously revealed through other public sources. But the bigger worry from an operations security standpoint was how Strava’s activity data could be used to identify interesting individuals, and track them to other sensitive or secretive locations. Paul Dietrich, a researcher and activist, claimed to have used public data scraped from Strava’s website to track a French soldier from overseas deployment all the way back home.
“This is the part that is perhaps most worrisome, that an individual’s identity might be pullable from the data, either by combining with other information online or by hacking Strava—which just put a major bullseye on itself,” says Peter Singer, strategist and senior fellow at New America, a think tank based in Washington, DC. “Knowing the person, their patterns of life, etc., again would compromise not just privacy but maybe security for individuals in US military, especially if in the Special Operations community.”
Strava’s data could even be used to follow individuals of interest as they rotated among military bases or intelligence community locations, according to Jeffrey Lewis, director of the East Asia Nonproliferation Program in the Middlebury Institute of International Studies at Monterey, California. In a sobering Daily Beast article, Lewis laid out a scenario by which Chinese analysts could track a Taiwanese soldier based on his activities at a known missile base and thereby discover other previously unknown missile bases as the soldier’s duties required him to rotate through those bases.
Taking Steps to Fix the Problem
The United States is clearly far from alone in dealing with such security challenges. Back in 2015, the People’s Liberation Army Daily issued a stern warning to members of the Chinese military about the security risks posed by smart watches, fitness bands, and smart glasses, according to Quartz. But the Strava example shows that the United States may be at greater risk, with its relatively large footprint involving troops, intelligence personnel, diplomats, and contractors deployed overseas in sensitive areas or conflict zones.
The US military’s Central Command has already begun reassessing its privacy policies for the troops after the Strava revelations, according to reporting by The Washington Post and others. Current US military service policies seem to allow for use of fitness trackers and other wearables with the caveat that local commanders have the discretion to tighten security. In fact, the US Army has previously promoted use of Fitbit trackers as part of a pilot fitness program.
Some of the security tightening may involve certain “no-go areas” or “leave-at-home policies” for personal smartphones and wearables, similar to what already exists in sensitive offices of the Pentagon and other installations, Singer says.
‘People on their third or fourth deployment are going to lose their minds or their marriages if they can’t use tech to simulate normalcy.’
Lynette Nusbacher, Military Historian
Certain military or intelligence facilities may also need upgrades to their security as a result of the Strava data reveal, says Lynette Nusbacher, a strategist and military historian based in the UK. She adds that militaries and other organizations will require constant, up-to-date training for both their leadership and the rank-and-file, to ensure they’re aware of the threat from modern geolocation technology.
The idea of banning wearable technologies outright may potentially make sense in certain cases: “A small minority of tier one special forces operators can go without toilet paper or soap or mobile phones for weeks,” Nusbacher says. But she warns that imposing extreme restrictions more broadly could reduce the number of people willing to sign up for military or intelligence stints overseas.
“When I was deployed on operations in 1999 we expected one phone call a week and dial-up internet,” Nusbacher says. “People on their third or fourth deployment are going to lose their minds or their marriages if they can’t use tech to simulate normalcy.”
Many analysts place the burden of responsibility on the US military and other organizations for the lapse, rather than on Strava. The latter does, after all, allow users to choose whether they share their data. “Strava offered a service,” Nusbacher says. “It’s not their fault that soldiers who needed better training and briefing turned that service into a vulnerability.”
But Paul Scharre, senior fellow and director of the Technology and National Security Program at the Center for a New American Security, argues that technology companies do have certain responsibilities, especially after a problem of this magnitude has been identified.
“Military service members, particularly in the special operations community, take operational security seriously: They would not have shared this data if they understood the consequences,” Scharre says. “If Strava was serious about the negative consequences of this data being public, they would temporarily take the maps offline and work with the government to scrub sensitive data. I do not think it is acceptable for a company to release data that might imperil the lives of US service members.”
In a statement, James Quarles, CEO of Strava, acknowledged that “members in the military, humanitarian workers and others living abroad may have shared their location in areas without other activity density and, in doing so, inadvertently increased awareness of sensitive locations. Many team members at Strava and in our community, including me, have family members in the armed forces. Please know that we are taking this matter seriously and understand our responsibility related to the data you share with us.”
Quarles said that Strava was “committed to working with military and government officials to address potentially sensitive data.” He added that the company was “reviewing features that were originally designed for athlete motivation and inspiration to ensure they cannot be compromised by people with bad intent,” and was also working to simplify “privacy and safety features” for customers to more easily understand and control their data.
North and South Korea
The Not-So-Bad and the Ugly
The heat map may contain a few bright spots, though. There is no evidence as of yet that certain countries or militant groups exploited the Strava heatmap along with other open-source intelligence to inflict real harm. “It’s a good thing this was reported now versus being exploited by an enemy later in a major war,” says Singer.
The Strava heatmap also represents the cumulative activity of users over several years up through September 2017. That means nobody can use it to track military patrols or analysts walking through CIA bases in real-time.
‘I do not think it is acceptable for a company to release data that might imperil the lives of US service members.’
Paul Scharre, Center for a New American Security
Still, the Strava incident is just the latest and perhaps most spectacular example of how social media can compromise the operations security of even the most sensitive military and intelligence agencies.
- Analysts and journalists have previously tracked the locations of soldiers, such as Russian troops in Ukraine, based on selfies and other public data shared on social media.
- Back in 2007, Iraqi insurgents used geo-tagged photos shared on social media of US Army attack helicopters landing at an airbase to pinpoint and destroy four of the expensive war machines in a mortar attack.
Much of the public data needed to compromise certain aspects of military or intelligence operations was already out there and hiding in plain sight years ago, according to Gavin Sheridan, CEO of Vizlegal and a former journalist. In a lengthy Twitter thread, he explained how geotagging has made it relatively easy to detect Westerners—usually soldiers—in remote areas of the world, or even to compile lists of family members for individuals working at the CIA or the Pentagon.
But addressing the security risks highlighted by Strava will require much more than simply updating a few policies. A world dominated by the rise of social media, the growing availability of commercial satellite and drone imagery, and increasing usage of smartphones necessitates an entirely new cultural mentality.
“Too often we think secrets lie hidden, when now they are mostly out in the open,” says Singer. “Both militaries and the public need to come to grips with the fact that the era of secrets is arguably over.”
How a popular running app
reveals the discreet routines of life on base
You can see individuals that are using Strava by zooming it to houses that have a short line.
Strava gives the ability to set up privacy zones, but it’s not on by default.
Where do troops at Incirlik Air Base in Turkey like to jog? Around the nuclear weapons storage sites, says a heat map of fitness paths from data tracking app Strava. Sourced from the native GPS data on user’s smartphones and watches, Strava Labs produced a global heat map, and with a click of a button and a minute on Google, anyone can find where on a military base it’s likely that people like to jog. So what?
When it comes to Incirlik, the Strava jogging data just adds to existing knowledge. Satellite views show B-52s stationed outside special bunkers, a quick search of “nuclear weapons Turkey” pulls up a ton of stories about the storage at the base in Incirlik (I’m partial to this one), and the knowledge that people on base like to run laps around available paths probably isn’t new information to anyone in the area. As analysts and uniformed personnel debated on Twitter, what Strava’s heat map adds isn’t much compared to what a half-interested observer probably already knows. But that’s the heatmap itself.
The bigger, specific question is what else Strava knows that isn’t on the heat map. And more broadly, the bigger danger is what happens when every technologies vital for everyday life record that information and share it widely. Strava accounts are linked to Facebook, Google, or email, and depending on the sign-up method, by simply making the account a user gives Strava the same data about their connections already siloed away in a social network. A click or two later, and the user can choose how to share their location with the app. Buried under setting is a “privacy” section. By default, anyone can view a Strava user’s profile, people logged into the app can follow a user, and can download user data. An Enhanced Privacy setting masks some activity, but users have to individually toggle each of several categories of activity to hide it from the app. If a user wants to keep a certain location hidden on the app, like their home or office, they have to go through the desktop client to set a hidden zone with a radius of three-quarters of a mile around a set location.
For someone who wants to just sign on, run, and share their run with their buddies, that’s a lot of work to make sure that, say, the daily jog around a patriot missile installation is secure. And someone who wants to make sure that the steps they take while on on duty count for their fitness tracker might not take the extra time to play with the sensor and hide their location data from other users. And even if they do, the information is still fed into the app and the company itself, where it is then collected and up to the company how that data gets used. The heat map doesn’t identify any specific users. It instead gives viewers something else: patterns of behavior around places.
A quick note on this: while the high traffic areas show brightest (this is the nature of a heat map, after all), it’s the thin lines around sensitive locations, detailing paths walked by maybe one person doing security, that are probably most interesting to anyone plotting something nefarious. That’s true universally: the paths of any group of armed people running security with Strava on can be found here. While it might be of special interest to those tracking the activities of the Pentagon, it’s relevant to anyone tracking any military. And a caveat: this is just the data of people who use the Strava app, so while it documents some activity, the absence of anything on the Strava map only means no Strava users recorded data there, not that nothing happened there.
Strava is hardly the first app to capture ephemeral behavior and turn it into a public, geolocated document. When the messaging and video tool Snapchat introduced a world map option, allowing any user to view any users or posts set to public, commanders had to remind those serving to change their privacy settings, to keep locations private from acquaintances. When Pokemon took a map built for the augmented reality game Ingress and made it a massive phenomenon, quiet but mildly notable sites (like, say, a Commanding General’s house) became the target of casual interlopers.
We are living in a future where the default is to be connected, and the ways people are connected increasingly ask users to offer information, seemingly insignificant, in exchange for daily utility. To some extent, this is vital: what good is a jogging app if it can’t tell where the user went jogging? What is unclear, though, is what else the app knows, and how securely that data is stored, and if it’s possible for someone with access to the full dataset to de-anonymize it.
And then, there is the simple fact of the data itself: while the location of nuclear storage in Incirlik may be public, and the bunkers may be visible from space, those loops are still restricted areas, and cell phones, especially the kind that can record the path someone took on a jog, are restricted in those areas. If there is a risk here, it is the modern liabilities information shared online combined same danger that has plagued every endeavor from the dawn of time: human error.
In an effort to address concerns raised over the weekend, Strava plans to work with both military and government officials on the potentially sensitive data their app harvested from users’ devices.
“We learned over the weekend that Strava members in the military, humanitarian workers and others living abroad may have shared their location in areas without other activity density and, in doing so, inadvertently increased awareness of sensitive locations,” James Quarles, Strava CEO, said in a news release.
The GPS tracking app Strava sources data from users’ smartphones and smartwatches to produce an overlay of popular running paths. Users quickly became aware of the app’s potential to outline secure facilities downrange.
“Many team members at Strava and in our community, including me, have family members in the armed forces,” Quarles said. “Please know that we are taking this matter seriously and understand our responsibility related to the data you share with us.”
The company also stressed that its existing privacy features should be used in the meantime, adding that engineers are working on “simplifying our privacy and safety features to ensure you know how to control your own data.”
Defense Secretary Jim Mattis has directed a department-wide review of fitness app use policies following the discovery of Strava’s potential, Pentagon spokesman Army Col. Rob Manning said Monday.
“The secretary is aware [of the breach], and we are taking a look at our department-wide policies to determine if [they] need to be updated,” Manning said.
The additional policies could include new guidelines on any kind of wearable device that tracks user locations, to include smart phones.