SPICE Researchers: Strava Heat Map Noncompliant With General Data Protection Regulation


By Professor L. Jean Camp and Jonathan Schubauer

A jogging club in your neighborhood, the perfect trail to go on a bike ride, a work friend’s home address, maybe a top secret military base–whatever location you’re looking for, chances are Strava’s storing it.

You’ve probably already heard about the recent global heat map Strava published a few weeks ago. The fitness tracking company released more than 27 million users’ jogging, running, and walking patterns. Over 12 billion GPS data points of exercise enthusiasts are clearly mappable. An even harsher reality, the data isn’t completely anonymized.

According to Computer Scientist Steve Loughran, by uploading an altered GPS file, it’s possible to identify company data and verify the identities of top secret military personnel. For users requesting data of specific geographic areas, sensitive information such as names, various fitness analytics, and running routes are also publicly available if a user has shared their data in a segmented location.

If you’re not super tech savvy like Loughran, the information emitted from fitness trackers can be used to identify a user’s daily routines. Simply combine the location data from the fitness app with publicly available information to uniquely identify a person’s routines on the Strava Heat Map. These locations and routines may include work schedules, school addresses, home address, and other patterns of life a user may view as sensitive information.

The app does not distinguish between someone working at a secret base, or a woman dealing with a stalker. From a proud enthusiast eager to broadcast admirable exercising habits, anyone who has not updated their privacy settings are at risk to their personal information being exposed publicly online.

The fitness app uses movement sensors, sometimes linked to GPS integrated devices, to collect information. Various fitness activities like distance traveled can all be used to track users behavior patterns over a two-year period (2015 and September 2017).

The purpose of sharing fitness-based data with Strava is to track and share progress, set and meet goals, and build a community. The mobile application uses stored information to help link other people to location-shared exercise activities in the same area. The problem? Users aren’t aware their Strava privacy settings are being turned off each session if their privacy settings aren’t saved. Strava profile information is publicly “BROWSABLE” by default.

While it is true that Strava offers enhanced privacy options, the settings are not very easy to use, requiring multiple preference settings Strava users find confusing to implement. These privacy frustrations are exemplified by some of their users on Strava’s support page.

“As the title suggests, my enhanced privacy mode keeps reverting to off. I have to go in every day or so, turn it back on and save. I’ve tried this on the phone, web, etc. It is maddening!”

“I am having the same issue. I have turned my enhanced privacy on every day for the last week. I have logged out of my other devices to see if that would solve the problem but has not. Today I turned it on twice.”

The application requires seven options on at various screens of the app, with the initial “Choose Enhanced Privacy”. But “Choose Enhanced Privacy” is just the start. The user must also specify the “Hide Activities From Leaderboards” and “Change your FlyBy Options” features to enhance their privacy.

Without those privacy setting features turned off, even with enhanced privacy on, a user will still sometimes post a photo, name & run publicly, and will do so based on the activities of others. Particularly, if there are only a few people running a route (such as a small number of personnel on a classified military base or operation) so that there is an isolated local Leaderboard.

There is yet another set of screens the user must also control. After a user has turned their public-sharing settings “off” the user will need the “Enable Group Activity Enhanced Privacy “option “on” or a profile will share Personally Identifiable Information (PII) while a part of a shared Strava group. That is, the individual data will be shared based on privacy preferences of members in a shared group. Even with all these privacy settings turned on, If the user has not chosen to anonymize their data form the Strava heat map, their daily exercising routes can still be viewed publically.

As if the privacy settings process wasn’t tedious enough, there is a third source to the privacy settings. Even if the user has blocked other Strava users, a person who is blocked will be able to see activity entries in public areas like segment leaderboards, club feeds, and segment explore. “Blocking” only means the blocked user will not be able to access specific activity or a user’s profile page. The blocked person can still track a user’s location data. These features must also be saved or the blocked Strava user will revert to being unblocked the next use session.

It would seem that Strava offers a very complex privacy settings configuration with confusing use of common words, such as “privacy” and “blocking”.

Can this design and business practice comply with the General Data Protection Regulation (GDPR)? A user chooses enhanced privacy, blocks someone, and other users can still view her “leaderboards, club feeds, and segment explore”. This means her personal travel patterns, where she hangs out with colleagues, who those colleagues are, and when new behavior patterns occur are still being tracked. And that’s only if the she has already chosen to turn “On” Enhanced Privacy. The user must also remember to save these privacy features, otherwise, the settings revert back to public default sharing.

Furthermore, many researchers have already identified unique individuals using personally identifiable information that is already publicly available. Since digital information is now seen as a commodity, disclosure agreements consented by users are drafted ambiguously, with excessive length that far exceeds the vast majority of users’ willingness to inform themselves appropriately.

Privacy notices are drafted in this manner to nudge the user to make the convenient and immediate choice of agreement, while “consenting” to multiple parties collecting personal information. Companies employ these types of notice deficiencies to discreetly serve their own purposes typically as part of a blanket opt-in/opt-out approach. Strava’s privacy notice is no different. The privacy features are automatically turned off by default if they are not saved and the user has already “consented” for Strava to collect their information by downloading. Since the users have consented to Strava’s privacy agreement, various permissions are deemed accepted and access various forms of applications on users phones as depicted in Strava’s permission list. In Android we can access the permissions list. The privacy permissions in Apple likely include these, and ex-filtrate even more data. (Based on empirical comparisons showing that equivalent apps share more data on iOS than Android). In addition, studies have shown that IOS users are less privacy aware than Android users.

Even if Strava users are willing to read and perfectly understand the language of the privacy notices and opt-in/opt-out rights are exercised, users still lack control over their personal information that Strava stores. This lack of information control creates distinct subjective and object privacy harms, both of which we are now seeing in the private and public sectors. Because many recipients were unaware of what information Strava was collecting and how it would be used, there was virtually no control offered in mitigating potential privacy harm. Of course, with the release of the heat map we are witnessing these harms on an intercontinental scale.

The General Data Protection Regulation is scheduled to be enforced in May 25, 2018. GDRP fines can range from as much as 4% of worldwide company revenue if found noncompliant with the regulation. When it was announced, no one considered it a critical component to the national defense strategy of NATO. This was never a regulation interrelated to national defense. Yet Strava shows that information control is a personal safety issue, a possible trade conflict, and a national security concern. Strava has announced it will be working with government officials to review its privacy policies to address the location activity exposed with the heat maps.

Are commonly used lifestyle applications now affecting national security? The answer is complicated. While not every mobile app is sharing locations of users, applications do require a degree of clarity for privacy preferences to be properly implemented. We see how poorly integrated usability has affected the privacy of millions of users, including military personnel with Strava. Security culture is an issue both private industries and governments continue to struggle with. Strava is just one perfect example of how failure to implement adequate privacy settings mixed together with negligent security practice can cause unintended consequences at the national security level.