Episode 2 – This is why privacy matters in digital health

By Karsten Stampa, COO / CFO of healthbank

In the first part of this blog post, I explained how people’s data is being used by giants like Google and Facebook to sell targeted advertising, based on the user’s demographic data and interests. My primary example was how Facebook shared private data of their users with Cambridge Analytica to enable “targeted advertising” for a US-voting campaign. And, as we’re talking about health data, I showed how shortly after that, Facebook paused a project of their branch Building 8, where they even reached out their arms to anonymized patient data in hospitals, “to blend hospital data with social data” – it remains unclear if the patients ever gave their consent to it.

But let’s get back to the data that private users voluntarily share on social networks and let’s get back to the runner mentioned in the previous episode.

2. The Strava heat map case

Lots of people have done this: together with their new fitness trackers, they use an app called “Strava” to not only track the steps and the time they run, but thanks to geolocation functionalities, they also track the exact route that they took during their run (or walk or bike ride).

So far, not a huge problem – who’s not proud in showing off the distance and route run to stay fit and healthy?! Strava, however, did something stunning in Fall 2017: they put all the different routes of all Strava users all over the world together and produced a so-called heat map (https://www.strava.com/heatmap#7.00/-120.90000/38.36000/hot/all) that shows in different colors, which routes are being used very frequently and which are not thus shows ‘heat’ made by aggregated, public activities.
Following Strava, they provide this service for “Athletes from around the world {who} come here to discover new places to be active” and claim that Athletes can opt out by updating their privacy settings, mark activities as ‘private’ or don’t show any ‘heat’ in areas with very little activity.
(source: https://www.nytimes.com/2018/01/29/world/middleeast/strava-heat-map.html )


But still, this is where it gets really interesting:
as the New York Times reported in January 2018, the Strava app revealed military secrets by showing the physical activities of soldiers serving on military sites all over the world whilst using their Strava fitness app!
Still, this is a disaster for the military but as Strava of course states, all activities and the data shown are anonymous.

But that’s only half the truth as the NYT article states:
And this article is stating my data privacy concerns: “although the map does not name the people who traced its squiggles and lines, individual users can easily be tracked, by cross-referencing their Strava data with other social media use.” For example, if those users compete in a “best list”, their usernames are visible for a particular area – and let’s be honest here: most usernames are not that creative (for instance, see my twitter name @kstampa).

To sum this case up: Of course, most of the users of Strava are not military personnel and reveal secrets – but that’s not the important issue here: it’s more important that people voluntarily share sensitive health information to a public network, which in turn uses the data for a purpose most people aren’t even aware of. And this information can in turn be traced back to people and have unintended consequences.

Stay tuned for tomorrow’s episode 3 of this blog post! You will learn more about an example of a company sharing sensitive health information without the explicit user consent and why digital health is so much more than just an electronic version of your patient record.

Contact us

Sending

Log in with your credentials

Forgot your details?