Article Lead Image

When apps track your workouts, who really wins?

It might be dressed up like a game, but apps that track your movements are serious business.

 

[email protected]

Internet Culture

Posted on Nov 12, 2013   Updated on Jun 1, 2021, 2:07 am CDT

BY STEFFAN HEUER AND PERNILLE TRANBERG

Tracking your steps and your sleep with a device like the FitBit Flex or the Nike Fuelband can be immensely motivating. If you haven’t reached your goal at the end of the day and haven’t been rewarded with a vibration on the wrist, it makes you get off the couch and drag the dog out for another walk. Competing with friends will fuel your activity even further until, ideally, your physical and mental state is improved day by day with electronic nudges. Just ask Rupert Murdoch.

The Quantified Self movement that proposes to cover our bodies with sensors and mine the accompanying data has, on the face of it, intriguing advantages. But as with everything else in the tech world, there is a flipside: Turning your body and your health into a live quarry for datamining raises serious issues about privacy and discrimination.

Why so paranoid?

Data derived from tracking devices like Fitbit, Nike, and Jawbone Up, as well as smartphone apps like Endomondo and Runkeeper, is highly sensitive health data. In fact, it’s probably the most valuable data about you right after credit card information.

Now read the homepage and terms of service of the companies offering the tracking devices and services. At Fitbit’s corporate section, the company promises insurers they can “measure progress over time with validated data.” Maybe it does not sell your data directly, but as Wolfie Christl from Data Dealer laid out in this TedX Vienna talk, insurers have jumped on the trend: Meet your target and lower your premium.

That sounds good, but it’s not in the interest of the average consumer since most of us are not the shiny, happy, and energetic people you meet in advertisements for trackers and fitness apps. Obsessive self-tracking is just another way to turn life into an endless succession of self-service offerings disguised as a convenient game. But while the automators win,  the cogs lose.

The California-based Privacy Rights Clearinghouse recently released troubling findings in a study funded by the California Consumer Protection Foundation. It found gaping holes in the way 43 popular fitness apps handle personal data. They range from transmitting vital data unencrypted to transferring your health nuggets to third-parties without telling consumers. The PRC’s verdict: “Consumers should not assume any of their data is private in the mobile app environment—even health data that they consider sensitive.”

So what’s a fitness enthusiast or a person at risk of contracting a particular disease to do?

If you prefer to not have your personal health data captured, tied to your person and sold, you should be very careful which apps you download. Free apps are usually less trustworthy than paid-for ones. What’s more, you should always work with multiple identities and disposable email addresses instead. At Fakenamegenerator.com, for instance, you can create new identities and use them for the Fitbits of the world. Steer clear of connecting them to social networks—even if it might save you a few seconds when logging in.

When you sync a tracker to the company’s website to stay up to date on your performance, it’s a good idea to block cookies and other tracking scripts, as well as use a virtual private network (VPN). It lets you pretend you’re located in various places around the world, making it harder to pinpoint your real identity through your various devices’ location.

If we don’t guard these seemingly mundane details about our life, physical activities, and weaknesses, we enable new ways for tech companies to force us to constantly self-optimize. And we open the door to new kinds of discrimination. Imagine what an insurer or consumer goods company can do with a simple connected toothbrush. It’s gamified hygiene, for sure. But it can also mean no dental coverage since the data shows you haven’t brushed your teeth as long and as often as some fine print in your policy requires.

In the long run, we expect to see the emergence of self-tracking services that don’t abuse your personal data, for example French startup Cozy Cloud. It lets you “host, hack and delete” your own cloud, including a connector for various tracking services and gadgets.

Don’t spit out your data

Even more caution is advised when it comes to genomic data. One of the leaders in the field, 23andMe, is trying to convince the self-trackers and the general population to  volunteer the contents of their biological hard drives, err genome, to them. It promises the world from a few drops of spit. “Find out things like if your body metabolizes caffeine quickly, or if you’re at risk for diabetes. The more you know about your DNA, the more you know about yourself”, the company says on its website.

What’s not to like about a little biohacking and sleuthing game for just $99? You can find unknown ancestors and risk factors. But you’re also feeding highly sensitive data to a for-profit company that states in its user agreement how that data will be shared and monetized: “These activities may include, among other things, … conducting data analysis that may lead to and/or include commercialization with a third party.”

Not to mention the open questions of what happens to your data if a government entity or a lawyer comes calling, or if the database were to be sold to another company. Google, after all, is an investor in 23andMe. It’s a Pandora’s box that most media stories about genetic testing don’t really talk about. Just take a look at the recent fluff piece by the NPR show On the Media.

Our advice: Don’t send your spit to 23andme or other genetic testing companies. Why do you need to know if you have 13 percent risk of developing some disease? Scientists wring their hands over the supposed insights consumers may get from these tests and list of vague probabilities, and for good reason. Entrepreneurs, however, have realized that bio data from millions of people is worth gold if they can bundle and resell it. Why let them exploit your genetic data commercially without having a thorough debate first and make sure you have a say in how it’s used?

Tying the hardwired identity of your DNA profile to the rest of your offline and online identity is literally the killer app for snuffing out privacy as we know it. And it will take us one more step down the slippery slope of designing humans in the image of some arrogant technocratic elite that thinks everything is just a matter of the right algorithm. It’s no surprise 23andMe recently was awarded a patent that lets parents pick the traits of their future child before undergoing fertility treatment. The book on what such a world will look like was written a good decade ago, but it’s still worth reading: Our Posthuman Future by Francis Fukuyama.  

We’ll admit it’s hard to use a fake identity when paying a company to sequence parts or soon all of your genes. Better to curb your curiosity to get a vague and probably unreliable idea of what ailments you might develop. See your doctor and have him or her run tests on a regular basis. Take a run, go biking. And by all means, track yourself while doing it—but please let a team of aliases feed the machines while you try to live a healthy life.

Steffan Heuer and Pernille Tranberg are authors of the book Fake It: A Guide to Digital Self-Defense. They cover technology and privacy issues in San Francisco and Copenhagen. In this series, Digital Self-Defense, Heuer and Tranberg report with updates from the digital identity wars and teach us how to defend our privacy in the great data grab going on all around us. Follow them at @FakeIt_Book.

Photo by  Noise ‘n Kisses/Flickr

Share this article
*First Published: Nov 12, 2013, 2:01 pm CST