Today's smartphones increasingly resemble the handheld medical scanners of a science-fiction future. But as our always-on devices transform medicine, we need to look to the past as well, ensuring that technology companies abide by the 2,000-year-old-dictum that binds doctors: first, do no harm. More than 110 million wearable sensors were sold worldwide in 2015. Fitbits, heart rate monitors and smartphone apps not only count our steps and track our workouts, but also have the potential to produce "digital biomarkers" - indicators of medical conditions or symptoms. These digital traces of our daily activities could one day become warning signals of nascent health issues. Our web browser history could alert psychologists to a pending manic episode. Activity monitor location data may one day help diagnose mobility disorders such as Parkinson's disease.
What we do (or don't do) on our smartphones might facilitate early detection of dementia or cognitive decline. The research emerging shows us real ways in which smartphones and other devices may one day improve our health. Unlike health information collected and provided to healthcare professionals, the consumer digital data on fitness or health gathered by tech companies enjoys practically no protection. Virtually no policies, laws or procedures protect user privacy or guarantee users access to this information.
This presents two parallel challenges: we need to protect data from those who want to hurt us, and to access data ourselves when we need it. All of these issues have the same principle at stake: people whose bodies generate health data should have power over how it is used. The risks of discrimination are self-evident. But the rights of consumers to control their own information are perhaps more important. Currently, most of us haven't the foggiest idea what health information could be detected from our data. Will we be able to access such data when we need it?
Doctors are now required to share medical records with their patients. Doctors may be medical experts, but patients are often experts in their own everyday experience, and together they make informed choices. But the same does not hold true for the information-technology companies that claim to be designing solutions for our health.
As more new technologies and data analysis techniques become available, we should ensure that we have access to our own digital health information, whether it is collected from a consumer device such as a smartphone or a medical device like a defibrillator. Today, we participate in our own diagnoses. We decide what symptoms to discuss with our doctors and which medical tests to get. We should also be allowed to participate in the decisions when digital health data about us is generated, collected or analysed. When we have control over our own health-related digital data, it changes how we can ask questions, involve others in our care and understand what information about us is being used - and who else is using it.
Patients and consumers should not have to make trade-offs between using new technology to improve their health and protecting themselves from active discrimination based on their data. Stopping data collection and research on the potential health benefits of mobile device data is not the solution. The public health benefits are worth fighting for. The ability to diagnose and treat diseases using digital biomarkers will rapidly improve.
Already, key examples show the phenomenal things that can happen when people have access to their own data: new kinds of potentially life-saving displays for blood glucose data such as the Nightscout project provides; new kinds of questions, hypotheses and self-experiments as Quantified Self meetups provide; and even new ways of expressing artistic or aesthetic values through our data as in the work of artists such as Laurie Frick and Stephen Cartwright. Some technology companies argue that mandating consumer control of data will have a chilling effect on their business models or their proprietary company information. But public health benefits are potentially too high. We have always held health devices to a high standard for privacy and efficacy, and that should not stop now. Now we need to ensure that the tech companies do the same.
Gina Neff is an associate professor of communication at the University of Washington and the author of Self-Tracking (MIT Press)
This article was originally published by WIRED UK