Winter 2019

Carpus Dicit

Wrist-worn digital devices promise to help people manage their health. It's an offer that may have strings attached

Artificial Intelligence Issue

  • by Monique Brouillette
  • 11 minute read

Amy Newell says she had a break­through because of a mobile phone app.

Several years ago, the 43-year-old software engineer was diagnosed with bipolar disorder and has since been trying everything she can to manage the crushing symptoms.

“Unfortunately, they were resistant to medication,” she says.

Seated in a coffee shop, Newell pulls out her phone and scrolls to find the app that tracks her daily symptoms. A calendar appears with about half the days filled with little notes about mood, medications, diet, and exercise. These are all useful data for Newell when she faces difficult treatment decisions. She recalls how this past summer she used the app to track her mood in response to drinking alcohol. Her doctor had been urging her to stop drinking because of potential interactions with a medication, but she resisted. After curbing her drinking for a month, and tracking the results on her app, she found that alcohol was indeed a major precursor of depressive symptoms.

Reflecting on this breakthrough in a recent paper in the Journal of Medical Internet Research, Newell wrote, “My mood data during what is usually my worst month, July, finally convinced me that significantly limiting my alcohol use results in more stability. This is a lesson I was not remotely interested to hear from my psychopharmacologist, but the data, annoying though it was, did not lie and had no agenda.”

Newell is one of a growing number of patients who manage their chronic health conditions with the help of smartphone apps and wearable digital devices. An estimated 250,000 mobile health apps exist on the market today, doing everything from tracking steps and calculating insulin doses to promising to support meditation practice and boost cognitive function. Given the size of this market, it is not hard to imagine a future in which your phone could warn you about potentially deadly conditions such as atrial fibrillation or help you manage medications.

Even health insurers are integrating these devices into their programs. In 2017, United Healthcare, a leading insurance company in the United States, announced partnerships with Fitbit, Garmin, and Samsung, makers of popular wrist-worn activity trackers, which would allow members to receive financial incentives by logging a greater number of steps each day.

A whole suite of tools is now available to help people manage their health. But this fast-moving, commercially driven trend remains largely outside the oversight of the health care regulatory establishment. Instead, companies often are validating their own products, emphasizing product usability and “stickiness” over data privacy and efficacy considerations.

Useful? Perhaps. Accurate? Perhaps not.

Earlier this year, Fitbit Inc., the company that makes the Fitbit wearable activity trackers, agreed to a settlement of a class-action lawsuit filed in 2016 by the State of California because of technical problems with the heart-rate sensor for two versions of its device. Researchers at California State Polytechnic University found that, during moderate and high-intensity exercise, the devices underestimated heart-rate readings by 15 and 22 beats per minute, on average, when compared to electrocardiogram readings. Such an error range is potentially dangerous, especially if the wearer uses the device to determine maximum heart rate during exercise. A 2017 study by scientists at the University of Wisconsin also found inaccuracies for the two devices studied by the California researchers and for wrist-worn activity trackers by two other manufacturers.

“Apps are free or low cost because you’re paying with your personal health data. The business model right now is your data.”

All such activity trackers measure heart rate by emitting a beam of LED light onto the skin on the inside of the wrist. Some of the emitted light is absorbed; some is reflected back. The device senses differences in light reflection that occur as the heart beats and sends blood pulsing through the arteries, causing a change in volume. But the variation in reflection is very small and can be overpowered by changes in light reflection resulting from movement. If a person is jogging and jostling around, for example, the heart rate monitor can miscalculate or even shut down altogether.

“It is important to realize that, although many apps appear useful, the actual evidence for clinical efficacy is nascent,” says John Torous, MBI ’18, an HMS instructor in psychiatry and director of the Division of Digital Psychiatry at Beth Israel Deaconess Medical Center. “I think we have to demand high-quality evidence.” Torous is involved in an American Psychiatric Association effort to evaluate mental health apps and aid consumers in making informed decisions.

portrait of John Torous
John Torous

When it comes to questionable accuracy, activity trackers are not alone. A few years ago, insulin-dose calculator apps came under scrutiny. Patients with diabetes need to calculate insulin doses daily, and a number of apps have been developed to make this task easier. Researchers at Imperial College in London, however, found that out of nearly fifty apps designed to calculate insulin dosage about 70 percent risked recommending an inappropriate dosage. Roughly 90 percent of them lacked user validation to ensure that the data entered was correct—allowing mistakes that could lead to dangerous medication errors.

The high price of free

A few years ago, the United Kingdom’s National Health Service opened a library containing a collection of mobile health apps that had to meet high standards of privacy. Inclusion of an app in the library was meant to assure consumers that sensitive health data would not be mishandled. Software developers were required to answer a series of questions regarding their security protocols; the questions explored whether the apps would meet the NHS’s medical privacy standards.

A review in 2015 by researchers at Imperial College, however, found that many of the apps stored medical data in ways that left the data vulnerable to interception. Nearly 20 percent of the apps had no privacy standards at all. Two-thirds of the apps sent identifying information over the internet without encryption, and nearly 80 percent of those had a policy on sharing data that lacked documentation about encryption practices. Four apps transmitted both health and identifying information without encryption.

The NHS example underscores the difficulty of keeping health data secure in a market where personal data drive profits.

“Apps are free or low cost because you’re paying with your personal health data,” says Torous. “The business model right now is your data.”

Joseph Zurba, the information security and IT compliance officer at HMS, reviews the security of apps and wrist-worn digital devices used in HMS research studies. He thinks that securing digital health technology is more difficult than securing traditional medical record data because digital records face two constantly evolving threats: hackers and malware. He points to the data breach earlier this year at My Fitness Pal, a popular fitness-tracking app. In that breach, 150 million accounts were compromised, with the hackers making off with user names, scrambled passwords, and email addresses. Zurba says that balancing data privacy with the data collection potential of these devices is a complicated dance.

“Companies are really interested in aggregate information. But when you talk about genetic information, I think people should be going into it with their eyes open.”

“We certainly can’t say ‘Never use these devices,’ ” says Zurba. “They can be very useful from a research perspective.” Zurba points to the Apple research kit as a tool that can gather data on such things as how many steps you take, the length and cadence of your stride, and the number of flights of stairs climbed. When the data are combined with, for example, heart-rate data gathered from a wrist-worn digital device, he says, you have a wealth of data for use in research.

portrait of Joseph Zurba
Joseph Zurba

To help secure participant data on such devices, Zurba has come up with creative workarounds: During one study, he asked research participants to create email accounts using false names and birthdays in order to protect their true identity during storage with third-party services.

For I. Glenn Cohen, the James A. Attwood and Leslie Williams Professor of Law at Harvard Law School, the degree of worry over our data should vary depending on which data are being considered. When consumers are worried about losing data collected from their tracking devices, he urges them to maintain perspective. If, however, they are using apps that handle more sensitive information, like the genetic data that life insurance companies request to determine health risks, he believes ample caution is warranted. While there are laws that prevent employers and insurers from discriminating against people based on their genetic profiles, most states still allow life insurers to request genetic data if they are available.

“The kind of data you’re generating with Fitbit and the like are unlikely to be usable at the individual level,” says Cohen. “Companies are really interested in aggregate information. But when you talk about genetic information, I think people should be going into it with their eyes open.”

Health apps and wearable digital devices are not regulated under the Health Insurance Portability and Accountability Act, which governs patient privacy in clinical settings and with insurers. These companies can therefore provide workers with health apps and wearable digital devices and sell the data they gather electronically.

But some still worry about the possibility of health insurance companies denying coverage for unhealthy habits and underlying conditions discovered through apps and wearable digital devices. If the pre-existing conditions clause is removed from the Affordable Care Act, it is conceivable that collected data that reveal an underlying condition could be used to deny coverage.

A control issue

Technology now evolves faster than the regulatory bodies that govern it. One agency confronting the outcomes of this evolution is the U.S. Food and Drug Administration. Unlike traditional hardware-based medical devices, products that occupy the digital health technology space are often software-based. Keeping up with the pace of software development—and the innovation in digital health technologies that the FDA seeks to foster—has led the agency to an innovation of its own: the Software Precertification (Pre-Cert) Pilot Program, launched in 2017. According to the FDA, the program will “help inform the development of a regulatory model to assess the safety and effectiveness of software technologies without inhibiting patient access to these technologies.” The program addresses a product category called SaMD, software as a medical device.

At its simplest, the new program would certify companies rather than individual products, and, much like the Transportation Security Administration’s program allows preapproved passengers expedited security checks, would allow precertified companies a faster route to FDA clearance. Apple, Fitbit, and Samsung are among the nine companies currently participating in the pilot program.

Some consider the FDA’s move to be controversial, especially if it means products will not need to show research that indicates actual health benefits for consumers. Three senators, including Elizabeth Warren from Massachusetts, have sent a letter to the FDA questioning whether the program will allow companies to essentially self-regulate.

Torous says the program is a novel approach, but “there are still many questions about how it will actually work and even if it will actually work in practice.”

Dan Webster is one of those reassured by the FDA’s involvement in software. Webster, a principal scientist for digital health at Sage Bionetworks, a nonprofit that builds and uses collaborative tools to support the integration of data science into biomedical research, is the scientific lead on the digital health effort at the All of Us Research Program. The program, launched in 2016, is an element of the Precision Medicine Initiative in the National Institutes of Health. Its goal: to enroll one million people living in the United States; collect data on their health, demographics, genomics, and behavior; and, ultimately, make that data available to medical professionals, researchers, and patients who seek to work collaboratively on making health care decisions.

“The All of Us program was designed with patients as active partners from its inception, full transparency in its approach and data collection, and clear privacy standards that place participants in control of their data and how it is shared,” says Torous, who is also an adviser for the arm of the project focused on designing a smartphone mood-tracking app.

The program includes the use of a variety of apps and wearable digital devices to collect exponentially more, and more granular, data than clinical visits alone would.

Webster says the privacy policies at All of Us are still being formed, and developers will carefully consider many precautions to ensure participant privacy. One thing they are considering, he says, is masking data like latitude and longitude. If the All of Us researchers need location information in order to study the effects of, say, airborne particulate matter, they might collect only the first three digits of a zip code to make the geographic data less identifiable. He notes that the privacy policies and protocols for All of Us are posted online.

Medical applications like the one that helped Newell are just the beginning. Mobile health devices are earning a place in our health care system, offering patients and researchers the opportunity to gather vast amounts of data on a scale and granularity never before attainable. While these apps will likely be used in the clinic, the toll they might exact on patient safety and privacy remains to be determined.

Monique Brouillette is a Massachusetts-based science writer.

Images: Dung Hoang (illustration, top); John Soares