The Dark Side of Wearables: A Potential Fertility Surveillance Network

Author

Paloma Secunda Laretto, MS, Anne Zimmerman, JD, MS, and Olivia Bowers, MS,

Publish date

The Dark Side of Wearables: A Potential Fertility Surveillance Network
Topic(s): Policy Politics Public Health Reproductive Ethics

While there is an abundance of literature noting the risks of period tracking apps in jurisdictions with strict abortion laws, wearables collecting seemingly innocuous biological data pose similar risks. Many may be unaware of their ability to infer deeply personal reproductive information. With fewer eyes on this type of collection and prediction, this information could be used to coerce or incriminate in states with restrictive abortion laws.

Protections for data directly related to fertility are not enough to protect women. Non-fertility-related information from wearables, such as heart rate variability, sleep patterns, age, activity levels, and respiratory rates, can predict pregnancy with a high degree of confidence. This means even women who never tracked or connected their cycle tracking to a wearable are at risk of privacy invasions, whether by a partner or a government entity. While there are some potential benefits, such as early detection, use by a provider, or increased research about pregnancy, the risks of syncing cycle data with wearables are too high in the current legal landscape.

Predictive modeling works by developing a data set using opt-in wearable data. A model is trained on it, and its effectiveness is then tested against existing data that the model has not yet seen. With large enough datasets, predictive models can become extremely accurate.

Wearables and fertility datasets with extended single-user data do not require as extensive or diverse a dataset. Models can compare individual data points to the user’s personal baseline rather than relying on population averages, allowing for more accurate predictions with less data. Moreover, a wealth of data is becoming available due to the widespread use of wearables. In 2016, 15% of US consumers used a wearable device, and over 50% are estimated to own one today. Young women are more likely than men or older individuals to own these devices, making the dataset even more accurate for fertility-related predictive modeling.

Wearables also allow users to connect menstruation tracking apps to their wearable app (e.g., natural cycles, cycle tracking) or have built-in menstrual tracking capabilities. This allows these companies to build massive data sets that can expand the reproductive insights inferred. Some wearables, such as Whoop, are already publishing insights related to pregnancy from user data.

The current legal and ethical framework for data privacy does not recognize a fundamental right to ownership of one’s data, and privacy is often and easily waived by users within the initial blanket permissions. Even with modifying permissions or setting data collection to private after every software update, data can still be accessed, sold, or stolen. In some cases, users reported consent violations because of continued data collection even after changing their settings and permissions.

Understanding who has access is essential. Wearable device companies such as Amazon, Garmin, Fitbit, and Apple store personal information in the cloud, whereas Oura uses AWS, and Whoop uses its own personal servers. Third parties, including insurance companies, clinicians, advertising companies, or even researchers, often have permission to access this data.

As wearables are not standalone devices, shared accounts may pose intimate risk. Users may inadvertently give someone in their household or inner circle access to personal information. Policies promoting segregated IDs in plans like Apple’s Family Sharing aim to give users more control over information access. This is especially relevant in states with restrictive abortion laws, as parents or romantic partners can engage in reproductive coercion and interfere with the medical decisions of a woman seeking abortion care.

Wearables are also sensitive targets for hackers. Hacking, whether through initial access, credential access, or exfiltration tactics, can occur when wearables are connected to broader networks or cloud systems, connected to mobile devices such as phones, or third-party systems. Their limited computing power makes it difficult to securely process data, and many allow weak authentication or easy Bluetooth pairings, leaving the data extremely vulnerable to outside parties. Hacked fertility data could be weaponized to blackmail or publicly expose women, turning pregnancy itself into a tool for coercion and doxing.

Law enforcement has already used biological data from wearables in investigations and courts. For example, Fitbit data was used to confirm a false accusation of rape. Wearable data can replace testimony, conflict with or confirm witness testimony (speaking to credibility), and support a legal argument.

The 1967 Katz test determined whether one has a reasonable expectation of privacy in the context of constitutional warrantless search and seizure. The idea was that if someone reasonably expected privacy, a warrant should be required. Carpenter v. US held that law enforcement violated the Fourth Amendment by accessing cellphone data without a warrant. Like the data from wearables, the location data was shared with a third party. The Court found the defendant had a reasonable expectation of privacy despite the third party’s access to it (limiting the application of the Third-Party Doctrine). Wearable technology companies are generally expected to keep the data private.

The Carpenter Court considered five factors, including the “intimacy and comprehensiveness of the data, the expense of obtaining it, the retrospective window that it offers to law enforcement, and whether it was truly shared voluntarily with a third party.” In Riley v. California, the Supreme Court held that a warrant is necessary to search a cell phone during an arrest, noting that if it were not, law enforcement would have access to intimate details about a person’s life. And in US v. Jones, installing a GPS tracker on a defendant’s car without consent violated the Fourth Amendment.

Wearables generate intimate data, and access to this data should require a warrant. The spirit of the Fourth Amendment is to prevent a surveillance state in a democratic society. Even with a warrant, access to biometric data from wearable devices poses a risk, as users may not be aware that their data was tracked or what piecing the data together with other information could reveal.

In a Nebraska case, Facebook faced a subpoena and provided messages demonstrating that a 17-year-old had terminated a pregnancy and buried the fetus. Some period tracking apps, including Proov, are responding and have moved their storage to Google Cloud, so that the data itself is stored in states that have laws protecting reproductive freedom. Similarly, Apple encrypts data, and Bellabeat offers private key encryption to try to protect data from subpoena. Fitness-focused wearables may not scream ‘pregnancy data,’ but they should still match period-tracking apps by offering robust encryption, complete data deletion, and strict privacy safeguards.

The legal risks are compounded by abortion bans and fetal endangerment laws. Even though these laws endanger pregnant women, prosecutors generally go after medical professionals or people attempting abortions outside of clinics. Evidence of pregnancy itself, potentially drawn from wearable data, could be admissible. Because miscarriage and abortion are often indistinguishable early on, this creates serious risks of wrongful inference and invasive surveillance.

Women of childbearing age should exercise extra caution and consider avoiding the use of wearables that collect health data. In states with restrictive abortion laws, data suggesting pregnancy could invite surveillance or even serve as evidence in a prosecution. Beyond the legal system, biological and biometric data in the wrong hands creates new vulnerabilities. As wearable technologies become more pervasive, both users and the public must recognize these risks alongside benefits.

Paloma Secunda Laretto, MS, is a software engineer at User Interviews and an ethics consultant at Oena Consulting.

Anne Zimmerman, JD, MS, CIPP-US, CEET is the Editor-in-Chief of Voices in Bioethics (Columbia University). 

Olivia Bowers, MS, MS is the Managing Editor of Voices in Bioethics (Columbia University). 

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.