Fertility apps house the sensitive data of millions of users globally. Read our blog from Dr Maryam Mehrnezhad on the risks surrounding fertility app users’ privacy.
Examining the cybersecurity, privacy, and bias in period-orientated technologies is vital. Such technologies are booming.
But they have the potential to cause complex harm.
How do fertility apps work?
Fertility apps collate user-entered data and/or use small sensors to take body measurements such as ovulation detection, basal body temperature etc.
By collecting a huge amount of data and processing it through advanced algorithms, such as AI, these technologies assist in managing reproductive and sexual health, and give scientists more insight into people's bodies.
However, there's a sinister lack of clarity in the laws surrounding this extremely sensitive data.
For example, user consent, third-party sharing, and algorithmic biases.
What have our studies shown?
We have shown that the majority of fertility apps start tracking the user right after the app is open. This is before there's even been any user interaction with the privacy notice.
In fact, research has shown that the users of such technologies might experience more severe consequences for the same risk. We also have shown that user privacy is not consistently practiced by service providers across platforms, from PC to mobile to IoT, causing cyber-security fatigue.
Users are not empowered by the existing privacy-enhancing technologies (PETs) available in fertility apps since reproductive-related data is not often covered by the privacy policies and, in most cases, is completely disregarded.
The risks surrounding sensor-enabled fertility IoT devices
Sensor-enabled fertility devices introduce new risks.
Our research shows that although sensors, such as ambient and motion, can put users at serious risk, user perception of such risk is far less.
Previous research in digital women’s health and woman-centered design suggests that privacy is not considered an important design component.
And this is despite the risk of women’s human rights violations - such as tracking abortion, manipulation and pregnancy-based redundancy.
But that's not all. We anticipate that due to their evolving nature, more insidious opportunities will unfold in the future.
Who has access to this data?
Fertility tracking apps and abortion have a controversial history.
Reportedly, these apps share sensitive data (e.g. sex activity) with third parties like Facebook the moment the user opens the app, even without a Facebook account.
Other events include those which target women to mine and exploit their data. For example, the case of an ideologically aligned fertility app that was funded and led by anti-abortion, anti-gay Catholic campaigners.
Nonetheless, fertility technologies are yet to gather the interest of security researchers. And limited analysis of similar technologies is available. User aspects are also considerably understudied.
Research topics such as bias and trust in such technologies have not been explored too. Any bias in these technologies would affect the user’s life significantly. For instance, the predictions based on users’ data from certain demographics may lead to unwanted pregnancy and/or missing the fertile window for a user from other demographics.
Fortunately, in a recently UKRI-funded PETRAS project (CyFer), we will explore such issues in close collaboration with our industrial partner SPD (Swiss Precision Diagnosis, makers of Clearblue’s pregnancy test).
We aim to explore this research space with a cross-disciplinary approach. We will identify and evaluate the current security and privacy-enhancing technologies (PETs).
In addition, while data and algorithm discrimination has been studied in other areas, certain contexts such as FemTech are significantly understudied.
We aim to explore the bias in data, algorithms, and AI systems in fertility technologies. We also aim to study users of such systems across societies, to have a general understanding about how these technologies may inflict differential harms across cultures.
Find out more
Dr Maryam Mehrnezhad is a Research Fellow in the School of Computing. Maryam conducts research on all aspects of the security and privacy of sensing technologies on a wide range of sensors (biometric, communicational, motion, ambient, etc.) and across different platforms. She is specifically interested in cybersecurity and privacy research for marginalised and disadvantaged user groups. To read more from our academics, explore our blog.