I’d like to learn more about Newcastle!

read

Fertility apps and cybersecurity: who can access your data?

By Dr Maryam Mehrnezhad
woman_using_fertility_app_to_track_ovulation

Fertility apps have millions of users globally, and Internet of Thing (IoT) devices are projected to be valued at $50 billion by 2025. Read our blog from Dr Maryam Mehrnezhad on the risks surrounding fertility app users’ privacy. 

Examining the cybersecurity, privacy, and bias in female-oriented technologies (FemTech), such as apps and Internet of Things (IoT) devices is vital. Such technologies are booming, but they have the potential to cause complex harm.  

 

How do fertility apps work? 

Fertility apps gain user-entered data and/or use small sensors to take body measurements (e.g. ovulation detection, basal body temperature).  

By collecting a vast amount of data and processing them through advanced algorithms, such as AI, these technologies assist in managing reproductive and sexual health, and give scientists more insight into people's bodies.  

However, there is a lack of clarity in the law surrounding this extremely sensitive data. For example, user consent, third-party sharing, and algorithmic biases.  

 

What have our studies shown?

We have shown that the majority of fertility apps start tracking the user right after the app is open. This is before any user interaction with the privacy notice. 

In fact, research has shown that the users of such technologies might experience more severe consequences for the same risk. We also have shown that that user privacy is not consistently practiced by service providers across platforms, from PC to mobile to IoT, causing cyber-security fatigue.  

Users are not empowered by the existing privacy-enhancing technologies (PETs) available in fertility apps since reproductive-related data is not often covered by the privacy policies and, in most cases, is completely disregarded.

 

The risks surrounding sensor-enabled fertility IoT devices 

Sensor-enabled fertility IoT devices would introduce new risks. Our research shows that although sensors, such as ambient and motion, can put users at serious risk, user perception of such risk is far less. 

Previous research in digital women’s health and woman-centred design suggests that privacy is not considered as a pillar in the design of such systems. But this sensitive data can put the user at risk. 

Namely, the risk of women’s human rights violations such as by tracking abortion, manipulation, pregnancy-based redundancy. And we anticipate that due to their evolving nature, more insidious opportunities will unfold in the future.  

 

Who has access to this data?

Fertility tracking apps and abortion have a controversial history. Reportedly, these apps share sensitive data (e.g. sex activity) with third parties like Facebook the moment the user opens the app, even without a Facebook account.  Other events include those which target women to mine and exploit their data e.g. the case of an ideologically aligned fertility app that was funded and led by anti-abortion, anti-gay Catholic campaigners.  

Nonetheless, IoT fertility technologies are yet to gather the interest of security researchers and limited analysis of similar technologies is available. User aspects are also considerably understudied.  

Research topics such as bias and trust in such technologies have not been explored too. Any bias in these technologies would affect the user’s life significantly. For instance, the predictions based on users’ data from certain demographics may lead to unwanted pregnancy and/or missing the fertile window for a user from other demographics.  

 

What's next?

Fortunately, in a recently UKRI-funded PETRAS project (CyFer), we will explore such issues in close collaboration with our industrial partner SPD (Swiss Precision Diagnosis, makers of Clearblue’s pregnancy test).   

We aim to explore this research space with a cross-disciplinary approach. We will identify and evaluate the current security and privacy enhancing technologies (PETs). 

In addition, while data and algorithm discrimination has been studied in other areas, certain contexts such as FemTech are significantly understudied. We aim to study the bias in data, algorithms, and AI systems in fertility technologies. We also aim to study users of such systems across societies to have a general understanding about how these technologies may inflict differential harms across cultures.  

 

Find out more 

Dr Maryam Mehrnezhad is a Research Fellow in the School of Computing. Maryam conducts research on all aspects of the security and privacy of sensing technologies on a wide range of sensors (biometric, communicational, motion, ambient, etc.) and on different platforms (e.g. mobile, IoT, buildings, animals, Agritech, Femtech). She is specifically interested in cybersecurity and privacy research for marginalised and disadvantaged user groups. To read more from our academics, explore our blog. 

Sign_up_for_our_latest_research_stories

Tags: Data