Women’s Health: FemTech is set for growth but how to balance trust, privacy and innovation?

medical-563427_1920.jpg

Guest blog post by Natasha Singh

Health and lifestyle products and services have so far catered mainly for the needs of men. In recent years, a sector dedicated to women has emerged and is set to grow: FemTech (Female Technology).  

As per Frost & Sullivan Market Research, the FemTech market has the potential to reach $9.4 billion by 2024 globally. 

FemTech are digital applications, devices, products, or services that cater to the needs of women’s health. Some of the common solutions in the market range from beauty products, fertility solutions, reproductive health, menstrual tracking apps, pregnancy, post-pregnancy care to menopause management, but would also cover general health such as bone care, cancer and other chronic diseases. They exist in various forms such as wearables or devices paired with mobile apps, or wearables or devices and mobile apps on their own, internet-connected medical devices and others.  

The level of sophistication can vary, with some offering simple tracking services while others being paired with sensor integrated devices, generating large amounts of sensitive information in real-time. 

Given that FemTech is addressing women’s health, there is reliance on the processing of personal information. Personal information collected would depend on the purpose of the FemTech. It can range from user profile details such as their name, age, contact information to very sensitive personal information such as menstruation information, sex life, pregnancy status and health related history (also known as ‘special category of data’). It falls within the purview of the General Data Protection Regulation (GDPR) and the ePrivacy Directive and FemTech is exposed to data protection and security issues.

Does the law offer a fine balance between privacy and promoting trust, innovation and competitiveness? 

Myths vs. Fact

Sometimes, privacy and data protection are perceived as a threat to innovation and competitiveness or the GDPR being just another problematic compliance requirement that stifles innovation and growth. 

Our view is that the GDPR is sufficiently flexible and technology agnostic to strike the correct balance between privacy, trust and innovation. The law essentially embraces a ‘risk-based approach’; in fact, throughout the GDPR, it is stated that organisations that control the processing of personal data (‘Controllers’) are encouraged to implement protective measures which correspond to the level of risk of their data processing activities. The concept of risk analysis most notably appears in the “technical and organisational” measures controllers must implement to assure adequate data security.

These measures can vary from data protection impact assessments, privacy by design, staff training, pseudonymization or encryption to the ability to restore access to data if there is a security incident. 

The concept of risk analysis is purely contextual. It is defined by reference to the “likelihood and severity” of a negative impact on people’s rights. Controllers should account for “the nature, scope, context and purposes of the processing” as every organisation is different. 

Even for the data protection authorities, when it comes to enforcement, the GDPR requires them to consider the risk level of the activity when deciding whether to impose fines for a violation which must be “effective, proportionate and dissuasive.”  

So, the law is not a challenge to innovation, and it has not been designed as such. 

The next question is: how to embed privacy and data protection to drive trust and innovation in FemTech? 

The sensitivity of the data processed by FemTech can be quite high and a data breach would be enough to lose customers’ trust, not just in the products or services but probably in the core purpose of a FemTech business. Not to mention the impact it can have on reputational damage. Trust, once lost, could not be easily found. 

There are 3 pillars that a FemTech business must implement at the core of their business model to enable trust and innovation. These are: transparency, control and privacy and security by design. 

Transparency and access to information 

There must be transparency about digital business practices, business model and data processing operations, especially data sharing. Disclosures in privacy notices and business terms must be concise, easy-to-understand and conveyed in plain language to users. Creative presentation techniques must be adopted for disclosures, such as tabular formats, layered designs, privacy and security “nutrition” labels, and notice highlights followed by a link to full privacy notices to assist consumer education.

It’s also beneficial to offer the users the opportunity to interact with your business through conversational interfaces, in case they have questions, such as, helpdesks and chat platforms so that they can weigh their options and make decisions with confidence by being empowered with information. 

Where FemTech is powered by artificial intelligence (AI) and machine learning, organisations must explain decisions made by artificial systems to the users affected by them by taking into account the requirements of the GDPR (at least in cases of solely automated AI decisions) which cover:

    • Providing meaningful information about the logic, significance and envisaged consequences of the AI decision;

    • The right to object and;

    • The right to obtain human intervention. 

Putting the reins in the hands of the users when it comes to their data

We know that the data gathered by FemTech is a goldmine. In 2019, Privacy International conducted an analysis of menstruation apps and the research highlighted that the menstruation apps raised serious concerns about GDPR obligations such as consent and transparency. Out of the 36 apps they tested, they found that 61% were automatically transferring data to Facebook the moment a user opened the app. They also found that some of those apps were routinely sending Facebook incredibly detailed and sometimes sensitive personal data. It didn’t matter if people were logged out of Facebook or didn’t have an account.

In such situations, users are no more in control of their data as they can’t express their choices and have their choices respected. 

One of the practical ways of implementing choices is by designing a customer privacy centre where users can easily have a quick snapshot of the data that is being collected, and the purposes their data is being used for. They should have the ability to decide on third party sharing of their data in exchange for personalised benefits (users’ explicit consent must be sought if related to health data); can choose their marketing and communications preferences anytime; and meaningfully make rights requests, and easy downloads of their data. 

Privacy and Security by Design 

In-depth data protection impact assessments must be undertaken when designing applications with consideration for users and the potential harms they could experience if there was a breach, based on the nature of the processing. 

The data collected must be limited (data minimisation) and unnecessary data must not be collected just for the purpose of enriching the profile of a user. Only data that is necessary for the purpose that the app states must be collected (purpose limitation). Data sharing must be limited to what is strictly necessary for the purpose of providing the services. Default data sharing settings of tools provided by third parties (for instance, Facebook’s Software Development Kit (SDK) in the above scenario) must be avoided. Only the user’s opt-in can lead to sharing of data with a third party. 

The data protection impact assessments must inform the appropriate technical controls that will need to be implemented based on the level of the risk presented by the processing activities. 

It would be reasonable to require high tech organisations which collect, store and use highly sensitive personal information to implement high-tech safeguards and those which show low privacy risks to users to implement low-tech safeguards. This is essential to find the right balance between protecting users’ personal health information and promoting innovation.

Ethics 

Apart from the above three pillars, there is also a wider ethical angle to technology. Potter Stewart, associate justice of the U.S. Supreme Court, once said, “Ethics is knowing the difference between what you have the right to do and what is right to do.

As technology can promote surveillance, the willingness for an organisation to work towards their users’ welfare by showing their data responsibility above compliance can generate higher levels of trust and credibility. 

As Emily Leach and Kevin Donahue outlined in a recent IAPP article, ethics “can be a tough sell internally because it may involve trading in short-term goals for long-term benefits. But implementing ethical standards around data handling will help breed customer trust and loyalty, establish a platform for future regulatory success and give you a story to showcase your program to a range of stakeholders. After all, who wants to tout their success in doing the minimum, right?

Natasha Singh is a Principal Consultant in the Cybersecurity and Privacy Practice at Gemserv. She has 11 years of experience in regulatory risk and compliance. She specialises in privacy, new technologies and global data protection law. She supports organisations with their global privacy programs, governance and privacy by design. She is also the outsourced Data Protection Officer of her clients operating in several sectors. Read more about Natasha in our interview, connect with her via LinkedIn or contact her via email.