a feeling of foreboding the privacy risks of emotion reading wearables banner

A Feeling Of Foreboding: The Privacy Risks Of Emotion-Reading Wearables

Even as we haemorrhaged our personal data, we could always rely on at least one secret: If we have enough self-control, people can never read our minds. Now, be prepared for even that to change.

For many technologists, the final frontier in the quest for personal information is what we’re feeling. For a marketer or politician, reading our emotions could be the most powerful thing of all.

Vendors have been rushing to develop products that can do just that. Some propose using technology that taps into brain data (a field known as neurotechnology). Apple, for example, has patented brain-reading AirPods. Earlier this year, researchers at South Korea’s Ulsan National Institute of Science and Technology (UNIST) created a facial mask they claim can analyse the wearer’s imperceptible facial changes to read human emotions in real-time.

Some don’t require new gadgets to read emotions; they’re using data from our existing ones; as far back as 2018, the U.S. National Institute of Health found that smartwatch data enabled researchers to determine the wearers’ emotions with a decent degree of accuracy. Today, children’s hospital Holland Bloorview’s Holly system uses heart data gathered from smartwatches and fitness trackers to ascertain children’s emotions, which it then displays on an app for caregivers.

Emotion-scanning wearables have a bright side. The University of West Scotland’s research project uses a mixture of facial analysis and wearable device data to help understand emotions in people on the autism spectrum. Feel Therapeutics offers a wearable and back-end data analysis system to analyse emotions for medical treatments, including mental health therapy.

Some emotion-reading initiatives focus on employee well-being, especially in high-stress professions. The UK police force’s Oscar Kilo program has trialled wearables to monitor the emotional state of officers in especially stressful positions, including firearms usage and child exploitation investigations.

Privacy

Alongside these positive uses, Nita Farahany, Robinson O. Everett Distinguished Professor of Law & Philosophy at Duke Law School, also sees a potential dark side.

“The brain data that these devices will collect won’t be collected in traditional laboratory environments or in clinical research studies run by physicians and scientists,” she said, talking at a Ted event.

“Instead, it will be the sellers of these new devices. The very companies who have been commodifying our data for years,” she added. She argues that this data will be more personal than the data we’ve shared in the past because it represents the part of us that we usually hold back.

The Neurorights Foundation, a non-profit group with its origins at Columbia University, focuses on protecting human rights from the potential abuse or misuse of neurotechnology. In April 2024, it published a report called Safeguarding Brain Data: Assessing the Privacy Practices of Consumer Neurotechnology Companies. It assessed 30 companies offering non-medically certified consumer devices that gathered neural data and found that 29 “appear to have access to the consumer’s neural data and provide no meaningful limitations to this access.” It added that those companies’ policies enabled them to share data with third parties.

From Data Analytics To Remote Control

What are the possibilities as companies become more adept at reading and registering emotions? At the mundane end of the spectrum, understanding what excites or bores people could supercharge tasks such as split testing different website designs. More interesting is its ability to help build a profoundly personal picture of a person’s interests that goes far beyond tracking what they’re browsing online.

Companies might try to use data in making commercial decisions. For example, a health or life insurance firm might consider your emotional state when assessing your suitability for a policy.

Perhaps one of the most chilling scenarios is the idea of emotional feedback loops. If you thought a company feeding you social media posts based on what you’re clicking on was creepy, imagine it drip-feeding you content based on how it knows you’re feeling. Facebook already experimented with deliberately manipulating peoples’ emotions via social media over ten years ago.

These emotional loops could be devastating. As technology evolves to seem more personable, its ability to influence emotions is becoming as powerful as its ability to detect them. We’ve already seen accusations of generative AI products contributing to suicides.

Legal Protections

Some laws could be invoked by regulators to prevent oversteps. While few companies analysed in the Neurorights Foundation report explicitly acknowledged brain data as personal information, lawyers suggest that it would fall squarely in this category under some consumer privacy regulations, such as GDPR. The EU’s AI Act also explicitly regulates the use of AI-powered systems that use biometric data for emotional recognition, outright banning them in some environments and categorising those that are allowed as high-risk systems with strict regulatory requirements.

Woefully poor federal privacy protection in the U.S. won’t offer privacy advocates the same solace, but states are stepping up. Some might be able to tackle emotion-reading wearables under existing consumer privacy laws, while some are moving to single out brain-based data. In February, Colorado passed a bill explicitly protecting neural data. California passed its own bill in October. Another way might be to classify emotional data as biometric.

The work on emotional detection wearables continues apace, and the market is growing quickly. Emotional recognition technology was worth $23.5bn in 2022 and will reach $42.9bn by 2027 – a 12.8% CAGR across five years. There’s plenty of money at stake. Conversely, the legal and ethical work in this area is still embryonic. Treating neural and emotional data as personal data – if not special category data with extra protections – is a smart move in the short term, whether regulators have tested the issue or not. With something so powerful, the more caution, the better.

Streamline your workflow with our new Jira integration! Learn more here.