close
close

Neural data privacy is an emerging issue as California enacts privacy laws

Neurobiologist Rafael Yuste had what he calls his “Oppenheimer moment” a decade ago after learning that he could take over the minds of mice by turning on specific neurons in their brains with a laser.

While Yuste was initially excited about how the discovery could help schizophrenics suffering from hallucinations, his euphoria faded when he realized the serious impact the breakthrough had on humans – whose neural data could one day be manipulated in the same way.

Neural data is already being collected from people – many of them gamers and meditation practitioners – and sold to third parties, Yuste said in an interview.

Without strict regulation, data brokers may soon be able to widely sell neural data they have collected and stored in databases that catalog individuals and their “brain fingerprints,” said Yuste, a neuroscience professor and director the NeuroTechnology Center at Columbia University.

“There could be a total elimination of privacy because the brain is… the organ that generates all your mental activity,” Yuste said. “If you can decode your mental activity, you can decode everything that you are – your thoughts, your memories, your imagination, your personality, your emotions, your conscious mind, even your subconscious mind.”

Inspired to create an organization dedicated to protecting people's neural data, Yuste co-founded the NeuroRights Foundation with a prominent human rights lawyer in 2021 and has since been discussing the need for regulation with lawmakers nationwide.

yuste.jpg

Image: Rafael Yuste

The group has made some progress: On Saturday, California Gov. Gavin Newsom signed a law extending existing privacy laws to neural data after unanimously passing both chambers of the state Legislature.

Under the law, consumers can now request, delete, correct and limit what neural data companies collect from them.

In April, Colorado Gov. Jared Polis signed the country's first such law, which, like California, expanded the definition of “sensitive data” covered by the state's privacy law to include data collected from the brain, spinal cord or the Nerve network can be created.

Yuste played a key role in passing the laws in California and Colorado and said he is currently talking to lawmakers in four other states about passing similar measures. At his urging, Sen. Maria Cantwell (D-WA) included a neural privacy provision in the latest draft of a comprehensive privacy bill she introduced in April, Yuste said.

Data protection advocates believe that regulation is urgently needed.

Tech giants are now exploring how to mine neural data. Last September, Apple filed a patent for a future version of its AirPods product that can scan brain activity by tapping into users' ears, while Meta was reportedly researching a new smartwatch with a “neural interface.”

According to a report published in April by Yuste's Foundation, companies that already pull data from consumers' brains are now sharing that data with third parties. Most often, they do this by requiring consumers to wear electroencephalogram helmets, which are traditionally used to diagnose epilepsy, brain tumors or strokes, Yuste said.

One of the 30 companies studied by the foundation collected millions of hours of brain signals from consumers, Yuste said. All but one of the companies “took possession” of the brain data they collected, and more than half sold the brain data to unknown third parties.

“It could be the Russian military, it could be any third party, and of course once you do that, the third party is no longer bound by the consumer user agreement,” Yuste said. “Brain data couldn’t be less protected.”

Currently, many of the contexts in which brain data is used are “low-level,” Yuste said, but the field is advancing quickly.

In December, a team of scientists managed to decipher a volunteer's “mental language,” or silent thoughts, he said.

The scientists' discovery could lead to useful consumer products that allow people to dictate or perform Google searches just by thinking, Yuste said. But the experiment also shows that humans are “halfway toward decoding a person’s mental processes,” he said.

Companies requesting, storing and selling neural data is an emerging trend, but the practice is growing quickly and concerns about privacy and discrimination abound, said Calli Schroeder, global privacy adviser at the Electronic Privacy Information Center. She is working with the UK's data protection regulator, the Information Commissioner's Office, to help them develop guidance for businesses on this issue.

As part of this work, Schroeder has spoken with some neuroscientists who believe that neural data is as individually identifiable as a fingerprint, she said.

The lack of federal neural privacy laws for non-medical uses of the data – medical applications are regulated by the Food and Drug Administration and covered by the Health Insurance Portability and Accountability Act (HIPAA) – means there is nothing stopping companies from creating databases that filled with brain scans of millions of consumers.

This information could be used to discriminate against neurodivergent or mentally ill people, Schroeder said.

She envisions a future in which brain scans could also be used to determine who to target ads to or to dictate employment and lending decisions.

“There are uses of this that concern us, and the way this is being shared concerns us,” Schroeder said. “There is a high risk.”