How neurotech could boost your performance at work – or endanger your human rights

17 janv. 2024

17min

How neurotech could boost your performance at work – or endanger your human rights
auteur.e
Rozena Crossman

Journalist and translator based in Paris, France.

From sweat sensors to mind-reading, neurotech is already being used to monitor workers’ concentration levels and assess their emotional skills. Here’s how it could transform your workplace and what you need to do about it.

Pilots, truck drivers, and construction workers are using a transformative form of technology that could have an Orwellian impact on society if left unchecked. Neurotechnology (neurotech) is growing rapidly as it monitors, manipulates, and mimics our nervous system. Tracking the heart rate, sweat rate, and cerebral activity of those operating heavy machinery enables these devices to sound the alarm when fatigue levels are high and concentration is low. This can help prevent planes from crashing or save pedestrians’ lives. Yet, some experts warn that its proliferation could lapse into a widespread transgression of human rights in the not-so-far future.

The market for neurotech devices is estimated to hit a weighty $24.2 billion by 2027 as these tools – initially confined to medical applications such as pacemakers, hearing implants, or deep brain stimulation for Parkinson’s disease – spread to industries like marketing, video gaming and recruitment, largely thanks to AI. While non-invasive wearables such as heart rate, perspiration, and electroencephalogram (EEG) sensors have been around for some time, AI has drop-kicked neurotech into the realm of sci-fi, with companies like Neuralink and Precision Neuroscience carrying out human trials of brain implants that would allow paralyzed patients to control keyboards through mere thoughts.

As yet, the corporate world isn’t planting chips into employees’ minds, but companies are tracking attention spans, stress levels and social skills. A technology that could mainstream workplace brain scanning would bring enormous change to the labor market. The question is whether that change would increase workplace safety and power groundbreaking solutions to
burnout, or land us in an ethical quagmire of employee surveillance, privacy violations and job inequality.

Jared Genser, founder of the human rights law firm Perseus Strategies and co-founder of The Neurorights Foundation, says that if we want to avoid a dystopian scenario, “this really has to engage employees from the very, very beginning.” So here’s a guide to understanding neurotech in the workplace: the pros, the cons, and the steps that employees, companies and governments should take today to avoid an ethical catastrophe.

The wonderful ways neurotech helps workers

1 Tracking fatigue

Tiredness is estimated to cost employers in America $136 billion a year in health-related lost productivity. In high-stakes professions, exhaustion can lead to incorrect medical decisions, derailed trains and crashed planes. “Most people don’t have a good and accurate understanding of their own fatigue levels,” explains Nita Farahany, a tech ethicist, law professor at Duke University and author of The Battle For Your Brain: Defending The Right To Think Freely in the Age of Neurotechnology. This is why innovations like SmartCap’s Lifeband, an EEG-laden headband that detects microsleeps, were heartily welcomed by transportation and manufacturing companies.

2 Monitoring concentration

Today’s deluge of digital content – known as the attention economy – has diminished our ability to concentrate and increased the demand for mindfulness exercises, a medically backed technique that trains the brain to bring its attention back to the task at hand. Now, products like Emotiv’s MN8 earbuds can monitor brainwaves and alert workers when they start to lose focus. The earbuds notify users – and their managers – of dips in concentration via their workplace wellness app. They then suggest ways to re-center, such as practicing meditation or going for a walk.

Similarly, MIT researchers created AttentivU, a two-part wearable system consisting of glasses that track brain and eye activity and a scarf that vibrates to let users know when their engagement drops. When testing AttentivU’s performance, the researchers found it helped participants score better on comprehension tests. Both Emotiv and MIT researchers believe their products can help surmount daily obstacles to our concentration, from learning to driving to time management.

3 Harmonizing humans with machines

It’s hard to find a field that doesn’t automate tasks, with tools ranging from ChatGPT to collaborative robots, or “cobots.” While these inventions certainly speed up production, they also contribute to burnout. For example, the increased speed of online communication is creating exhaustion and digital debt in office employees, while assembly workers are struggling to keep up with AI-powered manufacturing processes.

Enter neurotech, which aligns automated work environments with human well-being, a field known as cognitive ergonomics. Take the neurotech system developed by Penn State’s College of Engineering, enabling robots on construction sites to monitor the cognitive load of employees. When their human colleague shows signs of overwork, these robots slow down to a more manageable pace. “To a robot, a human operator is an unfailing partner,” states the college’s website. “To a human, the robot’s level of awareness, intelligence and motorized precision can substantially deviate from reality.” In other words, Penn State’s project translates the needs of the human nervous system into technical jargon a robot can understand, facilitating teamwork between the Luke Skywalkers and R2D2s of the construction world.

White-collar jobs are seeing similar advancements in cognitive ergonomics: Microsoft used EEG sensors to see if collaboration between employees is more mentally taxing when done remotely – and concluded that it is. Microsoft Teams then created a new option called “Together mode” where video calls display participants as if they were sharing a virtual space, a design that reduces the strain on the human brain and eye. The experiment concluded that minimizing background distractions in videos and making it easier to pick up on non-verbal cues reduced workers’ mental fatigue.

4 Improving emotional, cognitive and social skills

That coworker who turns nasty under stress, or takes up too much space in meetings – how can you make them aware of their behavior? “A lot of people don’t think they need to work on their soft skills,” says Clarisse Pamies, chief executive of Omind Neurotechnologies. “I was the chief digital officer of a big medical corporation [Johnson & Johnson] so I did a lot of reskilling programs, notably on agility,” she says. “It was funny because people never talked about themselves as not being agile – it was always people on other teams.”

Omind aims to help employees improve their learning, focus, stress management and interpersonal skills through a coaching program enhanced by gaming. While some of the games take place online, others involve a virtual reality headset paired with heart rate and sweat sensors. They strategically assess your cognitive, emotional and social abilities by noting your response time to challenges like gunning down alien invaders tracking physiological reactions as the levels become harder, or evaluating how you navigate mock negotiation scenarios.

Once employees get their test results – along with an explanation of how their skills are measured – they receive personalized, human guidance from coaches accredited by the International Coaching Federation and trained to work with Omind’s products. “Our coaches say it accelerates the process by saving three or four coaching sessions,” says Pamies. She argues that the scientific, hands-on nature of these programs brings a deeper self-awareness to employees who may not have realized the need to improve their empathy or coping mechanisms.

Just as you can learn to master software development or a new language, Omind believes you can train to improve your analytical thinking or self-awareness. And the goal isn’t just to be a better employee. “We can also identify people in highly irritating situations that are in a pre-burnout phase,” explains Nicholas Bassan, Omind’s head of science and innovation. “So it’s a way of saying to them, ‘You can’t really mobilize your energy. [Your nervous system] is stimulated by everything.’” He argues this biological information can help people recognize when they need to rest and recuperate before their minds and bodies crash.

5 Mitigating bias

Some scientists and ethicists believe that using tech to evaluate soft skills could reduce discrimination in recruitment. “Even if I’m hiring someone that doesn’t have an engineering degree but has the experience of being through a boot camp . . . and can show the requisite soft skill profile, that can be a fantastic way to really include people in the workforce that historically may not have been considered,” says Frida Polli, founder of the gamified recruitment software Pymetrics, in an interview with the CFA Institute. Polli believes soft skills are “more equitably distributed in the population than hard skills” and that “focusing on just an engineering degree might introduce a candidate pool that’s more male and Caucasian.”

This automated evaluation of soft skills could also help neurodivergent candidates, relieving the stress associated with in-person interviews or showcasing the cognitive abilities of candidates whose CVs reflect a history of challenges to land a job. What’s more, there’s considerable controversy over the efficacy of unconscious bias training for human recruiters, who may inadvertently assess candidates based on their clothes, body language or speech patterns. Recruiters may increasingly rely on assessing transferable qualities like soft skills anyway, as many hard skills like SEO writing or knowledge of certain programming languages are considered “perishable,” becoming obsolete as technology evolves.

Pymetrics relies on behavioral data rather than neurotech to measure soft skills, but heart rate monitors and sweat sensors like those used by Omind could be incorporated into these kinds of recruiting platforms. Soon, they may even be reading your brain.

image

The pitfalls of giving neurotech to employers

1 Scary surveillance

Companies don’t need access to your nervous system to act like Big Brother: six out of 10 companies monitor their remote workers, and 88% have fired employees after using monitoring software to check productivity habits, according to a survey by the website review platform Digital.com. As the boom in telework and the ensuing risk of employee slacking prompted increased nail-biting among managers, the response became ramped-up surveillance. “This is where the neuroscience comes in,” a consultant for global real estate firm JLL told IEEE Spectrum, a magazine edited by the Institute of Electrical and Electronics Engineers, remarking that the chief executives he works with – who are worried about both the productivity and wellbeing of their workers – are testing Emotiv’s MN8 earbuds to see how the brains they employ handle different work situations.

Farahany, however, cautions that “if [employers] are not willing to work in the best interests of their employees, then [neurotech] could be used just as easily to penalize employees for their fatigue levels, rather than to address the working conditions leading to them being overworked or overtired or in an unsafe environment.” This applies to in-person workplaces as well, as the Brookings Institution uncovered instances of employers using webcams to measure employees’ attention by eye and body movements, with reprimands for wandering focus. “The dystopian potential of this technology is not lost on us,” Emotiv’s CEO admitted to IEEE Spectrum.

Depriving workers of mental downtime lessens their ability to concentrate, increases stress and makes them more likely to break company rules. If neurotech becomes a tool for micromanaging employees’ brains, it could bring about a double disaster of exacerbated mass burnout and unprecedented privacy problems, as these gadgets may soon be capable of…

2 Mind-reading

On May 1, 2023, researchers from The University of Texas at Austin published their findings of a new technique that decodes brain activity into words – offering a glimpse into a future where thoughts may be deciphered without any invasive brain implant procedures.

While these strides propel us towards a future where the intricacies of the mind become as readable as an open book, this tech is still rudimentary as, for now, we can only understand small parts of the brain. Neuroscientists like Rafael Yuste, however, envision the possibility of a singular mathematical formula capable of unlocking the secrets of the cerebral cortex, the brain’s largest component, believed to hold the key to our thoughts. “If we can figure out how a little piece of the cortex works, we could figure out how the entire cortex works,” says Yuste, who is director of Columbia University’s NeuroTechnology Center and chairs the Neurorights Foundation. He believes scientists may be closer to finding this puzzle piece than we realize. “I think we’re seeing the light at the end of the tunnel precisely because of all this neurotechnology that is being developed right now,” he says.

This development holds particular promise for the severely paralyzed, schizophrenic and non-verbal, the demographic that initially inspired these neurotech innovations. Although UT Austin’s new mind-reading technique using fMRIs isn’t poised for workplace implementation, neurotch’s snowballing breakthroughs and swift commercialization mean that day is looming large. Already, employee brain data is being collected via a different type of scan called an EEG (electroencephalogram), not capable of translating thoughts but proficient in discerning moods and emotions. This means employers may have…

3 Non-consensual access to very private data

“Neurotechnology can collect information that people are not aware of,” warns the UK’s Information Commissioner’s Office. “Neurodata can include estimations of emotional states, workplace or educational effectiveness and engagement, and medical information about mental health, among many other types of data.”

To understand how this works, consider a worker with ADHD, a learning disability primarily diagnosed by levels of attentiveness and impulsivity — attributes frequently gauged by neurotech devices in workplace settings. While neuroscientists aren’t sure whether EEGs can be used to identify ADHD subtypes, they’re working on it. So it’s conceivable that an employer could accidentally discover their worker is neurodivergent while monitoring concentration – even if the employee did not know or consent to sharing this information. In fact, as a byproduct of workplace neurotech, your employer may actually find out you have dementia or Parkinson’s before you do.

Of course, not all employers know how to interpret data gleaned from EEG scans, and they may not even have access to such private information in the first place, particularly if robust data protection policies are in place. But the problem is…

4 Someone else owns your neurodata

If an employer checks your heart rate or brainwaves, who owns that data? Does it belong to you, your employer, or the company behind the neurotech device?

In August 2023, Emotiv – the neurotech company working with JLL to monitor workers – was penalized by the Chilean Supreme Court for refusing to allow customers to import or export any of their brain data from Emotiv’s cloud without a paid account. The court also found Emotiv guilty of using customers’ brain data for research purposes without consent, which violates Chilean data protection laws. This seems to be common practice for most commercial neurotech companies: Genser and Yuste are working on a report which has found that 18 of the world’s biggest neurotech firms own all the rights to their customers’ neurodata. Seventeen of these companies have the right to share this neurodata with third parties who can use it as they please.

Even if neurotech companies gave their customers full ownership of the data collected on their devices, some lawyers worry those rights might end up with the corporate clients, rather than the workers the gadgets are used to monitor. It’s a confusing data privacy situation that’s not particularly well-regulated anywhere, including in the European Union, the global leader in data protection laws.

To make matters worse, whoever is in charge of the neurodata is also responsible for protecting it properly. And right now…

5 Your neurodata might not be safe

Neurotech isn’t the only field that’s been boosted by artificial intelligence, as generative AI has ramped up cybercrime. In 2023, 11.5 cyberattacks were deployed every minute between March and May – a 13% increase on the same period last year. A report by Mastercard found a 600% increase in cyberattacks since the pandemic began, and many companies – big and small – aren’t up to date with the latest data protection procedures.

While it’s spooky enough that cybercriminals are upping their tricks to steal credit card information, a world where our neurodata is stored in hackable corporate clouds is downright terrifying. Even well-meaning companies that follow the EU’s General Data Protection Regulation (GDPR) guidelines, the most stringent data protection rules around, may struggle to protect neurodata from cybercriminals, as GDPR-approved techniques to protect sensitive data aren’t foolproof.

Worse, in the event brain scans make a commercial debut, hackers could leverage them to identify their victims. “About half of the brain activity is specific to you and half is shared with other people,” explains Yuste. “So that half of brain activity that’s specific to you enables [someone] to identify [you] – it’s like your fingerprint.” Certain anonymized MRI scans, which scan the whole head in order to see the brain, have been matched back to patients using facial recognition software.

Our list of neurotech worries should also include state-level abuse, according to Genser. In the US, the justice system has instrumentalized neurotech devices to verify the innocence of accused offenders. In 2017, data from a man’s pacemaker was used against him in court: he argued he had run out of his burning house, while the pacemaker’s information about his medical condition proved this impossible; he was subsequently charged with arson. MIT Technology Review recounts how one neurotech company was contacted by law enforcement officials who sought to access data from the brain implant of an epilepsy patient accused of assaulting a police officer; the police wanted to see if the attack had been caused by a seizure. These cases aren’t glaringly unethical, but Genser, who negotiates on behalf of political prisoners, is particularly concerned about how governments gone wrong could access and weaponize their citizens’ neurodata.

6 Increased bias

While companies like Pymetrics advocate for soft skill assessments in recruitment as a way to bypass prejudice and unfair advantages, it may have the opposite effect. Although Pymetrics is endorsed by the World Economic Forum, some research indicates these types of cognitive ability tests can create racial disparities three to five times greater than other assessment processes. “Many stakeholders and scholars cite significant concerns about the ability of algorithms to accurately detect emotional cues,” states the ICO, “in particular, for ethnic minorities, those from non-European cultures or neuro-divergent individuals.” Bias problems in tech-based recruitment programs are well documented and have been a subject of great concern for the US Equal Employment Opportunity Commission. The accuracy of these psychometric tests, and their ability to predict how a candidate may fit into a company, is also hotly contested.

For example, if the datasets of such recruitment software are based on neurotypical people, “somebody who is outside of those norms is going to score poorly or differently,” Farahany points out. “And the people that they’re selecting to build the model use case of the ideal employee are generally already a biased set. So if you say, ‘Well, I don’t know exactly what I’m measuring, but I know these 10 employees are successful,’ and the hiring process that went into hiring those 10 people already has a number of biases built into that process, then you’re just replicating those same cognitive and personality types within the workplace, which isn’t necessarily the best cognitive and personality type that’s fit for the job that you’re trying to hire for.”

There’s an additional problem when neurotech is used in these psychometric tests: if the device isn’t properly placed, it could have trouble reading the person’s nervous system, thereby increasing problems of bias or engendering decisions based on faulty data.

There’s a reason Omind Neurotechnologies refuse to use their products for recruitment purposes. “Recruitment is a selection,” says Pamies. “We’ve always viewed this technology as a way to support humans, not to screen them or bring them down.”

Omind ensures the neurodata collected on its devices is shared with the employee and their coach only, never with the employer. “We’re trying to map potential, but it’s learned potential… after seven hours of training, we can increase an empathy score by 20 points,” explains Pamies. “So to say someone with a score of 70 is better than someone with a score of 50 doesn’t mean anything because if I train the person with 50 points, they’ll be at 70.”

7 Exacerbated Inequality

Unequal access to tech has long been linked to rising inequality by economists and researchers, who argue that both workers and companies with access to the latest tools have an unfair advantage. Just take the millions of Americans without broadband access or smartphones, which severely limits their banking, education and job opportunities. Introducing expensive neurotech devices that boost alertness, concentration and productivity could seriously skew an already imbalanced job market.

“We’re talking about the possibility of creating a hybrid human, which could divide humanity into two species: augmented and non-augmented,” Yuste told Columbia Magazine. It may sound like an extreme case for the far future, but Israeli company InnerEye already hooks TSA agents up to EEG sensors to speed up their ability to identify prohibited items in X-rays.

image

What can we do now?

Despite its reputation as a vector for doom, neurotech is seen as a desirable upgrade by many workers. Even back in 2017, when job-related neurotech was more scarce, willing employees at one Wisconsin company implanted microchips into their hands that enabled contactless payments and access to office buildings. With all of the above considerations to weigh in our present day, how can we maximize the pros and minimize the cons?

First and foremost, laws protecting sensitive data need to be tightened. “Given the lack of societal norms and laws regarding tracking brain activity in general, for now companies are simply creating their own rules about fatigue monitoring,” Farahany wrote in Harvard Business Review. Genser gives the appalling example of the Chinese schoolchildren who, in 2019, were made to wear a neurotech headband developed by the US company BrainCo and a local Chinese partner. Some students had their concentration levels displayed in front of the whole class; others were reprimanded for their lack of attention. “Now, I don’t see these things being remotely possible in most contexts in Western democracies, because I think that even where there are no unions, people would have serious objections,” says Genser. Yet he feels this anecdote is a clear, real-life illustration of why regulating the use of neurotech devices and their data is an urgent matter.

Through the Neurorights Foundation, Genser and Yuste compiled five “neurorights” they’d like to integrate into international law:

  • The right to mental privacy would ensure privacy laws have special protections not just for neurodata but also for “moral data,” as Genser calls it. “There are some things that you don’t express through your actions or through your choices that you only keep inside, that [people] may have access to from brain data,” explains Farahany. “And that’s even more frightening than all of the rest of it, right? That’s ultimately the most sensitive information that we have to safeguard.”
  • The right to personal identity, or a person’s ability to control their own personality and sense of self. The most obvious case where this rule would apply is the personality changes that sometimes occur after deep neural stimulation for Parkinson’s which have upended some patients’ lives.
  • The right to free will advocates for a user’s right to control their own decision-making without the manipulation of neurotech devices.
  • Fair access to mental augmentation, “which is probably the right that’s furthest off in the future,” says Genser. “But once these technologies start to become more advanced and are not just decoding the human brain, but also able to improve memory or address brain diseases and so forth, there needs to be equal access to these kinds of technologies based on need.”
  • Protection from bias would regulate neurotech algorithms and datasets to mitigate any built-in discrimination.

It should be noted that while they are referred to as “rights,” these legal recommendations would likely be updates to existing human rights already enshrined in international law. Farahany says that our current right to privacy could be modified to “explicitly include mental privacy,” and laws around freedom of thought should “cover contacts that go beyond religion and belief, which is how it’s traditionally been interpreted.”

Farahany, who focuses on updating our legal system to address ethical problems posed by several new technologies, argues that neurotech is just one part of the problem. Even behavioral data, which isn’t collected from your nervous system, can reveal a creepy amount of personal information, like the TikTok users claiming the app’s algorithms knew they were bisexual or had ADHD before they did.

Protecting yourself today

If you’re using neurotech at work, check if your company is:

  • Staying on top of new regulations and data security measures. From evolving laws on biometric data to the latest practices in cybersecurity, an employer who values your privacy should be keeping a close eye on data management developments.

  • Respecting employees’ cognitive liberty. “Not having every second of their day focused on paying attention, but having periods of mind wandering,” says Farahany. “Mind wandering actually benefits [employees] and benefits the company as well, because that’s where real insights are. That’s how you decrease stress levels. That’s how you enable an employee to actually flourish in the workplace rather than treating them like automatons who have to stay on task at all times.”

  • Making sure everyone knows how the neurotech device works, and how its data will be used. This doesn’t mean the company and its employees need to cite the exact algorithms used in the device, but, as the ICO points out, employers can perform due diligence on their end, and provide a clear statement about how they intend to protect and use the data it collects. Companies should “empower employees with the technology very clearly in employee handbooks,” says Farahany. “Explain exactly what brain data is being collected, how it’s being analyzed, for what purpose and what use.”

  • Choosing optional consent over forced surveillance. “This should be a self-help device,” says Genser, who believes employees should be able to decide if they’d like to use a neurotech tool or not. He also suggests setting up devices so that the data isn’t being recorded, or uploaded to an employer. For example, a fatigue monitor would simply beep when the user was getting tired, without collecting any data. Farahany advocates for employees’ right to obtain copies of their neural data, as well as any conclusions drawn from this data. She also argues that neurodata should be collected only while the employee is working.

Wearing a neurotech device that “projects real-time data to a central employer sounds very, very invasive and intrusive, and would make me uncomfortable,” says Genser. “I don’t think it would make me a better truck driver. It’s going to make me resent the company that I work for.”

Photo: Welcome to the Jungle

Follow Welcome to the Jungle on FacebookLinkedIn, and Instagram, and subscribe to our newsletter to get our latest articles every week!

Les thématiques abordées