As tech employees become increasingly aware of the potential harms of products they are hired to work on, some are taking it upon themselves to lead the change and improve their company’s ethical blueprint.
Leafing through the pages of the Ethical Explorer Pack, a new ethics guide for tech workers, you come across a slightly unusual map. The areas it identifies aren’t regions nor towns, but “risk zones” titled addiction, disinformation, algorithmic bias and exclusion, to name a few. They allow the reader to question: what might be the harm or unintended consequences of the tech products I’m working on? And what is my company doing to make sure we don’t stray?
Questions such as these are becoming increasingly common among techies. Over the past few years, a series of ethical dilemmas and epiphanies have shaken the industry. There have been resounding cris de coeur—from former Googler Tristan Harris, for example, who sounded the alarm over the way technology can manipulate our minds—and highly publicised petitions and protests inside big tech companies, with hundreds of employees contesting issues such as the use of AI for military purposes, facial recognition services for law enforcement and even actual working conditions.
“I know that I have blood on my hands by now.” - Sophie Zhang
Emanuel Moss conducts ethnographic research on issues of ethics and accountability at Data & Society, a nonprofit research institute focused on the social and cultural issues arising from data-centric technological development. “There is a pattern of growing employee interest, even employee activism in the tech industry, around the ethical stakes of products,” he said.
“The individualisation [of ethical responsibility] is certainly happening in the entire industry, rather than locating it at the level where accountability for a company lies, which is the level of C-suite [executive managers].” - Emanuel Moss
Recently, an explosive memo written by a former Facebook data scientist detailed how the social network knew leaders around the world were abusing their platform to manipulate elections, yet failed to act. “I know that I have blood on my hands by now,” wrote Sophie Zhang, in a 6,600 word memo.
Dissent inside tech companies has revealed concerns regarding ethics while also pointing to a shift in how ethical implications are being addressed and tackled. “The individualisation [of ethical responsibility] is certainly happening in the entire industry, rather than locating it at the level where accountability for a company lies, which is the level of C-suite [executive managers],” said Moss.
In fact, nearly two-thirds (63%) of tech workers would like more opportunity to think about the impact of their products, according to a recent survey. And more than three-quarters (78%) said they would like practical resources to help them do so.
The US social change venture Omidyar Network is aware of how much the landscape has changed. It recently published a downloadable guide, the Ethical Explorer Pack, a practical resource to “help create space for nuanced conversation around hard topics like exclusion and bias”, said Sarah Drinkwater, director on the responsible technology team. It is a follow-up to Omidyar’s guide for CEOs released two years earlier.
“A huge portion of the downloads have come from this intended audience but, also, thousands from big tech companies, academia and design.” - Sarah Drinkwater
Drinkwater said the new version, designed as a deck of cards, was conceived specifically for “people who work at companies too small to have any kind of in-house team focused on responsibility”, whether they are designers, engineers, founders or their collaborators.
“A huge portion of the downloads have come from this intended audience but, also, thousands from big tech companies, academia and design. In such a year of multiple crises and realisations, we’re excited to see how it’s resonated,” she said.
“In theory, those are the things that you were very interested in, but maybe this algorithm works profoundly against the consumer. It sends subliminal messages and pushes you towards buying certain products…” - Alicia Roberts*
Alicia Roberts works for a start-up that sells AI personnalisation to help businesses boost sales. “There’s always that question: is it a good thing that the tech we sell to online retailers will keep pushing the products a potential client previously viewed on a website to the forefront of their attention?” she said. “In theory, those are the things that you were very interested in, but maybe this algorithm works profoundly against the consumer. It sends subliminal messages and pushes you towards buying certain products, and might even make you addicted to them in the long run.”
Roberts said that ethical discussions weren’t really happening in her workplace; most of her colleagues were too focused on their day-to-day tasks to see the bigger picture. She added that raising ethical concerns inside the company remained “complicated” as it was “very hard to defend as an employee depending on the company’s interests”.
“Having economic privilege to speak out without fear of being fired or blacklisted from the tech world is hugely important and drives a lot of employee activism.” - Emanuel Moss
Many tech workers feel too intimidated to discuss ethical concerns at work, but their market power might give them leverage. According to a survey based on 2019 data from 2,300 tech workers, tech salaries averaged at $147,000 (£114,000) and £67,000 in the US and the UK respectively.
“Elite tech workers are often highly paid, gain seniority quickly and can sell their skills to any company,” said Moss. “Having economic privilege to speak out without fear of being fired or blacklisted from the tech world is hugely important and drives a lot of employee activism.”
“The working environment [at Facebook] was very much that of self-censorship.” - Anne Williams*
Tools such as ethics guides can be useful, but they rely on an open working environment and solid working relationships that allow for ethical concerns to be discussed with both colleagues and leadership. In Netflix documentary The Social Dilemma, former Instagram employee Bailey Richardson said that one of the biggest problems of tech was a “real failure of leadership around having open conversations about not what just went well, but what isn’t perfect”.
Anne Williams remembers her nine months at Facebook, known for its weekly open Q&A sessions with the company’s founder, as a closed environment.“The working environment was very much that of self-censorship,” she said. “Everybody was constantly monitoring one another because of the rating system and the strong feedback culture. I didn’t feel at all like it was possible to ask questions and talk openly about ethics.”
The company’s performance review system—which includes a company-wide limit on the number of employees who can receive one of the seven grades—has created a culture where critical feedback is discouraged, according to several former Facebook employees who decided to speak out about the issue in 2019. This “stack ranking” system is not unique to Facebook—until 2013, it was also used by Microsoft.
Williams’s job was to accompany small companies in their digital transition and help them with advertising on Facebook; a big part of her job was organising and leading workshops. However, she became disillusioned. “The main KPI [Key Performance Indicator] of my work wasn’t whether the companies actually learnt something—it was for people to say that what Facebook was doing was good,” she said.
…an ethics-based metric could be a useful lever for getting best practices disseminated across the entire organisation.
Much of what drives work on an employee level in the tech industry are objectives and key results, but engaging with ethical dimensions is still largely missing from individual performance metrics. Certain tech workers are educating themselves about ethical issues in their own time, but this work goes uncompensated.“In my opinion, this should be rewarded or at least figured inside the incentives that already exist,” said Moss. He believes an ethics-based metric could be a useful lever for getting best practices disseminated across the entire organisation.
Part of what makes ethical questions difficult to follow through is that they are often concerned with improving quality without being legally necessary. It means investing more resources into making a product more “good”, which might be a difficult selling point for tech companies trapped by a business model, economic incentives and shareholder pressure.
“As a developer, I don’t have the legitimacy to tell everybody: listen, now we should do it like this.” - Nicolas Esneault
For the past few years, Nicolas Esneault, a front-end web developer at the French ride-sharing start-up BlaBlaCar, has been pushing for several product fixes that would make their platform accessible to blind people and those unable to use a computer mouse. “Some colleagues are interested in [thinking about ethical issues such as accessibility], others less so. As a developer, I don’t have the legitimacy to tell everybody: listen, now we should do it like this,” he said.
Esneault quickly realised that the first step towards change was educating his colleagues through workshops and tutorials. He believes that when an employee is aware of a product fix that could make tech more responsible, they should speak up. “It’s up to the employee to share what could be improved, to convince others and bring the right arguments,” he said.
Esneault has managed to win support from his managers and colleagues—the team is currently completing a detailed quality analysis of gaps in the service’s accessibility.
“Societally, we can’t ask people like that—people working at the level of code—to be responsible for the sales deal that is happening in the C-suite.” - Emanuel Moss
As media outcry over a lack of ethics in tech companies intensifies, some Silicon Valley companies such as Salesforce have started hiring staff for an unprecedented job role—that of “ethics owner” although the exact job title may vary—to deal with unprecedented ethical questions raised by tech. Their task is to disseminate and level up ethical skills and practices across the entire organisation, including those of rank-and-file workers.
But even as workers become increasingly involved in ethics, there are cases such as Google’s Project Maven or Dragonfly, where employees unknowingly worked on applications that were not within their moral framework.
“If you’re working on a computer vision program, you might not necessarily know that it is being built into a military application. Societally, we can’t ask people like that—people working at the level of code—to be responsible for the sales deal that is happening in the C-suite,” said Moss.
While new ethical guidelines might empower employees, they can only prove effective if they are actually followed and not just blindly ticked off the list. They also need to be part of a broader, external structure of responsibility and accountability in the tech industry. “Tech companies themselves cannot be their own police persons. There needs to be some sort of external accountability mechanism for true accountability to happen around ethics,” said Moss.
“Given how high the stakes are, it should be an all-hands-on-deck response.” - Emanuel Moss
Almost half of those in tech (45%) believe their sector is not regulated enough, according to a recent in-depth survey of tech workers in the UK. The report also found that tech workers would prefer Government regulation to ensure the consequences of technology for people and society are taken into account.
“Given how high the stakes are, it should be an all-hands-on-deck response,” said Moss. “Everybody should be interested and concerned about [ethical] questions, and should be trying to answer them as best as they can.”
*-Names have been changed
Follow Welcome to the Jungle on Facebook and subscribe to our newsletter to receive our best articles.