Cori Crider began her career as a lawyer in the aftermath of the 9/11 attacks. “…[my] life path at that time was set by a sense that we had gone down the wrong track…that we were engaging in the collective punishment of the world’s 1.6 billion Muslims,” she said. After defending Guantanamo detainees for 10 years, Crider co-founded Foxglove in 2019, a not-for-profit set up in London. The group fights for a world where tech is fair for all, challenging Big Tech and governments over their use of technology and abuse of power.
We spoke to her about her professional life, setting up her own organisation, challenging Big Tech and why we need to talk about content moderators.
For the past year, Crider, 38, has been focusing on the poor working conditions of content moderators—the workers who make sure that violent imagery stays away from social media platforms. But this wasn’t always the case.
Before Foxglove, you spent ten years helping victims of extreme human rights abuse. When did you first start to focus on human rights as a lawyer?
I never had a job at a law firm. My first job, in 2006, after law school was representing Guantanamo detainees. I knew that that was what I wanted to do: work for a not-profit, find the detainees, represent the detainees. There was some guy in London who had started a not-for-profit [Reprieve], and I just phoned him up and said: “I hear you’re representing the detainees. If I can get a fellowship, can I come and do it?” And that was it!
Growing up, did you know from an early age that you wanted to become a lawyer?
I don’t know if I knew… I always had a deep and instinctive dislike of things that seemed unjust and unfair. I also came from an environment—not in my household but in my wider community, I guess you would call it Trumpville now—in which I saw a reluctance to ask questions about the structures of power in our society, and a suspicion of anybody who asks questions about authority. I was different, I was lucky, because I was raised by two scientists and was always taught to ask questions.
What was a foundational moment for you as a lawyer? What convinced you to go down the path of human rights?
Most of my political and working life was basically formed in university by what I saw as the overbearing, racist, and disastrous response to the September 11 attacks. The foundational moment to me as a lawyer during uni was when I first saw a detainee in an orange suit on a gurney after September 11. I thought, how is this possible compared with what we say our values are as Americans?
When did you decide to end your work with detainees and switch to tech?
Leaving is about your personal responsibility and the people who you represent. By 2016 I knew that I wanted to move on [from Reprieve], but I also had one last case, like a person in a bad detective movie. This had to do with a Libyan family who had been kidnapped and sent to be tortured. We won that case in May 2018, and [then prime minister] Theresa May apologised for the UK’s role in kidnapping the family. Then I finally cut the cord and left.
But why tech?
I started to feel that there was something else going on when I was doing work on drone attacks in 2014. I was interviewing people in Yemen [which wasn’t a war zone at the time] who had lost loved ones in drone attacks, and who then turned out to be completely innocent. Their lives were completely upended.
We came to know from leaked documents that in the vast majority of cases of drone attacks outside declared war zones, the Obama administration didn’t actually know the identities of the people that it had killed. The vast majority of cases were like this: this is a Sim card and the data that we have about the way the phone is behaving is suggestive to us that the owner of this phone is in some kind of a militant group. […]You strip that out of its context and use this mass data to make the most consequential decision that there is to make: do they live or do they die? I just felt like, Jesus, this is how power is now actually exercised.
You launched Foxglove in 2019. Can you talk about the experience of starting your own not-for-profit?
I would never have started it without Martha [Dark], my co-founder. [Crider met Dark at Reprieve, where she was the head of operations. They began discussing Foxglove in January 2019 and five months later they set it up.] All I ever do is see problems [she laughs]. She is a kind of machine of energy and efficiency. It’s all very well to say that there is a problem, but actually we can damn well do something about it, and here are the following 20 things that we need to do to make that happen… This isn’t me, that’s Martha.
You have been focusing on empowering social media content moderators. Why did you decide to work with this particular group of tech workers?
There had been a number of “shop-floor exposés” in the media [regarding content moderators’ working conditions], but we didn’t have a sense that any of those pieces of investigative journalism had ever resulted in a sustained effort by either civil society—the human rights movement—or labour unions. We thought: there is a gap here and we are a new organisation.
There’s a ton of debate about the front end of social media, what content goes up and what content goes down, but we felt there was comparatively very little discussion of, or attention to, the conditions of work that make that system possible. That seemed to feed a myth—and it’s not just Facebook but certainly that the leadership of Facebook perpetuates—of this magical technical system.
What do you mean exactly by that ‘myth of a magical technical system’?
If you look at the system of work, the truth is that without these 35,000 workers, the platform doesn’t exist. It’s not built simply on the backs of programmers, but on the backs of mountains of human labour. To get ordinary people to see that these folks [tech giants] exercise unbelievable amounts of power, I think you have to make them see the reality of the human systems that we are all participating in.
What have you learned about the working conditions of social media content moderators around the world—and was it what you had expected?
I had noticed the headline-grabbing stuff—like, oh my god, can you imagine what it would be like to sit in front of your computer for eight hours a day and just watch beheading after a beheading, and just be absolutely soaked in the worst of what humans do?
But that is part of a wider, rotten system that we want to address. One of the problems is precarity—outsourcing. Just as Uber has disrupted its way out of workers rights by creating this legal fiction that an uber driver is an independent contractor rather than an employee, Facebook maintains a similar fiction that content moderators are not core to Facebook’s business, and that it’s therefore appropriate to hold them at arm’s length and to contract out through third party corporations.
Apart from precarity and low pay, what are some of the other issues?
Secrecy is a huge problem. Facebook content moderators are not allowed to tell their family members that they’re working for content moderation and they’re not really supposed to talk about it at all publicly.
What passes for mental healthcare is a bit of coaching and deep breathing and yoga. This is not psychiatric or psychological support. Companies are constantly trying to violate standard confidentiality relationships. We have spoken with a couple of wellness coaches and the corporate bosses told them, “We want to know who’s coming in to see you, what they’re talking about.”
And the final one is this idea of turning a worker into a robot, before the robot takes their job. Content moderators are algorithmically micro-managed within an inch of their very lives. They have to process hundreds of pieces of content every day, no matter how toxic or how difficult it is to tackle, to a fanciful accuracy score or quality score of 98%.
What about the working conditions of content moderators during Covid?
The thing that’s coming in really forcefully from the workforce is: how can you be asking us to go in with Covid cases? The current rule is that you have to go to work even if you live with someone who is at risk. […] The question around Covid, and the way they’ve handled it, is a kind of lens for what’s so crazy about the whole work in general. Because if they are so essential that they’ve got to run the risk to do this work, then why aren’t they employees?
Can you give us a sense of what it feels like to listen to the content moderators share their experiences?
Sometimes you’re encountering people who have repeated flashbacks in an absolutely classic PTSD-way. It is hard to sit in the presence of somebody who has, for the sake of all of us just having social media, genuinely cut scars into their psyche. I was surprised, I suppose, to see a depth of pain and a depth of suffering, because of the kind of psychotoxic nature of some of this content, that is real and is serious. So yeah, it’s hard sometimes.
With Foxglove, you’re holding power to account and tackling issues people are afraid to talk about. Are you at all concerned about your safety?
No way, absolutely not! We’re so much more privileged than the people that we represent. The least that we can do, and that we owe to them, is to say something, right? The people to worry about are the Facebook content moderators who risk their jobs and potentially getting deported to give me information because they think it will help other workers.
Throughout your whole career, you’ve been fighting for fairness and standing up for those with little to no rights. How do you explain where this inner fuel and determination come from?
I guess all we have is our time, isn’t it? I’ve never been able to do anything that I don’t care about, or find boring. It’s gotta seem to me to matter, and a worthwhile way of spending your one life.
Photos by Betty Laura Zapata for Welcome to the Jungle
Follow Welcome to the Jungle on Facebook to get your daily dose of our best articles in your timeline!
- Add to favorites
- Share on Twitter
- Share on Facebook
- Share on LinkedIn