EN
Translate:
EN
Signed in as:
filler@godaddy.com
EN
Translate:
EN
Signed in as:
filler@godaddy.com
I want to thank Erin, Gabriel, and all of the class representatives that supported this case against Facebook, along with Steven Williams, the Saveri law firm, Burns Charest LLP firm (co-counsel), and Most & Associates Representing the plaintiffs, for their support. Due to the global impact of our work, former content moderators Chris Gray in Ireland and Daniel Montague in South Africa are currently working to hold Facebook, a local Bay Area and multinational corporation, accountable in courts of law for unsafe working conditions and human rights violations. Unfortunately, we have yet to see content moderators step forward from the Philippines.
Content moderation highlights complex global problems from terrorism, insurgencies, and child endangerment, to misinformation, online scams, self-inflicted premature death, and the decline of mental health in younger populations, while moderators work with diverse stakeholders, including governments, civil society groups, and individual users.
Until computer vision, artificial intelligence, and machine learning technologies are able to classify, identify, and remove content deemed harmful by facebook's public policies, there will be humans in-the-loop training facebook's algorithms at the long-term cost of their overall health and well-being. We greatly appreciate the recognition of our work in Scola vs. Facebook, as such recognition continues to assist those still seeking justice.
Thank you.
Location: San Francisco, CA
Impact Fund Class Action Hall of Fame Induction Ceremony acceptance speech by Gabriel, Erin, and Selena.
Three Brave Content Moderators Take On Facebook On Behalf Of Thousands In Historic Class Action. More coming soon.
Berkeley, 02.24.23 – Seventeen heroes were today inducted into the Impact Fund Class Action Hall of Fame. The Hall of Fame recognizes named plaintiffs whose commitment and determination has led to significant advances in economic, environmental, racial, and social justice.
Impact Fund Nomination Information
Selena Scola:
I want to thank Erin and Gabriel. Without them on the case, I wouldn’t have had any support—and it would’ve just been me, alone, saying this job causes harm to everyone who does it.
I’d also like to thank Stephen Williams, the Joseph Saveri Law Firm, Burns Charest LLP as co-counsel, and Most & Associates for representing the plaintiffs.
Without their trust in me and their belief that what I was saying was true, Facebook would not have been held accountable.
This case has had global impact. Former content moderator Chris Gray in Ireland has launched a case that’s still in the courts. Daniel Montag in South Africa is currently working to hold Facebook accountable for human rights violations there.
Facebook is a Bay Area company—and a multinational corporation that must be held accountable globally for the position it’s put us in.
Unfortunately, we’re still unable to reach content moderators in the Philippines. They suffer the worst.
Content moderation highlights complex global problems: terrorism, insurgency, child endangerment, misinformation, online scams, premature death, and the erosion of youth mental health.
Until computer vision, artificial intelligence, and machine learning technologies are capable of classifying, identifying, and removing harmful content from the web, humans will still be doing these jobs—and still be harmed by them.
The long-term cost is to your nervous system, your mental health, and it can actually change your DNA.
I greatly appreciate the recognition from the Impact Fund. This is the first time I’ve spoken publicly, or to the press—or anyone outside of my friends or legal team—about this.
So thank you. This award will continue to help those who haven’t spoken up yet.
Thank you.
Let’s hear it for our Class of 2023!
Berkeley, 02.24.23 – Seventeen heroes were today inducted into the Impact Fund Class Action Hall of Fame. The Hall of Fame recognizes named plaintiffs whose commitment and determination has led to significant advances in economic, environmental, racial, and social justice.
Impact Fund Executive Director Jocelyn Larkin said: “At the heart of every civil rights class action are every day, ordinary, people who put their lives and livelihoods on hold to champion the interests of those who have been discriminated against, denied their rights, and made to feel second-class. Today, we’re grateful to recognize these extraordinary individuals for their bravery and endurance in the face of overwhelming odds.”
Selena Scola, Gabriel Ramos, and Erin Elder, named plaintiffs in the case Scola v. Facebook, Inc. Selena, Gabriel, and Erin represented a class of over 14,000 content moderators alleging they were denied protection against severe psychological and other injuries resulting from viewing objectionable postings while working on behalf of Facebook through third-party agencies. Moderators are our front-line soldiers in a modern war against online depravity that we all have a stake in winning, and these three individuals fought to protect the rights of thousands by advocating for a remedy to the psychological trauma resulting from constant and unrelenting exposure to screening toxic postings. The Class sought damages and workplace improvements, including mental health screening, treatment, and compensation, and a requirement that Facebook improve working conditions to live up to its own safety standards. In May 2020, the Class reached a ground-breaking settlement with Facebook for $52 million and workplace improvements, which received final Court approval in July 2021. This novel case and settlement have provided essential relief to the Class and paved the way for similar suits against YouTube, Inc. and TikTok, Inc.
Class Action Hall of Fame, Class Actions
Steven N. Williams, Joseph Saveri Law Firm
Social media platforms have completely transformed our lives. We rely on them for information, advice, and opinions, and to connect with family, friends, and colleagues in many positive ways. What we do not see are postings of humanity’s worst impulses: murders, sexual crimes, bestiality, child abuse, and other vile behavior. For this, we can thank “content moderators”: a non-existent profession a short time ago, now one we rely on for safe navigating—for ourselves and our children.
To maintain sanitized platforms, maximize profits, and cultivate their public images, social media companies rely on content moderators to review objectionable postings and remove any that violate their terms of use. These moderators are our “front line soldiers” in a modern war against online depravity that we all have a stake in winning. They have one of the most difficult jobs in the world. Constant and unrelenting exposure to screening toxic postings causes many to develop and suffer from psychological trauma and/or PTSD. Tech companies have implemented counseling, training, and safety standards to protect them. But they have ignored these standards, instead requiring moderators to work under conditions known to cause and intensify psychological trauma. Also, many moderators work for third-party agencies across several states and are prohibited by non-disclosure agreements from talking about their work concerns publicly. They frequently receive low wages under short-term contracts and minimal health benefits. When I first learned of this nightmare world, I was equally shocked and angered. I vowed I would do everything I could to help them.
Erin, Selena, and Gabriel, 2023 Hall of Fame Inductees
In 2018 my firm and co-counsel filed Facebook in San Mateo County Superior Court. Our plaintiff class of content moderators working on behalf of Facebook through third-party agencies alleged they were denied protection against severe psychological and other injuries resulting from viewing objectionable postings. The lawsuit sought mental health screening, treatment, and compensation, and a requirement that Facebook improve working conditions to live up to its own safety standards.
When we initiated Facebook, we were a lone voice in the wilderness raising this type of legal claim. It was a complicated, “first of its kind” lawsuit, so there was no easy path to follow from previous litigation. To avoid having our afflicted clients’ claims shunted into the workers’ compensation system, we instead aimed to achieve a medical monitoring solution. This remedy is available under California law, but successful cases are as rare as Bigfoot sightings
Surprisingly, Facebook CEO Mark Zuckerberg had acknowledged the content moderation problem in a Facebook post. By doing so, he put himself in the middle of the primary issue of the case. I was taken aback by the tone coming from the lawyers on the other side. They took an aggressive approach when we sought access to information and to Zuckerberg and COO Sheryl Sandberg. They insisted we had no legal claim. They were in a hard spot because of Zuckerberg’s post; he did not want to testify under oath.
To maintain sanitized platforms, maximize profits, and cultivate their public images, social media companies rely on content moderators to review objectionable postings and remove any that violate their terms of use.
As the case progressed, we filed summary judgment motions and for judgment on the pleadings. Facebook asked for settlement talks. A JAMS neutral mediated a deal that took a year to hammer out. While confronting numerous obstacles, I was always comforted by my firm’s support and my unwavering recognition that my legal struggles were miniscule compared to the harm my clients were suffering every day, and which unquestionably had to be relieved.
Settlement Reached
In 2020, our class reached a preliminary settlement with Facebook. In 2021, the Court granted final approval of a $52 million settlement and workplace improvements for over 14,000 class members who work for Facebook vendors in California, Arizona, Texas, and Florida. The workplace improvements apply to any U.S.-based content moderation operations for Facebook. It was exhilarating to achieve this outcome and to share the welcome news with the class, especially since success was never guaranteed but was so desperately needed.
Gabriel, Erin, and Selena receive their certificates of induction into the Hall of Fame
Scola v. Facebook, Inc.’s Legacy
This case’s settlement and resulting media attention have opened the door to similar litigation. In 2020, my firm brought a proposed class action against YouTube, Inc., alleging it failed to protect a former content moderator and her co-workers from mental harm caused by reviewing disturbing footage. In 2022, the Court granted preliminary approval to an approximately $4.3 million settlement and about $3.9 million in injunctive relief. In 2022, my firm filed a similar suit again TikTok, which is ongoing.
The role and plight of content moderators is endemic across the social media landscape. All of us need to monitor and support them. The fact that moderators are treated as disposable scares me as it should everyone. It has been my honor and privilege to represent them at every step in this case. There is no limit to what you can do if you are prepared to lose sometimes—sometimes in the face of formidable odds you want and need to make a statement about something that is just plain wrong. This case’s successful resolution has improved our clients’ lives. That is the ultimate gift—indeed, the ultimate impact—that any lawyer can receive: one that I have and continue to aspire to, and which I would encourage everyone in the legal profession to strive for.
Selena Scola, Gabriel Ramos, and Erin Elder were inducted into the Impact Fund Class Action Hall of Fame on February 24th, 2023 in recognition of their courage, sacrifice, commitment, and determination that led to a significant advance in social justice.
Gabriel, Selena, and Erin (on the left) together with the Class of 23
Tagged: Facebook, Scola v. Facebook, Selena Scola, Gabriel Ramos, Erin Elder, Content Moderation, Mental Health, Social Media, Social Justice, Steven N. Williams, Jospeh Saveri Law Firm, Mark Zuckerberg, YouTube, TikTok
Copyright © 2004-2025 Selena Scola All Rights Reserved
: )
This website uses cookies. By continuing to use this site, you accept our use of cookies.