EN
Translate:
EN
Signed in as:
filler@godaddy.com
EN
Translate:
EN
Signed in as:
filler@godaddy.com
I want to thank Erin, Gabriel, and all of the class representatives that supported this case against Facebook, along with Steven Williams, the Saveri law firm, Burns Charest LLP firm (co-counsel), and Most & Associates Representing the plaintiffs, for their support. Due to the global impact of our work, former content moderators Chris Gray in Ireland and Daniel Montague in South Africa are currently working to hold Facebook, a local Bay Area and multinational corporation, accountable in courts of law for unsafe working conditions and human rights violations. Unfortunately, we have yet to see content moderators step forward from the Philippines.
Content moderation highlights complex global problems from terrorism, insurgencies, and child endangerment, to misinformation, online scams, self-inflicted premature death, and the decline of mental health in younger populations, while moderators work with diverse stakeholders, including governments, civil society groups, and individual users.
Until computer vision, artificial intelligence, and machine learning technologies are able to classify, identify, and remove content deemed harmful by facebook's public policies, there will be humans in-the-loop training facebook's algorithms at the long-term cost of their overall health and well-being. We greatly appreciate the recognition of our work in Scola vs. Facebook, as such recognition continues to assist those still seeking justice.
Thank you.
Location: San Francisco, CA
Impact Fund Class Action Hall of Fame Induction Ceremony acceptance speech by Gabriel, Erin, and Selena.
Three Brave Content Moderators Take On Facebook On Behalf Of Thousands In Historic Class Action. More coming soon.
Berkeley, 02.24.23 – Seventeen heroes were today inducted into the Impact Fund Class Action Hall of Fame. The Hall of Fame recognizes named plaintiffs whose commitment and determination has led to significant advances in economic, environmental, racial, and social justice.
Impact Fund Nomination Information
Selena Scola Gabriel Ramos and Erin Elder in the case Scola versus Facebookchallenged the lack of protection offered tocontent moderators who are responsible for viewing andremoving offensive contentand disturbing content from Facebook.if you seea beheading it affects you.And content moderators, were traumatized sufferingfrom PTSD.And it was a groundbreaking settlement that was reached that resultedin substantial workplace improvements, paved theway for similar suits against YouTube and TikTok, andhere with us today are Selena, Gabriel,and Erin.Hi, good afternoon. This issuch an honor and I feelvery humbled to have been even nominated forthis. It was such a surprise when I signed onto this case.I just wanted better work conditions for myfriends that I worked with andthis job of content moderation wasthis new thing that no one really knewabout and we were just viewing some of themost awful things on the internet and soI'm just really grateful that Igot to walk this path with both Gabriel and Selena,and I'd like to thank the Joseph Saverilaw team for their skillful guidance because Ididn't know what I was doing. So, so, thankyou.Thank you very much for this honor.Like Erin I came out of nowhere and I'm happyto be here. It's been a long.Journey to get to this point and for accountabilityof Facebook and I prepared a few words.A Content moderator is first and foremost a person.Facebook's lack of accountabilitymade it imperative for us to speak up.To demand changes thattook into account the humanityof the individual risking so much to benefit thesocial media platform. And I'd really liketo thank Selena for starting this and Erin for supporting andall of us together. Thank you so much.I want to thank Erin and Gabriel without themon the case, I wouldn't have had any support and itjust would have been me alone saying that this job did harm to everyone whoworks it. I'd also like tothankStephen Williams, the Saveri LawFirm, Burns Charest LLP from co-counsel and Most and Associates representing theplaintiffs for their support.Without their trust in me and believing that whatI was saying was true, Facebook wouldnot have been held accountable.Due to this case, it's had global impact. Former contentmoderator, Chris Gray in Ireland has launched acase, which is still in the courts. Daniel Montag in South Africa iscurrently working to hold Facebook accountable forhuman rights violations in South Africa.And Facebook is a local Bay Area company andalso a multinational corporation that needsto be accountable globally for their crimes.and for the position that they have put usin.Let's see.Unfortunately, we still are unableto reach the content moderators in the Philippines:they suffer the worst.And content content moderatorshighlights complex global problems from terrorism to insurgency, to child endangerment, misinformation, onlinescams...... premature death, and theerosion of the mental health of our youthand......sorry... and until computer vision, artificial intelligence, and machine learning technologies are able toclassify, identify, and remove content from the web, humans arestill going to be in the loop doing these jobs and still beingdamaged. And long-term costs is to your nervous system and your mental health and actuallychanges your DNA.I greatly appreciate the recognition of the Impact Fund. This is the first time I've spokenpublicly or to the press or anyone about this outside of my friends ormy legal team. So thank you and this award will continue to help the peoplewho have not spoken up yet.Thank you.Let's hear it for our class of 2023!EnglishAllConversationListenable
Berkeley, 02.24.23 – Seventeen heroes were today inducted into the Impact Fund Class Action Hall of Fame. The Hall of Fame recognizes named plaintiffs whose commitment and determination has led to significant advances in economic, environmental, racial, and social justice.
Impact Fund Executive Director Jocelyn Larkin said: “At the heart of every civil rights class action are every day, ordinary, people who put their lives and livelihoods on hold to champion the interests of those who have been discriminated against, denied their rights, and made to feel second-class. Today, we’re grateful to recognize these extraordinary individuals for their bravery and endurance in the face of overwhelming odds.”
Selena Scola, Gabriel Ramos, and Erin Elder, named plaintiffs in the case Scola v. Facebook, Inc. Selena, Gabriel, and Erin represented a class of over 14,000 content moderators alleging they were denied protection against severe psychological and other injuries resulting from viewing objectionable postings while working on behalf of Facebook through third-party agencies. Moderators are our front-line soldiers in a modern war against online depravity that we all have a stake in winning, and these three individuals fought to protect the rights of thousands by advocating for a remedy to the psychological trauma resulting from constant and unrelenting exposure to screening toxic postings. The Class sought damages and workplace improvements, including mental health screening, treatment, and compensation, and a requirement that Facebook improve working conditions to live up to its own safety standards. In May 2020, the Class reached a ground-breaking settlement with Facebook for $52 million and workplace improvements, which received final Court approval in July 2021. This novel case and settlement have provided essential relief to the Class and paved the way for similar suits against YouTube, Inc. and TikTok, Inc.
Class Action Hall of Fame, Class Actions
Steven N. Williams, Joseph Saveri Law Firm
Social media platforms have completely transformed our lives. We rely on them for information, advice, and opinions, and to connect with family, friends, and colleagues in many positive ways. What we do not see are postings of humanity’s worst impulses: murders, sexual crimes, bestiality, child abuse, and other vile behavior. For this, we can thank “content moderators”: a non-existent profession a short time ago, now one we rely on for safe navigating—for ourselves and our children.
To maintain sanitized platforms, maximize profits, and cultivate their public images, social media companies rely on content moderators to review objectionable postings and remove any that violate their terms of use. These moderators are our “front line soldiers” in a modern war against online depravity that we all have a stake in winning. They have one of the most difficult jobs in the world. Constant and unrelenting exposure to screening toxic postings causes many to develop and suffer from psychological trauma and/or PTSD. Tech companies have implemented counseling, training, and safety standards to protect them. But they have ignored these standards, instead requiring moderators to work under conditions known to cause and intensify psychological trauma. Also, many moderators work for third-party agencies across several states and are prohibited by non-disclosure agreements from talking about their work concerns publicly. They frequently receive low wages under short-term contracts and minimal health benefits. When I first learned of this nightmare world, I was equally shocked and angered. I vowed I would do everything I could to help them.
Erin, Selena, and Gabriel, 2023 Hall of Fame Inductees
In 2018 my firm and co-counsel filed Facebook in San Mateo County Superior Court. Our plaintiff class of content moderators working on behalf of Facebook through third-party agencies alleged they were denied protection against severe psychological and other injuries resulting from viewing objectionable postings. The lawsuit sought mental health screening, treatment, and compensation, and a requirement that Facebook improve working conditions to live up to its own safety standards.
When we initiated Facebook, we were a lone voice in the wilderness raising this type of legal claim. It was a complicated, “first of its kind” lawsuit, so there was no easy path to follow from previous litigation. To avoid having our afflicted clients’ claims shunted into the workers’ compensation system, we instead aimed to achieve a medical monitoring solution. This remedy is available under California law, but successful cases are as rare as Bigfoot sightings
Surprisingly, Facebook CEO Mark Zuckerberg had acknowledged the content moderation problem in a Facebook post. By doing so, he put himself in the middle of the primary issue of the case. I was taken aback by the tone coming from the lawyers on the other side. They took an aggressive approach when we sought access to information and to Zuckerberg and COO Sheryl Sandberg. They insisted we had no legal claim. They were in a hard spot because of Zuckerberg’s post; he did not want to testify under oath.
To maintain sanitized platforms, maximize profits, and cultivate their public images, social media companies rely on content moderators to review objectionable postings and remove any that violate their terms of use.
As the case progressed, we filed summary judgment motions and for judgment on the pleadings. Facebook asked for settlement talks. A JAMS neutral mediated a deal that took a year to hammer out. While confronting numerous obstacles, I was always comforted by my firm’s support and my unwavering recognition that my legal struggles were miniscule compared to the harm my clients were suffering every day, and which unquestionably had to be relieved.
Settlement Reached
In 2020, our class reached a preliminary settlement with Facebook. In 2021, the Court granted final approval of a $52 million settlement and workplace improvements for over 14,000 class members who work for Facebook vendors in California, Arizona, Texas, and Florida. The workplace improvements apply to any U.S.-based content moderation operations for Facebook. It was exhilarating to achieve this outcome and to share the welcome news with the class, especially since success was never guaranteed but was so desperately needed.
Gabriel, Erin, and Selena receive their certificates of induction into the Hall of Fame
Scola v. Facebook, Inc.’s Legacy
This case’s settlement and resulting media attention have opened the door to similar litigation. In 2020, my firm brought a proposed class action against YouTube, Inc., alleging it failed to protect a former content moderator and her co-workers from mental harm caused by reviewing disturbing footage. In 2022, the Court granted preliminary approval to an approximately $4.3 million settlement and about $3.9 million in injunctive relief. In 2022, my firm filed a similar suit again TikTok, which is ongoing.
The role and plight of content moderators is endemic across the social media landscape. All of us need to monitor and support them. The fact that moderators are treated as disposable scares me as it should everyone. It has been my honor and privilege to represent them at every step in this case. There is no limit to what you can do if you are prepared to lose sometimes—sometimes in the face of formidable odds you want and need to make a statement about something that is just plain wrong. This case’s successful resolution has improved our clients’ lives. That is the ultimate gift—indeed, the ultimate impact—that any lawyer can receive: one that I have and continue to aspire to, and which I would encourage everyone in the legal profession to strive for.
Selena Scola, Gabriel Ramos, and Erin Elder were inducted into the Impact Fund Class Action Hall of Fame on February 24th, 2023 in recognition of their courage, sacrifice, commitment, and determination that led to a significant advance in social justice.
Gabriel, Selena, and Erin (on the left) together with the Class of 23
Tagged: Facebook, Scola v. Facebook, Selena Scola, Gabriel Ramos, Erin Elder, Content Moderation, Mental Health, Social Media, Social Justice, Steven N. Williams, Jospeh Saveri Law Firm, Mark Zuckerberg, YouTube, TikTok
Copyright © 2004-2024 selena scola All Rights Reserved
: )
This website uses cookies. By continuing to use this site, you accept our use of cookies.