Two significant jury verdicts against major social media companies have opened the floodgates for thousands of pending lawsuits alleging that popular platforms endanger the mental health of children.

Fortune reports that juries in two landmark cases have ruled against the social media Masters of the Universe, marking a potential turning point in how courts view Silicon Valley companies’ responsibilities toward child safety. The verdicts represent early victories in a wave of litigation that could reshape the technology industry’s approach to protecting young users.

In New Mexico, a jury imposed $375 million in civil penalties against Meta on Tuesday after finding the company harmed children’s mental health and concealed information about child sexual exploitation on its social media platforms. The jury endorsed the maximum penalty of $5,000 per violation of state consumer protection law, multiplied by thousands of social media accounts for children under 18.

In a statement to Breitbart News about the New Mexico verdict, a Meta spokesperson wrote: “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”

Meanwhile, in California, a jury ruled that Meta and Google’s YouTube must pay at least $3 million in damages to a 20-year-old woman who claimed she became addicted to social media as a child, worsening her mental health struggles. California jurors recommended an additional $3 million in punitive damages pending a judge’s final review. TikTok and Snap settled with the plaintiff before the trial began.

A Meta spokesperson told Breitbart News, “We respectfully disagree with the verdict and will appeal. Teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting teens online.”

The verdicts do not currently mandate specific changes to social media platform designs or the algorithms that deliver content to billions of users worldwide. However, a second phase of the New Mexico trial scheduled for May could result in court-ordered changes to Meta’s platforms for local users. A state district court judge will determine whether Meta created a public nuisance and could impose restrictions or order the company to fund programs addressing potential harms to children.

New Mexico Attorney General Raúl Torrez (D), who filed the lawsuit against Meta in 2023, said his office seeks improvements to Meta’s enforcement of minimum age limits and removal of sexual predators. The attorney general’s proposals include lifting encryption on communications that can interfere with police investigations.

Meta maintains it continuously works to improve safety and has already implemented changes including phasing out encryption on Instagram, limiting teenagers’ access to explicit content, blocking unsolicited messages to children from adults, and helping young users manage time spent on platforms and avoid sleep disruptions. Both the California and New Mexico trials highlighted the addictive properties of platform algorithms and their negative impacts on child mental health.

Google defended YouTube as a responsibly built streaming platform, distinguishing it from social media sites in its response to the California verdict.

The California case carries particularly broad legal and financial implications as it was designated a bellwether test that might guide the resolution of other lawsuits. Thousands of such lawsuits remain pending, including hundreds in California alone. The New Mexico verdict may serve as an early indicator for lawsuits brought by other publicly elected prosecutors.

Attorneys general in more than 40 states have filed suit against Meta, claiming the company contributes to a mental health crisis among young people. Most of these cases are being pursued in U.S. federal court.

Tech companies continue to benefit from legal protections under Section 230 of the 1996 Communications Decency Act, which shields them from legal responsibility for content posted by users. This protection has historically made it difficult to hold platforms accountable for harm allegedly caused by their services.

Read more at Fortune here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Read the full article here

Share.
Leave A Reply

Exit mobile version