Mark Zuckerberg’s Meta released its integrity reports on Thursday for the first quarter of 2025, claiming it reduced enforcement mistakes by 50 percent since the beginning of the Trump presidency. Enforcement mistakes include removing content from Facebook and Instagram that doesn’t actually violate platform rules.

In January, Meta announced policy changes intended to reduce the number of enforcement mistakes, as part of Zuckerberg’s commitment to enable free speech on Meta’s platforms.

The company’s newly released Q1 report shows that the company is taking steps to put its weight behind Zuckerberg’s words. The report notes that in December 2024 it removed millions of pieces of content every day, estimating that that one to two out of every ten posts removed may have been a mistake.

The social media giant said it has achieved this reduction in enforcement errors through:

  • Auditing automated systems to see where the platform may be making too many mistakes on its apps, such as Facebook and Instagram
  • Requiring more confidence that content violates policies before removing posts, and raising the threshold to sometimes require multiple layers of review
  • Focusing on enforcement of illegal and high-severity violations

Meta continued:

During this same time period and even with these changes, prevalence overall has remained largely consistent across most violation types, with two exceptions. There was a small increase in the prevalence of bullying and harassment content from 0.06-0.07% to 0.07-0.08% on Facebook due to a spike in sharing of violating content in March. There was also a small increase in the prevalence of violent and graphic content on Facebook from 0.06%-0.07% to about 0.09% due to an increase in sharing of violating content as well as a result of ongoing work to reduce enforcement mistakes.

Meta plans to expand its Community Standards Enforcement Report to include metrics on our mistakes so that the public can track Meta’s alleged progress on decreasing enforcement errors.

The tech platform said the first quarter of 2025 is where they are starting to see changes resulting from its more open content moderation policies.

This report comes after Meta ended its fact-checking program in favor of community notes on Facebook, Instagram, and Threads in April.

Breitbart News wrote at the time:

On Monday, Meta will deactivate fact-checkers’ ability to rate new content, meaning that no new fact-checks will appear on content on Meta platforms in America.

Older fact-checks will no longer be matched with new content in the United States either. Anyone that has received a fact-check since January will not receive a penalty or demotion.

By Monday, no users should have strikes on their accounts or demotions related to fact-checking.

Notes that have reached consensus will gradually appear on Meta platforms.

“By Monday afternoon, our fact-checking program in the US will be officially over. That means no new fact checks and no fact checkers. We announced in January we’d be winding down the program & removing penalties. In place of fact checks, the first Community Notes will start appearing gradually across Facebook, Threads & Instagram, with no penalties attached,” Joel Kaplan, the chief global officer for Meta, announced on X at the time.



Read the full article here

Share.
Leave A Reply

Exit mobile version