Mark Zuckerberg has announced major changes to Meta’s content moderation policies and practices on Facebook and Instagram, citing a desire to embrace free speech and avoid censorship. Zuckerberg’s changes begin with scrapping Facebook’s third-party “fact check” system which is notorious for its leftist bias.
Meta, the parent company of Facebook, Instagram, and Threads, is undergoing a major overhaul of its content moderation practices. CEO Mark Zuckerberg announced on Tuesday that the company will end its fact-checking program, plagued by severe leftist bias, and replace it with a community-driven system similar to X’s Community Notes. The changes come as a response to what Zuckerberg perceives as a “cultural tipping point” towards prioritizing speech, influenced by the recent elections.
In his video, Zuckerberg states: “We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. More specifically, here’s what we’re going to do. First, we’re going to get rid of fact checkers and replace them with community notes similar to X, starting in the U.S.”
In a press release, Joel Kaplan, Meta’s Chief Global Affairs Officer, stated, “Meta’s platforms are built to be places where people can express themselves freely. That can be messy. On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression.”
Kaplan highlighted Zuckerberg’s 2019 speech at Georgetown University, where he argued that free expression has been the driving force behind progress in American society and around the world. “Some people believe giving more people a voice is driving division rather than bringing us together. More people across the spectrum believe that achieving the political outcomes they think matter is more important than every person having a voice. I think that’s dangerous,” Zuckerberg said at the time.
Kaplan acknowledged that Meta had developed complex systems to manage content in recent years, partly in response to societal and political pressure. However, he admitted that this approach had gone too far. “As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable. Too much harmless content gets censored, too many people find themselves wrongly locked up in ‘Facebook jail,’ and we are often too slow to respond when they do,” he said.
Under the new approach, Meta will eliminate some content policies around hot-button issues such as immigration and gender, and re-focus its automated moderation systems on “high severity violations.” The company will rely more on users to report other violations. Additionally, Facebook’s trust and safety and content moderation team will be moving from California to Texas.
Kaplan explained the decision to end the third-party fact-checking program and move to a Community Notes model, similar to the one used by X. “We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see. We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing – and one that’s less prone to bias,” he said.
Meta plans to phase in Community Notes in the US first over the next couple of months and will continue to improve it over the course of the year. The company will also be expanding its transparency reporting to share numbers on its mistakes on a regular basis.
Kaplan also addressed the issue of over-enforcement of policies, stating, “We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement. We’re getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate. It’s not right that things can be said on TV or the floor of Congress, but not on our platforms.”
Meta will also be adopting a more personalized approach to political content, allowing people who want to see more of it in their feeds to do so. “We’re going to start treating civic content from people and Pages you follow on Facebook more like any other content in your feed, and we will start ranking and showing you that content based on explicit signals (for example, liking a piece of content) and implicit signals (like viewing posts) that help us predict what’s meaningful to people,” Kaplan explained.
Conservatives will be tasked will monitoring Meta’s rollout of a community notes system closely. When Twitter initially rolled out its system, the users selected to share their input were overwhelmingly leftist, resulting in community notes that reflected ultra-woke political ideology. The program was vastly expanded after Elon Musk bought the platform, introducing the X Community Notes we know today.
X Community Notes is also vulnerable to mass flagging. Before the 2024 election it was revealed that the Harris Campaign attempted to manipulate the program.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.
Read the full article here