Facebook

Meta is ending its fact-checking program in favor of a ‘community notes' system similar to X

Meta CEO Mark Zuckerberg announced a series of major changes to the company's moderation policies and practices, saying that the election felt like a "cultural tipping point."

Meta CEO Mark Zuckerberg announced a series of major changes to the company's moderation policies and practices Tuesday, citing a shifting political and social landscape and a desire to embrace free speech.

Zuckerberg said that Meta will end its fact-checking program with trusted partners and replace it with a community-driven system similar to X’s Community Notes.

The company is also making changes to its content moderation policies around political topics and undoing changes that reduced the amount of political content in user feeds, Zuckerberg said.

The changes will affect Facebook and Instagram, two of the largest social media platforms in the world, each boasting billions of users, as well as Threads.

"We're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms," Zuckerberg said in a video. "More specifically, here's what we're going to do. First, we're going to get rid of fact checkers and replace them with community notes similar to X, starting in the U.S."

Zuckerberg pointed to the election as a major influence on the company's decision and criticized "governments and legacy media" for allegedly pushing "to censor more and more."

"The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech," he said.

"So we're gonna get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms."

He also said the systems the company had created to moderate its platforms were making too many mistakes, adding that the company would continue to aggressively moderate content related to drugs, terrorism and child exploitation.

"We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes," Zuckerberg said. "Even if they accidentally censor just 1% of posts, that's millions of people, and we've reached a point where it's just too many mistakes and too much censorship."

Beyond the end of the facet-checking program, Zuckerberg said the company will be eliminating some content policies around hot-button issues including immigration and gender, and refocus the company's automated moderation systems on what he called "high severity violations" and rely on users to report other violations.

Facebook will also be moving its trust and safety and content moderation team from California to Texas.

"We're also going to tune our content filters to require much higher confidence before taking down content," he said. "The reality is that this is a trade off. It means we're going to catch less bad stuff, but we'll also reduce the number of innocent people's posts and accounts that we accidentally take down."

The changes come as Meta and social media companies broadly have in recent years reversed course on content moderation due in part to the politicization of moderation decisions and programs. Republicans have long criticized Meta’s fact-checking system and fact-checking in general as unfair and favoring Democrats — a claim that is in dispute.

X’s Community Notes system, which CEO Elon Musk has used to replace the company's previous efforts around misinformation, has been celebrated by conservatives, and it has allowed for a mixture of fact-checking, trolling and other community-driven behavior. 

Zuckerberg's announcement comes as CEOs and business leaders across sectors are currying favor with the incoming administration of President-elect Donald Trump. Meta, along with other tech companies, donated $1 million to Trump's inaugural fund, and ahead of the election, Zuckerberg praised Trump in an interview with Bloomberg Television without offering an outright endorsement. Ahead of Trump's inauguration, Meta has reportedly appointed Republican Joel Kaplan to lead its policy team, and on Monday, the company announced that UFC's Dana White, a long-time supporter of Trump, would join its board.

Kaplan appeared on Fox News on Tuesday morning as part of the rollout of the company's announcement. No other Meta executive appeared on any news channel.

Meta’s initial fact-checking system, which was launched on Facebook in 2016, worked by running information on its platforms through third-party fact-checkers certified by the International Fact-Checking Network and the European Fact-Checking Standards Network. The program included more than 90 organizations that would fact-check posts in more than 60 languages. In the United States, they have included groups such as PolitiFact and Factcheck.org. 

In a news release, Meta wrote that it was able to identify posts that might be promoting misinformation based on how people were responding to certain pieces of content and how fast posts would spread. Independent fact-checkers would also work to identify posts with possible misinformation on their own. Posts that were said to include misinformation would then be shown lower in feeds as they waited for review.

The independent fact-checkers would then work to verify the accuracy of the content that had been flagged and give it a “content rating,” labeling content as “False,” “Altered,” “Partly False,” “Missing Context,” “Satire” or “True” and adding notices to the posts. 

Those fact-checking measures applied to any posts on Facebook, and they expanded to include Instagram in 2019 and Threads last year. Fact-checkers were able to review content including “ads, articles, photos, videos, Reels, audio and text-only posts.”

Under the system, Meta noted, fact-checkers did not have the ability to remove content, and content would be removed if it violated the company’s community standards, which was discerned by Meta itself.

In addition to the changes in content moderation, Zuckerberg also said Tuesday that the company will be changing its powerful recommendation system that decides what to show users. Meta has, for years, limited political content, citing user complaints and following discussions about how social media can affect users' beliefs.

"We're bringing back civic content," he said. "For a while, the community asked to see less politics because it was making people stressed, so we stopped recommending these posts. But it feels like we're in a new era now, and we're starting to get feedback that people want to see this content again. So we're going to start phasing this back into Facebook, Instagram and Threads, while working to keep the communities friendly and positive."

Lastly, Zuckerberg said Meta would work with the incoming Trump administration to promote free speech around the world, though he did not detail any steps toward that goal.

"We're going to work with President Trump to push back on governments around the world that are going after American companies and pushing to censor more," he said, adding that various countries have cracked down on certain speech online.

"The only way that we can push back on this global trend is with the support of the U.S. government, and that's why it's been so difficult over the past four years, when even the U.S. government has pushed for censorship," Zuckerberg added. "By going after us and other American companies, it has emboldened other governments to go even further."

In the last several years, the government's interactions with social media companies have come under intense scrutiny from Republican politicians who have made claims of censorship. After the 2016 election, Meta and other social media companies ramped up moderation efforts and routinely met with representatives of the FBI and other governmental organizations to prevent foreign interference. As Republicans began to inquire about government interactions with social media platforms, those meetings became the subject of congressional inquiries led by conservative leaders such as Rep. Jim Jordan, R-Ohio.

In 2022, Zuckerberg defended the company's interactions with agencies such as the FBI, calling it a "legitimate institution" in an interview with podcaster Joe Rogan.

Scrutiny was also heaped on communications between the White House and social media companies, most notably around posts related to the Covid pandemic and vaccines. In August, Zuckerberg said that the Biden White House pressured Meta to take action against some Covid-related posts and that it was wrong to do so.

In June, the Supreme Court rejected arguments that the government had unlawfully coerced social media companies to take down content.

Still, the issue has remained politically potent, amplified by Musk's takeover of X (then Twitter) and his moves to drastically change the platform's moderation practices. Zuckerberg has previously praised Musk's handling of X, despite also at times engaging in a feud that has included discussions of a possible cage fight.

This story first appeared on NBCNews.com. More from NBC News:

Copyright NBC News
Contact Us