Meta CEO Mark Zuckerberg announced major changes to the company's content moderation policies, placing blame on fact-checking partners while drawing swift rebuttals from those same organizations.
In a video announcement, Zuckerberg claimed "fact-checkers have been too politically biased" and "destroyed more trust than they created." He revealed plans to end Meta's fact-checking program with trusted partners in favor of a community-driven system similar to X's Community Notes.
However, fact-checking organizations that worked with Meta strongly disputed Zuckerberg's characterization. Neil Brown, president of the Poynter Institute which runs PolitiFact, defended their work: "I don't believe we were doing anything, in any form, with bias. There's a mountain of what could be checked, and we were grabbing what we could."
The fact-checkers emphasized that they never had authority over content removal - that power remained solely with Meta. According to Meta's own policies, fact-checkers could only review and rate content accuracy, applying labels like "False," "Altered," or "Missing Context."
The announcement comes as Meta shifts its approach under the incoming Trump administration. The company plans to reduce content filtering, bring back more political content in feeds, and relocate its trust and safety team from California to Texas.
Meta's fact-checking program, launched in 2016, involved over 90 organizations fact-checking content in more than 60 languages. While the U.S. program is ending, similar arrangements with fact-checkers in roughly 119 other countries currently remain unchanged.
The move represents a major reversal in Meta's content moderation strategy, which had expanded in recent years to combat misinformation across Facebook, Instagram, and Threads. Zuckerberg framed the change as getting "back to our roots" with a focus on "reducing mistakes" and "restoring free expression."