Meta announced sweeping changes to its content moderation approach across Facebook, Instagram, and other platforms, marking a significant shift in how the tech giant handles online speech and fact-checking.
The company will discontinue its third-party fact-checking program in the United States, replacing it with a Community Notes system similar to X (formerly Twitter). Under this new model, users themselves will provide context and additional information for potentially misleading posts, rather than relying on external fact-checkers.
The policy changes reflect Meta's renewed focus on free expression, with the company acknowledging that its previous moderation systems may have become too restrictive. Meta plans to lift various content restrictions around topics like immigration and gender identity that are frequently debated in mainstream discourse.
In a notable operational change, Meta will relocate its trust and safety teams from California to Texas and other U.S. locations. The company is also implementing changes to its enforcement systems, focusing automated detection primarily on illegal content and serious violations while requiring user reports for less severe infractions.
The announcement includes changes to how political content appears in users' feeds. Rather than broadly reducing civic content visibility, Meta will adopt a more personalized approach, allowing users greater control over the amount of political content they see across Facebook, Instagram, and Threads.
These changes come during a period of leadership transition at Meta, with Joel Kaplan taking over as chief global affairs officer. The company frames these updates as a return to its fundamental commitment to free expression, though critics may view them as a relaxation of safeguards put in place following past controversies around misinformation.
Meta plans to implement these changes gradually over the coming months, starting with the United States, while promising increased transparency about enforcement mistakes and policy outcomes.