In a significant shift in its content moderation strategy, Meta, the parent company of Facebook, Instagram, and Threads, is officially terminating its fact-checking program in the United States.
Joel Kaplan, Meta’s chief global affairs officer, announced the changes on Friday, marking the end of a controversial era and the beginning of a new approach to handling misinformation on its platforms.
“By Monday afternoon, our fact-checking program in the US will be officially over. That means no new fact checks and no fact checkers,” Kaplan stated in a social media post.
“We announced in January we’d be winding down the program & we haven’t applied penalties to fact-checked posts in the US since then. In place of fact checks, the first Community Notes will start appearing gradually across Facebook, Threads & Instagram, with no penalties attached.”
The decision to phase out the fact-checking initiative, which relied on third-party organizations to verify the accuracy of content, follows months of scrutiny and debate over its effectiveness and perceived biases.
Launched in 2016 to combat the spread of misinformation, the program partnered with independent fact-checkers to label false or misleading posts, often reducing their visibility or adding warning notices.
However, critics argued it disproportionately targeted certain viewpoints and stifled free expression, while supporters credited it with curbing the viral spread of hoaxes and conspiracy theories.
Meta’s pivot to Community Notes—a system inspired by X’s crowd-sourced fact-checking feature—signals a move toward a more decentralized model. Unlike the previous program, Community Notes will allow users to collaboratively submit and vote on contextual notes to accompany posts, providing additional information without imposing penalties like reduced reach or account restrictions.
Kaplan emphasized that this approach aims to foster transparency and user engagement rather than top-down moderation.
The transition begins Monday, with Community Notes expected to roll out gradually across Meta’s platforms.
While the company has not detailed the exact mechanics of the system, it is likely to rely on a combination of user submissions and algorithmic oversight to ensure relevance and accuracy. The absence of penalties has already sparked mixed reactions: some praise the hands-off approach as a win for free speech, while others worry it may leave misinformation unchecked.
This article was originally published by a www.shorenewsnetwork.com . Read the Original article here. .