Meta plans to axe third-party fact checking

In a move meant to increase free speech, the company is moving to the community notes system used by X.

Meta is planning to end its third-party fact-checking program and instead move to the Community Notes model used by X.

In a company statement on Tuesday, Meta’s chief global affairs officer Joel Kaplan stated that this change will first take place in the U.S. and that the company will lift restrictions on some topics that are part of the mainstream discourse, while continuing to focus its enforcement on illegal and high-severity violations.

Meta also plans to take a more personalized approach to political content, according to Kaplan. “Since 2021, we’ve made changes to reduce the amount of civic content people see – posts about elections, politics or social issues – based on the feedback our users gave us that they wanted to see less of this content. But this was a pretty blunt approach. We are going to start phasing this back into Facebook, Instagram and Threads with a more personalized approach so that people who want to see more political content in their feeds can.”

The platform’s original fact-checking program was launched in 2016 “as a way for independent experts to give people more information about the things they see online, particularly viral hoaxes, so they were able to judge for themselves what they saw and read,” said Kaplan. The program also gave advertisers more assurance of brand safety. However, Meta claims it was not working, particularly in the U.S.

Devon MacDonald, president of Cairns Oneil says the agency has seen significant brand safety issues with other social platforms reducing content controls.

“The announcements today give increased concerns about Meta. Civic, political or moderate-risk content all carry brand safety concerns for clients,” says MacDonald. “With the negative impact of social media on young people already being well established, we’ve continually reviewed our investments on platforms like Instagram. This announcement will further that scrutiny and we’ll be proactively seeking out ways to protect our clients while looking for new brand safety protections.”

Meta does not see its role or responsibility to be the arbiter of truth, said Kaplan, pointing to The Community Notes approach on X, which has helped to “empower the community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see. We think this is a better way of achieving its original intention of providing people with information about what they’re seeing – and one that’s less prone to bias.”

“Social Platforms, if left unmoderated, will trend towards the loudest voices. Reddit, BlueSky, X, TikTok, all those platforms have a distinct voice. Specifically, a distinct political voice,” says Mohsen Dezyanian, president of Empathy Inc. “That means, media planners must now think like journalists more than ever before. It’s not just about reach, and CPM, and CPC. It’s about the voice of the platform. More and more media will become fragmented. And your brand’s very presence on a platform will say something about your brand. Whether you intended it or not.”

Lakshmi Radhakrishnan, director of Performance Marketing at Involved Media says, “Meta owns three of the top four large social media networks in the world and, as a result Meta bears the responsibility for setting regulations that are in line with each countries policies, but also set the standards for moderating discourse – such as removing hate speech and fact checking. This move is a huge step back for regulations in the social space.”

Kate Dorofeeva Involved Media’s director of Digital Strategy says that with regulations getting pulled off of gender diversity and immigration conversations, and with rise in AI-generated content, brand safety is a rising risk.  “Given the size of the audience housed within this umbrella, brands may not completely boycott this avenue but should seriously consider diversifying their portfolio into other platforms to make smarter and safer bets on where they can reach their audience. As an agency, we need to continuously refine our internal brand safety standards to ensure a client’s ad is placed in a safe environment and steer them in the direction that aligns with their business goals.”

To begin, Community Notes will be phased out in the U.S. over the next couple of months. During this time, Meta says it will “remove the fact-checking control, stop demoting fact-checked content and, instead of overlaying full screen interstitial warnings users have to click through before you can even see a post, we will use a much less obtrusive label indicating that there is additional information for those who want to see it.”