Which social media platforms are the most socially responsible?

YouTube might have a head start, but IPG Mediabrands' audit shows that all platforms have something to work on.
social media

IPG Mediabrands has released its new Media Responsibility Audit, which is based on the 10 Media Responsibility Principles Mediabrands recently released to the public. The principles are geared toward protecting brands and the communities that a brand serves, weighing the impact of harmful content, and evaluating the policies of different platforms and their enforcement.

Led by Mediabrands’ performance agency, Reprise, a cross-functional team conducted a comprehensive assessment of the main social media platforms – Facebook, LinkedIn, Pinterest, Reddit, Snapchat, TikTok, twitch, Twitter, and YouTube – against each of the 10 principles. The audit was comprised of 250 questions in total and focused on establishing a benchmark on what a responsible platform looks like.

The first platform to top the list was YouTube. Mediabrands viewed the platform as having a “head start” on brand safety, since the platform underwent a high-profile initiative to increase social responsibility – and thus brand safety on the platform. Measures it took included changing the standards that allowed certain creators and videos to be monetized and staffing up so that videos are vetted not simply by AI mechanisms, but humans, for their appropriateness. While YouTube has evolved in the past three years, it took brand boycotts from GM Canada and PepsiCo.

Mediabrands’ audit comes amidst renewed concern about brand safety. This year, it’s mainly been Facebook under the microscope; the platform has been accused of not doing enough to moderate and police hateful content. It even resulted in a fairly significant boycott from Canadian and global brands including Lululemon, MEC and numerous financial institutions, although early results indicate that most have gone back to advertising on Facebook.

But Matt Ramella, managing director, Reprise Canada, says these issues are part of a bigger picture.

“We took the view that the issues at hand are larger than just Facebook,” he tells MiC. Reprise involved nine different partners in the audit questionnaire.

“We examined not only policies and procedures, but also looked for evidence of consistent policy enforcement. To ensure platforms are walking the talk and standing by their policies, enforcement criterion was given higher weighting across the audit questionnaire. We identified both the average and best-in-class across each principle to establish a benchmark on what a responsible platform looks like.”

There are still some areas that, universally, platforms need to work on. Mediabrands’ findings include an urgent need for third-party verification, something only a few platforms have for protecting advertisers from adjacency to objectionable content and harmful categories.

There is also a lack of consistency across platforms when it comes to anti-discrimination and data privacy. Most platforms are challenged by combatting misinformation and the eradication of hate speech, since the definitions of what qualifies as hate speech is inconsistent.

YouTube might have come out on top for now, but the current ranking is not the final word – the audit will be conducted quarterly. This assessment provides an objective and factual scoring that can be tracked for improvement over time. The analysis will better inform clients on the state of social media responsibility across the entire ecosystem and helps brands understand how each platform aligns with their own CSR principles and values. Each advertiser will weigh risk differently and it’s important to align a brand’s “non-negotiables” with the decision to advertise on any one platform.

Ramella says the audit findings clearly show that platforms fall short by not backing up their promises with consistent policy enforcement. This is an area where marketers and advertisers can apply pressure to platforms by looking at prevalence. “Prevalence is the scale and frequency at which content that violates user policies and community guidelines is being shared and flagged on a platform. There’s an urgent need for platforms to adopt independent third-party auditing and tracking within this area versus having each of them grade their own policy enforcement.”