Whistleblowers Reveal How Meta and TikTok Algorithms Prioritised Outrage Over Safety

Internal whistleblowers have told the BBC that social media giants Meta and TikTok made decisions that allowed harmful content to spread widely, prioritising engagement over user safety.

More than a dozen insiders exposed how the companies took risks on issues including violence, sexual blackmail, and terrorism while competing for users’ attention. Meta employees said management encouraged the inclusion of “borderline” harmful content — such as misogyny, conspiracy theories, and racially charged posts — to compete with TikTok and maintain stock value.

A TikTok insider revealed how internal dashboards showed political content being prioritised over reports of harmful posts affecting children, highlighting the tension between public safety and corporate interests.

Meta’s Instagram Reels and the Race for Engagement

According to Matt Motyl, a former senior Meta researcher, Instagram Reels was launched in 2020 without sufficient safeguards. Internal research documents shared with the BBC showed Reels’ comment sections had significantly higher levels of bullying, harassment, hate speech, and incitement compared with the main Instagram feed.

Despite investing in 700 staff to grow Reels, the company refused additional hires for child safety and election integrity teams. Motyl and other whistleblowers said Meta’s algorithms rewarded outrage because it drove engagement, admitting that the “financial incentives… do not appear to be aligned with our mission.”

Former employees described a trade-off between boosting engagement and protecting users. One engineer said the decision to allow borderline harmful content came directly from senior leadership as part of a “do whatever we can to catch up” approach to TikTok’s rapid growth.

TikTok’s Algorithm and Safety Concerns

Ruofan Ding, a former machine-learning engineer at TikTok, explained that engineers had limited control over the deep-learning algorithms. Content was treated as data points, while trust and safety teams were expected to prevent harmful posts from being promoted.

Despite these safeguards, whistleblowers said borderline content, including misogynistic, racist, and sexualised posts, increased over time. One teenager, Calum, said he was “radicalised by the algorithm” from age 14, exposed to outrage-inducing content that shaped racist and misogynistic views.

A member of TikTok’s trust and safety team, who the BBC calls Nick, said the volume of cases was overwhelming. Some serious reports involving children were deprioritised in favour of political content, leaving users exposed to risks of cyberbullying, sexual exploitation, and radicalisation. Nick advised parents to keep their children away from the platform.

Internal Documents Expose the Impact of Algorithms

Internal Meta studies revealed that content that triggers outrage produces higher engagement, which in turn prompts the algorithm to show more of it. Documents warned that sensitive content, including posts touching on moral beliefs or inciting violence, disproportionately drove interactions.

Brandon Silverman, who helped develop Facebook’s monitoring tool Crowdtangle, said Meta leadership was “paranoid” about competition with TikTok, which intensified the prioritisation of engagement over safety. Decisions about staff and product launches often favoured growth metrics instead of harm prevention.

Company Responses

Meta rejected the allegations, stating: “Any suggestion that we deliberately amplify harmful content for financial gain is wrong” and emphasising investments in user safety and teen protections. TikTok also denied the claims, calling them “fabricated” and pointing to automated safety features, parental controls, and moderation systems designed to prevent harmful content from reaching users.

Despite these denials, whistleblowers argue that internal incentives and management priorities created conditions where harmful content could flourish, raising questions about the true cost of the algorithm-driven attention economy.

Leave a reply

Follow
Search
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...