95% of self-harm posts unflagged by major platforms
The UK-based Molly Rose Foundation (MRF) has released new research that reveals children are being put at risk by the failure of major social networks to detect and remove dangerous suicide and self-harm content.
Analysis of over 12 million content moderation decisions by six major tech platforms shows that over 95% of suicide and self-harm posts are being detected by just two major platforms, Pinterest and TikTok.
The report found that Instagram and Facebook are each responsible for only 1% of all suicide and self-harm content detected by major platforms. X, formerly known as Twitter, is responsible for just 1 in 700 content decisions.
Chair Ian Russell has urged the UK government to commit to a new Online Safety Act that can strengthen regulation and ‘finish the job.’
MRF analysed publicly available records of over 12 million content moderation decisions taken by six sites: Facebook, Instagram, Pinterest, Snapchat, TikTok and X. Under the EU’s Digital Services Act, these platforms must publish records every time they detect and take action on an item of suicide and self-harm content.
Social media platforms are routinely failing to detect harmful content in the highest-risk parts of their services. For example, only 1 in 50 suicide and self-harm posts detected by Instagram were videos, despite its short-form video product, Reels, now accounting for half of all time spent on the app.
Own rules
Most major services fail to do enough to enforce their own rules, according to the report: for example, while TikTok detected almost 3 million items of suicide and self-harm content (2,892,658 decisions), it suspended only two accounts.
The foundation also found no evidence that Meta is implementing high-profile commitments it made to restrict harmful suicide and self-harm content from children’s feeds. Despite promising to restrict harmful content in January 2024, shortly before Mark Zuckerberg gave evidence to a US Senate hearing, Meta has so far failed to restrict a single item of content for teens.
Ian Russell, chair of the Molly Rose Foundation, said "it’s shocking to see most major tech companies continue to sit on their hands and choose inaction over saving young lives."
Analysis of over 12 million content moderation decisions by six major tech platforms shows that over 95% of suicide and self-harm posts are being detected by just two major platforms, Pinterest and TikTok.
The report found that Instagram and Facebook are each responsible for only 1% of all suicide and self-harm content detected by major platforms. X, formerly known as Twitter, is responsible for just 1 in 700 content decisions.
Chair Ian Russell has urged the UK government to commit to a new Online Safety Act that can strengthen regulation and ‘finish the job.’
MRF analysed publicly available records of over 12 million content moderation decisions taken by six sites: Facebook, Instagram, Pinterest, Snapchat, TikTok and X. Under the EU’s Digital Services Act, these platforms must publish records every time they detect and take action on an item of suicide and self-harm content.
Social media platforms are routinely failing to detect harmful content in the highest-risk parts of their services. For example, only 1 in 50 suicide and self-harm posts detected by Instagram were videos, despite its short-form video product, Reels, now accounting for half of all time spent on the app.
Own rules
Most major services fail to do enough to enforce their own rules, according to the report: for example, while TikTok detected almost 3 million items of suicide and self-harm content (2,892,658 decisions), it suspended only two accounts.
The foundation also found no evidence that Meta is implementing high-profile commitments it made to restrict harmful suicide and self-harm content from children’s feeds. Despite promising to restrict harmful content in January 2024, shortly before Mark Zuckerberg gave evidence to a US Senate hearing, Meta has so far failed to restrict a single item of content for teens.
Ian Russell, chair of the Molly Rose Foundation, said "it’s shocking to see most major tech companies continue to sit on their hands and choose inaction over saving young lives."
Comments
Namibian Sun
No comments have been left on this article