[ad_1]
A graphic suicide video that went viral on TikTok in early September was “the result of a coordinated raid from the dark web”, the company has told MPs.
Giving evidence to the Commons committee for digital, culture, media and sport (DCMS), Theo Bertram, TikTok’s European director of public policy, said the video, which was originally broadcast live on Facebook, was used in a “coordinated attack” on the social video app a week after it was originally recorded.
“We learned that groups operating on the dark web made plans to raid social media platforms, including TikTok, in order to spread the video across the internet,” Bertram said.
“What we saw was a group of users who were repeatedly attempting to upload the video to our platform, and splicing it, editing it, cutting it in different ways,” he added. “I don’t want to say too much publicly in this forum about how we detect and manage that, but our emergency machine-learning services kicked in, and they detected the videos.”
The death was widely shared on the site, with prominent users eventually sharing advice to others on how to spot and avoid the video before it autoplayed.
Almost a week separated the death from the “raid” on TikTok, which has led the company to propose a “global coalition” to help protect users against such harmful content.
“Last night, we wrote to the CEOs of Facebook, Instagram, Google YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit,” Bertram said, “and what we are proposing is that, in the same way these companies already work together around [child sexual abuse imagery] and terrorist-related content, we should now establish a partnership around dealing with this type of content.”
Such a partnership would have allowed, for instance, Facebook to share technical details of the graphic video so that TikTok could prevent it from being uploaded in the first place.
That would also ease the burden on moderation for the smaller companies. In the UK, TikTok has 363 moderators, Bertram revealed, out of a staff of 800.
As well as working on keeping the platform clean of harmful content, those moderators are also tasked with enforcing TikTok’s age guidelines: “Every time you moderate a video with a human reviewer, that human reviewer, whatever else they’re reviewing that video for, is also looking to see whether that account belongs to someone under the age of 13,” Bertram revealed. If the account looks like it does belong to someone underage, it is removed.
Bertram also asked MPs to help simplify that process, by requiring Apple and Google to incorporate stronger age restrictions into their app stores.
“There are only two app stores,” he argued. “So if you said to the app stores: ‘This is the one place where we’re going to ask parents to give proof of age,’ then you’re not increasing the risk of the data loss by making parents get out data for every app, but you can ensure that every app then needs to verify the age with that app store.
“It seems to me that that would be a solution to a tougher age verification.”
Responding to a complaint from the Tory MP Steve Brine about “trashy” content on the app, such as “content where there’s a mother and daughter pushing their tushy, if you know what I mean”, Bertram recommended MPs follow Andrew Lloyd Webber’s videos.
“Does he push his tushy?” Brine asked. Bertram confirmed that no, Andrew Lloyd Webber does not, “but he is quite entertaining”.
-
In the UK and Ireland, Samaritans can be contacted on 116 123 or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org.
[ad_2]
Source link