Tiktok, a video-sharing platform, removed 7,3 million accounts of children under the age of 13 over the first quarter of this year.
According to BBC, Tiktok published figures in a Community Guidelines Enforcement Report, for the first time ever, in hopes that the under-age user detail will “help the industry push forward when it comes to transparency and accountability around user safety”.
Community Guidelines Enforcement Report
Here are details of the report:
– 61,951,327 videos were removed for violating the app’s rules, fewer than 1% of all videos uploaded
– 82% of them were removed before being viewed, 91% before any user reported them, and 93% within 24 hours of being posted
– 1,921,900 ads were rejected for violating advertising policies and guidelines
– 11,149,514 accounts in total were removed for violating guidelines and terms of service.
“To bring more visibility to the actions we take to protect minors, in this report we added the number of accounts removed for potentially belonging to an under-age person,” Cormac Keenan, head of trust and safety at TikTok, said.
Underage Tiktok users
There has been a lot of concerns about most of the app’s users being minors. The video-sharing app has also been sued for collecting and using data from children.
The claim was being made on behalf of millions of children in the UK and EU who use the platform. The tech firm said the case was without merit and it would fight it, BBC reports.
Tiktok has already started pushing measures to protect teenagers online such as limiting features like private messaging and live streaming to 16-year-old users and older.
In January this year, Tiktok introduced a feature that will automatically set accounts of those under the age of 16 to private accounts.