TikTok removed 60,465 Kenyan accounts from its platform between April and June 2024 for violating community guidelines, according to the company’s second-quarter Community Guidelines Enforcement Report for Kenya.
In addition to these removals, TikTok says another 57,262 accounts in Kenya were removed due to suspicions that they were managed by users under the age of 13.
Globally, the report highlights that TikTok removed a total of 178.8 million accounts, with 144.4 million accounts removed through automated processes. Only 5.4 million videos were later restored after review.
“With over a billion people and millions of pieces of content posted to our platform every day, we continue to prioritize and enhance TikTok’s automated moderation technology. Such technology enables faster and consistent removal of content that violates our rules,” said the company.
In Kenya, the report noted that approximately 360,000 videos were removed during this period, representing 0.3% of the total videos uploaded in the country. TikTok further reported that 99.1% of these videos were proactively removed before user reports, with 95% taken down within 24 hours of being posted.
“We invest in technologies that improve content understanding and predict potential risks so that we can take action on violative content before it’s viewed. These technical investments also reduce the volume of content that moderators review, helping minimize human exposure to violative content. As a result, automated technology now removes 80% of violative videos, up from 62% a year ago,” the report reads.
Of the total removed content, 31% involved sensitive and mature themes, 27.9% related to regulated goods and commercial activities, 19.1% addressed mental and behavioral health issues, 15.1% pertained to safety and civility, 4.7% concerned privacy and security, and 2.1% related to integrity and authenticity.