TikTok has suspended 60,465 Kenyan accounts for breaching its community guidelines, with the platform flagging these accounts for policy violations. The move is part of TikTok’s ongoing efforts to maintain a safe online environment.
According to TikTok’s second-quarter Community Guidelines Enforcement Report for Kenya, the platform also removed 57,262 accounts suspected to be operated by users under the age of 13. This action reflects the company’s commitment to ensuring that its platform is safe and appropriate for all users, particularly minors.
Globally, TikTok reported removing 178.8 million accounts, with 144.4 million of those flagged by automated systems. The platform also restored 5.4 million videos after review.
In Kenya, TikTok removed at least 360,000 videos, representing 0.3% of the total videos uploaded on the platform during the latest reporting period. Impressively, 99.1% of these videos were proactively taken down before any user complaints, with 95% of them removed within 24 hours.
The platform’s investments in content moderation technology have paid off. These advancements help predict potential risks, enabling faster action on harmful content. TikTok’s automated systems now handle 80% of violative videos, up from 62% last year, reducing the need for manual review.
The removed content was categorized into various areas: 31% involved sensitive and mature themes, 27.9% related to regulated goods and commercial activities, 19.1% dealt with mental and behavioral health, and 15.1% focused on safety and civility. Smaller portions of content were removed for privacy and security violations (4.7%) and integrity concerns (2.1%).