Home » Blog » TikTok Accused of Pushing Explicit Search Terms to 13-Year-Olds

TikTok Accused of Pushing Explicit Search Terms to 13-Year-Olds

TikTok Accused of Pushing Explicit Search Terms to 13-Year-Olds

A new investigative report reveals that TikTok’s algorithm actively suggested sexually explicit search terms to accounts registered as 13-year-olds—even when “Restricted Mode” was enabled. Test accounts were set up on fresh phones with no prior search history, and still the platform’s “You May Like” search suggestions included phrases like “very rude babes” and escalated to more explicit terms such as “hardcore porn clips.” (Paraphrased from Global Witness findings)

In some instances, pornographic content appeared within just two clicks—first click on the search bar, second on the suggested phrase. The investigators say the explicit content was sometimes embedded in innocuous video thumbnails to dodge moderation filters. The findings suggest TikTok’s content moderation and recommendation systems are failing to shield young users.

Because of these results, Global Witness argues TikTok may be in breach of the UK’s Online Safety Act, which took effect July 2025 and mandates that platforms protect minors from harmful content. (They reported the evidence to authorities and called for regulators like Ofcom to investigate.)

TikTok responded by saying it had removed offending videos and is updating its search algorithm to reduce harmful suggestions. But critics warn that algorithmic bias and recommendation logic might still lead minors into dangerous content, meaning more systemic reform and oversight are needed.

Leave a Reply

Your email address will not be published. Required fields are marked *