China’s TikTok actively pushes pornography and other sexual content on 13-year-old users, according to researchers who created fake accounts of young teens to see what the Chinese platform’s algorithm would suggest to them.

In addition to TikTok allowing children to access pornography and sexual material, the app’s algorithm actually pushes minors toward the content, according to a report by the nonprofit investigative research organization Global Witness.

The discovery was made after researchers used “clean phones” with no search history to create TikTok accounts posing as 13-year-olds. They even activated the Chinese app’s “Restricted Mode,” which TikTok advertises as a safety feature that can be used by parents to limit the exposure of certain content for minors.

“You shouldn’t see mature or complex themes, such as — sexually suggestive content,” TikTok informs a user once the protective feature is turned on.

The first tests were conducted during March and April 2025, with additional tests being carried out over the summer.

“As any 13-year-old user might, we clicked on the search bar, and then on one of the search suggestions, and looked to see the kind of content that was being shown,” Global Witness explained.

“We then repeated this process a few times, going back to the search bar, looking again at what search terms were being suggested and what kind of content was behind one of them,” the organization added.

Three of the test accounts showed sexualized searches auto-filling in TikTok’s search bar, which suggests around ten searches “you may like.”

Some of the search bar suggestions on the children’s accounts included terms like “very very rude skimpy outfits” and “woman kissing her man while washing his…,” which then led to content showing women touching themselves or exposing their breasts.

At its worst, TikTok’s search suggestions led to pornographic footage of penetrative sex, with researchers finding that the porn had been edited into seemingly innocent photos or videos — which may offer some explanation as to how it evaded TikTok’s content moderation efforts.

“For one of the users, there was pornographic content just two clicks away after logging into the app — one click in the search bar and then one click on the suggested search,”  Global Witness said.

A full list of TikTok’s search suggestions for children’s accounts can be found here.

The organization added that other TikTok users have been complaining about the app recommended sexualized search suggestions to them — noting that this does not appear to be an isolated issue.

“We encountered several examples of TikTok users posting screenshots of sexualized search suggestions,” Global Witness said, citing others users’ captions, such as “can someone explain to me what is up w my search recs pls.”

Other TikTok users commented on those posts, writing, “I THOUGHT I WAS THE ONLY ONE,” “how tf do you get rid of it like I haven’t even searched for it,” and “same, what’s wrong with this app.”

Global Witness said it conducted its most recent investigation to see if any changes were made after the organization carried out a similar experiment in January, which yielded similar findings.

After its first investigation earlier this year, Global Witness reported its findings to TikTok, which reportedly responded, “We have reviewed the content you shared and taken action to remove several search recommendations globally.”

“And yet we found the issue hadn’t been resolved in a later investigation,” the organization said. “We gave TikTok the opportunity to comment on our findings and they said they took action on more than 90 pieces of content and removed some of the search suggestions that had been recommended to us.”

TikTok told Global Witness it is still reviewing its youth safety strategies.

Alana Mastrangelo is a reporter for Breitbart News. You can follow her on Facebook and X at @ARmastrangelo, and on Instagram.



Read the full article here

Share.
Leave A Reply

Exit mobile version