Introduction

Over the past five years, TikTok has become the world’s most influential short-video platform, shaping pop culture, entertainment trends, and even political engagement. However, while TikTok dominates youth attention spans, concerns about harmful content have grown exponentially. A recent investigative report in the United Kingdom has revealed that TikTok’s algorithm actively directs minors toward sexualized content, even when parental controls are enabled. This discovery has triggered debate among policymakers, parents, and digital-safety experts.

The implications are not limited to Western countries. In the Middle East and North Africa, particularly Tunisia, TikTok is growing rapidly among users aged 10 to 17. This surge raises new questions about whether societal, legal, and educational frameworks are ready to protect children in the digital space.

This investigative article explores the full context of the findings, analyzes the mechanics of TikTok’s recommendation algorithm, examines the psychological impact on minors, and highlights why Tunisia must develop stronger protective measures.


The UK Investigation: Methodology and Key Findings

In October 2025, the London-based organization Global Witness conducted a controlled experiment to determine whether TikTok’s safety mechanisms could prevent minors from consuming inappropriate material. Seven brand-new TikTok accounts were created using devices with:

  • no previous watch history,
  • no stored cookies,
  • no search records,
  • children’s safety settings enabled,
  • “Restricted Mode” activated.

Despite these protections, the accounts received:

  • sexually suggestive search prompts,
  • algorithmically recommended provocative videos,
  • material featuring hyper-sexualized dancing and minors,
  • content linked to grooming risks.

Some videos were forwarded to the Internet Watch Foundation, which specializes in identifying child exploitation material.

TikTok responded publicly, stating it had recently removed more than ninety hashtags associated with sexual content discovery and had improved machine-learning filters. Yet the investigation suggested that harmful recommendations appear before moderation systems intervene.


How TikTok’s Algorithm Behaves

TikTok’s recommendation engine, known as “For You,” is designed to maximize watch time. Its performance depends on measuring:

  • how long a user watches a clip,
  • replays,
  • pauses,
  • comments,
  • likes,
  • interaction with similar content.

Research published by the Journal of Behavioral Addictions (2024) revealed that sexually suggestive clips consistently generate longer watch times compared to neutral videos, even when the viewer is underage. Because machine-learning models cannot easily interpret user intent, the algorithm simply concludes: “This kind of content keeps the user watching.”

This is not intentional exploitation. It is automation optimized for engagement — but engagement does not equal safety.

For Tunisian readers unfamiliar with the psychological risks connected to TikTok engagement, the article How TikTok Is Ruining Youth provides further context.


The Global Scale: Stunning Statistics

Recent reports show:

  • 67% of TikTok’s users are between 13 and 24 years old (DataReportal, 2025).
  • TikTok surpassed 2.2 billion global downloads by early 2025.
  • Children aged 8–11 in the UK spend an average of 3 hours per day on TikTok (Ofcom Youth Media Study, 2024).
  • TikTok is used by 90% of surveyed teenagers as their primary platform for entertainment (Pew Research Center, 2024).
  • In Tunisia, usage among ages 10–17 increased 32% during 2024–2025 (Tunisia Digital Report).

Such statistics demonstrate that the platform’s influence extends beyond harmless dancing videos.

As minors increasingly search TikTok instead of Google, the content discovery system becomes the primary educator of online identity, a fact explored in detail in TikTok Monetization in Tunisia.


Restricted Mode: Ineffective Protection

Parents can activate Restricted Mode to limit inappropriate material. Unfortunately, researchers found that:

  • Restricted Mode filters primarily by keywords, not context.
  • Sexualized content often avoids explicit keywords.
  • Algorithms recommend videos based on watching patterns, not age appropriateness.

This means that if a minor watches a dance video featuring a crop-top outfit — even innocently — the system may progressively recommend more provocative videos. Over time, the content spiral intensifies.

Other platforms, such as YouTube Kids, use human moderation teams to catch context-based escalation, while TikTok relies largely on automated tools.


Grooming Risks in Livestream Features

One of the most concerning aspects of TikTok is its TikTok LIVE system. Livestreams allow:

  • direct audience communication,
  • the sending of virtual gifts,
  • instant financial rewards,
  • real-time challenges.

Virtual gifting systems are widely exploited. Predators can send small monetary incentives to persuade minors to perform suggestive movements or participate in sexualized trends.

This issue has been documented extensively in North Africa, including Morocco, where TikTok scandals continue to unfold. Readers can explore this regional parallel in TikTok Scam in Morocco Scandal.


Financial Incentives and Minor Exploitation

In Tunisia, the virtual gift economy creates a dangerous incentive loop:

  1. A minor performs a dance.
  2. Viewers send digital gifts.
  3. Those gifts convert into monetary value.
  4. The minor repeats and escalates content to earn more.

Children chasing virtual rewards quickly learn:

  • sexualized behavior equals profit,
  • attention equals income.

The article TikTok Gifts Tunisia explains how this system works and why it encourages unethical behavior.


Parental Awareness Gap in Tunisia

Digital literacy among parents remains low. Many underestimate:

  • the addictive nature of short-video dopamine cycles,
  • the speed of grooming tactics,
  • the permanence of digital footprints,
  • the financial motives driving content creation.

According to the Tunisia Digital Childhood Survey (2024):

  • 72% of parents do not regularly monitor their children’s TikTok usage.
  • 58% are unaware of livestream gifting options.
  • 38% believe TikTok “does not contain sexually explicit material.”

Real-world patterns prove otherwise.


Societal and Cultural Impact

Exposure to sexualized content influences:

  • self-esteem,
  • body image,
  • early sexual identity development,
  • peer competitiveness.

A 2023 study from Stanford University found that adolescents exposed to such content experience:

  • increased anxiety rates by 44%,
  • higher depression risk by 29%,
  • more body-dissatisfaction complaints by 52%.

In the Tunisian context, where discussions about sexuality are culturally sensitive, minors lack adequate psychological support.


Regulatory Silence in Tunisia

Unlike the UK, Tunisia currently lacks:

  • age-verification requirements,
  • gift taxation transparency,
  • livestream age restrictions,
  • platform accountability laws.

Meanwhile, recruitment agencies in Tunisia actively search for new streamers, including minors. The warning signs are covered extensively in TikTok Agencies Tunisia Risks.

Without policy intervention, such agencies can profit from early sexualization trends.


The Psychology of Algorithmic Influence

TikTok’s algorithmic design creates:

  • intermittent dopamine reward cycles,
  • a fear of missing out,
  • social comparison anxiety.

For minors, whose prefrontal cortexes are still developing, this cycle becomes addictive. Behavioral psychologists warn that such exposure:

  • fosters attention disorders,
  • shapes risky offline behavior,
  • normalizes exploitation culture.

The Escalation Spiral

One subtle danger is algorithmic escalation. Here’s how it works:

  1. A minor watches a slightly provocative dance.
  2. The algorithm suggests similar content.
  3. Each clip is slightly more explicit.
  4. Eventually, the user’s personalized feed becomes saturated with sexualized themes.

This is why preventing exposure early on is crucial.


Tunisia’s Digital Future

Tunisia’s government has made progress in cybersecurity and cybercrime enforcement. However:

  • Video platforms remain lightly monitored.
  • Schools rarely teach digital safety.
  • Parents lack awareness tools.
  • Platforms lack Arabic moderation coverage.

The result is a monitoring vacuum.


What Tunisian Parents Must Do Now

Experts recommend:

  • enabling Family Pairing mode,
  • disabling the ability to receive gifts,
  • turning off direct messaging,
  • limiting screen time,
  • discussing digital consent openly,
  • monitoring livestreams.

Additionally, parents should learn the structure of TikTok gifting, explained in TikTok Gifts Tunisia.


What Lawmakers Should Consider

To protect minors, Tunisia can:

  • enforce age-verification through biometric checks,
  • regulate financial flows to minors,
  • limit livestream access to adults,
  • require content labeling,
  • mandate reporting from streaming agencies.

Countries like France, Australia, and South Korea are already moving in this direction.


TikTok’s Responsibility

TikTok must dramatically improve:

  • contextual moderation,
  • automated escalation detection,
  • Arabic language content flagging,
  • transparency reports for North Africa,
  • educational outreach to parents.

Safety cannot rely on corporate promises.


Conclusion

The UK investigation confirms a disturbing truth: TikTok’s algorithmic system can push minors toward sexualized content even when protections are enabled. While the company claims to prioritize child safety, automated recommendation models continue to reward curiosity, attention, and provocation.

As TikTok’s popularity grows in Tunisia, families, educators, and policymakers must not underestimate the psychological, financial, and sexual risks present on the platform. Without proactive intervention, minors remain vulnerable to algorithmic exploitation, grooming tactics, and performance-based sexualization — often without parental awareness.