Covert Targeting: How TikTok Ads Quietly Profile Minors Despite EU Protections

tourist-using-smartphone-navigation-app


TikTok’s irresistible mix of short-form videos and hyper-personalised recommendations has turned it into the social platform of choice for millions of European teenagers. Yet beneath the dance challenges and meme trends, marketing practices are unfolding that quietly bypass European laws designed to keep minors safe from commercial profiling.

The Legal Backdrop: What the EU Actually Bans

The EU’s Digital Services Act (DSA) and the General Data Protection Regulation (GDPR) both include provisions that forbid advertisers from using sensitive or granular data to profile users under 18 for advertising purposes. In theory, this means:

  • No targeting based on behavioural signals collected from a minor’s activity.
  • No personalised ads driven by inferred interests, location, or browsing history.
  • Full transparency whenever paid content is shown.

Violations can lead to multimillion-euro fines, service restrictions, or—under repeat non-compliance—temporary bans.

Inside TikTok’s Ad Engine

TikTok offers multiple pathways for brands to reach users:

1. In-Feed Ads

These appear natively among regular videos, using the same full-screen, swipe-up interface. TikTok’s automated auction system relies on user data points—watch time, likes, shares—to predict which ad a user is most likely to engage with.

2. Creator Partnerships

Brands send products or scripts to popular creators, who then post “organic” videos. Unless the creator manually toggles a sponsorship disclosure, these clips look indistinguishable from personal content.

3. Spark Ads

An advertiser can take an existing organic video and turn it into a paid placement without altering the original post. If a creator’s audience is largely teenage, the ad effectively targets minors—even when the advertiser itself never sees under-age data.

The Loophole: Undisclosed or Poorly Disclosed Ads

Because advertising labels are optional and tiny, many teens can’t tell when they are watching sponsored material. A recent NGO study found that:

  • Only 18% of influencer ads aimed at minors used TikTok’s standard “paid partnership” disclosure.
  • Brand hashtags like #gifted or #collab were buried in caption folds, invisible unless users tapped “see more.”
  • Even when “ad” text was present, the clip’s personalised placement meant it still violated the profiling ban.

Why It Matters

The stakes go beyond pocket money spent on fast-fashion hauls. Covert commercial pressure can:

  • Distort self-image—beauty filters plus product plugs normalise unattainable standards.
  • Undermine informed choice—teens may not realise they are targets of persuasive design.
  • Collect shadow profiles—behavioural data harvested today can resurface in data-broker dossiers tomorrow.

Regulatory Pushback and Platform Responses

So far, enforcement has lagged behind the technological reality:

EU Commission Investigations

In late 2023, the Commission demanded detailed risk assessments from TikTok on child safety. The platform submitted thousands of pages, but investigations are ongoing and final rulings could take years.

TikTok’s Countermeasures

TikTok now advertises a “Zero-Data Personalisation” mode for minors in the EU. However, civil society audits show targeted ads still appear, suggesting the on-device algorithm draws on session activity to rebuild profiles in real time.

What Parents and Teens Can Do Now

  • Enable restricted mode: It reduces categories of mature content but also blunts some ad targeting vectors.
  • Scrutinise hashtags and captions: Phrases like #ad, #spon, #gifted, or #partner are red flags.
  • Reset ad interests: Under “Privacy & Ads” in settings, users can clear inferred interests, though the effect is temporary.
  • Report undisclosed ads: Use TikTok’s built-in reporting tool; documented patterns help regulators build cases.

Looking Ahead

With the DSA now fully enforceable and hefty fines looming, TikTok faces a pivotal choice: redesign its ad stack for age-appropriate transparency, or risk becoming the first major test case of Europe’s tougher stance on Big Tech. Either way, the platform’s treatment of its youngest users will serve as a bellwether for how seriously social media companies take children’s digital rights in the algorithmic age.


Leave a Reply

Your email address will not be published. Required fields are marked *

Most Read

Subscribe To Our Magazine

Download Our Magazine