Phase pilote
Confirmé contested Sévérité : Important Version 1

L'action d'application la plus importante en matière de vie privée contre un système d'IA au Canada. Quatre commissaires fédéraux et provinciaux ont conjointement conclu que le profilage par apprentissage automatique des enfants par TikTok n'avait aucune finalité légitime — rendant le consentement juridiquement non pertinent. La conclusion selon laquelle TikTok possédait une IA sophistiquée de détection d'âge mais a choisi de ne pas l'utiliser pour protéger les enfants établit un précédent pour les attentes réglementaires.

Survenu : 1 janvier 2020 (year) au 23 septembre 2025 Signalé : 23 septembre 2025

Récit

A joint investigation by four federal and provincial privacy commissioners found that TikTok collected personal information of Canadian children for ML-based algorithmic profiling, facial analytics, and targeted advertising without any legitimate purpose — making consent legally irrelevant.

The investigation, announced in February 2023 and published as PIPEDA Findings #2025-003 on September 23, 2025, was the largest coordinated privacy enforcement action against an AI system in Canada. The Office of the Privacy Commissioner of Canada led the investigation jointly with Quebec’s Commission d’accès à l’information, BC’s Office of the Information and Privacy Commissioner, and Alberta’s Office of the Information and Privacy Commissioner.

TikTok operates multiple interconnected ML systems that profiled its 14 million Canadian monthly active users, including children. Its content recommendation algorithm powers the “For You” feed using behavioural and inferred data. Convolutional neural networks analyze facial features for age and gender estimation. Audio analytics extract additional signals. Multiple age-estimation models — video-level, account-level, advertising, and TikTok LIVE — classify users into demographic segments. These systems collectively inferred users’ interests, location, age range, gender, spending power, and — critically — sensitive attributes including health data, political opinions, gender identity, and sexual orientation from content and behaviour patterns.

A central finding was that TikTok’s age assurance mechanisms were inadequate. TikTok relied on three weak measures: a voluntary age gate (easily circumvented by entering a false birthdate), minimal automated keyword scanning that only caught users who posted text, and human moderation triggered by user reports. Since 73.5% of TikTok users never post videos and 59.2% never comment, the vast majority of underage users — passive consumers of algorithmically recommended content — escaped detection entirely. TikTok removed approximately 500,000 underage Canadian accounts per year, but commissioners concluded the actual number of underage users was “likely much higher.” In Quebec, 40% of children aged 6–17 and 17% of children aged 6–12 had TikTok accounts.

The commissioners found that TikTok possessed sophisticated AI-based age-detection capabilities but did not deploy them to prevent underage access. BC Commissioner Michael Harvey noted the “elaborate profiling” involving facial and voice data combined with location data to “create inferences about spending power” — capabilities that demonstrated TikTok could identify children but did not deploy those tools for protection.

TikTok’s advertising targeting system exposed sensitive attributes. Hashtags like “#transgendergirl” and “#transgendersoftiktok” were available as ad targeting options, enabling advertisers to target users based on transgender status. TikTok was unable to explain why these hashtags had been available and later confirmed they “should not have been available.”

The commissioners also found that TikTok disclosed that affiliate companies and employees in China could access personal information collected from Canadian users — a finding that commentators noted had national security dimensions given the contemporaneous Investment Canada Act order (November 2024) to wind up TikTok Technology Canada Inc.

TikTok was directed to implement three new “demonstrably effective” age assurance mechanisms within six months, cease allowing advertisers to target users under 18, provide a youth-specific plain-language privacy summary, publish a privacy video for teen users, implement a “Privacy Settings Check-up” for all Canadian users, and submit monthly compliance updates. TikTok disagreed with the findings but committed to implementing all recommendations. The matter is conditionally resolved pending fulfilment.

A proposed privacy class action was commenced in the Supreme Court of British Columbia in October 2025 by Siskinds LLP against ByteDance and TikTok entities.

Préjudices

TikTok collected personal information of Canadian children — approximately 500,000 underage accounts removed per year — for ML-based content recommendation, facial analytics, biometric profiling, and targeted advertising, with no legitimate purpose under PIPEDA. The OPC found that because the purpose itself was inappropriate, consent could not render the collection lawful.

Important Population

TikTok's age assurance mechanisms were inadequate: a voluntary age gate easily circumvented by entering a false birthdate, minimal keyword scanning that only caught text-posting users, and human moderation via user reports. Since 73.5% of users never post videos and 59.2% never comment, the vast majority of underage users escaped detection entirely.

Important Population

TikTok's advertising targeting options included hashtags like '#transgendergirl' and '#transgendersoftiktok,' enabling advertisers to target users based on transgender status. TikTok was unable to explain why these hashtags had been available and confirmed they should not have been. Health data, political opinions, gender identity, and sexual orientation were inferred from user content through ML profiling.

Important Groupe

TikTok failed to obtain express consent for sensitive information processing, provide meaningful privacy disclosures, or make key privacy communications available in French — violating PIPEDA, Quebec's Private Sector Act, and provincial privacy statutes in BC and Alberta.

Modéré Population

Populations touchées

  • Canadian children and youth using TikTok
  • parents and guardians of underage TikTok users
  • transgender and LGBTQ+ TikTok users targeted by advertising profiling
  • all 14 million Canadian TikTok users whose consent was found inadequate

Entités impliquées

TikTok Pte. Ltd. / ByteDance
deployerdeveloper

A exploité la plateforme TikTok au Canada avec 14 millions d'utilisateurs actifs mensuels; a recueilli les données personnelles d'enfants pour le profilage par apprentissage automatique et la publicité ciblée sans finalité légitime; a déployé des mécanismes d'assurance d'âge inadéquats; a contesté les conclusions mais s'est engagé à mettre en œuvre toutes les recommandations

A dirigé l'enquête conjointe annoncée en février 2023; conclusions publiées le 23 septembre 2025 sous PIPEDA #2025-003; a conclu que la collecte de données d'enfants par TikTok était fondée; résolution conditionnelle avec engagements de conformité

A participé à l'enquête conjointe en tant que commission d'accès à l'information du Québec

A participé à l'enquête; le commissaire Michael Harvey a souligné le « profilage élaboré » de TikTok combinant données faciales, vocales et de localisation pour inférer le pouvoir d'achat

A participé à l'enquête conjointe en tant que commissaire à l'information et à la protection de la vie privée de l'Alberta

Systèmes d'IA impliqués

TikTok Recommendation Algorithm & Profiling Systems

TikTok's ML-based recommendation algorithm, facial analytics (CNNs for age/gender estimation), audio analytics, and multiple age-estimation models used to profile users — including children — for content personalization and ad targeting. Despite having sophisticated age-detection capabilities, TikTok did not deploy them to prevent underage access.

Réponses et résultats

Commissariat à la protection de la vie privée du Canada

Published PIPEDA Findings #2025-003 jointly with Quebec, BC, and Alberta commissioners; found TikTok's collection of children's data well-founded; conditionally resolved with compliance commitments including three new age assurance mechanisms, cessation of youth ad targeting, and monthly compliance reporting

Contexte du système d'IA

TikTok deploys multiple interconnected ML systems: a content recommendation algorithm powering the "For You" feed using behavioural and inferred data; convolutional neural networks (CNNs) for facial feature extraction and age/gender estimation from video content; audio analytics systems; and multiple age-estimation models (video-level, account-level, advertising, and TikTok LIVE). These systems collectively profiled users — including children — by inferring interests, location, age range, gender, spending power, health data, political opinions, gender identity, and sexual orientation from content and behaviour. The investigation found TikTok possessed but did not deploy its age-detection capabilities to prevent underage platform access.

Mesures préventives

  • Require platforms deploying ML-based profiling to implement effective age assurance mechanisms before collecting children's data, rather than relying on voluntary age gates
  • Mandate that platforms deploy existing age-detection AI capabilities for child protection when such capabilities have been developed for other purposes (e.g., ad targeting)
  • Prohibit algorithmic profiling and targeted advertising directed at children, consistent with the commissioners' finding that such purposes are inherently inappropriate
  • Require express, granular consent for ML-based inference of sensitive attributes (gender identity, sexual orientation, health status, political opinions) from user behaviour and content
  • Mandate privacy communications in both official languages for platforms operating in Canada

Fiches connexes

Taxonomie

Domaine
MédiasTélécommunications
Type de préjudice
Vie privée et donnéesExcès de surveillanceAutonomie et manipulationDiscrimination et droits
Implication de l'IA
Défaillance de déploiementDéfaillance de supervisionDonnées d'entraînement
Phase du cycle de vie
Collecte de donnéesDéploiementSurveillance

Sources

  1. PIPEDA Findings #2025-003: Joint investigation of TikTok Réglementaire — Office of the Privacy Commissioner of Canada (23 sept. 2025)
  2. News release: Privacy commissioners find TikTok collected children's personal information inappropriately Réglementaire — Office of the Privacy Commissioner of Canada (23 sept. 2025)
  3. Joint Investigation Report PIPA2025-IR-02 Réglementaire — Office of the Information and Privacy Commissioner of Alberta (23 sept. 2025)
  4. Privacy commissioners find TikTok collected sensitive data from Canadian children Média — CBC News (23 sept. 2025)
  5. TikTok failed to keep kids off platform, Canadian privacy watchdogs find Média — Global News (23 sept. 2025)
  6. TikTok Privacy Decision: A Major Compliance Warning Autre — Barry Sookman (30 sept. 2025)

Historique des modifications

VersionDateModification
v1 9 mars 2026 Initial publication