This site is a work-in-progress prototype.
Confirmed contested Severity: Significant Version 1

The most significant privacy enforcement action against an AI system in Canada. Four federal and provincial commissioners jointly found that TikTok's ML-based profiling of children had no legitimate purpose — meaning consent was legally irrelevant. The finding that TikTok possessed sophisticated age-detection AI but chose not to use it to protect children establishes a precedent for regulatory expectations around deploying safety capabilities that already exist. TikTok disagreed with the findings but committed to all remedies.

Occurred: January 1, 2020 (year) to September 23, 2025 Reported: September 23, 2025

Narrative

A joint investigation by four federal and provincial privacy commissioners found that TikTok collected personal information of Canadian children for ML-based algorithmic profiling, facial analytics, and targeted advertising without any legitimate purpose — making consent legally irrelevant.

The investigation, announced in February 2023 and published as PIPEDA Findings #2025-003 on September 23, 2025, was the largest coordinated privacy enforcement action against an AI system in Canada. The Office of the Privacy Commissioner of Canada led the investigation jointly with Quebec’s Commission d’accès à l’information, BC’s Office of the Information and Privacy Commissioner, and Alberta’s Office of the Information and Privacy Commissioner.

TikTok operates multiple interconnected ML systems that profiled its 14 million Canadian monthly active users, including children. Its content recommendation algorithm powers the “For You” feed using behavioural and inferred data. Convolutional neural networks analyze facial features for age and gender estimation. Audio analytics extract additional signals. Multiple age-estimation models — video-level, account-level, advertising, and TikTok LIVE — classify users into demographic segments. These systems collectively inferred users’ interests, location, age range, gender, spending power, and — critically — sensitive attributes including health data, political opinions, gender identity, and sexual orientation from content and behaviour patterns.

A central finding was that TikTok’s age assurance mechanisms were critically inadequate. TikTok relied on three weak measures: a voluntary age gate (easily circumvented by entering a false birthdate), minimal automated keyword scanning that only caught users who posted text, and human moderation triggered by user reports. Since 73.5% of TikTok users never post videos and 59.2% never comment, the vast majority of underage users — passive consumers of algorithmically recommended content — escaped detection entirely. TikTok removed approximately 500,000 underage Canadian accounts per year, but commissioners concluded the actual number of underage users was “likely much higher.” In Quebec, 40% of children aged 6–17 and 17% of children aged 6–12 had TikTok accounts.

The commissioners found that TikTok possessed sophisticated AI-based age-detection capabilities but did not deploy them to prevent underage access. BC Commissioner Michael Harvey noted the “elaborate profiling” involving facial and voice data combined with location data to “create inferences about spending power” — capabilities that demonstrated TikTok could identify children but chose not to use those tools for protection.

TikTok’s advertising targeting system exposed sensitive attributes. Hashtags like “#transgendergirl” and “#transgendersoftiktok” were available as ad targeting options, enabling advertisers to target users based on transgender status. TikTok was unable to explain why these hashtags had been available and later confirmed they “should not have been available.”

The commissioners also found that TikTok disclosed that affiliate companies and employees in China could access personal information collected from Canadian users — a finding with national security implications given the contemporaneous Investment Canada Act order (November 2024) to wind up TikTok Technology Canada Inc.

TikTok was directed to implement three new “demonstrably effective” age assurance mechanisms within six months, cease allowing advertisers to target users under 18, provide a youth-specific plain-language privacy summary, publish a privacy video for teen users, implement a “Privacy Settings Check-up” for all Canadian users, and submit monthly compliance updates. TikTok disagreed with the findings but committed to implementing all recommendations. The matter is conditionally resolved pending fulfilment.

A proposed privacy class action was commenced in the Supreme Court of British Columbia in October 2025 by Siskinds LLP against ByteDance and TikTok entities.

Harms

TikTok collected personal information of Canadian children — approximately 500,000 underage accounts removed per year — for ML-based content recommendation, facial analytics, biometric profiling, and targeted advertising, with no legitimate purpose under PIPEDA. The OPC found that because the purpose itself was inappropriate, consent could not render the collection lawful.

Significant Population

TikTok's age assurance mechanisms were critically inadequate: a voluntary age gate easily circumvented by entering a false birthdate, minimal keyword scanning that only caught text-posting users, and human moderation via user reports. Since 73.5% of users never post videos and 59.2% never comment, the vast majority of underage users escaped detection entirely.

Significant Population

TikTok's advertising targeting options included hashtags like '#transgendergirl' and '#transgendersoftiktok,' enabling advertisers to target users based on transgender status. TikTok was unable to explain why these hashtags had been available and confirmed they should not have been. Health data, political opinions, gender identity, and sexual orientation were inferred from user content through ML profiling.

Significant Group

TikTok failed to obtain express consent for sensitive information processing, provide meaningful privacy disclosures, or make key privacy communications available in French — violating PIPEDA, Quebec's Private Sector Act, and provincial privacy statutes in BC and Alberta.

Moderate Population

Affected Populations

  • Canadian children and youth using TikTok
  • parents and guardians of underage TikTok users
  • transgender and LGBTQ+ TikTok users targeted by advertising profiling
  • all 14 million Canadian TikTok users whose consent was found inadequate

Entities Involved

TikTok Pte. Ltd. / ByteDance
deployerdeveloper

Operated the TikTok platform in Canada with 14 million monthly active users; collected children's personal information for ML-based profiling and targeted advertising without legitimate purpose; deployed inadequate age assurance mechanisms; disagreed with findings but committed to implementing all recommendations

Led the joint investigation announced February 2023; findings published September 23, 2025 as PIPEDA Findings #2025-003; found TikTok's collection of children's data well-founded; conditionally resolved with compliance commitments

Co-investigated as Quebec's access to information and privacy commission

Co-investigated; Commissioner Michael Harvey highlighted TikTok's 'elaborate profiling' involving facial and voice data combined with location to infer spending power

Co-investigated as Alberta's information and privacy commissioner

AI Systems Involved

TikTok Recommendation Algorithm & Profiling Systems

TikTok's ML-based recommendation algorithm, facial analytics (CNNs for age/gender estimation), audio analytics, and multiple age-estimation models used to profile users — including children — for content personalization and ad targeting. Despite having sophisticated age-detection capabilities, TikTok did not deploy them to prevent underage access.

Responses & Outcomes

Office of the Privacy Commissioner of Canada

Published PIPEDA Findings #2025-003 jointly with Quebec, BC, and Alberta commissioners; found TikTok's collection of children's data well-founded; conditionally resolved with compliance commitments including three new age assurance mechanisms, cessation of youth ad targeting, and monthly compliance reporting

AI System Context

TikTok deploys multiple interconnected ML systems: a content recommendation algorithm powering the "For You" feed using behavioural and inferred data; convolutional neural networks (CNNs) for facial feature extraction and age/gender estimation from video content; audio analytics systems; and multiple age-estimation models (video-level, account-level, advertising, and TikTok LIVE). These systems collectively profiled users — including children — by inferring interests, location, age range, gender, spending power, health data, political opinions, gender identity, and sexual orientation from content and behaviour. The investigation found TikTok possessed but did not deploy its age-detection capabilities to prevent underage platform access.

Preventive Measures

  • Require platforms deploying ML-based profiling to implement effective age assurance mechanisms before collecting children's data, rather than relying on voluntary age gates
  • Mandate that platforms deploy existing age-detection AI capabilities for child protection when such capabilities have been developed for other purposes (e.g., ad targeting)
  • Prohibit algorithmic profiling and targeted advertising directed at children, consistent with the commissioners' finding that such purposes are inherently inappropriate
  • Require express, granular consent for ML-based inference of sensitive attributes (gender identity, sexual orientation, health status, political opinions) from user behaviour and content
  • Mandate privacy communications in both official languages for platforms operating in Canada

Related Records

Taxonomy

Domain
Media & EntertainmentTelecommunications
Harm type
Privacy & Data ExposureSurveillance OverreachAutonomy & ManipulationDiscrimination & Rights
AI involvement
Deployment FailureOversight BreakdownTraining Data Issue
Lifecycle phase
Data CollectionDeploymentMonitoring

Sources

  1. PIPEDA Findings #2025-003: Joint investigation of TikTok Regulatory — Office of the Privacy Commissioner of Canada (Sep 23, 2025)
  2. News release: Privacy commissioners find TikTok collected children's personal information inappropriately Regulatory — Office of the Privacy Commissioner of Canada (Sep 23, 2025)
  3. Joint Investigation Report PIPA2025-IR-02 Regulatory — Office of the Information and Privacy Commissioner of Alberta (Sep 23, 2025)
  4. Privacy commissioners find TikTok collected sensitive data from Canadian children Media — CBC News (Sep 23, 2025)
  5. TikTok failed to keep kids off platform, Canadian privacy watchdogs find Media — Global News (Sep 23, 2025)
  6. TikTok Privacy Decision: A Major Compliance Warning Other — Barry Sookman (Sep 30, 2025)

Changelog

VersionDateChange
v1 Mar 9, 2026 Initial publication