Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Confirmed Contested Significant

Four Canadian privacy commissioners found TikTok collected children's data for algorithmic profiling and targeted advertising.

Occurred: January 1, 2020 (year) to September 23, 2025 Reported: September 23, 2025

A joint investigation by four federal and provincial privacy commissioners found that TikTok collected personal information of Canadian children for ML-based algorithmic profiling, facial analytics, and targeted advertising without any legitimate purpose — making consent legally irrelevant (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025).

The investigation, announced in February 2023 and published as PIPEDA Findings #2025-003 on September 23, 2025, was a major coordinated privacy enforcement action against an AI system in Canada (Barry Sookman, 2025). The Office of the Privacy Commissioner of Canada led the investigation jointly with Quebec's Commission d'accès à l'information, BC's Office of the Information and Privacy Commissioner, and Alberta's Office of the Information and Privacy Commissioner (Office of the Privacy Commissioner of Canada, 2025).

TikTok operates multiple interconnected ML systems that profiled its 14 million Canadian monthly active users, including children (Office of the Privacy Commissioner of Canada, 2025). Its content recommendation algorithm powers the "For You" feed using behavioural and inferred data. Convolutional neural networks analyze facial features for age and gender estimation. Audio analytics extract additional signals. Multiple age-estimation models — video-level, account-level, advertising, and TikTok LIVE — classify users into demographic segments. These systems collectively inferred users' interests, location, age range, gender, spending power, and — critically — sensitive attributes including health data, political opinions, gender identity, and sexual orientation from content and behaviour patterns (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025).

A central finding was that TikTok's age assurance mechanisms were inadequate (CBC News, 2025; Global News, 2025). TikTok relied on three weak measures: a voluntary age gate (easily circumvented by entering a false birthdate), minimal automated keyword scanning that only caught users who posted text, and human moderation triggered by user reports (Office of the Privacy Commissioner of Canada, 2025). Since 73.5% of TikTok users never post videos and 59.2% never comment, the vast majority of underage users — passive consumers of algorithmically recommended content — escaped detection entirely (Office of the Privacy Commissioner of Canada, 2025). TikTok removed approximately 500,000 underage Canadian accounts per year, but commissioners concluded the actual number of underage users was "likely much higher" (Office of the Privacy Commissioner of Canada, 2025). In Quebec, 40% of children aged 6–17 and 17% of children aged 6–12 had TikTok accounts (Office of the Privacy Commissioner of Canada, 2025).

The commissioners found that TikTok possessed sophisticated AI-based age-detection capabilities but did not deploy them to prevent underage access (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025). BC Commissioner Michael Harvey noted the "elaborate profiling" involving facial and voice data combined with location data to "create inferences about spending power" — capabilities that demonstrated TikTok could identify children but did not deploy those tools for protection (CBC News, 2025).

TikTok's advertising targeting system exposed sensitive attributes. Hashtags like "#transgendergirl" and "#transgendersoftiktok" were available as ad targeting options, enabling advertisers to target users based on transgender status (Office of the Privacy Commissioner of Canada, 2025). TikTok was unable to explain why these hashtags had been available and later confirmed they "should not have been available" (Office of the Privacy Commissioner of Canada, 2025).

The commissioners also found that TikTok disclosed that affiliate companies and employees in China could access personal information collected from Canadian users — a finding that commentators noted had national security dimensions given the contemporaneous Investment Canada Act order (November 2024) to wind up TikTok Technology Canada Inc (Office of the Privacy Commissioner of Canada, 2025).

TikTok was directed to implement three new "demonstrably effective" age assurance mechanisms within six months, cease allowing advertisers to target users under 18, provide a youth-specific plain-language privacy summary, publish a privacy video for teen users, implement a "Privacy Settings Check-up" for all Canadian users, and submit monthly compliance updates (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025). TikTok disagreed with the findings but committed to implementing all recommendations. The matter is conditionally resolved pending fulfilment.

A proposed privacy class action was commenced in the Supreme Court of British Columbia in October 2025 by Siskinds LLP against ByteDance and TikTok entities.

Harms

TikTok collected personal information of Canadian children — approximately 500,000 underage accounts removed per year — for ML-based content recommendation, facial analytics, biometric profiling, and targeted advertising, with no legitimate purpose under PIPEDA. The OPC found that because the purpose itself was inappropriate, consent could not render the collection lawful.

Privacy & Data ExposureDisproportionate SurveillanceAutonomy UnderminedDiscrimination & RightsSignificantPopulation

TikTok's age assurance mechanisms were inadequate: a voluntary age gate easily circumvented by entering a false birthdate, minimal keyword scanning that only caught text-posting users, and human moderation via user reports. Since 73.5% of users never post videos and 59.2% never comment, the vast majority of underage users escaped detection entirely.

Privacy & Data ExposureDisproportionate SurveillanceAutonomy UnderminedDiscrimination & RightsSignificantPopulation

TikTok's advertising targeting options included hashtags like '#transgendergirl' and '#transgendersoftiktok,' enabling advertisers to target users based on transgender status. TikTok was unable to explain why these hashtags had been available and confirmed they should not have been. Health data, political opinions, gender identity, and sexual orientation were inferred from user content through ML profiling.

Privacy & Data ExposureDisproportionate SurveillanceAutonomy UnderminedDiscrimination & RightsSignificantGroup

TikTok failed to obtain express consent for sensitive information processing, provide meaningful privacy disclosures, or make key privacy communications available in French — violating PIPEDA, Quebec's Private Sector Act, and provincial privacy statutes in BC and Alberta.

Privacy & Data ExposureDisproportionate SurveillanceAutonomy UnderminedDiscrimination & RightsModeratePopulation

Evidence

6 reports

  1. Regulatory — Office of the Privacy Commissioner of Canada (Sep 23, 2025)

    Full findings of joint investigation — children's data collection without legitimate purpose, inadequate age assurance, consent failures, sensitive attribute profiling

  2. Regulatory — Office of the Privacy Commissioner of Canada (Sep 23, 2025)

    OPC news release: privacy commissioners find TikTok collected children's data for ML-based algorithmic profiling without legitimate purpose

  3. Regulatory — Office of the Information and Privacy Commissioner of Alberta (Sep 23, 2025)

    Alberta OIPC joint investigation report: detailed findings on TikTok's collection and use of children's personal information

  4. Media — CBC News (Sep 23, 2025)

    CBC reporting: privacy commissioners find TikTok collected sensitive data from Canadian children; media coverage of investigation findings

  5. Media — Global News (Sep 23, 2025)

    Global News reporting: TikTok failed to keep kids off platform; Canadian privacy watchdogs' findings on age verification failures

  6. Other — Barry Sookman (Sep 30, 2025)

    Characterized as 'the most significant privacy enforcement action in Canada in years'

Record details

Responses & Outcomes

Office of the Privacy Commissioner of Canadainstitutional actionActive

Published PIPEDA Findings #2025-003 jointly with Quebec, BC, and Alberta commissioners; found TikTok's collection of children's data well-founded; conditionally resolved with compliance commitments including three new age assurance mechanisms, cessation of youth ad targeting, and monthly compliance reporting

Editorial Assessment assessed

Privacy law commentator Barry Sookman described this as the most significant privacy enforcement action in Canada in years (Barry Sookman, 2025). Four federal and provincial commissioners jointly found that TikTok's ML-based profiling of children had no legitimate purpose — meaning consent was legally irrelevant (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025). The finding that TikTok possessed sophisticated age-detection AI but chose not to use it to protect children establishes a precedent for regulatory expectations around deploying safety capabilities that already exist (Office of the Privacy Commissioner of Canada, 2025; Office of the Information and Privacy Commissioner of Alberta, 2025). TikTok disagreed with the findings but committed to all remedies (CBC News, 2025; Global News, 2025).

Entities Involved

AI Systems Involved

TikTok Recommendation Algorithm & Profiling Systems

TikTok's ML-based recommendation algorithm, facial analytics (CNNs for age/gender estimation), audio analytics, and multiple age-estimation models used to profile users — including children — for content personalization and ad targeting. Despite having sophisticated age-detection capabilities, TikTok did not deploy them to prevent underage access.

Related Records

Taxonomyassessed

Domain
Media & EntertainmentTelecommunications
Harm type
Privacy & Data ExposureDisproportionate SurveillanceAutonomy UnderminedDiscrimination & Rights
AI pathway
Deployment ContextOversight AbsentTraining Data Origin
Lifecycle phase
Data CollectionDeploymentMonitoring

Changelog

Changelog
VersionDateChange
v1Mar 9, 2026Initial publication
v2Mar 12, 2026Neutrality review: softened 'largest' editorial characterization to 'major'; removed 5 policy recommendations that generalized TikTok-specific OPC compliance directions into general policy prescriptions, per CAIM neutrality policy.

Version 2