Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Confirmed Severe

AI-generated deepfake nudes of classmates led to the first Canadian criminal charges against a minor for AI CSAM.

Occurred: October 1, 2025 (month) Reported: December 3, 2025

In October 2025, Alberta Law Enforcement Response Teams' Internet Child Exploitation (ICE) unit received a tip about child sexual abuse materials being uploaded to a social media platform (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025). The investigation revealed that a 17-year-old had used AI tools to transform authentic photos of girls from multiple Calgary-area high schools into sexualized images and distributed the material online (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025).

On November 13, 2025, ICE officers, assisted by Calgary Police Service, executed a search warrant and seized two cellphones, a tablet, and a laptop (Alberta Law Enforcement Response Teams, 2025). On December 3, 2025, ALERT announced charges against the teen — who cannot be identified under the Youth Criminal Justice Act — for making, possessing, and distributing child sexual abuse and exploitation material (Criminal Code s. 163.1) and criminal harassment (s. 264) (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025; Global News, 2025). Staff Sergeant Mark Auger of ALERT ICE stated: "Our biggest takeaway from today is we need people to understand that this is not a joke. It's not a prank. This is the most extreme form of bullying and a criminal offence" (Alberta Law Enforcement Response Teams, 2025).

The case is the first Canadian criminal prosecution of a minor for AI-generated child sexual abuse material, and the first school-targeting deepfake incident in Canada to result in criminal charges (Global News, 2025; Calgary Journal, 2025). Two prior incidents — at a Winnipeg school in December 2023 (CBC News, 2023) and a London, Ontario school in April 2024 (CBC News, 2024) — involved students creating AI-generated deepfake nudes of classmates, but neither resulted in charges. In the Winnipeg case, police ultimately laid no charges, citing multiple factors including evidence issues, victims' wishes, and gaps in Manitoba's intimate image laws which did not cover altered images (CBC News, 2024). Manitoba subsequently introduced Bill 24 in March 2024 to expand its intimate image protections to cover AI-altered images. The London case similarly produced no charges or disclosed disciplinary consequences (CBC News, 2024).

The legal basis for prosecution rests on Criminal Code section 163.1, which defines child sexual abuse material broadly as "a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means" depicting a person "who is or is depicted as being under the age of eighteen years." This language — particularly "other visual representation" and "whether or not made by electronic or mechanical means" — captures AI-generated content. In a Quebec precedent, R v Larouche (2023) — reported by the Canadian Centre for Child Protection as the first Canadian conviction for creating deepfake child sexual abuse material using face-swapping AI — the accused received a sentence that included over three years for the deepfake production charges.

The specific AI tools used and the social media platform where the material was distributed have not been publicly identified. ALERT confirmed that all known victims were provided support services, and the accused was released on conditions including no contact with persons under 16 and restricted internet access (Alberta Law Enforcement Response Teams, 2025; CP24, 2025).

Materialized From

Harms

A 17-year-old used AI tools to generate sexualized images from real photos of girls at multiple Calgary-area high schools, then distributed the AI-generated child sexual abuse material through a social media platform.

Discrimination & RightsPsychological HarmSevereGroup

Multiple underage girls were victimized by having their likeness non-consensually sexualized through AI image generation and the resulting material distributed online.

Discrimination & RightsPsychological HarmSevereGroup

Evidence

8 reports

  1. Official — Alberta Law Enforcement Response Teams (Dec 3, 2025)

    ALERT press release: teen facing charges for AI-related child sexual abuse material; AI tools used to create sexualized images of high school classmates

  2. Media — CBC News (Dec 3, 2025)

    CBC reporting: Calgary teen accused of using AI to sexualize photos of high school classmates; details of ALERT investigation

  3. Media — Global News (Dec 3, 2025)

    Global News reporting on Calgary teen charged with using AI to create child sexual abuse material; legal context

  4. Media — CBC News (Dec 15, 2023)

    Prior Canadian school deepfake incident in Winnipeg resulted in no criminal charges

  5. Media — CBC News (Feb 15, 2024)

    Winnipeg police laid no charges, citing evidence issues, victims' wishes, and gaps in Manitoba intimate image laws

  6. Media — CBC News (Apr 25, 2024)

    Prior Canadian school deepfake incident in London, Ontario resulted in no criminal charges

  7. Media — Calgary Journal (Dec 3, 2025)

    Calgary Journal reporting on teen charges for AI-generated CSAM; local community impact

  8. Media — CP24 (Dec 3, 2025)

    CP24 reporting on Calgary teen charges for AI-generated sexual images of classmates

Record details

Responses & Outcomes

Alberta Law Enforcement Response Teamsinstitutional actionActive

Announced charges against a 17-year-old for making, possessing, and distributing child sexual abuse and exploitation material and criminal harassment; stated that all known victims were provided support services

Policy Recommendationsassessed

Existing Criminal Code provisions (s. 163.1) are broad enough to cover AI-generated child sexual abuse material, as demonstrated by the charges in this case

Alberta Law Enforcement Response Teams (demonstrated through prosecution) (Dec 3, 2025)

Provincial intimate image laws should be updated to cover AI-altered images, addressing the gap identified when Manitoba's laws did not cover altered images at the time of the Winnipeg school incident

Suzie Dunn (legal expert, via CBC News reporting on Manitoba law gap) (Feb 15, 2024)

Editorial Assessment assessed

The first Canadian criminal prosecution of a minor for creating AI-generated child sexual abuse material, and the first school-targeting deepfake case in Canada to result in criminal charges (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025; Global News, 2025). Prior incidents at schools in Winnipeg (2023) (CBC News, 2023) and London, Ontario (2024) (CBC News, 2024) — where AI was used to create deepfake nudes of students — resulted in no criminal charges. The Calgary case demonstrates that existing Criminal Code provisions (s. 163.1) are broad enough to cover AI-generated CSAM, setting a significant precedent for future prosecutions.

Entities Involved

Related Records

Taxonomyassessed

Domain
EducationLaw Enforcement
Harm type
Discrimination & RightsPsychological Harm
AI pathway
Use Beyond Intended Scope
Lifecycle phase
Deployment

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication
v2Mar 10, 2026Factual corrections: softened Winnipeg enforcement characterization (police cited multiple factors, not just law gap); corrected R v Larouche sentence (total 8 years, not just 3+ for deepfake charges) and framing (first conviction under existing law, not new legal establishment); fixed Winnipeg source date (Dec 15 not Dec 4, 2023); added Feb 2024 CBC follow-up source on no-charges outcome
v3Mar 11, 2026Verification upgraded from corroborated to confirmed: Alberta Law Enforcement Response Teams issued official press release about charges.
v4Mar 11, 2026Replaced fabricated policy recommendation attributions (ALERT made no policy recommendations; CBC is journalism); reattributed legislative gap observation to legal expert

Version 4