Calgary Teen Charged with Creating AI-Generated Child Sexual Abuse Material from Classmates' Photos
The first Canadian criminal prosecution of a minor for creating AI-generated child sexual abuse material, and the first school-targeting deepfake case in Canada to result in criminal charges. Prior incidents at schools in Winnipeg (2023) and London, Ontario (2024) — where AI was used to create deepfake nudes of students — resulted in no criminal charges, highlighting enforcement gaps. The Calgary case demonstrates that existing Criminal Code provisions (s. 163.1) are broad enough to cover AI-generated CSAM, setting a significant precedent for future prosecutions.
Narrative
In October 2025, Alberta Law Enforcement Response Teams’ Internet Child Exploitation (ICE) unit received a tip about child sexual abuse materials being uploaded to a social media platform. The investigation revealed that a 17-year-old had used AI tools to transform authentic photos of girls from multiple Calgary-area high schools into sexualized images and distributed the material online.
On November 13, 2025, ICE officers, assisted by Calgary Police Service, executed a search warrant and seized two cellphones, a tablet, and a laptop. On December 3, 2025, ALERT announced charges against the teen — who cannot be identified under the Youth Criminal Justice Act — for making, possessing, and distributing child sexual abuse and exploitation material (Criminal Code s. 163.1) and criminal harassment (s. 264). Staff Sergeant Mark Auger of ALERT ICE stated: “Our biggest takeaway from today is we need people to understand that this is not a joke. It’s not a prank. This is the most extreme form of bullying and a criminal offence.”
The case is the first Canadian criminal prosecution of a minor for AI-generated child sexual abuse material, and the first school-targeting deepfake incident in Canada to result in criminal charges. Two prior incidents — at a Winnipeg school in December 2023 and a London, Ontario school in April 2024 — involved students creating AI-generated deepfake nudes of classmates, but neither resulted in charges. In the Winnipeg case, police concluded they could not proceed, highlighting a gap in Manitoba’s intimate image laws which did not cover altered images. The London case similarly produced no charges or disclosed disciplinary consequences.
The legal basis for prosecution rests on Criminal Code section 163.1, which defines child sexual abuse material broadly as “a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means” depicting a person “who is or is depicted as being under the age of eighteen years.” This language — particularly “other visual representation” and “whether or not made by electronic or mechanical means” — captures AI-generated content. A Quebec precedent, R v Larouche (2023), established that creating deepfake child sexual abuse material using face-swapping AI constitutes a criminal offence, resulting in a sentence of over three years’ imprisonment.
The specific AI tools used and the social media platform where the material was distributed have not been publicly identified. ALERT confirmed that all known victims were provided support services, and the accused was released on conditions including no contact with persons under 16 and restricted internet access.
Harms
A 17-year-old used AI tools to generate sexualized images from real photos of girls at multiple Calgary-area high schools, then distributed the AI-generated child sexual abuse material through a social media platform.
Multiple underage girls were victimized by having their likeness non-consensually sexualized through AI image generation and the resulting material distributed online.
Affected Populations
- female high school students at multiple Calgary-area schools
- families of victims
Entities Involved
ALERT's Internet Child Exploitation (ICE) unit received the initial tip, led the investigation, executed the search warrant, and laid charges
Responses & Outcomes
Announced charges against a 17-year-old for making, possessing, and distributing child sexual abuse and exploitation material and criminal harassment; stated that all known victims were provided support services
AI System Context
AI image generation tools (specific tools not publicly disclosed) used to transform authentic photos of underage girls into sexualized images. The source photos were taken from the girls' social media accounts. The generated images were then uploaded and distributed via a social media platform (not publicly identified).
Preventive Measures
- Strengthen enforcement and prosecution of AI-generated child sexual abuse material under existing Criminal Code provisions, building on the precedent that s. 163.1 covers AI-generated material
- Enact legislation explicitly addressing AI-generated non-consensual intimate images of both minors and adults, closing the gap between child exploitation law and adult intimate image protections
- Require AI image generation tools to implement technical safeguards that prevent generation of sexualized content from photos of real people, particularly minors
- Mandate school digital literacy curricula covering the legal consequences and harms of creating and distributing AI-generated intimate images
- Establish reporting obligations for social media platforms when AI-generated child sexual abuse material is detected on their systems
Materialized From
Related Records
Taxonomy
Sources
- Teen facing charges relating to AI-related child sexual abuse material
- Calgary teen accused of using AI to sexualize photos of high school girls
- Calgary-area teen accused of using AI to create child sex abuse material
- Calgary teen facing charges after allegedly creating AI-generated sex photos of girls
- Calgary teen charged after allegedly creating AI-generated sexual content
- AI-generated fake nude photos of girls from Winnipeg school posted online
- No charges against Catholic high school students who made and shared deep-fake nudes
Changelog
| Version | Date | Change |
|---|---|---|
| v1 | Mar 8, 2026 | Initial publication |