Ce site est un prototype en cours de développement.
Corroboré Sévérité : Grave Version 1

The first Canadian criminal prosecution of a minor for creating AI-generated child sexual abuse material, and the first school-targeting deepfake case in Canada to result in criminal charges. Prior incidents at schools in Winnipeg (2023) and London, Ontario (2024) — where AI was used to create deepfake nudes of students — resulted in no criminal charges, highlighting enforcement gaps. The Calgary case demonstrates that existing Criminal Code provisions (s. 163.1) are broad enough to cover AI-generated CSAM, setting a significant precedent for future prosecutions.

Survenu : 1 octobre 2025 (month) Signalé : 3 décembre 2025

Récit

In October 2025, Alberta Law Enforcement Response Teams’ Internet Child Exploitation (ICE) unit received a tip about child sexual abuse materials being uploaded to a social media platform. The investigation revealed that a 17-year-old had used AI tools to transform authentic photos of girls from multiple Calgary-area high schools into sexualized images and distributed the material online.

On November 13, 2025, ICE officers, assisted by Calgary Police Service, executed a search warrant and seized two cellphones, a tablet, and a laptop. On December 3, 2025, ALERT announced charges against the teen — who cannot be identified under the Youth Criminal Justice Act — for making, possessing, and distributing child sexual abuse and exploitation material (Criminal Code s. 163.1) and criminal harassment (s. 264). Staff Sergeant Mark Auger of ALERT ICE stated: “Our biggest takeaway from today is we need people to understand that this is not a joke. It’s not a prank. This is the most extreme form of bullying and a criminal offence.”

The case is the first Canadian criminal prosecution of a minor for AI-generated child sexual abuse material, and the first school-targeting deepfake incident in Canada to result in criminal charges. Two prior incidents — at a Winnipeg school in December 2023 and a London, Ontario school in April 2024 — involved students creating AI-generated deepfake nudes of classmates, but neither resulted in charges. In the Winnipeg case, police concluded they could not proceed, highlighting a gap in Manitoba’s intimate image laws which did not cover altered images. The London case similarly produced no charges or disclosed disciplinary consequences.

The legal basis for prosecution rests on Criminal Code section 163.1, which defines child sexual abuse material broadly as “a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means” depicting a person “who is or is depicted as being under the age of eighteen years.” This language — particularly “other visual representation” and “whether or not made by electronic or mechanical means” — captures AI-generated content. A Quebec precedent, R v Larouche (2023), established that creating deepfake child sexual abuse material using face-swapping AI constitutes a criminal offence, resulting in a sentence of over three years’ imprisonment.

The specific AI tools used and the social media platform where the material was distributed have not been publicly identified. ALERT confirmed that all known victims were provided support services, and the accused was released on conditions including no contact with persons under 16 and restricted internet access.

Préjudices

A 17-year-old used AI tools to generate sexualized images from real photos of girls at multiple Calgary-area high schools, then distributed the AI-generated child sexual abuse material through a social media platform.

Grave Groupe

Multiple underage girls were victimized by having their likeness non-consensually sexualized through AI image generation and the resulting material distributed online.

Grave Groupe

Populations touchées

  • female high school students at multiple Calgary-area schools
  • families of victims

Entités impliquées

ALERT's Internet Child Exploitation (ICE) unit received the initial tip, led the investigation, executed the search warrant, and laid charges

Réponses et résultats

Alberta Law Enforcement Response Teams

Announced charges against a 17-year-old for making, possessing, and distributing child sexual abuse and exploitation material and criminal harassment; stated that all known victims were provided support services

Contexte du système d'IA

AI image generation tools (specific tools not publicly disclosed) used to transform authentic photos of underage girls into sexualized images. The source photos were taken from the girls' social media accounts. The generated images were then uploaded and distributed via a social media platform (not publicly identified).

Mesures préventives

  • Strengthen enforcement and prosecution of AI-generated child sexual abuse material under existing Criminal Code provisions, building on the precedent that s. 163.1 covers AI-generated material
  • Enact legislation explicitly addressing AI-generated non-consensual intimate images of both minors and adults, closing the gap between child exploitation law and adult intimate image protections
  • Require AI image generation tools to implement technical safeguards that prevent generation of sexualized content from photos of real people, particularly minors
  • Mandate school digital literacy curricula covering the legal consequences and harms of creating and distributing AI-generated intimate images
  • Establish reporting obligations for social media platforms when AI-generated child sexual abuse material is detected on their systems

Matérialisé à partir de

Fiches connexes

Taxonomie

Domaine
ÉducationApplication de la loi
Type de préjudice
Discrimination et droitsPréjudice psychologique
Implication de l'IA
Utilisation abusive
Phase du cycle de vie
Déploiement

Sources

  1. Teen facing charges relating to AI-related child sexual abuse material Officiel — Alberta Law Enforcement Response Teams (3 déc. 2025)
  2. Calgary teen accused of using AI to sexualize photos of high school girls Média — CBC News (3 déc. 2025)
  3. Calgary-area teen accused of using AI to create child sex abuse material Média — Global News (3 déc. 2025)
  4. Calgary teen facing charges after allegedly creating AI-generated sex photos of girls Média — Calgary Journal (3 déc. 2025)
  5. Calgary teen charged after allegedly creating AI-generated sexual content Média — CP24 (3 déc. 2025)
  6. AI-generated fake nude photos of girls from Winnipeg school posted online Média — CBC News (4 déc. 2023)
  7. No charges against Catholic high school students who made and shared deep-fake nudes Média — CBC News (25 avr. 2024)

Historique des modifications

VersionDateModification
v1 8 mars 2026 Initial publication