Un adolescent de Calgary accusé d'avoir créé du matériel d'exploitation sexuelle d'enfants par IA à partir de photos de camarades de classe
Des hypertrucages de nus de camarades de classe générés par IA ont mené aux premières accusations criminelles au Canada contre un mineur pour du MESE produit par IA.
In October 2025, Alberta Law Enforcement Response Teams' Internet Child Exploitation (ICE) unit received a tip about child sexual abuse materials being uploaded to a social media platform (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025). The investigation revealed that a 17-year-old had used AI tools to transform authentic photos of girls from multiple Calgary-area high schools into sexualized images and distributed the material online (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025).
On November 13, 2025, ICE officers, assisted by Calgary Police Service, executed a search warrant and seized two cellphones, a tablet, and a laptop (Alberta Law Enforcement Response Teams, 2025). On December 3, 2025, ALERT announced charges against the teen — who cannot be identified under the Youth Criminal Justice Act — for making, possessing, and distributing child sexual abuse and exploitation material (Criminal Code s. 163.1) and criminal harassment (s. 264) (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025; Global News, 2025). Staff Sergeant Mark Auger of ALERT ICE stated: "Our biggest takeaway from today is we need people to understand that this is not a joke. It's not a prank. This is the most extreme form of bullying and a criminal offence" (Alberta Law Enforcement Response Teams, 2025).
The case is the first Canadian criminal prosecution of a minor for AI-generated child sexual abuse material, and the first school-targeting deepfake incident in Canada to result in criminal charges (Global News, 2025; Calgary Journal, 2025). Two prior incidents — at a Winnipeg school in December 2023 (CBC News, 2023) and a London, Ontario school in April 2024 (CBC News, 2024) — involved students creating AI-generated deepfake nudes of classmates, but neither resulted in charges. In the Winnipeg case, police ultimately laid no charges, citing multiple factors including evidence issues, victims' wishes, and gaps in Manitoba's intimate image laws which did not cover altered images (CBC News, 2024). Manitoba subsequently introduced Bill 24 in March 2024 to expand its intimate image protections to cover AI-altered images. The London case similarly produced no charges or disclosed disciplinary consequences (CBC News, 2024).
The legal basis for prosecution rests on Criminal Code section 163.1, which defines child sexual abuse material broadly as "a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means" depicting a person "who is or is depicted as being under the age of eighteen years." This language — particularly "other visual representation" and "whether or not made by electronic or mechanical means" — captures AI-generated content. In a Quebec precedent, R v Larouche (2023) — reported by the Canadian Centre for Child Protection as the first Canadian conviction for creating deepfake child sexual abuse material using face-swapping AI — the accused received a sentence that included over three years for the deepfake production charges.
The specific AI tools used and the social media platform where the material was distributed have not been publicly identified. ALERT confirmed that all known victims were provided support services, and the accused was released on conditions including no contact with persons under 16 and restricted internet access (Alberta Law Enforcement Response Teams, 2025; CP24, 2025).
Matérialisé à partir de
Préjudices
Un adolescent de 17 ans a utilisé des outils d'IA pour générer des images sexualisées à partir de photos réelles de filles fréquentant plusieurs écoles secondaires de la région de Calgary, puis a distribué ce matériel d'exploitation sexuelle d'enfants généré par IA sur une plateforme de médias sociaux.
Plusieurs filles mineures ont été victimisées par l'utilisation non consensuelle de leur image à des fins de sexualisation par génération d'images par IA, et le matériel ainsi produit a été distribué en ligne.
Preuves
8 rapports
- Teen facing charges relating to AI-related child sexual abuse material Source principale
ALERT press release: teen facing charges for AI-related child sexual abuse material; AI tools used to create sexualized images of high school classmates
-
CBC reporting: Calgary teen accused of using AI to sexualize photos of high school classmates; details of ALERT investigation
- Calgary-area teen accused of using AI to create child sex abuse material Source principale
Global News reporting on Calgary teen charged with using AI to create child sexual abuse material; legal context
-
Prior Canadian school deepfake incident in Winnipeg resulted in no criminal charges
-
Winnipeg police laid no charges, citing evidence issues, victims' wishes, and gaps in Manitoba intimate image laws
-
Prior Canadian school deepfake incident in London, Ontario resulted in no criminal charges
-
Calgary Journal reporting on teen charges for AI-generated CSAM; local community impact
-
CP24 reporting on Calgary teen charges for AI-generated sexual images of classmates
Détails de la fiche
Réponses et résultats
Announced charges against a 17-year-old for making, possessing, and distributing child sexual abuse and exploitation material and criminal harassment; stated that all known victims were provided support services
Recommandations de politiqueévalué
Existing Criminal Code provisions (s. 163.1) are broad enough to cover AI-generated child sexual abuse material, as demonstrated by the charges in this case
Alberta Law Enforcement Response Teams (demonstrated through prosecution) (3 déc. 2025)Provincial intimate image laws should be updated to cover AI-altered images, addressing the gap identified when Manitoba's laws did not cover altered images at the time of the Winnipeg school incident
Suzie Dunn (legal expert, via CBC News reporting on Manitoba law gap) (15 févr. 2024)Évaluation éditoriale évalué
Il s'agit de la première poursuite criminelle au Canada contre un mineur pour création de matériel d'exploitation sexuelle d'enfants généré par IA, et du premier incident d'hypertrucage visant des écoles au Canada à entraîner des accusations criminelles (Alberta Law Enforcement Response Teams, 2025; CBC News, 2025; Global News, 2025; Calgary Journal, 2025; CP24, 2025). Des incidents antérieurs dans des écoles de Winnipeg (2023) et de London, en Ontario (2024) — où l'IA a été utilisée pour créer des nus hypertrucés d'élèves — n'avaient donné lieu à aucune accusation criminelle (CBC News, 2023; CBC News, 2024), mettant en évidence des lacunes dans l'application de la loi (CBC News, 2024). L'affaire de Calgary démontre que les dispositions existantes du Code criminel (art. 163.1) sont suffisamment larges pour couvrir le matériel d'exploitation sexuelle d'enfants généré par IA, établissant ainsi un précédent important pour les poursuites futures (Global News, 2025).
Entités impliquées
Fiches connexes
- AI-Generated Child Sexual Abuse Material in Canadarelated
- AI-Generated Child Sexual Abuse Material in Canadarelated
- AI-Generated Child Sexual Abuse Material in Canadarelated
Taxonomieévalué
Historique des modifications
| Version | Date | Modification |
|---|---|---|
| v1 | 8 mars 2026 | Initial publication |
| v2 | 10 mars 2026 | Factual corrections: softened Winnipeg enforcement characterization (police cited multiple factors, not just law gap); corrected R v Larouche sentence (total 8 years, not just 3+ for deepfake charges) and framing (first conviction under existing law, not new legal establishment); fixed Winnipeg source date (Dec 15 not Dec 4, 2023); added Feb 2024 CBC follow-up source on no-charges outcome |
| v3 | 11 mars 2026 | Verification upgraded from corroborated to confirmed: Alberta Law Enforcement Response Teams issued official press release about charges. |
| v4 | 11 mars 2026 | Replaced fabricated policy recommendation attributions (ALERT made no policy recommendations; CBC is journalism); reattributed legislative gap observation to legal expert |