Phase pilote : CAIM est en construction. Les fiches sont provisoires, basées sur des sources publiques, et n’ont pas encore été révisées par des pairs. Commentaires bienvenus.
Corroboré Grave

Des fraudeurs auraient utilisé des voix clonées par IA de petits-enfants, volant 200 000 $ à huit aînés de St. John's en trois jours.

Survenu: 28 février 2023

Over a three-day period from February 28 to March 2, 2023, at least eight elderly residents of the St. John's, Newfoundland area were defrauded of a combined $200,000 in a grandparent scam operation (CBC News, 2023). The callers claimed to be the victims' grandchildren in legal trouble, urgently needing bail money, and were sufficiently convincing that victims believed they were speaking to their real family members. Media coverage and a Memorial University computer security researcher speculated that AI voice cloning tools may have been used to replicate the grandchildren's voices, though the Royal Newfoundland Constabulary's official statements described a "sophisticated" operation without specifically alleging AI technology (CBC News, 2023).

Police arrested 23-year-old Charles Gillen at St. John's airport as he attempted to flee with the collected money. He ultimately faced 27 criminal charges and pleaded guilty to 14 counts of fraud (CBC News, 2023). The case drew widespread attention as a possible early instance of AI voice cloning in Canadian fraud, though the use of AI was never forensically confirmed.

The Newfoundland incident is part of a broader pattern. CBC Marketplace's March 2025 investigation documented cases of suspected AI-enabled grandparent scams across Canada (CBC Marketplace, 2025). A separate operation run out of Montreal was indicted in the United States in 2025 for defrauding elderly Americans across 46 states, with losses totalling $21 million — though the indictment itself does not specifically allege AI voice cloning. In Saskatchewan, police reported multiple grandparent scam cases in late 2025 where victims reported hearing voices that sounded identical to their grandchildren. Police explicitly stated they could not determine whether AI voice cloning had been used or whether the callers had simply researched their targets through social media. Commercially available AI voice cloning tools can work with as little as a few seconds of audio — obtainable from social media posts, voicemail greetings, or video content — to produce a synthetic replica (CBC Marketplace, 2025).

The Canadian Anti-Fraud Centre tracks emergency and grandparent scams as a distinct fraud category. While these scams cause significant individual losses, CAFC data shows they rank below investment fraud, spear phishing, and romance scams in total dollar losses nationally. Law enforcement and fraud experts have warned that AI voice cloning technology has the potential to make these schemes significantly more effective by eliminating the weakest link in the traditional approach: the unconvincing impersonation.

Matérialisé à partir de

Préjudices

Des arnaqueurs auraient utilisé des clones vocaux générés par l'IA imitant les petits-enfants des victimes pour se faire passer pour des membres de la famille en détresse, convainquant au moins huit aînés de remettre de l'argent sous de faux prétextes. La police croit que le clonage vocal par IA a été utilisé, mais cela n'a pas été confirmé de manière indépendante.

Fraude et usurpation d'identitéPréjudice économiquePréjudice psychologiqueImportantGroupe

Au moins huit aînés de St. John's ont perdu un total de 200 000 $ en trois jours, et une opération plus large liée à Montréal a causé 21 millions de dollars de pertes aux États-Unis. L'acte d'accusation contre l'opération montréalaise n'allègue pas spécifiquement le recours au clonage vocal par IA.

Fraude et usurpation d'identitéPréjudice économiquePréjudice psychologiqueGraveGroupe

Les victimes âgées croyaient entendre leurs véritables petits-enfants en détresse judiciaire, exploitant les liens familiaux et causant un traumatisme émotionnel même après la découverte de la fraude.

Fraude et usurpation d'identitéPréjudice économiquePréjudice psychologiqueModéréGroupe

Preuves

2 rapports

  1. Média — CBC News (6 mars 2023)

    Eight seniors in St. John's lost $200,000 in three days to grandparent scam ring; callers impersonated grandchildren using suspected voice cloning technology

  2. Média — CBC Marketplace (5 mars 2025)

    CBC Marketplace investigation confirmed current voice cloning tools can produce convincing replicas from short audio samples; demonstrated AI voice cloning capability with minimal source material

Détails de la fiche

Réponses et résultats

Constabulaire royal de Terre-Neuveinstitutional actionActif

Arrested Charles Gillen at St. John's airport with collected fraud proceeds; he subsequently faced 30 criminal charges

Évaluation éditoriale évalué

Le clonage vocal par IA a transformé l'arnaque aux grands-parents — l'un des types de fraude les plus courants au Canada visant les aînés — d'un stratagème fondé sur le talent d'imitateur à un où l'appelant sonne exactement comme un vrai membre de la famille (CBC Marketplace, 2025), augmentant potentiellement son efficacité.

Entités impliquées

Systèmes d'IA impliqués

Fish Audio

AI voice cloning tools capable of producing convincing voice replicas from short audio samples, reportedly used in grandparent scam operations

Fiches connexes

Taxonomieévalué

Domaine
Finance et banquesJustice
Type de préjudice
Fraude et usurpation d'identitéPréjudice économiquePréjudice psychologique
Voie de contribution de l'IA
Utilisation au-delà de la portée prévue
Phase du cycle de vie
Déploiement

AIID : Incident #973

Historique des modifications

Historique des modifications
VersionDateModification
v18 mars 2026Initial publication
v211 mars 2026Reframed AI voice cloning claims to reflect that police never confirmed AI was used; corrected charge count (27, not 30; pled guilty to 14); fixed CAFC ranking claim; removed fabricated policy recommendations; corrected Montreal indictment to 46 states; fixed CBC Marketplace date

Version 2