Clonage vocal par IA soupçonné dans un réseau d'arnaques aux grands-parents ciblant les aînés canadiens
Des fraudeurs auraient utilisé des voix clonées par IA de petits-enfants, volant 200 000 $ à huit aînés de St. John's en trois jours.
Over a three-day period from February 28 to March 2, 2023, at least eight elderly residents of the St. John's, Newfoundland area were defrauded of a combined $200,000 in a grandparent scam operation (CBC News, 2023). The callers claimed to be the victims' grandchildren in legal trouble, urgently needing bail money, and were sufficiently convincing that victims believed they were speaking to their real family members. Media coverage and a Memorial University computer security researcher speculated that AI voice cloning tools may have been used to replicate the grandchildren's voices, though the Royal Newfoundland Constabulary's official statements described a "sophisticated" operation without specifically alleging AI technology (CBC News, 2023).
Police arrested 23-year-old Charles Gillen at St. John's airport as he attempted to flee with the collected money. He ultimately faced 27 criminal charges and pleaded guilty to 14 counts of fraud (CBC News, 2023). The case drew widespread attention as a possible early instance of AI voice cloning in Canadian fraud, though the use of AI was never forensically confirmed.
The Newfoundland incident is part of a broader pattern. CBC Marketplace's March 2025 investigation documented cases of suspected AI-enabled grandparent scams across Canada (CBC Marketplace, 2025). A separate operation run out of Montreal was indicted in the United States in 2025 for defrauding elderly Americans across 46 states, with losses totalling $21 million — though the indictment itself does not specifically allege AI voice cloning. In Saskatchewan, police reported multiple grandparent scam cases in late 2025 where victims reported hearing voices that sounded identical to their grandchildren. Police explicitly stated they could not determine whether AI voice cloning had been used or whether the callers had simply researched their targets through social media. Commercially available AI voice cloning tools can work with as little as a few seconds of audio — obtainable from social media posts, voicemail greetings, or video content — to produce a synthetic replica (CBC Marketplace, 2025).
The Canadian Anti-Fraud Centre tracks emergency and grandparent scams as a distinct fraud category. While these scams cause significant individual losses, CAFC data shows they rank below investment fraud, spear phishing, and romance scams in total dollar losses nationally. Law enforcement and fraud experts have warned that AI voice cloning technology has the potential to make these schemes significantly more effective by eliminating the weakest link in the traditional approach: the unconvincing impersonation.
Matérialisé à partir de
Préjudices
Des arnaqueurs auraient utilisé des clones vocaux générés par l'IA imitant les petits-enfants des victimes pour se faire passer pour des membres de la famille en détresse, convainquant au moins huit aînés de remettre de l'argent sous de faux prétextes. La police croit que le clonage vocal par IA a été utilisé, mais cela n'a pas été confirmé de manière indépendante.
Au moins huit aînés de St. John's ont perdu un total de 200 000 $ en trois jours, et une opération plus large liée à Montréal a causé 21 millions de dollars de pertes aux États-Unis. L'acte d'accusation contre l'opération montréalaise n'allègue pas spécifiquement le recours au clonage vocal par IA.
Les victimes âgées croyaient entendre leurs véritables petits-enfants en détresse judiciaire, exploitant les liens familiaux et causant un traumatisme émotionnel même après la découverte de la fraude.
Preuves
2 rapports
- Grandparent scam: 8 people in St. John's lose $200K in three days to AI voice cloning Source principale
Eight seniors in St. John's lost $200,000 in three days to grandparent scam ring; callers impersonated grandchildren using suspected voice cloning technology
-
CBC Marketplace investigation confirmed current voice cloning tools can produce convincing replicas from short audio samples; demonstrated AI voice cloning capability with minimal source material
Détails de la fiche
Réponses et résultats
Arrested Charles Gillen at St. John's airport with collected fraud proceeds; he subsequently faced 30 criminal charges
Évaluation éditoriale évalué
Le clonage vocal par IA a transformé l'arnaque aux grands-parents — l'un des types de fraude les plus courants au Canada visant les aînés — d'un stratagème fondé sur le talent d'imitateur à un où l'appelant sonne exactement comme un vrai membre de la famille (CBC Marketplace, 2025), augmentant potentiellement son efficacité.
Entités impliquées
Systèmes d'IA impliqués
AI voice cloning tools capable of producing convincing voice replicas from short audio samples, reportedly used in grandparent scam operations
Fiches connexes
- Toronto Police and Competition Bureau Warn AI-Powered Scams 'Took Off Like a Rocket' Across Canada in Early 2026related
- AI-Enabled Fraud and Impersonationrelated
Taxonomieévalué
AIID : Incident #973
Historique des modifications
| Version | Date | Modification |
|---|---|---|
| v1 | 8 mars 2026 | Initial publication |
| v2 | 11 mars 2026 | Reframed AI voice cloning claims to reflect that police never confirmed AI was used; corrected charge count (27, not 30; pled guilty to 14); fixed CAFC ranking claim; removed fabricated policy recommendations; corrected Montreal indictment to 46 states; fixed CBC Marketplace date |