Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Corroborated Severe

Scammers reportedly used AI-cloned voices of grandchildren, stealing $200K from eight St. John's seniors in three days.

Occurred: February 28, 2023

Over a three-day period from February 28 to March 2, 2023, at least eight elderly residents of the St. John's, Newfoundland area were defrauded of a combined $200,000 in a grandparent scam operation (CBC News, 2023). The callers claimed to be the victims' grandchildren in legal trouble, urgently needing bail money, and were sufficiently convincing that victims believed they were speaking to their real family members. Media coverage and a Memorial University computer security researcher speculated that AI voice cloning tools may have been used to replicate the grandchildren's voices, though the Royal Newfoundland Constabulary's official statements described a "sophisticated" operation without specifically alleging AI technology (CBC News, 2023).

Police arrested 23-year-old Charles Gillen at St. John's airport as he attempted to flee with the collected money. He ultimately faced 27 criminal charges and pleaded guilty to 14 counts of fraud (CBC News, 2023). The case drew widespread attention as a possible early instance of AI voice cloning in Canadian fraud, though the use of AI was never forensically confirmed.

The Newfoundland incident is part of a broader pattern. CBC Marketplace's March 2025 investigation documented cases of suspected AI-enabled grandparent scams across Canada (CBC Marketplace, 2025). A separate operation run out of Montreal was indicted in the United States in 2025 for defrauding elderly Americans across 46 states, with losses totalling $21 million — though the indictment itself does not specifically allege AI voice cloning. In Saskatchewan, police reported multiple grandparent scam cases in late 2025 where victims reported hearing voices that sounded identical to their grandchildren. Police explicitly stated they could not determine whether AI voice cloning had been used or whether the callers had simply researched their targets through social media. Commercially available AI voice cloning tools can work with as little as a few seconds of audio — obtainable from social media posts, voicemail greetings, or video content — to produce a synthetic replica (CBC Marketplace, 2025).

The Canadian Anti-Fraud Centre tracks emergency and grandparent scams as a distinct fraud category. While these scams cause significant individual losses, CAFC data shows they rank below investment fraud, spear phishing, and romance scams in total dollar losses nationally. Law enforcement and fraud experts have warned that AI voice cloning technology has the potential to make these schemes significantly more effective by eliminating the weakest link in the traditional approach: the unconvincing impersonation.

Materialized From

Harms

Scammers reportedly used AI-generated voice clones of victims' grandchildren to impersonate family members in distress, convincing at least eight elderly residents to hand over money under false pretences. Police believe AI voice cloning was used but this has not been independently confirmed.

Fraud & ImpersonationEconomic HarmPsychological HarmSignificantGroup

At least eight seniors in St. John's lost a combined $200,000 over three days to the voice cloning scam, with a broader Montreal-linked operation causing $21 million in losses across the United States. The Montreal operation's indictment does not specifically allege AI voice cloning.

Fraud & ImpersonationEconomic HarmPsychological HarmSevereGroup

Elderly victims believed they were hearing their actual grandchildren in legal distress, leveraging familial bonds and causing emotional trauma even after the fraud was discovered.

Fraud & ImpersonationEconomic HarmPsychological HarmModerateGroup

Evidence

2 reports

  1. Media — CBC News (Mar 6, 2023)

    Eight seniors in St. John's lost $200,000 in three days to grandparent scam ring; callers impersonated grandchildren using suspected voice cloning technology

  2. Media — CBC Marketplace (Mar 5, 2025)

    CBC Marketplace investigation confirmed current voice cloning tools can produce convincing replicas from short audio samples; demonstrated AI voice cloning capability with minimal source material

Record details

Responses & Outcomes

Royal Newfoundland Constabularyinstitutional actionActive

Arrested Charles Gillen at St. John's airport with collected fraud proceeds; he subsequently faced 30 criminal charges

Editorial Assessment assessed

AI voice cloning has transformed the grandparent scam — one of Canada's most common fraud types targeting seniors — from a scheme relying on impersonation skill to one where the caller sounds exactly like the victim's actual family member (CBC Marketplace, 2025), potentially increasing effectiveness.

Entities Involved

AI Systems Involved

Fish Audio

AI voice cloning tools capable of producing convincing voice replicas from short audio samples, reportedly used in grandparent scam operations

Related Records

Taxonomyassessed

Domain
Finance & BankingJustice
Harm type
Fraud & ImpersonationEconomic HarmPsychological Harm
AI pathway
Use Beyond Intended Scope
Lifecycle phase
Deployment

AIID: Incident #973

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication
v2Mar 11, 2026Reframed AI voice cloning claims to reflect that police never confirmed AI was used; corrected charge count (27, not 30; pled guilty to 14); fixed CAFC ranking claim; removed fabricated policy recommendations; corrected Montreal indictment to 46 states; fixed CBC Marketplace date

Version 2