This site is a work-in-progress prototype.
Corroborated Severity: Severe Version 1

AI voice cloning has transformed the grandparent scam — one of Canada's most common fraud types targeting seniors — from a scheme relying on impersonation skill to one where the caller sounds exactly like the victim's actual family member, potentially increasing effectiveness.

Occurred: February 28, 2023

Narrative

Over a three-day period from February 28 to March 2, 2023, at least eight elderly residents of the St. John’s, Newfoundland area were defrauded of a combined $200,000 in a grandparent scam operation that police believe used AI voice cloning technology. The scammers are suspected of using AI tools to generate convincing replicas of the victims’ actual grandchildren’s voices, then called the seniors claiming to be in legal trouble and urgently needing bail money. The voices were sufficiently convincing that victims believed they were speaking to their real family members.

Police arrested 23-year-old Charles Gillen at St. John’s airport as he attempted to flee with the collected money. He faced 30 criminal charges. The case was one of the earliest documented instances in Canada of AI voice cloning used systematically in fraud.

The Newfoundland incident is part of a broader pattern. CBC Marketplace’s subsequent investigation revealed the scale of AI-enabled grandparent scams across Canada. A separate operation run out of Montreal was indicted in the United States in 2025 for defrauding elderly Americans across more than 40 states, with losses totalling $21 million — though the indictment itself does not specifically allege AI voice cloning. In Saskatchewan, police reported multiple grandparent scam cases in late 2025 where victims reported hearing voices identical to their grandchildren, though police stated they could not confirm whether AI voice cloning was used. AI voice cloning technology requires as little as a few seconds of audio — easily obtained from social media posts, voicemail greetings, or video content — to produce a convincing synthetic replica.

The Canadian Anti-Fraud Centre has reported that emergency and grandparent scams consistently rank among the most financially damaging fraud categories in Canada. Law enforcement and fraud experts have warned that AI voice cloning technology has the potential to make these schemes significantly more effective by eliminating the weakest link in the traditional approach: the unconvincing impersonation.

Harms

Scammers reportedly used AI-generated voice clones of victims' grandchildren to impersonate family members in distress, convincing at least eight elderly residents to hand over money under false pretences. Police believe AI voice cloning was used but this has not been independently confirmed.

Significant Group

At least eight seniors in St. John's lost a combined $200,000 over three days to the voice cloning scam, with a broader Montreal-linked operation causing $21 million in losses across the United States.

Severe Group

Elderly victims believed they were hearing their actual grandchildren in legal distress, leveraging familial bonds and causing emotional trauma even after the fraud was discovered.

Moderate Group

Affected Populations

  • elderly Canadians
  • families of victims

Responses & Outcomes

Royal Newfoundland Constabulary

Arrested Charles Gillen at St. John's airport with collected fraud proceeds; he subsequently faced 30 criminal charges

AI System Context

Commercially available AI voice cloning tools capable of generating convincing replicas of a person's voice from as little as a few seconds of audio, typically sourced from social media posts, voicemail greetings, or video content.

Preventive Measures

  • Fund public awareness campaigns specifically targeting seniors and their families about AI voice cloning capabilities and grandparent scam tactics
  • Require financial institutions to implement additional verification procedures for large cash withdrawals or wire transfers initiated by elderly customers, particularly when urgency is claimed
  • Mandate that AI voice cloning tool providers implement safeguards against non-consensual voice replication and cooperate with law enforcement investigations
  • Strengthen Criminal Code provisions to explicitly address the use of AI-generated synthetic media in fraud, with penalties reflecting the technological sophistication of the offence

Related Records

Taxonomy

Domain
Finance & BankingJustice
Harm type
Fraud & ImpersonationEconomic HarmPsychological Harm
AI involvement
Misuse
Lifecycle phase
Deployment

Sources

  1. Grandparent scam: 8 people in St. John's lose $200K in three days to AI voice cloning Media — CBC News (Mar 6, 2023)
  2. CBC Marketplace: AI voice cloning scam investigation Media — CBC Marketplace (Mar 5, 2025)

AIID: Incident #973

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication