Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Escalating Severe Confidence: high

AI voice cloning and deepfake video have defrauded Canadians of millions. Convincing impersonation now requires only consumer-grade tools, and existing protections do not address these capabilities.

Identified: March 1, 2023 Last assessed: March 8, 2026

Generative AI has made convincing impersonation — by voice, video, and text — accessible to fraudsters with no specialized technical expertise. This structural shift has produced significant financial harm to Canadians.

In March 2023, eight seniors in St. John's, Newfoundland lost $200,000 in three days to a grandparent scam ring that used suspected AI voice cloning to impersonate family members in distress. CBC Marketplace's 2025 investigation confirmed that current voice cloning tools can produce convincing replicas from short audio samples — a phone greeting or social media video is sufficient.

At a larger scale, AI-generated deepfake videos of celebrities and public figures — including Elon Musk, Dragon's Den personalities, and Prime Minister Mark Carney — were used in cryptocurrency investment scams, particularly during and after the 2025 federal election. Over 40 Facebook pages ran fraudulent Carney deepfake schemes. Individual losses from deepfake-driven crypto scams were substantial: an Ontario woman lost $1.7 million in retirement savings to a scheme using a deepfake of Elon Musk; a Prince Edward Island man lost $600,000 in life savings to a similar scam. The Canadian Anti-Fraud Centre reported $103 million lost to crypto scams in 2025, with AI deepfakes as a major vector.

The structural condition is the asymmetry between fraud capability and detection capability. AI tools have made convincing impersonation cheap and scalable. Law enforcement forensic capacity, financial institution identity verification, and consumer awareness have not adapted. The fraud techniques that previously required criminal organizations with resources and expertise are now accessible to anyone with a laptop and free AI tools.

Materialized Incidents

Harms

Eight seniors in St. John's lost $200,000 in three days to a grandparent scam ring using suspected AI voice cloning. The Canadian Anti-Fraud Centre reported $638 million in fraud losses in 2024, with AI-enabled impersonation as a growing category.

Fraud & ImpersonationEconomic HarmSeverePopulation

Generative AI voice cloning tools can produce convincing replicas from short audio samples, making phone-based impersonation fraud accessible to actors with no specialized expertise. Canadian law enforcement lacks tools calibrated for AI-enabled fraud detection.

Fraud & ImpersonationSignificantPopulation

Evidence

4 reports

  1. Media — CBC News (Mar 6, 2023)

    Voice cloning used in grandparent scams targeting Canadian seniors

  2. Media — CBC Marketplace (Mar 5, 2025)

    Investigation confirming AI voice cloning in fraud targeting Canadians

  3. Media — Mitrade (Jul 17, 2025)

    CAFC reports $103 million in crypto scam losses involving AI deepfakes

  4. Academic — Canadian Digital Media Research Network (Jun 1, 2025)

    AI deepfakes used for fraudulent investment scams during 2025 Canadian election

Record details

Responses & Outcomes

Saskatchewan Financial and Consumer Affairs Authorityinstitutional actionActive

Issued investor alerts about impersonation scams using AI-generated deepfakes of Prime Minister Carney

Canadian Anti-Fraud Centreinstitutional actionActive

Reported $103 million in crypto scam losses in 2025 involving AI deepfakes

Policy Recommendationsassessed

Public awareness campaigns targeting vulnerable populations about AI-enabled fraud

Canadian Anti-Fraud Centre (Jul 17, 2025)

Investor alerts about AI-generated deepfake impersonation scams

Saskatchewan Financial and Consumer Affairs Authority (Jun 4, 2025)

Require financial institutions to implement enhanced identity verification protocols for high-value transactions initiated through voice or video channels, including multi-factor authentication beyond voice recognition

CBC Marketplace investigation (Jan 1, 2025)

Editorial Assessment assessed

AI voice cloning and deepfake video have been used to defraud Canadians — $200,000 from eight Newfoundland seniors in three days through voice cloning, $103 million in AI-enabled crypto fraud in 2025. Convincing impersonation no longer requires expertise, only access to consumer-grade AI tools. Current law enforcement and financial protection systems were designed before these capabilities became widely accessible.

Entities Involved

Related Records

Taxonomyassessed

Domain
Finance & BankingRetail & Commerce
Harm type
Fraud & ImpersonationEconomic Harm
AI pathway
Use Beyond Intended ScopeMonitoring Absent
Lifecycle phase
Deployment

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication

Version 1