Un aperçu IA de Google a faussement accusé le musicien canadien Ashley MacIsaac d'infractions sexuelles, entraînant l'annulation d'un concert
An AI system deployed by the world's dominant search engine fabricated criminal accusations against a Canadian public figure, causing real-world harm — a cancelled concert and reputational damage — before the error was discovered. The incident illustrates how AI confabulation in search results can produce false accusations with consequences that precede correction. MacIsaac's only publicly known legal issue was a cannabis possession charge over two decades ago, for which he received a discharge.
Récit
In December 2025, Juno Award-winning Cape Breton fiddler Ashley MacIsaac learned that Google’s AI Overview feature — an AI-generated summary displayed at the top of search results — had falsely identified him as a convicted sex offender. The AI summary asserted that MacIsaac had been convicted of sexual assault, internet luring, assaulting a woman, and attempting to assault a minor, and that he was listed on the national sex offender registry. None of this was true.
The fabrication was the result of entity conflation: Google’s AI system blended MacIsaac’s biography with criminal records belonging to a different person with the surname MacIsaac from Atlantic Canada — likely drawn from a CBC article about a St. John’s, Newfoundland sex offender convicted of internet luring and sexual assault. The only publicly available record of the musician having a legal issue involves cannabis possession in Saskatchewan in 2001, for which he received an absolute discharge.
Sipekne’katik First Nation, a Mi’kmaw community in central Nova Scotia, had booked MacIsaac for a concert on approximately December 19. When community leadership researched MacIsaac ahead of the performance, they discovered the AI-generated summary and confronted him with the false information. The concert was cancelled. MacIsaac learned about the AI-generated defamation only through this cancellation. “Google screwed up, and it put me in a dangerous situation,” MacIsaac said. “I could have been at a border and put in jail.”
Google spokesperson Wendy Manton responded that AI Overviews are “dynamic and frequently changing” and that when issues arise, the company uses “those examples to improve our systems.” Google corrected the search results within one to two days of the story breaking on December 23. Sipekne’katik First Nation Executive Director Stuart Knockwood issued a public apology: “We deeply regret the harm this error caused to your reputation, your livelihood, and your sense of personal safety.” The First Nation extended an open invitation for a future performance.
As of January 2026, multiple lawyers had offered pro bono or contingency representation. MacIsaac expressed willingness to pursue legal action: “If a lawyer wants to take this on… I would stand up, because I’m not the first and I’m sure I won’t be the last.”
The incident is part of a broader pattern of AI confabulation in consequential Canadian contexts — alongside the CRA chatbot’s incorrect tax advice, Air Canada’s chatbot fabricating a bereavement fare policy, and AI-generated fake jurisprudence submitted in a Quebec court. What distinguishes the MacIsaac case is that the confabulation produced a false criminal accusation, generated without human involvement and presented as authoritative fact by a major search engine, causing harm before it could be corrected. As McMaster University professor Clifton van der Linden observed, “We’re seeing a transition in search engines from information navigators to narrators” — making AI-generated summaries appear authoritative rather than aggregative.
Préjudices
Google's AI Overview feature falsely identified Juno Award-winning Cape Breton fiddler Ashley MacIsaac as a convicted sex offender, asserting he had been convicted of sexual assault, internet luring, assaulting a woman, attempting to assault a minor, and being listed on the national sex offender registry — all fabricated by conflating his biography with another person who shares his name.
Sipekne'katik First Nation cancelled a planned concert by MacIsaac after confronting him with the AI-generated summary, causing reputational harm and economic loss. The false accusations circulated publicly before Google removed the summary.
Populations touchées
- Canadian musicians and public figures vulnerable to AI-generated defamation
- Indigenous communities relying on AI-generated information for due diligence
Entités impliquées
Developed and deployed AI Overviews, the AI-generated search summary feature that falsely accused MacIsaac of sex offenses by conflating his identity with another person; subsequently apologized and removed the false summary
Systèmes d'IA impliqués
Generated a false summary that blended Ashley MacIsaac's biography with criminal records of another person bearing the same name, presenting fabricated criminal convictions as factual information at the top of Google Search results
Réponses et résultats
Spokesperson Wendy Manton acknowledged the error; Google corrected the AI Overview within one to two days of the story breaking publicly
Contexte du système d'IA
Google AI Overviews, an AI-generated search summary feature that uses large language models to synthesize information from multiple web sources and display it prominently at the top of Google Search results. The feature is presented as authoritative factual information rather than search results, and users have no straightforward way to distinguish AI-generated confabulations from accurate summaries.
Mesures préventives
- Require AI-generated search summaries to implement entity disambiguation safeguards that prevent conflation of different individuals who share a name, particularly for criminal record information
- Establish legal liability frameworks for AI-generated defamatory content in search results, extending existing defamation law to cover automated systems
- Mandate that AI-generated summaries involving criminal accusations include confidence indicators and verification status, or exclude such claims entirely
- Require platforms deploying AI-generated summaries to maintain rapid correction mechanisms with proactive notification to affected individuals
Matérialisé à partir de
Fiches connexes
- specter-aviation-ai-fake-jurisprudence related
- air-canada-chatbot-misrepresentation related
- AI Confabulation in Consequential Canadian Contexts related
Taxonomie
Sources
- Ashley MacIsaac concert cancelled after AI wrongly accuses him of being sex offender
- Fiddler Ashley MacIsaac has show cancelled over Google AI-generated misinformation
- Prominent Canadian Musician Says Gig Was Cancelled After Google AI Overview Wrongly Branded Him Sex Pest
- Google Apologizes for AI Falsely Identifying Ashley MacIsaac as Sex Offender
- Ashley MacIsaac concert cancelled after AI wrongly accuses him of being sex offender
- AI Incident Database: Incident 1316
AIID : Incident #1316
Historique des modifications
| Version | Date | Modification |
|---|---|---|
| v1 | 8 mars 2026 | Initial publication |