Phase pilote : CAIM est en construction. Les fiches sont provisoires, basées sur des sources publiques, et n’ont pas encore été révisées par des pairs. Commentaires bienvenus.
Confirmé Important

Des tribunaux en C.-B., en Ontario, au Québec et à la Cour fédérale ont tous sanctionné des citations juridiques fabriquées par IA.

Survenu: 1 février 2024 (month) au 31 décembre 2025 Signalé: 1 février 2024

AI-generated fabricated legal citations have been submitted to courts in all four major Canadian jurisdictions — British Columbia, Ontario, Quebec, and Federal Court — by both lawyers and self-represented litigants, establishing a cross-jurisdictional pattern.

Zhang v. Chen (2024 BCSC 285) — the first reported Canadian case. Lawyer Ke cited two non-existent cases generated by ChatGPT in a family law matter. The BC Supreme Court ordered the lawyer to personally bear costs and review all files for AI-generated citations (Supreme Court of British Columbia (via CanLII), 2024).

Specter Aviation v. Laprade (2025 QCCS 3521) — A 74-year-old self-represented litigant in Quebec submitted legal arguments containing eight instances of fabricated case law. The Quebec Superior Court imposed the first financial sanction in Quebec for AI-hallucinated legal content — a $5,000 fine under article 342 of the Code of Civil Procedure for "significant breach" of procedural obligations (Quebec Superior Court (via CanLII), 2025; Global News, 2025; Gowling WLG, 2025; McMillan LLP, 2025). The litigant apologized to the court.

Ko v. Li (2025 ONSC 2766/2965/6785) — A lawyer in an Ontario estates/family law proceeding submitted a factum citing non-existent or irrelevant cases likely generated by AI. Justice Myers ordered counsel to show cause for contempt. Counsel admitted error and apologized; contempt proceedings were discontinued. Justice Myers noted counsel had failed to include the certification required by Ontario Rule 4.06.1(2.1) — enacted in 2024 specifically to address AI-hallucinated citations — which requires lawyers to certify the authenticity of authorities cited in submissions (Ontario Superior Court of Justice (via CanLII), 2025).

Hussein v. Canada (2025 FC 1060) — Immigration counsel used Visto.ai, an AI legal research tool, and submitted two non-existent cases to the Federal Court. The court found that the failure to disclose AI use in preparing submissions "amounts to an attempt to mislead the Court" (Federal Court (via CanLII), 2025) and in a subsequent costs decision (2025 FC 1138), awarded special costs against counsel personally.

The cases highlight a tension between access to justice and judicial integrity. Self-represented litigants increasingly turn to AI tools for legal assistance because they cannot afford lawyers. These tools generate confident, plausible-sounding legal analysis that non-experts cannot easily verify. At the same time, purpose-built legal AI tools like Visto.ai — which users might reasonably expect to be more reliable than general-purpose chatbots — also produce fabricated citations (Federal Court (via CanLII), 2025), indicating that the confabulation problem is structural to current generative AI, not limited to consumer chatbots.

Ontario's Rule 4.06.1(2.1), enacted in 2024, is the first Canadian procedural rule specifically addressing AI-hallucinated citations (Ontario Superior Court of Justice (via CanLII), 2025). Other jurisdictions have addressed the issue through case-by-case sanctions but have not yet implemented comparable systematic safeguards.

Matérialisé à partir de

Préjudices

Des citations de jurisprudence fabriquées par l'IA ont été soumises à des tribunaux dans quatre juridictions canadiennes — C.-B., Ontario, Québec et Cour fédérale — par des avocats et des plaideurs non représentés. Chaque affaire a obligé la partie adverse et le tribunal à rechercher et réfuter des décisions fictives, gaspillant des ressources judiciaires et compromettant l'intégrité des procédures.

DésinformationInterruption de serviceImportantSecteur

Au Québec (Specter Aviation c. Laprade, 2025 QCCS 3521), le tribunal a imposé une amende de 5 000 $ pour la soumission de huit citations fabriquées. En Ontario (Ko c. Li, 2025 ONSC 2766/2965), un avocat a fait face à des procédures pour outrage au tribunal. À la Cour fédérale (Hussein c. Canada, 2025 CF 1060), le tribunal a conclu que l'omission de divulguer l'utilisation de l'IA « équivaut à une tentative de tromper la Cour » et a adjugé les dépens contre l'avocat.

DésinformationInterruption de serviceImportantSecteur

Cette tendance menace la fiabilité de la recherche juridique assistée par IA à un niveau systémique, car les tribunaux ne peuvent pas facilement distinguer les citations hallucinées par l'IA des citations légitimes sans vérification, et le volume de contenu juridique généré par l'IA augmente.

DésinformationInterruption de serviceModéréSecteur

Preuves

7 rapports

  1. Zhang v. Chen, 2024 BCSC 285 Source principale
    Judiciaire — Supreme Court of British Columbia (via CanLII) (1 févr. 2024)

    First Canadian case — lawyer cited two non-existent ChatGPT-generated cases; ordered to bear costs personally

  2. Ko v. Li, 2025 ONSC 2965 Source principale
    Judiciaire — Ontario Superior Court of Justice (via CanLII) (1 mai 2025)

    Ontario lawyer submitted AI-generated fictitious citations; contempt proceedings initiated; triggered Ontario Rule 4.06.1(2.1)

  3. Judiciaire — Federal Court (via CanLII) (1 juin 2025)

    Immigration counsel used Visto.ai; two non-existent cases cited; court found failure to disclose AI use 'amounts to an attempt to mislead the Court'

  4. Judiciaire — Quebec Superior Court (via CanLII) (1 oct. 2025)

    First Quebec financial sanction ($5,000) for AI-hallucinated legal citations

  5. Média — Global News (1 oct. 2025)

    Global News reporting: Quebec judge fines man $5,000 for submitting AI-generated legal citations; media coverage of the Specter Aviation decision

  6. Autre — Gowling WLG (15 oct. 2025)

    Gowling WLG legal analysis: Quebec's first judicial sanction for AI-hallucinated citations; professional commentary on implications

  7. Autre — McMillan LLP (15 oct. 2025)

    McMillan LLP analysis: Quebec Superior Court sanctions for generative AI use in court; legal practice implications

Détails de la fiche

Réponses et résultats

Cour supérieure du Québecinstitutional actionActif

Imposed a $5,000 fine under article 342 of the Code of Civil Procedure for substantial breach of procedural obligations (Specter Aviation v. Laprade, 2025 QCCS 3521)

Recommandations de politiqueévalué

Require lawyers to certify the authenticity of authorities cited in submissions, as implemented by Ontario Rule 4.06.1(2.1)

Ontario Superior Court of Justice (1 mai 2025)

Require courts to establish practice directions addressing the use of generative AI in legal proceedings, including disclosure obligations when AI tools are used to prepare submissions

Federal Court of Canada (Hussein v. Canada, 2025 FC 1060) (1 juin 2025)

Consider proportional sanctions that distinguish between deliberate fabrication and good-faith reliance on AI tools by unrepresented parties who may not understand the technology's limitations

Quebec Superior Court (Specter Aviation v. Laprade, 2025 QCCS 3521) (1 oct. 2025)

Évaluation éditoriale évalué

Des citations juridiques hallucinées par l'IA ont été sanctionnées ou traitées par des tribunaux dans les quatre grandes juridictions canadiennes — C.-B. (Supreme Court of British Columbia (via CanLII), 2024), Ontario (Ontario Superior Court of Justice (via CanLII), 2025), Québec (Quebec Superior Court (via CanLII), 2025) et Cour fédérale (Federal Court (via CanLII), 2025) — établissant un phénomène systémique plutôt qu'un incident isolé. L'Ontario a introduit la Règle 4.06.1(2.1) exigeant la certification de l'authenticité des autorités citées (Ontario Superior Court of Justice (via CanLII), 2025). Ce phénomène touche à la fois les outils d'IA grand public (ChatGPT) (Supreme Court of British Columbia (via CanLII), 2024) et les outils juridiques spécialisés (Visto.ai) (Federal Court (via CanLII), 2025), et affecte tant les avocats que les plaideurs non représentés.

Entités impliquées

Systèmes d'IA impliqués

ChatGPT

Generative AI tools used to produce fabricated legal citations submitted to courts in BC, Ontario, Quebec, and Federal Court

Fiches connexes

Taxonomieévalué

Domaine
Justice
Type de préjudice
DésinformationInterruption de service
Voie de contribution de l'IA
Utilisation au-delà de la portée prévueConfabulation
Phase du cycle de vie
Déploiement

Historique des modifications

Historique des modifications
VersionDateModification
v18 mars 2026Initial publication (Quebec Specter Aviation case only)
v29 mars 2026Broadened to cross-jurisdictional pattern — added Zhang v. Chen (BC), Ko v. Li (ON), Hussein v. Canada (FC); upgraded severity and reach; added formal CanLII citations

Version 2