Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Confirmed Significant

Courts in BC, Ontario, Quebec, and Federal Court have all sanctioned AI-fabricated legal citations.

Occurred: February 1, 2024 (month) to December 31, 2025 Reported: February 1, 2024

AI-generated fabricated legal citations have been submitted to courts in all four major Canadian jurisdictions — British Columbia, Ontario, Quebec, and Federal Court — by both lawyers and self-represented litigants, establishing a cross-jurisdictional pattern.

Zhang v. Chen (2024 BCSC 285) — the first reported Canadian case. Lawyer Ke cited two non-existent cases generated by ChatGPT in a family law matter. The BC Supreme Court ordered the lawyer to personally bear costs and review all files for AI-generated citations (Supreme Court of British Columbia (via CanLII), 2024).

Specter Aviation v. Laprade (2025 QCCS 3521) — A 74-year-old self-represented litigant in Quebec submitted legal arguments containing eight instances of fabricated case law. The Quebec Superior Court imposed the first financial sanction in Quebec for AI-hallucinated legal content — a $5,000 fine under article 342 of the Code of Civil Procedure for "significant breach" of procedural obligations (Quebec Superior Court (via CanLII), 2025; Global News, 2025; Gowling WLG, 2025; McMillan LLP, 2025). The litigant apologized to the court.

Ko v. Li (2025 ONSC 2766/2965/6785) — A lawyer in an Ontario estates/family law proceeding submitted a factum citing non-existent or irrelevant cases likely generated by AI. Justice Myers ordered counsel to show cause for contempt. Counsel admitted error and apologized; contempt proceedings were discontinued. Justice Myers noted counsel had failed to include the certification required by Ontario Rule 4.06.1(2.1) — enacted in 2024 specifically to address AI-hallucinated citations — which requires lawyers to certify the authenticity of authorities cited in submissions (Ontario Superior Court of Justice (via CanLII), 2025).

Hussein v. Canada (2025 FC 1060) — Immigration counsel used Visto.ai, an AI legal research tool, and submitted two non-existent cases to the Federal Court. The court found that the failure to disclose AI use in preparing submissions "amounts to an attempt to mislead the Court" (Federal Court (via CanLII), 2025) and in a subsequent costs decision (2025 FC 1138), awarded special costs against counsel personally.

The cases highlight a tension between access to justice and judicial integrity. Self-represented litigants increasingly turn to AI tools for legal assistance because they cannot afford lawyers. These tools generate confident, plausible-sounding legal analysis that non-experts cannot easily verify. At the same time, purpose-built legal AI tools like Visto.ai — which users might reasonably expect to be more reliable than general-purpose chatbots — also produce fabricated citations (Federal Court (via CanLII), 2025), indicating that the confabulation problem is structural to current generative AI, not limited to consumer chatbots.

Ontario's Rule 4.06.1(2.1), enacted in 2024, is the first Canadian procedural rule specifically addressing AI-hallucinated citations (Ontario Superior Court of Justice (via CanLII), 2025). Other jurisdictions have addressed the issue through case-by-case sanctions but have not yet implemented comparable systematic safeguards.

Materialized From

Harms

AI-generated fabricated case law citations have been submitted in courts across four Canadian jurisdictions — BC, Ontario, Quebec, and Federal Court — by both lawyers and self-represented litigants. Each case required the opposing party and court to research and refute fictitious cases, wasting judicial resources and undermining the integrity of proceedings.

MisinformationService DisruptionSignificantSector

In Quebec (Specter Aviation v. Laprade, 2025 QCCS 3521), the court imposed a $5,000 fine for submitting eight fabricated citations. In Ontario (Ko v. Li, 2025 ONSC 2766/2965), a lawyer faced contempt proceedings. In Federal Court (Hussein v. Canada, 2025 FC 1060), the court found the failure to disclose AI use 'amounts to an attempt to mislead the Court' and awarded costs against counsel.

MisinformationService DisruptionSignificantSector

The pattern threatens the reliability of AI-assisted legal research at a systemic level, as courts cannot easily distinguish AI-hallucinated citations from legitimate ones without verification, and the volume of AI-generated legal content is increasing.

MisinformationService DisruptionModerateSector

Evidence

7 reports

  1. Court — Supreme Court of British Columbia (via CanLII) (Feb 1, 2024)

    First Canadian case — lawyer cited two non-existent ChatGPT-generated cases; ordered to bear costs personally

  2. Ko v. Li, 2025 ONSC 2965 Primary source
    Court — Ontario Superior Court of Justice (via CanLII) (May 1, 2025)

    Ontario lawyer submitted AI-generated fictitious citations; contempt proceedings initiated; triggered Ontario Rule 4.06.1(2.1)

  3. Court — Federal Court (via CanLII) (Jun 1, 2025)

    Immigration counsel used Visto.ai; two non-existent cases cited; court found failure to disclose AI use 'amounts to an attempt to mislead the Court'

  4. Court — Quebec Superior Court (via CanLII) (Oct 1, 2025)

    First Quebec financial sanction ($5,000) for AI-hallucinated legal citations

  5. Media — Global News (Oct 1, 2025)

    Global News reporting: Quebec judge fines man $5,000 for submitting AI-generated legal citations; media coverage of the Specter Aviation decision

  6. Other — Gowling WLG (Oct 15, 2025)

    Gowling WLG legal analysis: Quebec's first judicial sanction for AI-hallucinated citations; professional commentary on implications

  7. Other — McMillan LLP (Oct 15, 2025)

    McMillan LLP analysis: Quebec Superior Court sanctions for generative AI use in court; legal practice implications

Record details

Responses & Outcomes

Quebec Superior Courtinstitutional actionActive

Imposed a $5,000 fine under article 342 of the Code of Civil Procedure for substantial breach of procedural obligations (Specter Aviation v. Laprade, 2025 QCCS 3521)

Policy Recommendationsassessed

Require lawyers to certify the authenticity of authorities cited in submissions, as implemented by Ontario Rule 4.06.1(2.1)

Ontario Superior Court of Justice (May 1, 2025)

Require courts to establish practice directions addressing the use of generative AI in legal proceedings, including disclosure obligations when AI tools are used to prepare submissions

Federal Court of Canada (Hussein v. Canada, 2025 FC 1060) (Jun 1, 2025)

Consider proportional sanctions that distinguish between deliberate fabrication and good-faith reliance on AI tools by unrepresented parties who may not understand the technology's limitations

Quebec Superior Court (Specter Aviation v. Laprade, 2025 QCCS 3521) (Oct 1, 2025)

Editorial Assessment assessed

AI-hallucinated legal citations have now been sanctioned or addressed by courts in all four major Canadian jurisdictions — BC, Ontario, Quebec, and Federal Court — establishing this as a systemic pattern rather than an isolated incident (Supreme Court of British Columbia (via CanLII), 2024; Ontario Superior Court of Justice (via CanLII), 2025; Quebec Superior Court (via CanLII), 2025; Federal Court (via CanLII), 2025). Ontario introduced Rule 4.06.1(2.1) requiring certification of authority authenticity in response (Ontario Superior Court of Justice (via CanLII), 2025). The pattern implicates both general-purpose AI (ChatGPT) and purpose-built legal AI tools (Visto.ai) (Supreme Court of British Columbia (via CanLII), 2024; Federal Court (via CanLII), 2025), and affects both lawyers and self-represented litigants (Quebec Superior Court (via CanLII), 2025; Global News, 2025).

Entities Involved

AI Systems Involved

ChatGPT

Generative AI tools used to produce fabricated legal citations submitted to courts in BC, Ontario, Quebec, and Federal Court

Related Records

Taxonomyassessed

Domain
Justice
Harm type
MisinformationService Disruption
AI pathway
Use Beyond Intended ScopeConfabulation
Lifecycle phase
Deployment

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication (Quebec Specter Aviation case only)
v2Mar 9, 2026Broadened to cross-jurisdictional pattern — added Zhang v. Chen (BC), Ko v. Li (ON), Hussein v. Canada (FC); upgraded severity and reach; added formal CanLII citations

Version 2