This site is a work-in-progress prototype.
Confirmed Severity: Significant Version 2

AI-hallucinated legal citations have now been sanctioned or addressed by courts in all four major Canadian jurisdictions — BC, Ontario, Quebec, and Federal Court — establishing this as a systemic pattern rather than an isolated incident. Ontario introduced Rule 4.06.1(2.1) requiring certification of authority authenticity in response. The pattern implicates both general-purpose AI (ChatGPT) and purpose-built legal AI tools (Visto.ai), and affects both lawyers and self-represented litigants.

Occurred: February 1, 2024 (month) to December 31, 2025 Reported: February 1, 2024

Narrative

AI-generated fabricated legal citations have been submitted to courts in all four major Canadian jurisdictions — British Columbia, Ontario, Quebec, and Federal Court — by both lawyers and self-represented litigants, establishing a cross-jurisdictional pattern.

Zhang v. Chen (2024 BCSC 285) — the first reported Canadian case. Lawyer Chong Ke cited two non-existent cases generated by ChatGPT in a family law matter. The BC Supreme Court ordered the lawyer to personally bear costs and review all files for AI-generated citations.

Specter Aviation v. Laprade (2025 QCCS 3521) — Jean Laprade, a 74-year-old self-represented litigant in Quebec, submitted legal arguments containing eight instances of fabricated case law. The Quebec Superior Court imposed the first financial sanction in Canada for AI-hallucinated legal content — a $5,000 fine under article 342 of the Code of Civil Procedure for “substantial breach” of procedural obligations. Laprade apologized but told the court he would not have been able to defend himself without AI assistance.

Ko v. Li (2025 ONSC 2766/2965/6785) — A lawyer in an Ontario estates/family law proceeding submitted a factum citing non-existent or irrelevant cases likely generated by AI. Justice Myers ordered counsel to show cause for contempt. Counsel admitted error and apologized; contempt proceedings were discontinued. The case prompted Ontario Rule 4.06.1(2.1), requiring lawyers to certify the authenticity of authorities cited in submissions.

Hussein v. Canada (2025 FC 1060) — Immigration counsel used Visto.ai, an AI legal research tool, and submitted two non-existent cases to the Federal Court. The court found that the failure to disclose AI use in preparing submissions “amounts to an attempt to mislead the Court” and awarded costs against counsel.

The cases highlight a tension between access to justice and judicial integrity. Self-represented litigants increasingly turn to AI tools for legal assistance because they cannot afford lawyers. These tools generate confident, plausible-sounding legal analysis that non-experts cannot easily verify. At the same time, purpose-built legal AI tools like Visto.ai — which users might reasonably expect to be more reliable than general-purpose chatbots — also produce fabricated citations, indicating that the confabulation problem is structural to current generative AI, not limited to consumer chatbots.

Ontario’s introduction of Rule 4.06.1(2.1) represents the first Canadian procedural rule specifically responding to AI-hallucinated citations. Other jurisdictions have addressed the issue through case-by-case sanctions but have not yet implemented systematic safeguards.

Harms

AI-generated fabricated case law citations have been submitted in courts across four Canadian jurisdictions — BC, Ontario, Quebec, and Federal Court — by both lawyers and self-represented litigants. Each case required the opposing party and court to research and refute fictitious cases, wasting judicial resources and undermining the integrity of proceedings.

Significant Sector

In Quebec (Specter Aviation v. Laprade, 2025 QCCS 3521), the court imposed a $5,000 fine for submitting eight fabricated citations. In Ontario (Ko v. Li, 2025 ONSC 2766/2965), a lawyer faced contempt proceedings. In Federal Court (Hussein v. Canada, 2025 FC 1060), the court found the failure to disclose AI use 'amounts to an attempt to mislead the Court' and awarded costs against counsel.

Significant Sector

The pattern threatens the reliability of AI-assisted legal research at a systemic level, as courts cannot easily distinguish AI-hallucinated citations from legitimate ones without verification, and the volume of AI-generated legal content is increasing.

Moderate Sector

Affected Populations

  • parties to litigation
  • self-represented litigants
  • legal profession
  • judiciary

Responses & Outcomes

Quebec Superior Court

Imposed a $5,000 fine under article 342 of the Code of Civil Procedure for substantial breach of procedural obligations (Specter Aviation v. Laprade, 2025 QCCS 3521)

AI System Context

Generative AI tools including ChatGPT and Visto.ai (an AI legal research tool for immigration law) used by lawyers and self-represented litigants to draft legal submissions. These tools produce fabricated case law citations — non-existent judicial decisions with plausible-sounding reasoning — that are difficult to distinguish from legitimate citations without manual verification against case law databases.

Preventive Measures

  • Require courts to establish practice directions addressing the use of generative AI in legal proceedings, including disclosure obligations when AI tools are used to prepare submissions
  • Develop and promote AI literacy resources specifically for self-represented litigants, explaining the risk of hallucinated citations and the importance of verifying AI-generated legal research
  • Consider proportional sanctions that distinguish between deliberate fabrication and good-faith reliance on AI tools by unrepresented parties who may not understand the technology's limitations
  • Require legal AI tool providers to implement safeguards against citation hallucination, such as verification against actual case law databases

Related Records

Taxonomy

Domain
Justice
Harm type
MisinformationOperational Failure
AI involvement
MisuseModel Confabulation
Lifecycle phase
Deployment

Sources

  1. Specter Aviation inc. c. Laprade, 2025 QCCS 3521 Court — Quebec Superior Court (via CanLII) (Oct 1, 2025)
  2. Quebec judge fines man $5,000 for improper use of artificial intelligence in court Media — Global News (Oct 1, 2025)
  3. Specter Aviation v. Laprade — Quebec's first judicial sanction for AI Other — Gowling WLG (Oct 15, 2025)
  4. Use of Generative AI in Court: Quebec Superior Court Sanctions Other — McMillan LLP (Oct 15, 2025)
  5. Zhang v. Chen, 2024 BCSC 285 Court — Supreme Court of British Columbia (via CanLII) (Feb 1, 2024)
  6. Ko v. Li, 2025 ONSC 2965 Court — Ontario Superior Court of Justice (via CanLII) (May 1, 2025)
  7. Hussein v. Canada (Immigration, Refugees and Citizenship), 2025 FC 1060 Court — Federal Court (via CanLII) (Jun 1, 2025)

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication (Quebec Specter Aviation case only)
v2 Mar 9, 2026 Broadened to cross-jurisdictional pattern — added Zhang v. Chen (BC), Ko v. Li (ON), Hussein v. Canada (FC); upgraded severity and reach; added formal CanLII citations