Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Confirmed Minor

A tribunal ruled Air Canada liable after its chatbot provided inaccurate information about bereavement fare policy, setting a precedent in British Columbia.

Occurred: November 11, 2022 Reported: February 14, 2024

In November 2022, the complainant used Air Canada's website chatbot to ask about bereavement fare policies following the death of their grandmother (British Columbia Civil Resolution Tribunal, 2024). The chatbot indicated they could book a regular-priced flight and request a retroactive bereavement discount within 90 days of the ticket issue date (British Columbia Civil Resolution Tribunal, 2024). Relying on this information, the complainant booked a flight and later submitted a bereavement fare claim. Air Canada denied the claim, stating that its actual policy did not allow retroactive bereavement fare applications — the discount could not be applied after travel had already occurred (British Columbia Civil Resolution Tribunal, 2024).

When the complainant challenged the denial, Air Canada argued that it could not be held liable for information provided by its agents or representatives, including a chatbot (CBC News, 2024). The tribunal characterized this position as, in effect, suggesting the chatbot was "a separate legal entity that is responsible for its own actions" — an argument the tribunal member called "remarkable" (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). The British Columbia Civil Resolution Tribunal rejected this argument in its February 14, 2024 decision (Moffatt v. Air Canada), ruling that Air Canada is responsible for all information on its website, whether from a static page or a chatbot (British Columbia Civil Resolution Tribunal, 2024). The tribunal found Air Canada liable for negligent misrepresentation and awarded the complainant $650.88 in damages, plus $36.14 in interest and $125 in tribunal fees, for a total of $812.02 (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024).

The ruling held that Air Canada could not deploy a chatbot for customer service and then disclaim responsibility when the chatbot provided false information (McCarthy Tétrault, 2024). Air Canada did not provide evidence about the nature of its chatbot technology to the tribunal, and legal commentators noted the decision did not establish whether the system was AI-powered or rules-based (McCarthy Tétrault, 2024). The tribunal found Air Canada had a duty to ensure the accuracy of its chatbot's responses (British Columbia Civil Resolution Tribunal, 2024). The decision, while from a small claims-level tribunal whose rulings are not binding on other courts, received extensive commentary from legal scholars and practitioners — including analyses by firms such as McCarthy Tétrault and Dentons in 2024, and a UBC Law Review case comment in 2025 — as a notable early ruling on corporate liability for AI-generated customer communications (McCarthy Tétrault, 2024).

Materialized From

Harms

Air Canada's chatbot provided inaccurate information about bereavement fare policy, telling a passenger he could request a retroactive discount within 90 days when the actual policy required pre-travel approval. The passenger booked a full-price flight on this basis and was denied the bereavement fare. The tribunal awarded $650.88 in damages, plus $36.14 in interest and $125 in tribunal fees, totalling $812.02.

MisinformationEconomic HarmMinorIndividual

Evidence

3 reports

  1. Court — British Columbia Civil Resolution Tribunal (Feb 14, 2024)

    The tribunal decision itself: establishes the facts, Air Canada's arguments, the negligent misrepresentation finding, and the damages award.

  2. Media — CBC News (Feb 15, 2024)

    Documents the CRT ruling, chatbot misrepresentation, the separate legal entity argument, and the ~$650 damages award.

  3. Other — McCarthy Tétrault (Feb 16, 2024)

    Legal analysis of corporate liability for AI chatbot misrepresentation, negligence framework, and implications for businesses deploying AI tools.

Record details

Responses & Outcomes

Air Canadacourt decisionActive

Found liable by the BC Civil Resolution Tribunal for negligent misrepresentation; ordered to pay approximately $650 in damages plus interest and tribunal fees (Moffatt v. Air Canada)

Policy Recommendationsassessed

Organizations deploying AI chatbots for customer-facing communications should treat chatbot outputs as legally attributable corporate representations and implement content accuracy governance accordingly.

British Columbia Civil Resolution Tribunal (Moffatt v. Air Canada, 2024 BCCRT 149) (Feb 14, 2024)

Businesses deploying AI chatbots should audit chatbot responses against current corporate policies, particularly for financially consequential topics such as fares, refund eligibility, and warranty terms.

McCarthy Tétrault (legal commentary by Barry Sookman) (Feb 16, 2024)

Editorial Assessment assessed

This is among the first Canadian adjudicative decisions to reject a corporate attempt to disclaim liability for AI-generated customer communications (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). The CRT held that deploying a chatbot does not create a liability shield — the corporation remains responsible for the accuracy of information its AI provides (British Columbia Civil Resolution Tribunal, 2024; McCarthy Tétrault, 2024). While the CRT is a small claims-level tribunal whose rulings do not bind other courts, the decision exposed a gap in how Canadian consumer protection frameworks address AI intermediaries and attracted extensive legal commentary on the negligent misrepresentation standard applied to automated systems (McCarthy Tétrault, 2024).

Entities Involved

Air Canada
deployer

AI Systems Involved

Air Canada Customer Service Chatbot

Provided inaccurate information to a passenger about bereavement fare policy, stating fares could be applied retroactively within 90 days when the actual policy required pre-travel approval

Related Records

Taxonomyassessed

Domain
Transportation
Harm type
MisinformationEconomic Harm
AI pathway
Deployment Context
Lifecycle phase
DeploymentMonitoring

AIID: Incident #639

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication
v2Mar 11, 2026Corrected 'separate legal entity' as tribunal's characterization not AC's words; reframed precedential weight (CRT is non-binding small claims tribunal); fixed pronouns to match decision; removed fabricated policy recommendation; corrected McCarthy Tétrault date; refined date precision
v3Mar 11, 2026Corrected source titles to match actual headlines; added claim_supported and relevance to sources; rewrote policy_recommendations as forward-looking prescriptions; strengthened why_this_matters analysis; recalibrated harm severity to low
v4Mar 11, 2026Added CRT decision as primary court source; merged overlapping harms; noted chatbot nature was not established at tribunal; changed ai_pathways to deployment_context only; fixed McCarthy source_type
v5Mar 11, 2026Fixed exact damages figures ($650.88 + $36.14 interest + $125 fees = $812.02); corrected UBC Law Review dating (2025, not 2024); completed truncated harm description

Version 5