Air Canada Held Liable for Chatbot's Inaccurate Bereavement Fare Information
A tribunal ruled Air Canada liable after its chatbot provided inaccurate information about bereavement fare policy, setting a precedent in British Columbia.
In November 2022, the complainant used Air Canada's website chatbot to ask about bereavement fare policies following the death of their grandmother (British Columbia Civil Resolution Tribunal, 2024). The chatbot indicated they could book a regular-priced flight and request a retroactive bereavement discount within 90 days of the ticket issue date (British Columbia Civil Resolution Tribunal, 2024). Relying on this information, the complainant booked a flight and later submitted a bereavement fare claim. Air Canada denied the claim, stating that its actual policy did not allow retroactive bereavement fare applications — the discount could not be applied after travel had already occurred (British Columbia Civil Resolution Tribunal, 2024).
When the complainant challenged the denial, Air Canada argued that it could not be held liable for information provided by its agents or representatives, including a chatbot (CBC News, 2024). The tribunal characterized this position as, in effect, suggesting the chatbot was "a separate legal entity that is responsible for its own actions" — an argument the tribunal member called "remarkable" (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). The British Columbia Civil Resolution Tribunal rejected this argument in its February 14, 2024 decision (Moffatt v. Air Canada), ruling that Air Canada is responsible for all information on its website, whether from a static page or a chatbot (British Columbia Civil Resolution Tribunal, 2024). The tribunal found Air Canada liable for negligent misrepresentation and awarded the complainant $650.88 in damages, plus $36.14 in interest and $125 in tribunal fees, for a total of $812.02 (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024).
The ruling held that Air Canada could not deploy a chatbot for customer service and then disclaim responsibility when the chatbot provided false information (McCarthy Tétrault, 2024). Air Canada did not provide evidence about the nature of its chatbot technology to the tribunal, and legal commentators noted the decision did not establish whether the system was AI-powered or rules-based (McCarthy Tétrault, 2024). The tribunal found Air Canada had a duty to ensure the accuracy of its chatbot's responses (British Columbia Civil Resolution Tribunal, 2024). The decision, while from a small claims-level tribunal whose rulings are not binding on other courts, received extensive commentary from legal scholars and practitioners — including analyses by firms such as McCarthy Tétrault and Dentons in 2024, and a UBC Law Review case comment in 2025 — as a notable early ruling on corporate liability for AI-generated customer communications (McCarthy Tétrault, 2024).
Materialized From
Harms
Air Canada's chatbot provided inaccurate information about bereavement fare policy, telling a passenger he could request a retroactive discount within 90 days when the actual policy required pre-travel approval. The passenger booked a full-price flight on this basis and was denied the bereavement fare. The tribunal awarded $650.88 in damages, plus $36.14 in interest and $125 in tribunal fees, totalling $812.02.
Evidence
3 reports
- Moffatt v. Air Canada, 2024 BCCRT 149 Primary source
The tribunal decision itself: establishes the facts, Air Canada's arguments, the negligent misrepresentation finding, and the damages award.
- How can I mislead you? Air Canada found liable for chatbot's bad advice on bereavement rates Primary source
Documents the CRT ruling, chatbot misrepresentation, the separate legal entity argument, and the ~$650 damages award.
-
Legal analysis of corporate liability for AI chatbot misrepresentation, negligence framework, and implications for businesses deploying AI tools.
Record details
Responses & Outcomes
Found liable by the BC Civil Resolution Tribunal for negligent misrepresentation; ordered to pay approximately $650 in damages plus interest and tribunal fees (Moffatt v. Air Canada)
Policy Recommendationsassessed
Organizations deploying AI chatbots for customer-facing communications should treat chatbot outputs as legally attributable corporate representations and implement content accuracy governance accordingly.
British Columbia Civil Resolution Tribunal (Moffatt v. Air Canada, 2024 BCCRT 149) (Feb 14, 2024)Businesses deploying AI chatbots should audit chatbot responses against current corporate policies, particularly for financially consequential topics such as fares, refund eligibility, and warranty terms.
McCarthy Tétrault (legal commentary by Barry Sookman) (Feb 16, 2024)Editorial Assessment assessed
This is among the first Canadian adjudicative decisions to reject a corporate attempt to disclaim liability for AI-generated customer communications (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). The CRT held that deploying a chatbot does not create a liability shield — the corporation remains responsible for the accuracy of information its AI provides (British Columbia Civil Resolution Tribunal, 2024; McCarthy Tétrault, 2024). While the CRT is a small claims-level tribunal whose rulings do not bind other courts, the decision exposed a gap in how Canadian consumer protection frameworks address AI intermediaries and attracted extensive legal commentary on the negligent misrepresentation standard applied to automated systems (McCarthy Tétrault, 2024).
Entities Involved
AI Systems Involved
Provided inaccurate information to a passenger about bereavement fare policy, stating fares could be applied retroactively within 90 days when the actual policy required pre-travel approval
Related Records
Taxonomyassessed
AIID: Incident #639
Changelog
| Version | Date | Change |
|---|---|---|
| v1 | Mar 8, 2026 | Initial publication |
| v2 | Mar 11, 2026 | Corrected 'separate legal entity' as tribunal's characterization not AC's words; reframed precedential weight (CRT is non-binding small claims tribunal); fixed pronouns to match decision; removed fabricated policy recommendation; corrected McCarthy Tétrault date; refined date precision |
| v3 | Mar 11, 2026 | Corrected source titles to match actual headlines; added claim_supported and relevance to sources; rewrote policy_recommendations as forward-looking prescriptions; strengthened why_this_matters analysis; recalibrated harm severity to low |
| v4 | Mar 11, 2026 | Added CRT decision as primary court source; merged overlapping harms; noted chatbot nature was not established at tribunal; changed ai_pathways to deployment_context only; fixed McCarthy source_type |
| v5 | Mar 11, 2026 | Fixed exact damages figures ($650.88 + $36.14 interest + $125 fees = $812.02); corrected UBC Law Review dating (2025, not 2024); completed truncated harm description |