This site is a work-in-progress prototype.
Confirmed Severity: Moderate Version 1

A Canadian tribunal established that companies are liable for information provided by their AI chatbots — a precedent-setting ruling that applies to all businesses deploying AI customer service tools in Canada.

Occurred: November 1, 2022 (month) Reported: February 14, 2024

Narrative

In November 2022, Jake Moffatt used Air Canada’s website chatbot to ask about bereavement fare policies following the death of a family member. The chatbot told Moffatt he could book a regular-priced flight and request a retroactive bereavement discount within 90 days of the ticket issue date. Relying on this information, Moffatt booked a flight and later submitted a bereavement fare claim. Air Canada denied the claim, stating that its actual policy requires bereavement fares to be approved before travel, not retroactively.

When Moffatt challenged the denial, Air Canada argued that the chatbot was a “separate legal entity” responsible for its own statements, and that the airline should not be held liable for the chatbot’s inaccurate information. The British Columbia Civil Resolution Tribunal rejected this argument in its February 14, 2024 decision (Moffatt v. Air Canada), ruling that Air Canada is responsible for all information on its website, whether from a static page or a chatbot. The tribunal found Air Canada liable for negligent misrepresentation and awarded Moffatt approximately $650 in damages plus interest and tribunal fees.

The ruling established a significant legal precedent: a company cannot deploy an AI chatbot for customer service and then disclaim responsibility when the chatbot provides false information. Air Canada had a duty to ensure the accuracy of its chatbot’s responses, and its failure to do so constituted negligent misrepresentation. The decision has been widely cited by legal scholars and practitioners as a reference point for corporate liability for AI systems in Canadian law.

Harms

Air Canada's chatbot provided false information about bereavement fare policy, telling a passenger he could request a retroactive discount within 90 days when the actual policy required pre-travel approval.

Moderate Individual

The passenger booked a full-price flight based on the chatbot's inaccurate information and was denied the bereavement fare discount, resulting in approximately $650 in excess costs.

Moderate Individual

Affected Populations

  • airline passengers
  • consumers interacting with corporate AI chatbots

Entities Involved

Air Canada
deployer

Deployed a customer service chatbot on its website that provided inaccurate bereavement fare policy information, then argued the chatbot was a 'separate legal entity' to disclaim liability

AI Systems Involved

Air Canada Customer Service Chatbot

Provided inaccurate information to a passenger about bereavement fare policy, stating fares could be applied retroactively within 90 days when the actual policy required pre-travel approval

Responses & Outcomes

Air Canada

Found liable by the BC Civil Resolution Tribunal for negligent misrepresentation; ordered to pay approximately $650 in damages plus interest and tribunal fees (Moffatt v. Air Canada)

AI System Context

Air Canada's customer service chatbot, deployed on the airline's website to answer passenger queries about policies, bookings, and services.

Preventive Measures

  • Require businesses deploying AI chatbots to implement accuracy verification systems for information the chatbot provides, particularly for policies with financial consequences
  • Establish regulatory guidance clarifying that companies bear full legal responsibility for statements made by their AI customer service tools
  • Mandate that AI chatbots clearly disclose their automated nature and provide easy escalation paths to human agents for consequential decisions

Related Records

Taxonomy

Domain
Transportation
Harm type
MisinformationEconomic Harm
AI involvement
Model ConfabulationDeployment Failure
Lifecycle phase
DeploymentMonitoring

Sources

  1. Air Canada found liable for chatbot's errors in landmark ruling Media — CBC News (Feb 15, 2024)
  2. Moffatt v Air Canada: Misrepresentation by AI Chatbot Other — McCarthy Tétrault (Feb 16, 2024)

AIID: Incident #639

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication