Phase pilote : CAIM est en construction. Les fiches sont provisoires, basées sur des sources publiques, et n’ont pas encore été révisées par des pairs. Commentaires bienvenus.
Confirmé Mineur

Un tribunal a jugé Air Canada responsable après que son chatbot a fourni des informations inexactes sur sa politique de tarifs de deuil, créant un précédent en Colombie-Britannique.

Survenu: 11 novembre 2022 Signalé: 14 février 2024

In November 2022, the complainant used Air Canada's website chatbot to ask about bereavement fare policies following the death of their grandmother (British Columbia Civil Resolution Tribunal, 2024). The chatbot indicated they could book a regular-priced flight and request a retroactive bereavement discount within 90 days of the ticket issue date (British Columbia Civil Resolution Tribunal, 2024). Relying on this information, the complainant booked a flight and later submitted a bereavement fare claim. Air Canada denied the claim, stating that its actual policy did not allow retroactive bereavement fare applications — the discount could not be applied after travel had already occurred (British Columbia Civil Resolution Tribunal, 2024).

When the complainant challenged the denial, Air Canada argued that it could not be held liable for information provided by its agents or representatives, including a chatbot (CBC News, 2024). The tribunal characterized this position as, in effect, suggesting the chatbot was "a separate legal entity that is responsible for its own actions" — an argument the tribunal member called "remarkable" (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). The British Columbia Civil Resolution Tribunal rejected this argument in its February 14, 2024 decision (Moffatt v. Air Canada), ruling that Air Canada is responsible for all information on its website, whether from a static page or a chatbot (British Columbia Civil Resolution Tribunal, 2024). The tribunal found Air Canada liable for negligent misrepresentation and awarded the complainant $650.88 in damages, plus $36.14 in interest and $125 in tribunal fees, for a total of $812.02 (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024).

The ruling held that Air Canada could not deploy a chatbot for customer service and then disclaim responsibility when the chatbot provided false information (McCarthy Tétrault, 2024). Air Canada did not provide evidence about the nature of its chatbot technology to the tribunal, and legal commentators noted the decision did not establish whether the system was AI-powered or rules-based (McCarthy Tétrault, 2024). The tribunal found Air Canada had a duty to ensure the accuracy of its chatbot's responses (British Columbia Civil Resolution Tribunal, 2024). The decision, while from a small claims-level tribunal whose rulings are not binding on other courts, received extensive commentary from legal scholars and practitioners — including analyses by firms such as McCarthy Tétrault and Dentons in 2024, and a UBC Law Review case comment in 2025 — as a notable early ruling on corporate liability for AI-generated customer communications (McCarthy Tétrault, 2024).

Matérialisé à partir de

Préjudices

Le chatbot d'Air Canada a fourni des informations inexactes sur la politique de tarifs de deuil, indiquant à un passager qu'il pouvait demander un rabais rétroactif dans les 90 jours, alors que la politique réelle exigeait une approbation avant le voyage. Le passager a réservé un vol à plein tarif sur la foi de ces informations et s'est vu refuser le rabais. Le tribunal a accordé 650,88 $ en dommages-intérêts, plus 36,14 $ en intérêts et 125 $ en frais de tribunal, pour un total de 812,02 $.

DésinformationPréjudice économiqueMineurIndividuel

Preuves

3 rapports

  1. Judiciaire — British Columbia Civil Resolution Tribunal (14 févr. 2024)

    The tribunal decision itself: establishes the facts, Air Canada's arguments, the negligent misrepresentation finding, and the damages award.

  2. Média — CBC News (15 févr. 2024)

    Documents the CRT ruling, chatbot misrepresentation, the separate legal entity argument, and the ~$650 damages award.

  3. Autre — McCarthy Tétrault (16 févr. 2024)

    Legal analysis of corporate liability for AI chatbot misrepresentation, negligence framework, and implications for businesses deploying AI tools.

Détails de la fiche

Réponses et résultats

Air Canadacourt decisionActif

Found liable by the BC Civil Resolution Tribunal for negligent misrepresentation; ordered to pay approximately $650 in damages plus interest and tribunal fees (Moffatt v. Air Canada)

Recommandations de politiqueévalué

Organizations deploying AI chatbots for customer-facing communications should treat chatbot outputs as legally attributable corporate representations and implement content accuracy governance accordingly.

British Columbia Civil Resolution Tribunal (Moffatt v. Air Canada, 2024 BCCRT 149) (14 févr. 2024)

Businesses deploying AI chatbots should audit chatbot responses against current corporate policies, particularly for financially consequential topics such as fares, refund eligibility, and warranty terms.

McCarthy Tétrault (legal commentary by Barry Sookman) (16 févr. 2024)

Évaluation éditoriale évalué

Il s'agit de l'une des premières décisions juridictionnelles canadiennes à rejeter la tentative d'une entreprise de se soustraire à la responsabilité pour des communications générées par l'IA destinées aux clients (British Columbia Civil Resolution Tribunal, 2024; CBC News, 2024). Le TRC a statué que le déploiement d'un chatbot ne crée pas un bouclier de responsabilité — l'entreprise demeure responsable de l'exactitude des informations fournies par son IA (British Columbia Civil Resolution Tribunal, 2024; McCarthy Tétrault, 2024). Bien que le TRC soit un tribunal de petites créances dont les décisions ne lient pas les autres tribunaux, la décision a mis en lumière une lacune dans la manière dont les cadres canadiens de protection des consommateurs traitent les intermédiaires IA (CBC News, 2024) et a suscité de nombreux commentaires juridiques sur la norme de déclaration inexacte par négligence appliquée aux systèmes automatisés (McCarthy Tétrault, 2024).

Entités impliquées

Air Canada
deployer

Systèmes d'IA impliqués

Air Canada Customer Service Chatbot

Provided inaccurate information to a passenger about bereavement fare policy, stating fares could be applied retroactively within 90 days when the actual policy required pre-travel approval

Fiches connexes

Taxonomieévalué

Domaine
Transport
Type de préjudice
DésinformationPréjudice économique
Voie de contribution de l'IA
Contexte de déploiement
Phase du cycle de vie
DéploiementSurveillance

AIID : Incident #639

Historique des modifications

Historique des modifications
VersionDateModification
v18 mars 2026Initial publication
v211 mars 2026Corrected 'separate legal entity' as tribunal's characterization not AC's words; reframed precedential weight (CRT is non-binding small claims tribunal); fixed pronouns to match decision; removed fabricated policy recommendation; corrected McCarthy Tétrault date; refined date precision
v311 mars 2026Corrected source titles to match actual headlines; added claim_supported and relevance to sources; rewrote policy_recommendations as forward-looking prescriptions; strengthened why_this_matters analysis; recalibrated harm severity to low
v411 mars 2026Added CRT decision as primary court source; merged overlapping harms; noted chatbot nature was not established at tribunal; changed ai_pathways to deployment_context only; fixed McCarthy source_type
v511 mars 2026Fixed exact damages figures ($650.88 + $36.14 interest + $125 fees = $812.02); corrected UBC Law Review dating (2025, not 2024); completed truncated harm description

Version 5