This site is a work-in-progress prototype.
Confirmed Severity: Significant Version 1

The federal tax authority spent $18 million on an AI chatbot that the Auditor General found gave incorrect answers to basic tax questions. The chatbot processed over 18 million queries, raising concerns about the accuracy of tax information provided to Canadians through the system.

Occurred: February 1, 2020 (month) Reported: December 12, 2025

Narrative

The Canada Revenue Agency launched Charlie, an AI-powered chatbot, in February 2020 to answer taxpayer questions about tax filing, benefits, and CRA services. By the time of the Auditor General’s review in December 2025, the system had processed over 18 million questions at a cost of approximately $18 million.

The Auditor General’s December 2025 report found significant accuracy problems. When tested with common taxpayer questions, the chatbot answered only two out of six questions correctly. The incorrect responses included wrong information about tax obligations, filing requirements, and CRA procedures — precisely the types of questions taxpayers rely on the chatbot to answer accurately.

With 18 million queries processed, the error rate identified by the Auditor General raises concerns about the accuracy of tax information provided to Canadians through the system. Taxpayers who relied on Charlie’s responses may have received incorrect information about filing requirements, deadlines, or eligibility for benefits. The CRA chatbot carries the implicit authority of the federal tax agency, and users have no straightforward way to know when the chatbot’s answer is wrong.

In November 2025, following the Auditor General’s findings, the CRA upgraded Charlie to a generative AI version and reported pre-release testing showed approximately 90% accuracy, though the agency acknowledged it could not confirm real-world accuracy. The Secretary of State for CRA pushed back against the Auditor General’s characterization of the chatbot’s performance.

Harms

CRA's chatbot Charlie answered only two out of six common taxpayer questions correctly when tested by the Auditor General, providing wrong information about tax obligations, filing requirements, and CRA procedures to an estimated 18 million queries.

Significant Population

CRA spent $18 million on an AI chatbot without adequate accuracy testing or ongoing quality monitoring, deploying it as an official government information source that carried the implicit authority of the federal tax agency.

Moderate Population

Affected Populations

  • Canadian taxpayers
  • tax professionals

Entities Involved

Deployed the Charlie AI chatbot in February 2020 to answer taxpayer questions; spent $18 million on the system which the Auditor General found gave incorrect answers to basic tax questions

AI Systems Involved

CRA AI Chatbot

AI-powered chatbot 'Charlie' deployed to answer taxpayer questions about tax filing, benefits, and CRA services; processed over 18 million questions and was found by the Auditor General to answer only two out of six test questions correctly

Responses & Outcomes

Canada Revenue Agency

Upgraded Charlie to a generative AI version following the Auditor General's findings; reported pre-release testing showed approximately 90% accuracy but acknowledged it could not confirm real-world accuracy

AI System Context

Charlie, an AI-powered chatbot deployed by the Canada Revenue Agency to answer taxpayer questions about tax filing, benefits, and CRA services. The system processed over 18 million questions between its 2020 launch and the Auditor General's 2025 review.

Preventive Measures

  • Require rigorous accuracy testing of government AI chatbots before deployment, including domain-expert validation of responses to common queries
  • Implement ongoing quality monitoring that tests AI chatbot accuracy against verified answers on a regular schedule, with public reporting of accuracy rates
  • Mandate that government AI chatbots clearly disclose limitations and direct users to authoritative sources for consequential information such as tax obligations
  • Establish accountability mechanisms for AI system procurement, including performance benchmarks that must be met before continued investment

Related Records

Taxonomy

Domain
Public ServicesFinance & Banking
Harm type
MisinformationOperational Failure
AI involvement
Deployment FailureMonitoring Gap
Lifecycle phase
DeploymentMonitoring

Sources

  1. In scathing report, AG finds CRA call centres are slow to answer and often inaccurate Media — CBC News (Dec 12, 2025)
  2. Auditor General Slams Ottawa's $18 Million CRA Chatbot 'Charlie' Media — iPhone in Canada (Dec 12, 2025)
  3. The CRA spent $18M on 'Charlie,' a new tax information chatbot that is wrong most of the time Media — Unpublished (Dec 12, 2025)
  4. CRA must fix human responses before pursuing AI, experts say Media — 980 CJME (Oct 25, 2025)

AIID: Incident #1310

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication