Ce site est un prototype en cours de développement.
Confirmé Sévérité : Important Version 1

A major consulting firm used AI to generate research citations in a $1.6 million government health policy document, some of which were found to be fabricated. The incident illustrates how LLM confabulation can reach consequential policy decisions through established institutional channels.

Survenu : 29 mai 2025 Signalé : 22 novembre 2025

Récit

In May 2025, the Government of Newfoundland and Labrador released a 526-page Health Human Resources Plan commissioned from Deloitte at a cost of nearly $1.6 million. The plan was intended to guide a decade of workforce planning across 21 healthcare occupations.

In November 2025, The Independent (NL) reported that the document contained fabricated academic citations. Professor Emerita Martha MacLeod of the University of Northern British Columbia confirmed that a cited paper — “The cost-effectiveness of a rural retention program for registered nurses in Canada” — was “false” and “potentially AI-generated,” noting that while her team had done rural nursing research, they had never conducted a cost-effectiveness analysis. Professor Gail Tomblin Murphy of Dalhousie University confirmed another cited paper “does not exist,” adding that only three of the six listed co-authors had ever worked together. A third citation, purportedly from the Canadian Journal of Respiratory Therapy, could not be found in academic databases.

Deloitte responded that “AI was not used to write the report” but was “selectively used to support a small number of research citations,” and stated it would issue corrections that “do not impact the report findings.” The Premier and Health Minister did not respond to media inquiries. In June 2025 — one month after the report’s release — Deloitte had been selected for an additional contract: a core staffing review of nursing resources.

The incident follows a parallel case in July 2025 where a Deloitte Australia welfare report was found to contain a fabricated court quote and nonexistent research, for which Deloitte refunded AUD $290,000. That report’s appendix disclosed the use of Azure OpenAI.

Préjudices

A 526-page government-commissioned health workforce plan, intended to guide a decade of staffing decisions across 21 healthcare occupations in Newfoundland and Labrador, contained fabricated academic citations — including papers that real researchers confirmed do not exist, undermining the evidentiary basis for provincial health policy.

Important Secteur

Real researchers were falsely attributed authorship of nonexistent papers. Professor Emerita Martha MacLeod (UNBC) and Professor Gail Tomblin Murphy (Dalhousie) were named as authors of fabricated studies, damaging their professional reputations and lending false credibility to policy recommendations.

Modéré Individuel

Populations touchées

  • healthcare workers in Newfoundland and Labrador affected by workforce planning decisions
  • researchers falsely attributed as authors of fabricated citations
  • residents of Newfoundland and Labrador relying on health system planning

Entités impliquées

Deloitte Canada
deployerdeveloper

Mandaté pour près de 1,6 M$ pour produire un plan de ressources humaines en santé de 526 pages pour Terre-Neuve-et-Labrador ; a admis que l'IA a été « utilisée de manière sélective pour appuyer un petit nombre de citations de recherche » et a annoncé des corrections

A commandé et publié le rapport Deloitte comme politique provinciale officielle de santé ; n'a pas répondu aux demandes des médias concernant les citations fabriquées

Contexte du système d'IA

Deloitte used an unidentified AI system to generate research citations for a government health workforce plan. The AI fabricated citations to nonexistent academic papers, attributing them to real researchers. A parallel Deloitte Australia report disclosed Azure OpenAI usage, but the specific tool used in the NL report has not been confirmed.

Mesures préventives

  • Require consulting contracts for government policy work to disclose any AI use in research, analysis, or writing, with specific documentation of which sections involved AI assistance
  • Mandate independent verification of all cited research in government-commissioned reports before publication
  • Establish contractual penalties for submission of fabricated evidence in government consulting deliverables
  • Require AI-assisted research workflows to include human verification of every citation against primary sources

Matérialisé à partir de

Taxonomie

Domaine
SantéServices publics
Type de préjudice
DésinformationDéfaillance opérationnelle
Implication de l'IA
Confabulation du modèleDéfaillance de déploiement
Phase du cycle de vie
DéploiementSurveillance

Sources

  1. Major N.L. healthcare report contains errors likely generated by A.I. Média — The Independent (22 nov. 2025)
  2. Deloitte caught with fabricated, AI-generated research in million-dollar report for Canada government Média — Fortune (25 nov. 2025)
  3. Government Releases Health Human Resources Plan Officiel — Government of Newfoundland and Labrador (29 mai 2025)

AIID : Incident #1286

Historique des modifications

VersionDateModification
v1 8 mars 2026 Initial publication based on AIID cross-reference scan