This site is a work-in-progress prototype.
Confirmed Severity: Significant Version 1

A major consulting firm used AI to generate research citations in a $1.6 million government health policy document, some of which were found to be fabricated. The incident illustrates how LLM confabulation can reach consequential policy decisions through established institutional channels.

Occurred: May 29, 2025 Reported: November 22, 2025

Narrative

In May 2025, the Government of Newfoundland and Labrador released a 526-page Health Human Resources Plan commissioned from Deloitte at a cost of nearly $1.6 million. The plan was intended to guide a decade of workforce planning across 21 healthcare occupations.

In November 2025, The Independent (NL) reported that the document contained fabricated academic citations. Professor Emerita Martha MacLeod of the University of Northern British Columbia confirmed that a cited paper — “The cost-effectiveness of a rural retention program for registered nurses in Canada” — was “false” and “potentially AI-generated,” noting that while her team had done rural nursing research, they had never conducted a cost-effectiveness analysis. Professor Gail Tomblin Murphy of Dalhousie University confirmed another cited paper “does not exist,” adding that only three of the six listed co-authors had ever worked together. A third citation, purportedly from the Canadian Journal of Respiratory Therapy, could not be found in academic databases.

Deloitte responded that “AI was not used to write the report” but was “selectively used to support a small number of research citations,” and stated it would issue corrections that “do not impact the report findings.” The Premier and Health Minister did not respond to media inquiries. In June 2025 — one month after the report’s release — Deloitte had been selected for an additional contract: a core staffing review of nursing resources.

The incident follows a parallel case in July 2025 where a Deloitte Australia welfare report was found to contain a fabricated court quote and nonexistent research, for which Deloitte refunded AUD $290,000. That report’s appendix disclosed the use of Azure OpenAI.

Harms

A 526-page government-commissioned health workforce plan, intended to guide a decade of staffing decisions across 21 healthcare occupations in Newfoundland and Labrador, contained fabricated academic citations — including papers that real researchers confirmed do not exist, undermining the evidentiary basis for provincial health policy.

Significant Sector

Real researchers were falsely attributed authorship of nonexistent papers. Professor Emerita Martha MacLeod (UNBC) and Professor Gail Tomblin Murphy (Dalhousie) were named as authors of fabricated studies, damaging their professional reputations and lending false credibility to policy recommendations.

Moderate Individual

Affected Populations

  • healthcare workers in Newfoundland and Labrador affected by workforce planning decisions
  • researchers falsely attributed as authors of fabricated citations
  • residents of Newfoundland and Labrador relying on health system planning

Entities Involved

Deloitte Canada
deployerdeveloper

Contracted for nearly $1.6 million to produce a 526-page Health Human Resources Plan for Newfoundland and Labrador; admitted AI was 'selectively used to support a small number of research citations' and stated it would issue corrections

Commissioned and published the Deloitte report as official provincial health policy; did not respond to media inquiries about the fabricated citations

AI System Context

Deloitte used an unidentified AI system to generate research citations for a government health workforce plan. The AI fabricated citations to nonexistent academic papers, attributing them to real researchers. A parallel Deloitte Australia report disclosed Azure OpenAI usage, but the specific tool used in the NL report has not been confirmed.

Preventive Measures

  • Require consulting contracts for government policy work to disclose any AI use in research, analysis, or writing, with specific documentation of which sections involved AI assistance
  • Mandate independent verification of all cited research in government-commissioned reports before publication
  • Establish contractual penalties for submission of fabricated evidence in government consulting deliverables
  • Require AI-assisted research workflows to include human verification of every citation against primary sources

Materialized From

Taxonomy

Domain
HealthcarePublic Services
Harm type
MisinformationOperational Failure
AI involvement
Model ConfabulationDeployment Failure
Lifecycle phase
DeploymentMonitoring

Sources

  1. Major N.L. healthcare report contains errors likely generated by A.I. Media — The Independent (Nov 22, 2025)
  2. Deloitte caught with fabricated, AI-generated research in million-dollar report for Canada government Media — Fortune (Nov 25, 2025)
  3. Government Releases Health Human Resources Plan Official — Government of Newfoundland and Labrador (May 29, 2025)

AIID: Incident #1286

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication based on AIID cross-reference scan