Enquête conjointe des commissaires à la vie privée examinant si OpenAI a enfreint la loi canadienne
Quatre commissaires à la vie privée canadiens enquêtent conjointement sur la conformité de l'entraînement de ChatGPT aux lois sur la vie privée.
In April 2023, Canada's Privacy Commissioner launched an investigation into OpenAI after receiving a complaint about ChatGPT's handling of personal information (Office of the Privacy Commissioner of Canada, 2023; CBC News, 2023). The investigation was subsequently joined by privacy commissioners in Quebec, British Columbia, and Alberta in May 2023, making it one of the first joint federal-provincial privacy investigations into a large language model (Office of the Privacy Commissioner of Canada, 2023).
The investigation is examining whether OpenAI violated the Personal Information Protection and Electronic Documents Act (PIPEDA) on multiple grounds: collecting personal information of Canadians without consent through web scraping to build training datasets, failing to ensure the accuracy of personal information generated by ChatGPT, and lacking transparency about how personal data was collected, used, and processed. The scope includes ChatGPT's generation of false biographical statements about identifiable Canadians and whether this constitutes a failure to meet accuracy obligations under Canadian privacy law.
As of early 2026, the investigation remains ongoing. Privacy Commissioner Philippe Dufresne described it as his "ongoing investigation into OpenAI" in a February 2026 statement to Parliament. The investigation is expected to address whether companies deploying large language models in Canada bear privacy obligations for the outputs those systems generate — not just the data they consume.
The investigation addresses a tension in generative AI: systems trained on vast internet data typically absorb personal information about real people, and their probabilistic text generation can produce confidently stated falsehoods about identifiable individuals. The outcome of this investigation will help determine whether current Canadian privacy frameworks have applicability to these novel AI harms.
Matérialisé à partir de
Préjudices
OpenAI aurait collecté des renseignements personnels de Canadiens sans consentement par moissonnage du Web pour constituer les ensembles de données d'entraînement de ChatGPT, et n'aurait pas assuré la transparence quant à la manière dont les données personnelles ont été collectées, utilisées et traitées.
Il a été rapporté que ChatGPT génère de fausses déclarations biographiques sur des Canadiens identifiables, présentant des détails personnels fabriqués avec une apparente assurance, constituant un possible manquement aux obligations d'exactitude en vertu de la loi canadienne sur la protection de la vie privée.
Preuves
3 rapports
- Privacy Commissioner launches investigation into ChatGPT Source principale
OPC launched investigation into OpenAI/ChatGPT in April 2023 after receiving a complaint about handling of personal information
-
Media reporting on the OPC investigation launch; context on privacy concerns with ChatGPT in Canada
-
Quebec, BC, and Alberta privacy commissioners joined the investigation in May 2023; joint provincial-federal investigation into ChatGPT
Détails de la fiche
Réponses et résultats
Launched formal investigation into OpenAI's ChatGPT after receiving a complaint about its handling of personal information
Expanded investigation into a joint federal-provincial effort with privacy commissioners of Quebec, British Columbia, and Alberta
Évaluation éditoriale évalué
Une enquête conjointe des commissaires à la vie privée fédéral et provinciaux — la première portant sur un grand modèle de langage au Canada — examine si la collecte et la génération de renseignements personnels sur des Canadiens par OpenAI enfreignent la loi canadienne sur la protection de la vie privée (OPC, 2023; Office of the Privacy Commissioner of Canada, 2023).
Entités impliquées
Systèmes d'IA impliqués
The AI system under investigation for its training data collection practices and its generation of false personal information about identifiable Canadians
Fiches connexes
- Ontario Man Alleges ChatGPT's Persistent Affirmation Triggered Delusional Episoderelated
- Joint Privacy Investigation Finds TikTok Collected Children's Data for Algorithmic Profiling and Targeted Advertisingrelated
Taxonomieévalué
Historique des modifications
| Version | Date | Modification |
|---|---|---|
| v1 | 8 mars 2026 | Initial publication |
| v2 | 11 mars 2026 | Neutrality and factuality review: removed three fabricated policy recommendation attributions (the cited dates correspond to investigation launch announcements, not OPC recommendations; the investigation remains ongoing with no final report or recommendations published). Narrative facts verified against OPC primary sources — no changes needed. |