Three Ontario Regional Police Services Built a Shared Facial Recognition Database of 1.6 Million Images
York Regional Police and Peel Regional Police jointly deployed IDEMIA facial recognition in May 2024, followed by Halton Regional Police in December 2025. The three services share a database of 1.6 million mugshots; they state matches are treated as investigative leads reviewed by trained analysts. Civil liberties organizations called for a moratorium on police facial recognition in Canada.
On May 27, 2024, York Regional Police (YRP) and Peel Regional Police (PRP) jointly deployed IDEMIA facial recognition technology through a shared procurement partnership (CBC News, 2024; York Regional Police, 2024). The system allows officers to compare images of suspects or persons of interest against a shared mugshot database containing booking photos held by both services (CBC News, 2024; York Regional Police, 2024). IDEMIA, a French multinational biometrics company, was selected as the vendor.
In February 2025, Halton Regional Police Service awarded IDEMIA a $1.18 million, five-year contract ($362,764 for installation and first-year maintenance; $180,643 per year thereafter) (Biometric Update, 2025). The system went live for Halton in December 2025, expanding the shared database to approximately 1.6 million mugshots and tattoo images across all three services (Biometric Update, 2025).
The system automates what was previously a manual image comparison process. Officers submit images of suspects, which the IDEMIA software compares against the shared mugshot database using neural network-based facial recognition. All potential matches are treated as investigative leads — not confirmations of identity — and must be reviewed by trained facial recognition analysts (York Regional Police, 2024). The services state the system is not used for real-time surveillance, live video analysis, crowd monitoring, or scraping internet or social media images (York Regional Police, 2024).
The deployment was informed by guidance published by the Information and Privacy Commissioner of Ontario (IPC) in January 2024, titled "Facial Recognition and Mugshot Databases: Guidance for Police in Ontario." The IPC guidance states that police must ensure lawful authority before deploying facial recognition, conduct Privacy Impact Assessments, limit use to serious crimes, regularly purge non-conviction records from mugshot databases, and maintain public transparency through regular audits and public reporting (Information and Privacy Commissioner of Ontario, 2024). Both York and Peel stated they consulted with the IPC during implementation (CBC News, 2024). Peel Regional Police published a PIA summary document.
The Canadian Civil Liberties Association raised strong objections (Canadian Civil Liberties Association, 2024). Director of fundamental freedoms Anaïs Bussières McNicoll stated: "Until there are clear and transparent policies and laws regulating the use of facial recognition technology in Canada, it should not be used by law enforcement agencies." Brenda McPhail, former director of the CCLA's Privacy, Technology and Surveillance Program, has stated that facial recognition technology "facilitates mass surveillance, is harmful to privacy, is also racially biased and feeds systemic racism in policing" (Canadian Civil Liberties Association, 2024).
CBC News reported in June 2024 that Nijeer Parks, a Black man in New Jersey, spent 10 days wrongfully jailed in 2019 after an alleged misidentification by IDEMIA facial recognition technology — the same vendor now deployed by York and Peel police (CBC News, 2024). According to the lawsuit filed by Parks, he was arrested for shoplifting and assault based on a false facial recognition match (CBC News, 2024). The charges were eventually dropped. This case received significant media attention in Ontario and raised questions about racial bias in the system now being used by Canadian police (CBC News, 2024).
Separately, Toronto Police Service published a request for proposals in December 2024 to upgrade its own existing facial recognition system (CBC News, 2024). Eleven vendors expressed interest, including IDEMIA, NEC Corporation of America, and Facia AI Ltd. The solicitation closed February 14, 2025.
Materialized From
Harms
Three Ontario regional police services built a shared facial recognition database of 1.6 million mugshots and tattoo images, deploying neural network-based biometric matching without federal AI legislation governing such systems. The same IDEMIA technology was linked to the wrongful arrest and 10-day imprisonment of a Black man in New Jersey.
Facial recognition technology exhibits documented racial, gender, and age bias in accuracy rates, as established by NIST evaluations and independent research. Civil liberties organizations have raised concerns that the deployment of a shared database across three police services could result in discriminatory misidentification affecting racialized communities in Ontario.
Evidence
8 reports
-
York and Peel Regional Police jointly deployed IDEMIA facial recognition through shared procurement; system compares suspect images against shared mugshot database
-
IPC Ontario guidance on police use of facial recognition and mugshot databases; governance framework and privacy requirements
-
CCLA position on facial recognition technology; civil liberties concerns with police biometric surveillance
-
York Regional Police public information on facial recognition technology deployment; stated policies and limitations
-
IDEMIA linked to wrongful arrests in the United States; concerns about accuracy and racial bias in the technology deployed by Toronto-area police
-
Broader reporting on facial recognition adoption by Canadian police forces; context on national trends and privacy concerns
-
Canadian police expanding use of IDEMIA facial recognition; technical details of the shared database system
-
IDEMIA system going live for Canadian regional police; 1.6 million image database; Durham Regional Police joining shared system
Record details
Responses & Outcomes
Published guidance on facial recognition and mugshot databases for Ontario police, setting out requirements for lawful authority, PIAs, serious crime limitation, mugshot purging, and public transparency
Guidance-level only — no binding legal framework. York and Peel stated they consulted with IPC during implementation.
Editorial Assessment assessed
This deployment represents the quiet normalization of police facial recognition in Canada through incremental expansion. Three Ontario regional police services now share a 1.6 million-image database (Biometric Update, 2025), and Toronto is procuring its own system (CBC News, 2024). Each expansion occurs within guidance-level governance — IPC recommendations rather than binding legislation (Information and Privacy Commissioner of Ontario, 2024) — in a jurisdiction where no federal AI law exists. The IDEMIA system's documented link to a wrongful arrest in New Jersey illustrates the technology's potential for discriminatory harm (CBC News, 2024), and the shared database model means misidentifications could propagate across multiple police jurisdictions (Biometric Update, 2025).
Entities Involved
AI Systems Involved
Neural network-based facial recognition system that compares suspect images against a shared database of 1.6 million mugshots and tattoo images across York, Peel, and Halton regional police services
Related Records
- RCMP Use of Clearview AI Facial Recognition Without Privacy Assessmentrelated
- Edmonton Police First to Deploy Facial Recognition Body Cameras; Privacy Commissioner Says Approval Not Obtainedrelated
- AI Governance Gap in Canadarelated
Taxonomyassessed
Changelog
| Version | Date | Change |
|---|---|---|
| v1 | Mar 10, 2026 | Record created from public sources. Agent-draft — requires editorial review before publication. |
| v2 | Mar 11, 2026 | Neutrality and factuality review: corrected Brenda McPhail's title in FR (was 'chercheuse en vie privée', should be 'ancienne directrice du Programme de vie privée, technologie et surveillance de l'ACLC'); specified IPC guidance date as January 2024 in EN; qualified Nijeer Parks/IDEMIA link as coming from lawsuit allegations; reframed harm #2 to attribute bias concerns to NIST research and civil liberties organizations rather than presenting as editorial assertion. |