Pilot phase: CAIM is under construction. Records are provisional, based on public sources, and have not yet been peer-reviewed. Feedback welcome.
Active Significant Confidence: medium

Montreal police acquired AI video surveillance with built-in ethnicity and emotion detection — capabilities activable by configuration, without public disclosure or impact assessment.

Identified: December 1, 2025 Last assessed: March 8, 2026

In December 2025, reporting by Pivot, a Quebec civil liberties organization, revealed that the Service de police de la Ville de Montréal (SPVM) had acquired a $1.8 million, five-year AI video analysis platform from iMotion Security. The SPVM initially refused to disclose which software the platform used or to release the privacy impact assessment that authorized the procurement.

Subsequent investigative reporting in February 2026 identified the software as Rank One Computing's (ROC) video analytics platform, an American-made system deployed across 46 cameras. The ROC software's documented capabilities include search by clothing and vehicle attributes, but also built-in biometric features: facial recognition, age estimation, ethnicity detection, gender classification, facial hair detection, and emotion analysis. These capabilities are part of the software's standard feature set and can be toggled on or off through configuration rather than requiring new procurement or hardware changes.

The SPVM stated that biometric identification features are "not part of the current context of use." However, civil liberties organizations have raised concerns that the capabilities exist within the deployed software and could be activated at any time through a configuration change — without additional procurement, public consultation, or legislative authorization. The absence of a publicly available privacy impact assessment, the initial refusal to name the software vendor, and the gap between the platform's full capabilities and the SPVM's stated use case create a significant transparency deficit.

The deployment raises specific questions in light of documented racial bias in facial recognition systems. Research has shown that facial recognition algorithms, including those comparable to ROC's, exhibit significantly higher error rates for Black individuals and women. In a city where policing of racialized communities is an active public concern, the acquisition of AI surveillance technology with ethnicity detection capabilities — even if claimed to be currently disabled — represents a meaningful hazard.

Harms

SPVM acquired a $1.8 million AI video analysis platform from iMotion Security whose underlying software (Rank One Computing) includes built-in biometric capabilities — facial recognition, ethnicity detection, emotion analysis — that can be activated through configuration changes without additional procurement or public consultation.

Disproportionate SurveillancePrivacy & Data ExposureSignificantPopulation

The SPVM initially refused to disclose the software used or release the privacy impact assessment. No publicly available PIA authorizing the deployment has been produced, and no public consultation was conducted before deploying AI surveillance in public spaces.

Disproportionate SurveillanceAutonomy UnderminedSignificantPopulation

Evidence

3 reports

  1. Other — Pivot (Dec 8, 2025)

    Pivot investigation: SPVM acquired intrusive AI surveillance technology from iMotion Security; initial disclosure of the $1.8 million procurement

  2. Media — The Concordian (Feb 1, 2026)

    Concordian reporting: SPVM's AI video surveillance platform uses American software (Rank One Computing) with built-in facial recognition and ethnicity detection capabilities

  3. Media — Biometric Update (Feb 1, 2026)

    Biometric Update reporting: iMotion deploying ROC video analytics for Montreal police; technical details of the surveillance system

Record details

Responses & Outcomes

Service de police de la Ville de MontréallegislationActive

Stated that biometric identification features are not part of the current context of use, but initially refused to disclose the software vendor or release the privacy impact assessment

Policy Recommendationsassessed

Require public disclosure and independent review of all AI surveillance technology procured by police services, including the specific software, its full capabilities, and any privacy impact assessments

Pivot (Dec 8, 2025)

Establish municipal bylaws or provincial legislation prohibiting activation of biometric identification features in police surveillance systems without explicit legislative authorization

Pivot (Dec 8, 2025)

Require community consultation before police deploy AI-powered surveillance systems in public spaces, with particular attention to the impact on racialized communities

Pivot (Dec 8, 2025)

Editorial Assessment assessed

Montreal's police force acquired an AI surveillance platform whose software includes built-in biometric capabilities — ethnicity and emotion detection — that can be activated through configuration. The specific software and privacy impact assessment were not initially disclosed to the public. Civil liberties organizations and the Quebec AI ethics commission have raised concerns about the procurement process and the potential for capability expansion.

Entities Involved

AI Systems Involved

ROC Video Analytics Platform

American-made video analytics platform deployed across 46 SPVM cameras with built-in capabilities for facial recognition, age estimation, ethnicity detection, gender classification, and emotion analysis — features the SPVM claims are not currently enabled

Related Records

Taxonomyassessed

Domain
Law Enforcement
Harm type
Disproportionate SurveillancePrivacy & Data Exposure
AI pathway
Deployment ContextOversight Absent
Lifecycle phase
ProcurementDeployment

Changelog

Changelog
VersionDateChange
v1Mar 8, 2026Initial publication

Version 1