This site is a work-in-progress prototype.
Active Confidence: medium Potential severity: Significant Version 1

Montreal's police force acquired AI surveillance technology with built-in biometric capabilities including ethnicity and emotion detection, without public disclosure of the specific software or a publicly available privacy impact assessment — capabilities that can be activated through software configuration rather than requiring new procurement.

Identified: December 1, 2025 Last assessed: March 8, 2026

Description

In December 2025, reporting by Pivot, a Quebec civil liberties organization, revealed that the Service de police de la Ville de Montréal (SPVM) had acquired a $1.8 million, five-year AI video analysis platform from iMotion Security. The SPVM initially refused to disclose which software the platform used or to release the privacy impact assessment that authorized the procurement.

Subsequent investigative reporting in February 2026 identified the software as Rank One Computing’s (ROC) video analytics platform, an American-made system deployed across 46 cameras. The ROC software’s documented capabilities include search by clothing and vehicle attributes, but also built-in biometric features: facial recognition, age estimation, ethnicity detection, gender classification, facial hair detection, and emotion analysis. These capabilities are part of the software’s standard feature set and can be toggled on or off through configuration rather than requiring new procurement or hardware changes.

The SPVM stated that biometric identification features are “not part of the current context of use.” However, civil liberties organizations have raised concerns that the capabilities exist within the deployed software and could be activated at any time through a configuration change — without additional procurement, public consultation, or legislative authorization. The absence of a publicly available privacy impact assessment, the initial refusal to name the software vendor, and the gap between the platform’s full capabilities and the SPVM’s stated use case create a significant transparency deficit.

The deployment is particularly concerning in light of documented racial bias in facial recognition systems. Research has shown that facial recognition algorithms, including those comparable to ROC’s, exhibit significantly higher error rates for Black individuals and women. In a city where policing of racialized communities is an active public concern, the acquisition of AI surveillance technology with ethnicity detection capabilities — even if claimed to be currently disabled — represents a meaningful hazard.

Risk Pathway

The SPVM deployed an AI video surveillance platform whose software includes built-in biometric capabilities — facial recognition, ethnicity detection, emotion analysis — that can be activated through configuration changes without additional procurement, public consultation, or legislative authorization. The absence of a publicly available privacy impact assessment and the initial refusal to identify the vendor create a transparency deficit that prevents meaningful oversight of whether and when these capabilities might be enabled.

Assessment History

Active Confidence: medium Significant

Pivot's December 2025 reporting revealed the SPVM's acquisition of the AI surveillance platform and the refusal to disclose the vendor. February 2026 investigative reporting by The Concordian and Biometric Update identified the software as ROC video analytics and documented its full biometric capabilities. The ROC software's feature set — including facial recognition, ethnicity detection, and emotion analysis — is confirmed by vendor documentation. The SPVM's claim that biometric features are not enabled cannot be independently verified due to the absence of a public privacy impact assessment.

Migrated from v2 flat assessment

Triggers

  • Configuration change enabling biometric features without new procurement or public process
  • Expansion of camera coverage beyond the current 46 cameras
  • High-profile security event creating pressure to activate facial recognition capabilities
  • Absence of municipal or provincial legislation explicitly prohibiting biometric surveillance activation

Mitigating Factors

  • SPVM's public statement that biometric features are not part of the current context of use
  • Civil society scrutiny from Pivot and investigative journalists maintaining public awareness
  • Documented racial bias in facial recognition systems creating political and legal liability for activation

Risk Controls

  • Require public disclosure and independent review of all AI surveillance technology procured by police services, including the specific software, its full capabilities, and any privacy impact assessments
  • Establish municipal bylaws or provincial legislation prohibiting activation of biometric identification features — including facial recognition, ethnicity detection, and emotion analysis — in police surveillance systems without explicit legislative authorization
  • Mandate that police procurement of AI surveillance technology include binding contractual limits on which software capabilities may be enabled, with independent compliance verification
  • Require community consultation before police deploy AI-powered surveillance systems in public spaces, with particular attention to the impact on racialized communities

Affected Populations

  • Montreal residents
  • racialized communities
  • civil liberties organizations

Entities Involved

Acquired and deployed the $1.8 million AI video analysis platform across 46 cameras, initially refusing to disclose the software vendor or release the privacy impact assessment

iMotion Security
developer

Supplied the AI video analysis platform to the SPVM, integrating Rank One Computing's video analytics software

AI Systems Involved

ROC Video Analytics Platform

American-made video analytics platform deployed across 46 SPVM cameras with built-in capabilities for facial recognition, age estimation, ethnicity detection, gender classification, and emotion analysis — features the SPVM claims are not currently enabled

Responses

Service de police de la Ville de Montréal

Stated that biometric identification features are not part of the current context of use, but initially refused to disclose the software vendor or release the privacy impact assessment

Related Records

Taxonomy

Domain
Law Enforcement
Harm type
Surveillance OverreachPrivacy & Data Exposure
AI involvement
Deployment FailureOversight Breakdown
Lifecycle phase
ProcurementDeployment

Sources

  1. IA au SPVM — technologie intrusive Other — Pivot (Dec 8, 2025)
  2. SPVM's new AI video surveillance platform uses American software with facial recognition Media — The Concordian (Feb 1, 2026)
  3. iMotion deploying ROC video analytics for Montreal police Media — Biometric Update (Feb 1, 2026)

Changelog

VersionDateChange
v1 Mar 8, 2026 Initial publication