The increasing use of algorithmic surveillance technologies by Canadian police threatens privacy and fundamental freedoms, says a University of Toronto technology and rights research group.
“The advanced capabilities and heightened data requirements of algorithmic policing technologies introduce new threats to privacy,” says a report issued Tuesday by Citizen Lab.
It complains the use of what it calls algorithm policing — using data to predict where offences will likely be committed and who will likely commit them — involve repurposing of historic police data, constitutionally questionable data-sharing arrangements, and algorithmically surveilling public gatherings or online expression.
“The Canadian legal system currently lacks sufficiently clear and robust safeguards to ensure that the use of algorithmic surveillance methods—if any—occurs within constitutional boundaries and is subject to necessary regulatory, judicial, and legislative oversight mechanisms,” says the report. “Given the potential damage that the unrestricted use of algorithmic surveillance by law enforcement may cause to fundamental freedoms and a free society, the use of such technology in the absence of oversight and compliance with limits defined by necessity and proportionality is unjustified.”
The report’s authors also say discussion of algorithmic policing systems have to take into account systemic racism in the Canadian legal system, particularly as it could impact marginalized communities. “The seemingly ‘neutral’ application of algorithmic policing tools masks the reality that they can disproportionately impact marginalized communities in a protected category under equality law,” they wrote.
“Numerous inaccuracies, biases, and other sources of unreliability are present in most of the common sources of police data in Canada. As a result, drawing unbiased and reliable inferences based on historic police data is, in all likelihood, impossible. Extreme caution must be exercised before law enforcement authorities are permitted, if at all, to use algorithmic policing technologies that process mass police data sets. Otherwise, these technologies may exacerbate the already unconstitutional and devastating impact of systemic targeting of marginalized communities.”
It distinguishes between three types of algorithmic policing technologies:
- Location-focused, which uses historical police data to try to predict high crime areas;
- Person-focused, which uses historical police data to try to predict persons who may be likely to commit crimes;
- And algorithmic surveillance policing technologies such as facial recognition and usually have no predictive element but may process police data in a different way.
The 179-page five-part report compiles information from a number of sources including court documents, but not all police forces were helpful. Many of the freedom of information requests were either ignored or replied with claims of document privilege, or requests for “exorbitant” fees. Still, the authors note that at least two agencies, the Vancouver Police Department and the Saskatoon Police Service, confirmed they are using or are developing ‘predictive’ algorithmic technologies for the purposes of guiding police action and intervention.
Other police services, such as in Calgary and Toronto, have acquired technologies that include algorithmic policing capabilities, although it isn’t clear if these systems are being used. The report quotes an unnamed law enforcement official confirming that the Toronto Police Service doesn’t currently engage in algorithmically driven predictive policing. It adds that Toronto Police have said certain technologies are only being used for reporting and statistical insights.
The report says the Calgary Police Service uses algorithmic social network analysis, which may also be used at some point for person-focused algorithmic policing. Numerous law enforcement agencies across the country also now rely on a range of other algorithmic surveillance technologies (for example, automated licence plate readers, facial recognition, and social media surveillance algorithms), or they are developing or considering adopting such technologies, the report adds.
Some cities are sensitive to possible abuses. Vancouver Police use a machine-learning location-focused algorithmic system called GeoDASH. It uses historical police data to try to predict where and when break-and-enter crimes are likely to occur. The report notes the system allows the designation of “exclusionary zones,” which take account of sensitive areas of the city that give rise to over-policing concerns. One area, Downtown East, has been designated as an exclusionary zone.
The report calls on Ottawa to create a judicial inquiry to conduct a comprehensive review regarding law enforcement agencies’ potential repurposing of historic police data sets for use in algorithmic policing technologies. In the meantime, it adds, law enforcement agencies must immediately be fully transparent with the public and with privacy commissioners about what algorithmic policing technologies are currently being used, developed, or procured, to enable public dialogue and meaningful accountability and oversight.