Police across Canada are increasingly using controversial algorithms to predict where crimes could occur, who might go missing, and to help them determine where they should patrol, despite fundamental human rights concerns, a new report has found.
To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada is the result of a joint investigation by the University of Toronto’s International Human Rights Program (IHRP) and Citizen Lab. It details how, in the words of the report’s authors, “law enforcement agencies across Canada have started to use, procure, develop, or test a variety of algorithmic policing methods,” with potentially dire consequences for civil liberties, privacy and other Charter rights, the authors warn.
The report breaks down how police are using or considering the use of algorithms for several purposes including predictive policing, which uses historical police data to predict where crime will occur in the future. Right now in Canada, police are using algorithms to analyze data about individuals to predict who might go missing, with the goal of one day using the technology in other areas of the criminal justice system. Some police services are using algorithms to automate the mass collection and analysis of public data, including social media posts, and to apply facial recognition to existing mugshot databases for investigative purposes.
“Algorithmic policing technologies are present or under consideration throughout Canada in the forms of both predictive policing and algorithmic surveillance tools.” the report reads.
Police in Vancouver, for example, use a machine-learning tool called GeoDASH to predict where break-and-enter crimes might occur. Calgary Police Service (CPS) uses Palantir’s Gotham software to identify and visualize links between people who interact with the police—including victims and witnesses—and places, police reports, and the properties and vehicles they own. (A draft Privacy Impact Assessment (PIA) conducted by CPS in 2014 and mentioned in the report noted that Gotham could “present false associations between innocent individuals and criminal organizations and suspects” and recommended measures to mitigate the risk of this happening, but not all the recommendations have been implemented.)
The Toronto Police Service does not currently use algorithms in policing, but police there have been collaborating with a data analytics firm since 2016 in an effort to “develop algorithmic models that identify high crime areas,” the report notes.
The Saskatchewan Police Predictive Analytics Lab (SPPAL), founded in 2015, is using data provided by the Saskatoon Police Service to develop algorithms to predict which young people might go missing in the province. The SPPAL project is an extension of the “Hub model” of policing, in which social services agencies and police share information about people believed to be “at risk” of criminal behavior or victimization. The SPPAL hopes to use algorithms to address “repeat and violent offenders, domestic violence, the opioid crisis, and individuals with mental illness who have come into conflict with the criminal justice system,” the report reads.
“We’ve learned that people in Canada are now facing surveillance in many aspects of their personal lives, in ways that we never would have associated with traditional policing practices,” said Kate Robertson, a criminal defense lawyer and one of the authors of the report, in a phone call with Motherboard.
“Individuals now face the prospect that when they’re walking or driving down the street, posting to social media, or chatting online, police surveillance in the form of systematic data monitoring and collection may be at work,” Robertson added.
The authors note that “historically disadvantaged communities” are at particular risk of being targeted for surveillance and analysis by the technology due to systemic bias found in historical police data.
In the report, Robertson and her co-authors lay out a series of policy recommendations that they say would mitigate some of the potential dangers of algorithmic policing practices. These include placing a moratorium on police agencies’ use of predictive policing algorithms that rely on historical crime data and updating oversight mechanisms related to the use of surveillance.
“Right now, we know far too little about what algorithmic technologies are being deployed and how they’re being used,” Robertson said. “It’s hard for accountability mechanisms to operate if the technology is being used in secret.”
from VICE US https://ift.tt/34N7Vd8
via cheap web hosting
No comments:
Post a Comment