Brazil’s Risky Bet on Tech to Fight Crime

SHARETweet about this on TwitterShare on FacebookShare on LinkedIn

Rio de Janeiro’s ultra-conservative governor, Wilson Witzel, was elected in 2018 on a tough-on-crime ticket. Like his erstwhile ally, President Jair Bolsonaro, who vowed to let criminals “die in the street like cockroaches,” Witzel promised to “slaughter” and “dig graves” for anyone getting in the way of his crime-fighting agenda.

Since Witzel took office urging the police to use more forceful tactics, overall crime is down, but civilian deaths have soared. Rio’s police reportedly killed 1,810 people in 2019, an 18 percent increase over 2018 — and the highest toll since records began more than 20 years ago.

One of Witzel’s campaign pledges was to ramp up the use of technology. Before his inauguration, the governor visited Israel to look at drones equipped with sniper rifles and promised to install 30,000 new cameras, though this surveillance program has yet to lift off. But he did dispatch 120 sniper shooters in helicopters to fly over the city’s favelas telling them to “aim for the head” of gun-carrying suspects and recently revived his predecessors’ “pacifying police units” program equipping policemen with body cameras connected to facial recognition technology.

*This article was first published by the Igarapé Institute and has been translated, edited for clarity and reprinted with permission, but does not necessarily reflect the views of InSight Crime. See the original article here.

Witzel is not the only official experimenting with tech to lower crime. Back in the early 2000s, São Paulo led the way by digitizing crime stats and deploying mapping software, including InfoCrime and Detecta. These and other reforms are credited with a major drop in crime. In cities across Brazil, local authorities are erecting digital walls: cameras lining access roads in order to track license plates. At least five state police forces rolled out facial recognition pilots in 2019, linking them to the arrests of at least 151 people.

SEE ALSODeath From Above: The Use of Police Helicopters in Rio de Janeiro

Brazil’s federal police is among the world’s first to deploy Israeli-manufactured military drones. Some state police forces are also developing crime prediction capabilities that combine machine learning and crime datasets to estimate the probability of events occurring. While prone to criticism and vulnerable to bias, there is evidence that some predictive tools can help improve the effectiveness and efficiency of policing. Pilots involving crime forecasting are already underway in Rio de JaneiroFortaleza and Santa Catarina.

Brazil is far from the only Latin American country jumping on the security-tech bandwagon. Across the region governments, police and private companies are testing out different tools. Take the case of Buenos Aires, which is integrating facial recognition into its existing surveillance camera infrastructure. Other Argentine cities like Cordoba and Mendoza are following suit. In Lima, municipalities are quietly ramping-up facial recognition. And in Mexico, Coahuila state recently announced its intention to combine facial recognition capabilities to its newly acquired 1,100 surveillance cameras.

How clear is the picture?

Wherever adopted, facial recognition and crime prediction platforms are only as good as the underlying data and how the tools are managed. MIT researcher Joy Buolamwini has shown how facial recognition systems register error rates of 34.7 percent for black women as compared to just 0.8% for white men. In countries like Brazil, in which 64 percent of the prison population is black, there are real risks that recognition tools could go spectacularly wrong.

When it comes to crime prediction, one needs to account for the unreliability of existing data and systematic reporting biases. In the US, for example, low-income communities that are heavily-policed virtually always register “more” crime than the average. And in Brazil, there are often vast “no-go areas” suffering information blackouts where police are simply not present at all.

All of these information asymmetries can fundamentally distort the predictions generated by crime forecasting tools. This can reinforce existing biases to the detriment of public safety, especially of the most vulnerable.

Crime prediction and facial recognition tools can also be highly invasive and undermine civil liberties if poorly managed. Some tools are more problematic than others. Place-based prediction estimates the probable “risk” of events occurring at specific times and in particular geographic locations. People-based predictions hoover up public and personal data of individuals to infer the risk that they might perpetrate a crime or be a victim (or both).

SEE ALSOPolice Exercise License to Kill in Brazil’s Rio de Janeiro

Some of these systems rely on information of a highly personal nature. Few companies, however, are inclined to release information about how their tech works. Moreover, the high level of complexity of some algorithms means that even their developers often struggle to explain how they work.

Nevertheless, there are ways to limit biases and improve the accountability and transparency of security technologies. At a minimum, developers must test algorithms for fairness and discrimination. This means statistically modeling the direct and indirect effects of formulas to determine their effects on certain subgroups according to their race, income, gender, age or otherwise.

Companies can also introduce social impact strategies to clarify biases, explain how data is used, and provide more information to the public. This means ensuring algorithms are explainable; clarifying the police responsibilities when it comes to the data collected; describing the accuracy of the predictions — and their limitations; providing options to audit the algorithm on the ground; and explaining how the system ensures privacy, in-line with existing laws and practices.

If designed and deployed with fidelity and care, new technologies can generate positive improvements in the effectiveness and efficiency of public security. A recent study by McKinsey determined that certain data-driven tools can help reduce fatalities by up to 10 percent, lower crime incidents by as much as 40 percent and dramatically reduce emergency response times.

SEE ALSO: Brazil News and Profile

This is not guaranteed just by procuring and implementing facial recognition or crime prediction solutions. Clear direction on how these tools will be used, regular training, legal and ethical safeguards to limit misuse, and regular testing and evaluation is essential. Their rollout should also be accompanied by a public campaign to explain what these tools are, how they affect privacy, and citizens’ right to recourse.

The costs and benefits of security tech need to be publicly debated. If communities in democratic societies reject the technology, as they have in places like San Francisco, then they stand very little chance of being successful. On the other hand, new technologies for public security can also generate important savings, if deployed intelligently. Crime-mapping platforms, CCTV cameras, smart lights and body cameras can reduce unproductive expenditure on law enforcement agencies, prosecutors, judges and penal authorities.

The point is that while public security technologies are being widely deployed in Brazil and across Latin America, they need to be carefully administered and evaluated. But they are far from a panacea. Moreover, if they are deployed without oversight, they can generate counterproductive results. A key lesson is that if facial recognition and crime prediction is to be used, it should be accompanied with a public debate. Safeguards to protect data and strategies to measure outcomes — with independent specialists — are not optional extras, but fundamental to their success.

*This article was first published by the Igarapé Institute and has been translated, edited for clarity and reprinted with permission, but does not necessarily reflect the views of InSight Crime. See the original article here.

SHARETweet about this on TwitterShare on FacebookShare on LinkedIn