6 February, 2019
In the 2002 sci-fi film Minority Report, a police pre-crimes unit uses technology to predict and stop crimes before they happen. Take Tom Cruise out of the equation and you’ve got the present-day Los Angeles Police Department.
Since 2011, the LAPD has been using data software developed by Palantir to compile Chronic Offender Bulletins which help police determine probable offenders to track. Persons of Interest are given a score and ranked based on certain risk factors: five points if they’re in a gang, on parole or probation, or for each violent crime arrest in a two-year span, and one point for each “quality” police interaction in the same time frame. The individuals with the highest score are singled out for police surveillance, regardless of whether they are suspects in any particular crime. This project is known as Operation L.A.S.E.R.
People on the chronic offenders list have no way of knowing whether they are under police investigation. A February 2018 lawsuit filed by Stop LAPD Spying Coalition to release this information is still pending.
L.A.S.E.R., or Los Angeles Strategic Extraction and Restoration also targets “anchor points” – locations such as homes, businesses, or parks, which police may believe are meeting points for criminal activity. People in these locations may be hit with evictions or nuisance abatement notices. Based on reports from local residents, the LAPD may also be tracking social networks within neighborhoods.
The project is part of a new trend in “SMART policing” – partnerships between private tech companies and local units which use databases and innovations like license plate readers to track crime. L.A.S.E.R. was first tested by the Newton division, whose patrol area encompasses the Fashion District, Pueblo Del Rio, and South Park. The station, in the area formerly known as South Central, sits on the outskirts of rapidly-developing Downtown, and its residents are mostly low-income and People of Color.
To many observers, this is no coincidence. L.A.S.E.R.’s reign in this neighborhood overlaps with a period of gentrification that has seen an increase in evictions and rental prices. And it may play a direct role, as police encourage landlords to remove residents of apartments labeled as anchor points. Such evictions may rise as Police Chief Charlie Beck extends L.A.S.E.R. to other communities.
Activists have charged that L.A.S.E.R. and similar technologies employed by the LAPD in the past decade criminalize people of color and make it harder to escape a punitive justice system.
When we talk about predictive policing, we talk about three of our country’s most sensitive trigger points: race, violence, and the future of technology. For many of us, the need to feel secure in our neighborhoods outweighs our desire for privacy. Privacy encroachment has less of a feeling of the inevitable, and more of a feeling of the “has already happened.” However, there is still danger in permitting it in the name of the law.
It may be tempting to blow off tech-driven policing with the easy rebuttal: “There’s nothing to worry about if you’re not doing anything wrong.” Yet surveillance is a kind of accusation in of itself. It implies that the one being watched is a threat. Even for the law-abiding, this causes a lasting harm. Internally, we begin to police our own actions. We make certain choices, like foregoing the late-night grocery store run – after all, the license plate readers are on, and we don’t want to be in the wrong place at the wrong time. We stop calling our friend with the troubled past. Such things seem insignificant, but gradually our world becomes smaller, and our sense of freedom becomes more constricted.
It’s anti-democratic in the most basic sense. When it comes to technology, we seem to have lost the sense that we have any say over the direction of our communities. That we can say no.
The slogan “to protect and serve” is inverted. The police are protecting the neighborhood from its residents, rather than serving the residents themselves. We know that busting people within a given neighborhood does nothing to extract the root of the problem. As always, access to education and affordable housing is a good place to start.
None of this is to say that there is no role for companies like Palantir in modern policing. Certainly, we want perpetrators of violent crime to be apprehended before they can cause more harm. But in an email to The Appeal, Palantir spokeswoman Lisa Gordon showed her hand. She described the gathering of data used by the software as a “human-driven process.” Like all technologies, Palantir is made by human programmers, and operated by people. Oftentimes these people have race and class biases. Sometimes perhaps even malignant tendencies towards violence and grudge-holding. Technology isn’t neutral and will not counteract these toxic aspects of our law enforcement. We must reckon with them for what they are.