The Journal of Things We Like (Lots)
Select Page

For those of us law professors who write about policing, sociologists have been a real boon.  From the pathbreaking work of scholars like Jerome Skolnick and Michael Brown to the more recent research of Monica Bell and Issa Kohler-Hausmann, these trained observers and interlocutors of human and institutional behavior, who usually obtain much of their knowledge by embedding themselves in the criminal system, have provided law professors with extremely valuable insights about how things actually work.  One of the newer sociologists to join this group is Sarah Brayne, an Associate Professor of Sociology at the University of Texas.

In her book Predict and Surveil, Professor Brayne paints a detailed portrait of how the Los Angeles Police Department (LAPD) uses big data and, in doing so, gives us a glimpse of what policing might look like in the future. The third-largest police department in the country, the LAPD has been at the forefront of the move toward predictive policing, the use of fusion centers, programs that compile detailed data about police-citizen interactions, and reliance on private companies to help make sense of all the information collected by police and other agencies. Over the course of five years, Professor Brayne’s research into these practices involved ride-a-longs in patrol cars and helicopters, dialogues with all tiers of the LAPD’s hierarchy, interviews with people in federal agencies and technology firms, deep dives into LAPD archives, and trawling the data the department uses for its investigations. (Pp. 7-8.)

After summarizing the history of government surveillance efforts, the book zeroes in on the LAPD’s data-driven policing programs. During the period Professor Brayne conducted her research (2013 to 2018), this objective required figuring out the relationship between the LAPD and Palantir, a company founded in 2004 with partial support from the CIA that began its work in the intelligence business but soon expanded to providing platforms for compiling and analyzing law enforcement data. As Professor Brayne points out, until her work, “there was virtually no public research available on Palantir, and media portrayals [were] frustratingly vague.” (P. 37.)

Professor Brayne’s description of the interactions between Palantir and the LAPD focuses on what she calls “dragnet surveillance” and “directed surveillance.” On the dragnet side, Professor Brayne observed analysts using Palantir’s platform to carry out a wide range of functions: narrowing a search from 140 million records to just 13 using identifying information about a car; running an address to learn about all the criminal “events” that had occurred there; tracking down a person suspected of trafficking by obtaining his siblings’ addresses and then discovering police reports about strange goings-on at one of those residences; finding another suspect through nicknames and body art described in computerized police reports; and nabbing still another individual through an alert provided by an automated license plate reader data (ALPR). (Chapter 3.) With respect to targeted surveillance, she documents how the LAPD uses Palantir and other services to create programs like LASER (designed to identify “the violent” with “laser-like precision”), compile “Chronic Offender Bulletins,” and pinpoint “high crime areas.” (Chapter 4.)

Professor Brayne recounts that during her fieldwork, she saw many examples of how technological advances like ALPRs were “plainly useful.” (P. 51.) However, Professor Brayne also makes clear that measuring the efficacy of modern policing is a “complicated” process that requires evaluating all of its benefits and costs. (P. 23.) She points to several potential downsides of data-driven policing.  Most importantly, she notes that once the LAPD decided to go the big data route, it became obsessed with obtaining data. Even the police themselves complained about “data greed” and the extent to which the new regime ignored their intuitive street knowledge or merely replicated it. (P. 89.) To get the data, officers on the streets were pressured to fill out Field Interview cards (FIs) for virtually every encounter, no matter how trivial. Thus, the names and addresses not only of those stopped but of those who accompanied those stopped, ended up in the LAPD’s databanks, accessible through Palantir. People received “points” for being stopped, which then formed the basis for subsequent stops, which then led to more points and more stops. Given the areas that police frequent in Los Angeles, the net-widening effect of these efforts hit communities of color particularly hard; Professor Brayne reports that half the people listed as “chronic” offenders were Hispanic/Latino, and another 30% were black. (P. 108.) And she also documents that FIs were just one of dozens of criminal data sources that could suffer from such feedback loops.

To Professor Brayne, the dominance of Palantir was also problematic. Because of trade secret protection, the company’s inner workings were opaque even to the police department, much less to outsiders who wanted to monitor them; she notes that “not one person I spoke with could identify a single instance in which a Palantir use audit had been conducted.” (P. 103.) Partly as a result of this lack of transparency, many officers did not understand how to use the platform, nor did the LAPD have ultimate control of the types of information Palantir accessed or how it combined that information. In a real sense, employees at Palantir were making decisions that the police themselves used to make, and in ways that were even more mysterious to those affected.

The key warning Professor Brayne provides is that, while big data policing may appear more objective, it is “fundamentally social.” (P. 6.) As used by the LAPD, this mode of policing is just as dependent on human decision-making as traditional policing, at both the data collection and the data analysis stages. As a result, Professor Brayne suggests, “the unintended consequences of algorithmic systems may be a Trojan Hose: the algorithms posited as a gift to society actually smuggle in all sorts of biases, assumptions, and drivers of inequality.” (P. 6.) Of most relevance to legal academics, she argues that “the technological tools of police surveillance are far outpacing the laws that regulate them, resulting in a growing mismatch between law on the books and the law in action.” (Id.) To those of us who want to provide meaningful regulation of modern policing, Brayne’s book is a necessary starting place.

Download PDF
Cite as: Christopher Slobogin, The Sociology of Big Data Policing, JOTWELL (July 26, 2022) (reviewing Sarah Brayne, Predict and Surveil: Data, Discretion, and the Future of Policing (2021)),