Predictive Policing
Predictive Policing
Predictive Policing
2013
Table of Contents
Foreword. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Understanding Predictive Policing . . . . . . . . . . . . . . . . . Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Origins of Crime Analysis . . . . . . . . . . . . . . . . . . . . . The Role of Predictive Analytics in Crime Prevention . . Non-Predictive Uses of Crime Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 6 7 8 9
Operational Challenges of PredictivePolicing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Predicting Crime . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Theoretical Foundations of Data Mining and Predictive Analytics. . Data Used in Predictive Policing . . . . . . . . . . . . . . . . . . . . . . . . . Predictive Methodologies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Places on the Frontier of PredictivePolicing. . Santa Cruz, California. . . . . . . . . . . . . . . Baltimore County, Maryland . . . . . . . . . . . Richmond, Virginia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 14 15 17 25 25 27 29
Implementation Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 About the Author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Key Contact Information. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
Foreword
On behalf of the IBM Center for The Business of Government, we are pleased to present this report, Predictive Policing: Preventing Crime with Data and Analytics, by Jennifer Bachner, Center for Advanced Governmental Studies, Johns Hopkins University. There is much discussion now in the worlds of technology and government about social network analysis, business analytics, dashboards, GIS visualization graphics, and the use of big data. But for non-technical observers, this discussion only begins to make sense once tools and technologies show visible results for the public. This report seeks to bridge that gap. Dr. Bachner tells compelling stories of how new policing approaches in communities are turning traditional police officers into data detectives. Police departments across the country now adapt techniques initially developed by retailers, such as Netflix and Walmart, to predict consumer behavior; these same techniques can help to predict criminal behavior. The report presents case studies of the experience of Santa Cruz, California; Baltimore County, Maryland; and Richmond, Virginia, in using predictive policing as a new and effective tool to combat crime. While this report focuses on the use of predictive techniques and tools for preventing crime in local communities, these methods can apply to other policy arenas as well; Dr. Bachners work is consistent with a number of reports on this topic that the Center has recently released. Efforts to use predictive analytics can be seen in the Department of Housing and Urban Developments initiative to predict and prevent homelessness, and in the Federal Emergency Management Agencys initiative to identify and mitigate communities vulnerable to natural disasters. These techniques are also being applied to reduce tax fraud and improve services in national parks, as described in other IBM Center reports which include the 2012 report prepared by the Partnership for Public Service, From Data to Decisions II: Building an Analytics Culture.
Daniel J. Chenok
Gregory J. Greben
We hope that this report will be highly useful to the law enforcement community, as well as to government leaders generally, in understanding the potential of predictive analytics in improving performance in a wide range of arenas.
Daniel J. Chenok Executive Director IBM Center for The Business of Government chenokd@us.ibm.com
Gregory J. Greben Vice President Business Analytics & Optimization Practice Leader, IBM U.S. Public Sector greg.greben@us.ibm.com
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
The Baltimore County, Maryland, police department employs time and space analysis to interdict suspects in serial robberies. The Richmond, Virginia, police department has used social network analysis to cut off a suspects resources and drive the suspect to turn himself in to the police. An assessment of these and other tools, techniques, and philosophies adopted by police departments provides valuable insight into the qualities of an effective predictive policing program. In the last section, the report offers seven recommendations for municipalities and law enforcement agencies that are considering investing time and resources in the establishment of a predictive policing program: Recommendation One: Do it. Recommendation Two: Treat predictive policing as an addition to, not a substitute for, traditional policing methods. Recommendation Three: Avoid top-down implementation. Recommendation Four: Keep the software accessible to officers on the beat. Recommendation Five: Consider the geographic and demographic nature of the jurisdiction. Recommendation Six: Collect accurate and timely data. Recommendation Seven: Designate leaders committed to the use of analytics. All of the practitioners interviewed for this report emphasize that predictive policing, while exceptionally useful, is only one aspect of crime analysis, which itself is a piece of a larger crime-fighting methodology. Todays police agencies have many avenues through which to prevent and solve crime, including both analytics and community interaction. Nevertheless, policing, like many other fields, is undoubtedly moving in a data-driven direction. And, as the amount of data increases and software becomes more accessible, this trend is likely to accelerate.
1. David Weisburd and Tom McEwan, Introduction: Crime Mapping and Crime Prevention, in David Weisburd and Tom McEwan, eds., Crime Mapping and Crime Prevention Studies (Monsey, N.Y.: Crime Prevention Studies, 1997). 2. Ibid. 3. Ibid.
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
calculate homicide rates.4 Additional measures, such as prison rates and arrest data, were collected by cities and states during the 1920s. In 1930, the Federal Bureau of Investigation (FBI) was given the authority to collect and disseminate crime data. The FBI continues to publish Crime in the United States annually, and this comprehensive publication served as the chief data input for crime analysis models in the latter half of the 20th century. With the advent of affordable computers, both police organizations and scholars began to explore automated crime mapping. Academic researchers investigated the relationship between environmental characteristics and the incidence for crime. Sociologists, for example, used mapping to uncover a quantifiable, causal relationship between the presence of taverns and the incidence of violent and property crimes.5 Police forces initially hoped crime mapping would serve as a means of improving resource allocations efficiency. The technical and personnel demands of mapping, however, prevented police departments from integrating this tool into everyday police work until recently. Today, the availability of massive data sets, data storage, sophisticated software, and personnel that can both perform analyses and communicate actionable recommendations to officers in the field has rendered crime analysis a central component of modern policing. Further, collaborative efforts between police officers, scholars, and businesses have led to the development of analytical techniques that have strong theoretical foundations; accompanying tools, such as software programs, enable their widespread use.
4. Paul Brantingham and Patricia Brantingham, Patterns in Crime (New York, N.Y.: Macmillan, 1984). 5. D. Roncek and M. Pravatiner, Additional Evidence That Taverns Enhance Nearby Crime, Sociology and Social Research 79:4 (1989): 185188. 6. Christy Visher and David Weisburd, Identifying What Works: Recent Trends in Crime Prevention Strategies, Crime, Law and Social Change 28 (1998): 223 242. 7. Brandon Walsh and David Farrington, The Future of Crime Prevention: Developmental and Situational Strategies, National Institute of Justice (2010). 8. Bureau of Justice Assistance, Understanding Community Policing: A Framework for Action, U.S. Department of Justice (2004).
Based on research conducted in the 1970s that evaluated the effectiveness of policing methods, it was concluded that a paradigm shift in policing was needed. There was general agreement that police departments had become insular, arrogant, resistant to outside criticism, and feckless in responding to social ferment.9 In response, police began moving to community policing which emphasized engagement with members of the local community for the purpose of creative problem-solving. Officers were encouraged to work with civilians to assess local needs and expectations, obtain information, and develop solutions. In addition, officers strove for visibility by patrolling on foot, bicycle, and horseback. With the emergence of big data and accessible analytical methods, several new approaches to police practice have appeared in recent years. Sometimes called intelligence-led policing, this approach emphasizes expertise, efficiency, and scientifically proven tactics. Modern policing demands decision-making that is guided by evidence; in particular, large volumes of quantitative data. The reliance on statistics and automated mapping, termed CompStat, has been widespread since 1995, when it was first implemented by the New York City Police Department. This philosophy has since been adopted by nearly every law enforcement agency in the country. Under the original framework of CompStat, crime data are collected and analyzedprimarily using geographic information systems (GIS)to improve accountability and resource allocation. By mapping the distribution of criminal activity across low-level geographic units (e.g., city blocks and individual buildings), police can deploy officers to high-crime areas and track changes over time. Whereas traditional uses of CompStat are fundamentally reactive, the goal of predictive policing is proactiveto prevent crime from occurring in the first place. Predictive policing is therefore a component of intelligence-led policing that is focused on what is likely to occur rather than what has already happened. It is the frontier of crime prevention, and the data and methods required for this approach have only recently been developed and employed.
9. David Alan Sklansky, The Persistent Pull of Police Professionalism, New Perspectives in Policing (March 2011): 5. 10. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program for the Analysis of Crime Incident Locations (2010).
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
incidents align with observable determinants of crime.11 The identification of the causes of criminal activity has implications for predictive policing; when police departments and city governments understand why crime occurs, they can take law enforcement and urban planning measures to prevent it from happening.
11. Ned Levine, Crime Mapping and the CrimeStat Program, Geographical Analysis 38 (2006): 4156.
10
Question Formulation Data Preprocessing Data Analysis Actionable Recommendation Officer Action Evaluation of Action
The first step requires a police agency to formulate a question. The question can be tactical, such as predicting the likely locations of auto thefts during a shift; or strategic, such as forecasting personnel needs over the next 10 years. After the question has been formulated, the crime analyst must determine whether the organization has the data needed to answer it. In some cases, the analyst may need to acquire additional data to answer the question. In other instances, the question may need to be modified to accommodate the existing data or the data that can be acquired.
11
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
After the necessary data have been identified, the data must be processed to be ready for analysis. This may require cleaning, recoding variables, inputting missing data, validating data, and matching observations. Once the data have been processed, it is time for analysis. At this stage, the analyst employs one or more of the techniques, such as clustering or social network analysis.
12
The purpose of the analysis is to answer the research question and generate operationally relevant recommendations. The recommendations, when appropriate, are then communicated to officers and integrated into real-time decision-making. After action has been taken, evaluations are conducted to determine if the process resulted in a favorable outcome. The crime analysis process can flow in either a linear or iterative fashion. The recommendations that result from a data analysis, for example, might surprise the analyst and cause them to revise the data or methods used. In discussing the conclusions from an analysis with officers, the analyst may come to realize that the recommendations cannot be implemented and further analysis is needed to determine a feasible way to proceed. Regular, open, and informed communication between analysts and officers fuels the healthy, and often circular, functioning of the crime analysis process. The process described above is subject to a number of challenges that warrant special attention by agencies considering implementing a predictive policing program. Challenge One: Collecting and managing large volumes of accurate data. Data used by police agencies come from a host of sources, including state governments, the federal government, and private organizations. Assembling, storing, and protecting the security of these data require an investment of time and resources. Moreover, much of the raw information available to police agencies must be translated into useful data. Records relating to financial transactions, telephone calls, criminal incidents, and Internet use, for example, are often not collected and stored for the purpose of statistical analysis and must therefore be appropriately formatted.13 Challenge Two: Ensuring analysts possess sufficient domain knowledge. Crime analysis works best when analysts have training in both methodology and substance. One of the most frequent disruptions in the crime analysis process occurs between the data analysis and developing of actionable recommendations. If the information provided by analysts fails to assist with actual police operations, the crime analysis process breaks down. To mitigate this problem, analysts should develop an understanding of crime, criminals, and police response.14 For civilian analysts in particular, this will likely require participation with officers in field police work. Challenge Three: Maintaining adequate analytical resources. To maintain an effective crime analysis unit, an agency must provide adequate training and software. Because of the evolving nature of analytical tools and methods, analysts need regular training opportunities. In addition, agencies should ensure that analysts have access to necessary software, including databases, statistical programs, and geographical information systems (GIS). Challenge Four: Fostering productive communication between analysts and officers. As Figure 1 depicts, the links between analysis, recommendations, and action are interactive. Analysts should modify analyses and recommendations following conversations with officers about the practicality and effectiveness of various courses of action. Further, regular conversations between officers and analysts promote mutual understanding of each others work and enable all involved to ask better questions and better identify areas of concern. Challenge Five: Ensuring officer follow-up on recommendations. The best recommendations are useless without follow-up. Agencies should establish guidelines for how recommendations from crime analysts should be incorporated into officer decision-making. Officers should also be empowered to receive real-time information from analysts and act on this information.
13. Colleen McCue, Data Mining and Predictive Analytics in Public Safety and Security, IT Professional 8 (2006): 1218. 14. Christopher Bruce, Closing the Gap between Analysis and Response, Police Chief Magazine (2008).
13
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
Predicting Crime
The transformative value of big data and analytics resides in their predictive capacities. This section focuses on the theoretical foundations of predictive policing and specific types of predictive methodologies.
15. Elizabeth R. Groff and Nancy G. La Vigne, Forecasting the Future of Predictive Crime Mapping, Crime Prevention Studies 13 (2002): 32.
14
The systematic, or predictable, component of crime can be discovered through either a topdown or bottom-up analytical approach. The top-down approach refers to methods in which the user specifies predictors of criminal activity. The analyst theorizes that certain factors, such as environmental characteristics, time of day, weather, and past incidence of crime influence the likelihood of future criminal activity. These factors are incorporated into a statistical model that generates predictions for particular geographical units. The bottom-up approach does not require a predetermined theory about the determinants of crime. Instead, an algorithm is used to search across a large amount of data to identify patterns. These patterns frequently take the form of geographical clusters of criminal incidents (hot spots) and diagrams of social networks. The chief benefit of a bottom-up approach is that analysts are able to uncover patterns they did not even realize existed.
16. Colleen McCue and Andre Parker, Connecting the Dots: Data Mining and Predictive Analytics in Law Enforcement and Intelligence Analysis, Police Chief Magazine (July 2012): 1.
15
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
Temporal Variables
Payday schedules Time of day Weekend vs. weekday Seasonal weather (e.g., hot versus cold weather) Weather disasters Moon phases Traffic patterns Sporting and entertainment events
to features such as interstate highways, bridges, and tunnels are more desirable crime locations than areas away from these key geographical connectors. Lastly, the third subcategory includes variables frequently correlated with criminals residences, such as bars and liquor stores. Once again, analysts must be creative when considering which data may be appropriate for this category. Public health data, for example, often provide geocoded statistics on drug overdoses, which may serve as useful predictors. Temporal variables include variables that vary across time (as opposed to geographical space). Seasonal weather patterns, for example, have substantial predictive power for a variety of crimes, such as suicide and auto thefts.17 Two theories dominate the literature to explain why crime rates shift with seasons. The aggression theory states that high temperatures induce feelings of anger, frustration, and irritation, which lead to aggressive behavior. This theory has been linked to violent crime, as it is often an expression of human aggression. The social behavior theory focuses instead on the change in human activities associated with a change in seasons. According to this theory, warmer temperatures cause people to spend more time outside their homes. When people are away from their homes, they are more likely to be the victims of crime and their homes are more likely to be burglarized.
17. Sociologist Emile Durkheim, in Suicide (1951), observes that suicide is more likely to occur in warmer months than in colder months. Auto thefts have been shown to increase when weather leads people to leave their keys in their car, such as when people preheat their cars in the mornings during cold spells.
16
The social behavior theory, therefore, predicts an increase in both violent and property crime during warmer weather. Through rigorous tests of these two theories using crime and weather data from all 50 states, researchers have found strong support for the social behavior theory; seasonal variation in our routine activities is positively related to higher rates of violent and property crime.18 In addition, the aggression-violent crime relationship is evident in states that experience exceptionally high temperatures, such as Texas.19 Social network analysis variables describe the relationship between two individuals. When conducting a social network analysis, the research must specify which types of social inter actions will define which individuals are included. Individuals can be linked through kinship, friendship, enmity, affiliation with an organization, or participation in a financial transaction or crime (offender/victim relationship). These interactions can be positive or negative. The decision about which interactions to include is driven by the research question. If analysts seek to predict the hideout of a suspected offender, they could diagram the individuals social resources, which might include positive ties to family members, friends, and business associates. Potentially, a separate table could be designed for each type of crime (burglaries, homicides, sexual assaults) as the occurrence of each is best predicted by a different set of variables. Whereas some crimes are most likely to occur in densely populated areas (pickpocketing), other crimes (sexual assault) are most likely to occur in secluded areas. The presence of a shopping mall might therefore be a strong predictor of pickpocketing but not of sexual assault. On the other hand, analysts may be hesitant to exclude variables when developing predictive models for a particular type of crime, as it is easy to overlook the relevance of a variable. Returning to the shopping mall example, secluded areas such as stairwells and parking lots abound in shopping malls; therefore, excluding this variable from a predictive model of sexual assault crimes would be unwise. In short, one of the key benefits of predictive policing is that previously unknown or overlooked patterns emerge, and police departments can facilitate this process by marshaling as much potentially relevant data as can be handled by analysts and software.
Predictive Methodologies
There are three categories of analysis techniques that police departments use to predict crime: Analysis of space Analysis of time and space Analysis of social networks These categories are not intended to be all-inclusive, as the number of methodologies available to analysts is large and increasing. Instead, the following provides an overview of the different types of analysis commonly undertaken and the advantages and disadvantages of each.
18. John Hipp, Patrick Curran, and Kenneth Bollen, Crimes of Opportunity or Crimes of Emotion? Testing Two Explanations of Seasonal Change in Crime, Social Forces 82:4 (2004): 13331372. 19. Ibid.
17
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
associated with high-crime areas, such as transportation routes, entertainment establishments, and a high population density. In terms of predictive policing, hot spot detection can inform short-term decision-making about resource allocation and long-term policies related to crime reduction. It is important to keep in mind that a hot spot is a perceptual construct.20 Because geographical space is inherently continuous, the placement of a boundary to delineate a hot spot is somewhat arbitrary. The final location, size, and shape of a hot spot are influenced by judgments made by the analysts, such as: Which criminal incidents are included in the analysis Whether the hot spots are determined by the concentration of past criminal incidents, environmental characteristics associated with crime, or both The amount of time captured by the analysis (e.g., one year of crime data vs. five years of crime data) The weighting scheme applied to past criminal incidents There are six different types of hot spot detection methods. Each is briefly described below: Point (or offense) locations. Point locations are specific addresses on a map that have experienced elevated levels of crime in the past and are therefore expected to experience crime in the future. To detect point locations, sometimes referred to as hot points, criminal incidents are temporally aggregated and plotted on a map. Heat maps or dot distribution maps (where dots are graduated in size according to the amount of criminal activity) are used to display the distribution of point locations. The use of point locations is based on the theory of repeat victimization, which states that individuals or places that have been victimized once have a much higher likelihood of being victimized again.21 The concern with using point locations in a predictive framework is twofold. First, the probability that an individual or location will be re-victimized decays at an exponential rate following the last incident. This means that point locations quickly lose their predictive value over time. Second, we know that crime is more likely to occur near locations that have experienced crime in the past, a phenomenon not captured (directly) by mapping the density of crimes at specific locations. Hierarchical clusters. The purpose of hierarchical clustering is to group crime incidents into hot spots. To generate the clusters using a nearest-neighbor technique, incidents are first compared to one another and the distance between each pair of incidents is calculated.22 Two or more incidents are then grouped together if the distance between them is less than a threshold distance (defined a priori). This process creates first-order clusters. The process is repeated to create higher-order clusters until a criterion has been met, such as: all clusters contain a minimum number of incidents all lower-order clusters have converged into a single cluster the distance between all clusters exceeds the threshold distance23
20. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program. 21. Graham Farrell and William Sousa, Repeat Victimization and Hot Spots: The Overlap and its Implications for Crime Control and Problem-Oriented Policing, Crime Prevention Studies 12 (2001): 221240. 22. The researcher generates a dissimilarities matrix that contains the distance between each pair of incidents. There are multiple ways to calculate distance; a Euclidean or Manhattan metric is commonly used. 23. Tony H. Grubesic, On the Application of Fuzzy Clustering for Crime Hot Spot Detection, Journal of Quantitative Criminology 22 (2006): 77105.
18
After clustering, the researcher determines how to display the clusters spatially. This involves selecting the number of clusters (e.g., first-order vs. second-order) to display and the method of delineating the clusters (e.g., ellipses, convex hulls). An ellipse is generated based on the distribution of the incidents in a cluster and, by definition, includes geographical space around each incident, which may comport best with the theory that future crime incidents will occur in the vicinity of past incidents. A convex hull, in contrast, is the smallest amount of geographical space that includes all incidents in a cluster. Imagine that incidents in a cluster were pins on a map and you placed a rubber band around the pins the geographical space enclosed by the rubber band is the convex hull. A convex hull is denser (in terms of incidents) than an ellipse, but several of the incidents in the cluster will reside on the boundary line of the hot spot. One of the biggest downsides with hierarchical clustering is that the resulting map is highly sensitive to arbitrary choices made by the researcher, such as the threshold distance, number of clusters to map, and the shape to draw around clusters of incidents. Altering one of these choices can substantially alter the resulting map and thereby the decisions made using the hot spot map. Partitioned clusters. In contrast with hierarchical clustering, partitioning involves dividing incidents into clusters by optimizing a criterion over all possible partitions. The researcher selects the number of partitions (k) a priori, and each incident is assigned to one (and only one) cluster. One of the key challenges with partitioning is identifying the most appropriate number of clusters. Although there are several techniques for determining k, they often yield different numbers and no theoretical justification for choosing one over the other. A second concern with partitioning is the constraint that all incidents must be assigned to a cluster and that no incident can be assigned to multiple clusters. In some instances, an incident might not reasonably belong in any cluster. In other instances, an incident might equally belong in two (or more) clusters.
24. Seattle Police Using Software to Predict Crime Locations, Seattle Times, February, 27, 2013. 25. Office of the Mayor, Mayor McGinn Introduces New Predictive Policing Software, February 27, 2013.
19
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
Fuzzy clusters. This category of methods relaxes several of the assumptions required by traditional partitioning techniques. Although the researcher must define the number of clusters (k) a priori, incidents can be members of multiple clusters or no clusters. This reduces the bias associated with outliers and multi-cluster incidents. For each incident, a fuzzy clustering algorithm generates a probability that the incident is a member of each cluster. Each incident is there assigned k probabilities, which sum to one.26 Density mapping. Density mapping is a category of methods that involve statistically smoothing over individual crime incidents. A computer algorithm places a symmetrical distribution (known as a kernel function) over points, such as the locations of crime incidents. The individual distributions are then added together to create a single density distribution. The resulting map looks similar to a topography map; areas with a higher elevation have a higher density of crime.27 As with the other clustering methods, the analyst must make an a priori decision that influences the results. With density estimation, the analyst selects a bandwidth for the kernel function. The bandwidth is the distance from the points, such as crime incidents, to the edge of the distribution. A larger bandwidth results in a smoother final density distribution, but may mask important variation. A smaller bandwidth preserves the variation in crime density, but may prevent the analyst from uncovering big-picture patterns.28 Risk-terrain modeling (RTM) clusters. This (comparatively) new category of clustering methods, also referred to as risk-based clusters, incorporates multiple variables into the detection of hot spots. Rather than relying solely on past crime data, risk terrain models incorporate geocoded data about numerous aspects of a location; this type of data is increasingly available in vast quantities. RTM is the frontier of predictive policing, as it allows analysts to leverage decades of criminological, sociological, and psychological research: [W]hile hotspot mapping has allowed police to address the concentration of crime, it has generally turned attention away from the social contexts in which crime occurs. Predictions about crime occurrence are then based on what happened before in locations rather than on the behavioral or physical characteristics of places within communities. This has detached crime analysis from early work done in criminology on the effects that different factors had on the social disorganization of communities, and, in turn, on crime.29 The theory of RTM is that each geographical location has a different propensity for crime based on its spatial environmental characteristics, including those listed in Table 1. Some of these characteristics, such as stadiums and shopping malls, are crime generators, as they create crime opportunities at specific points in time because of the large concentration of people that pass through. Other characteristics, such as a concentration of bars or liquor stores, are referred to as crime attractors, as they appeal to criminals looking for opportune locations.30 Both categories of characteristics, however, contribute to the risk level ofa location.
26. Ibid. 27. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program. 28. Selecting a bandwidth necessarily involves a trade-off between bias and variance. A smaller bandwidth results in less bias but a larger variance. The reverse is true with a larger bandwidth. 29. Leslie Kennedy, Joel Caplan and Eric Piza, Risk Clusters, Hotspots, and Spatial Intelligence: Risk Terrain Modeling as an Algorithm for Police Resource Allocation Strategies, Journal of Quantitative Criminology 27 (2011): 339362. 30. Patricia Brantingham and Paul Brantingham, Criminality of Place, European Journal on Criminal Policy & Research 33 (1995): 526.
20
RTM requires the analyst to identify those characteristics that predict the risk of a certain type of crime, such as shootings, burglaries, or robberies. The process of identification is subjective, but should be informed by criminological theories, existing empirical research, and practitioner knowledge.31 After identification, each risk factor is operationalized and linked to a specific unit of geography. The risk factors are then incorporated, in layers, onto a composite map using GIS. Each location on the composite map is assigned a level of risk based on the included risk factors, where a higher risk level indicates a higher probability that the particular crime under analysis will occur in that area. Police departments can then allocate resources based on the distribution of risk in its jurisdiction. Although academic research has demonstrated the efficacy of RTM, few police departments have adopted this approach to law enforcement. On-the-ground policing requires dynamic response, which can be better achieved through more user-friendly and easily accessible crime mapping programs (in which hot spots are detected using patterns of past crime events). At the time of this writing, RTM requires a substantial amount of time and analytical skill to develop risk layers and generate risk-composite maps. Until this analysis can be performed in (close to) real time and the results translated into actionable recommendations, police departments will be reluctant to embrace this method of crime prediction. Just as with the other clustering methods, the final map is sensitive to analyst judgment. With RTM, the analyst must determine which risk layers to include in the composite map and which to exclude. If the risk distribution in the composite map is highly sensitive to changes in the risk layers, decision-making based on the map becomes exceedingly difficult.
31. Joel Caplan, Leslie Kennedy, and Joel Miller, Risk Terrain Modeling: Brokering Criminological Theory and GIS Methods for Crime Forecasting, Justice Quarterly 28:2 (2011): 360381. 32. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program. Because, by definition, there are no incidents prior to the first or after the last, averages are calculated beginning with the second incident and ending with the penultimate incident.
21
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
subset of incidents. For each incident, the averages are calculated using the incidents that occurred just before and just after. A subset generally includes three, five, or seven incidents. The resulting map includes a line through the incidents, which marks that average path taken by the offender.33 To forecast when and where the next crime in a sequence will occur, an analyst can perform a correlated walk analysis (CWA).34 A CWA examines the temporal and spatial relationships between incidents in a given sequence to predict the next incident. The first step in performing a CWA is to determine if there is a systematic pattern in an observed sequence of criminal incidents. This is accomplished by computing the correlation between intervals. An interval is defined by the time, distance, and direction (or bearing) between two events. With one lag, the interval is defined as the time, distance, and direction between an incident at time t and an incident at time t1. With two lags, an interval is defined as the time, distance, and direction between an incident at time t and one at time t2. The same logic applies to lags of three, four, and so on. Once the correlations between intervals for various lags (usually one to seven) have been calculated, the analyst can determine which lag for time, distance, and direction has the highest correlation. To generate a prediction, the analyst can calculate the average time, distance, and direction based on the appropriate interval length.35 For example, if the analyst determines that a lag of three exhibits the highest correlation with respect to time, the average time would be calculated using intervals with three lags. After the average time, distance, and direction are calculated, they are added to the last incident in the sequencethis is the predicted time and location of the next offense. Although a correlated walk analysis has a strong theoretical basis, it does not always produce conclusive results.36 There are many sequences of incidents in which a clear spatial and temporal pattern fails to emerge. Sequences with a small number of incidents or long timespans between incidents may fail to reveal an underlying pattern. And inaccurate data, such as the exclusion of incidents committed by the offender of interest or inclusion of events committed by another offender, will bias the prediction. In sum, while preliminary research suggests that CWA has potential as a tool for crime prevention, the method requires refinement before it can be regularly used by police departments.
33. This process is similar to fitting a Loess Curve to a scatterplot. 34. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program. 35. Alternatively, one could use the median or a regression to determine the values of time, distance, and direction to add to the last incident in the sequence. 36. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program, 36.
22
smugglers, buyers, and money-launderers. Further, criminal networks are embedded in the social context in which they operate; they are nourished by, and victimize, members of the community, including family, friends, and retailers. SNA is a tool police agencies can use to map these numerous interpersonal connections and mine them for actionable information. The building blocks of a social network are relationships between two actors (either individuals or entities). Actors are referred to as nodes and the relationships between them are termed links or edges. The relationship that connects nodes is defined by the research question; it can take the form of an exchange, communication, familial connection, affiliation with an organization, connection to a criminal act (either as an offender or victim), or something else. Once an analyst has identified a set of nodes for potential inclusion and defined the relationship of interest, they can generate a visual display of the social network and statistics to summarize the network. One statistic of interest may be a measure of network density. In a complete network, every node is connected to every other node. The density of a network decreases as the number of links decreases. In crime-fighting applications, social network analysis is frequently used to identify central nodesindividuals who have a high level of connectivity within the network. There are multiple measures of centrality, including degree, closeness, and betweenness, which are calculated for each node.37 Analysts can then rank all nodes in the data set. Degree centrality is defined as the number of links possessed by a node, closeness centrality is the total distance (measured in links) from a node to all other nodes in the network, and betweenness centrality is the number of instances a given node appears in the shortest path between other nodes. Measures of centrality indicate a nodes level of connectedness (degree), ease of obtaining information from the network (closeness), and relevance to the passage of information within the network (betweenness).38 Using centrality measures, an analyst can identify individuals of interest in the context of a given problem. If a police agency seeks to acquire information about a network without dismantling it, contacting an actor with a high level of closeness might be effective. Alternatively, a goal of inserting information into a network might best be achieved using an actor with a high betweenness measure. If an agencys mission is to take custody of a networks leaders or central actors, the measure of degree may be most useful. The Richmond Police Department (RPD), in conjunction with researchers at Virginia Commonwealth University, developed a successful pilot program to integrate social network analysis into its crime-solving approaches. (For a more detailed discussion of the Richmond Police Department, see page 29). The program was initially used to understand why violence had erupted between two groups of males. Using the betweenness measure, analysts identified males who were central connectors to the various sub-networks embedded in the overall social network of the two groups of males. A visualization of the network revealed that these central males were connected, through aggravated assault, to key females in the network; corroborating work by police detectives confirmed that the victimization of female friends explained the outbreak of violence.39
37. Evelien Otte and Ronald Rousseau, Social Network Analysis: A Powerful Strategy, Also for the Information Sciences, Journal of Information Science 28 (2002): 441453. 38. Jennifer Johnson and John David Reitzel, Social Network Analysis in an Operational Environment: Defining the Utility of a Network Approach for Crime Analysis Using the Richmond City Police Department as a Case Study, International Police Executive Symposium, Geneva Centre for the Democratic Control of Armed Forces (November 2011). 39. Ibid, 814.
23
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
In a predictive context, Richmond has solved cases through the identification of likely perpetrators. In a homicide case, analysts developed a social network visualization to determine potential points of entry for obtaining access to a critical witness. The social network analysis allowed the detective on the case to efficiently and effectively move his personnel resources to strategically navigate the suspect into the hands of the police.40 In another case, police used SNA to link a person of interest in several convenience store robberies to incidents in other jurisdictions. The social network analysis revealed a connection between 16 robberies previously unknown to be linked.41 More complete information greatly improves an agencys ability to make optimal tactical choices and solve crimes. The effectiveness of SNA, as with all other forms of analysis, is partially dependent upon the decisions made by the analyst. When defining a network, the analyst makes a series of specification decisions: Which initial members (seeds) to include in the network Which types of relationships to include The number of steps removed from the seeds to include How to identify the population of possible nodes for inclusion The decisions above define the boundary of a social network. If a mapped network is too small or too large, the key actors or points of entry will not emerge. Specification decisions should be guided by the nature of the crime under investigation and general domain knowledge. While social networking analysis can be a powerful tool, and is used widely in the commercial sector for marketing and other initiatives, it needs to be used appropriately, and within legal constraints, when used in law enforcement. Appropriate safeguards and procedures will need to be put in place to ensure the public that such analyses are not misused, to either undermine the privacy of individuals who are not under suspicion, or undercut the due process rights of individuals who are under surveillance. A February 2013 report for the U.S. Department of Justices Global Justice Information Sharing Initiative recognized the importance of addressing privacy across all forms of analysis impacting social networks conducted by law enforcement. This report, Developing a Policy on the Use of Social Media in Intelligence and Investigative Activities, draws lessons from the practices of a variety of local police authorities, and makes a series of recommendations for how to protect privacy in investigative settings.
24
42. George Mohler (assistant professor of mathematics and computer science at Santa Clara University) and P. Jeffrey Brantingham (professor of anthropology at UCLA) were the chief architects of the algorithms and software used in the SCPDs predictive policing program. More information about the software, PredPol, can be found at www.predpol.com.
25
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
It is critical that SCPD find efficient ways to reduce crime, as their current staff level is 20 percent lower than in 2000. Further, the department is not expected to increase the size of its staff in the foreseeable future. As a result, the department must take steps to ensure its officers are each achieving the most benefit possible. The software itself is affordable and requires minimal training. Further, predictive methods supplement experience, thereby standardizing the talent level in a police department between seasoned officers and novices. By simply being in the right place at the right time, as dictated by a hot-spot map, novice officers can make a valuable contribution to reducing crime. The department currently assesses changes in crime rates to determine whether or not the program is working. Preliminary evidence indicates that the program has been successful, particularly with respect to burglaries. A comparison of burglaries in July 2011 (when the program was first implemented) to July 2010 indicates a 27 percent decline (down to 51 from 70). Aggregating over the six months prior to implementation (January 2011 to June 2011) and comparing this number to the amount of burglaries in the same time period in 2012 (January 2012 to June 2012) reveals a 14 percent decline (down to 263 from 305). It is not surprising that SCPD has experienced the most success with preventing burglaries, as this type of crime lends itself to prediction. Potential burglars carefully design their plan of attack, often taking into consideration the environmental characteristics of the geographical area. In contrast to Santa Cruz, other departments instead measure success using arrest rates. The concern with this measure is that predictive policing is intended to reduce the incidence of crime through deterrence. When potential criminals see police officers monitoring an area, they are less inclined to commit an offense. It is, of course, quite difficult to measure deterrence, as we cannot calculate how many crimes would have occurred if not for the increased police presence. 43
43. For a detailed explanation about measuring unobserved crimes, see John Whitley, Five Methods for Measuring Unobserved Events: A Case Study of Federal Law Enforcement, IBM Center for The Business of Government (2012).
26
44. Ned Levine & Associates and the National Institute of Justice, CrimeStat III: A Spatial Statistics Program.
27
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
In addition to identifying this street of interest, BCPD generated a surface map based on the apprehension locations of previous robbers. The contour lines on the surface map indicated the probability of criminal apprehension given data from previous cases. The area of highest probability on the surface map included the street BCPD had identified using the CMD analysis. Based on this information, police conducted a stakeout of the street and apprehended the offender. In a similar case, BCPD responded to a series of grocery store robberies. The department plotted the crimes on a map and identified the directional relationship between each crime (in the order in which they were committed). Moving from the first to the second crime, for example, takes one southeast and moving from the second to the third takes one east. The analysis yielded a pattern of directions. Analysts then identified the location of the most recent robbery, investigated whether any grocery stores were located in the direction predicted by the pattern and indeed, identified one. Officers staked out the store and intercepted the offender. The effectiveness of the predictive methods used by BCPD and other police agencies relies on the quality and timeliness of the available data. Inaccuracies and delays in data entry hinder predictive analysis. Further, BCPD believes in data-sharing, as criminal prevention often requires a regional perspective. Robbers, burglars, and other criminals frequently offend in multiple jurisdictions, and pattern recognition greatly improves as information completeness increases. BCPD thus relies on several different databases of data: Digital Information Gateway (DIG) is a customizable repository of data from federal, state, and local agencies. Information related to, for example, felons change of addresses, gun seizures, sex offenders, and traffic stops are stored in DIG and accessible by analysts for use in data mining and analysis applications. The Law Enforcement and Information Exchange (LInX) is an incident reporting system that facilitates information-sharing across law enforcement agencies. This program enables analysts to track patterns and offenders over multiple jurisdictions, which can be crucial for crime solving. The program is currently used in 11 regions and by over 800 police agencies.45 The BCPD relies on LInX for incident information and is working on providing on-the-beat access to the program to all officers. Predictive analytics have not only enhanced BCPDs ability to solve crimes, they have improved the management of the agency. Using predictive models, the agency is able to forecast its service, and therefore resource, demands. This allows the agency to predict needs such as the hiring of officers and purchasing of patrol cars. These predictions can be made fairly far in advance (around 18 months) and have proven to be quite accurate. With its history of generating accurate predictions with respect to service demands, the agency has enjoyed a strong negotiating position in budget discussions with the county executive; the agency uses predictive analytics to support its budget requests. In response to its predictive policing initiatives, the agency has been the recipient of several large grants from the Department of Justices Office of Community Oriented Policing Services (COPS), in large part because of its credible projections regarding service demands and how the tactical deployment of officers would decrease crime rates. This financial support has further allowed BCPD to develop its crime analysis program and remain on the frontier of predictive policing.
45. Northrup Grumman, Law Enforcement and Information Exchange, accessed January 2, 2013, http://www.northropgrumman.com/ Capabilities/PublicSafety/Documents/LInX/LInX_brochure.pdf.
28
Richmond, Virginia
The Richmond Police Department (RPD) has a variety of analytical tools at its disposal. The department relies heavily on ATAC Workstation, software that allows analysts to use mapping and pattern recognition to forecast criminal activity. To generate forecasts, the software detects series or patterns among potentially related incidents. The data analyzed come from the departments record management system. These data might include victimization descriptions, times of days the crimes were committed, days of the week the crimes were committed, and known offenders. Potential next targets are identified, which aids in tactical decision-making. Richmond also regularly uses density maps to visualize criminal activity and guide resource deployment. Deployment maps are issued to officers twice weekly, once on Monday mornings and once on weekends. In addition, deployment maps are used during big events, such as the citys Fourth of July and New Years celebrations. These maps indicate, down to a four-block radius, areas that warrant a heightened police presence. Analysts for RPD receive extensive training. They often participate in week-long training courses run by the Alpha Group Center for Crime and Intelligence Analysis Training, workshops run by the International Association of Crime Analysis, and on-site training at RPD. New analysts work closely with the supervisor of the crime analysis unit for the first month or so of their employment. More generally, analysts are expected to develop specialty areas as well as facility with using a range of tools. Analysts must frequently tackle an analytical problem through several different avenues before they uncover actionable results. Numerous success stories have resulted from RPDs embrace of the philosophy that crime can be solved and prevented through quantitative analysis. In a string of convenience store shotgun robberies, RPD used tactical forecasting to determine where to deploy officers. Analysts identified several possible target locations based on the characteristics of the previous crimes that were thought to be part of the series. Officers were then stationed at the target locations, and within the predicted hour, they apprehended three suspects. To solve a string of parking lot attendant robberies, analysts used analytics to identify commonalities: all of the incidents occurred in the parking lots that were located in the downtown area, surrounded by buildings (as opposed to streets), and used during big events (as this was when the attendants were flush with cash). Further, the analysts observed that certain lots were likely to be targeted on weekends and others on weekdays, and that the offender was moving in a counterclockwise direction in selecting his targets. Using this information, police established surveillance at target locations and apprehended the suspect as he was exiting the scene of his most recent crime. Richmond has also enjoyed success through the use of social network analysis. After a month of searching for a homicide suspect on the run, analysts constructed a network that included the suspects family and friends, and identified key nodes. Officers then approached the central members of the network and notified them that the suspect was wanted by police and that they should inform police if the suspect made contact. The purpose of these meetings was to isolate the suspect by obstructing his social resources. With no safe havens, the suspect turned himself over to police within hours. When evaluating the overall success of its crime analysis program, including its predictive policing efforts, RPD examines movements in the crime rates. The right comparisons, however, are not always immediately evident. It might seem intuitive, for example, to judge whether the burglary rate has declined by comparing the June 2013 rate to the June 2012 rate. Burglary rates, however, are affected by variables that change year to year, such as weather and economic conditions. In some cases, it might be more reasonable to compare
29
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
adjoining time periods, such as two 30-day windows. Alternatively, a comparison of rates in neighboring areas might also be useful. One of the unique features of RPD is the organizational structure of the crime analysis unit. Police agencies typically adopt either a centralized or decentralized organizational structure, with the majority preferring the former. In a centralized system, all analysts report to a crime analysis supervisor and work together in the same location. This encourages intellectual exchange and camaraderie among the analysts. In a decentralized system, analysts are assigned to precincts and report to precinct captains; this ensures that analysts develop deep familiarity with on-the-beat police work and that communication between analysts and officers is frequent. Richmond has adopted a blend between a centralized and decentralized organizational structure. All analysts report to a crime analysis supervisor, but they spend the majority of their time working in their assigned precincts. Twice per week, analysts meet together to discuss emerging issues and tackle analytical challenges. Each analyst has two workstationsone in their assigned precinct and one in the crime analysis unit. The blended structure promotes both cross-training in analytical methods among analysts and domain knowledge acquired through the participation in day-to-day police work. Renee Tate, crime analysis supervisor for RPD, credits this organizational structure of blended centralization and decentralization with enabling much of the units success.
30
Implementation Recommendations
The analysis presented in this report and interviews with crime analysts at police departments across the United States yield the following set of recommendations for cities and counties that are considering a predictive policing program. Recommendation One: Do it. This exhortation is emphasized by crime analysts in multiple departments. A cost-benefit analysis of the program strongly supports its use. The software is relatively inexpensive and use of the program has been shown to reduce crime, even when the number of officers in the department remains constant. In these times of financial uncertainty, police departments are being asked by county and city governments to do more with less. The use of a predictive policing program is an effective way to achieve this goal. Although the establishment of a program requires modest startup costs, it is likely that cost savings might accrue over time from the improved allocation of resources based on the use of predictive policing. Recommendation Two: Treat predictive policing as an addition to, not a substitute for, traditional policing methods. Predictive analytics are best used as complements to officers judgment and intuition. They provide additional, not substitute, information. Officers are most empowered when they have many tools at their disposal and are trusted to use each as they see fit. When the Santa Cruz Police Department adopted a predictive policing program, the new methods were presented as an additional tool officers could use to reduce crime rates. Officers were encouraged to use the new hot spot maps as a subset of the many pieces of information that factor into their on-the-beat decision-making. The maps were not intended to replace other means of crime prevention, including community interaction. Offering the new tools as a complement to existing resources reduced friction with existing policies and allowed time for successful adoption. Recommendation Three: Avoid top-down implementation. Departments with effective predictive policing programs stress that buy-in increases when officers experience and judgment are respected. When officers are trained in the purposes and uses of the new tools and methods, and when these tools and methods are presented as additions to the officers existing capabilities, officers are more likely to embrace the change. The Santa Cruz Police Department observed that officers were most likely to incorporate predictive methods into their decision-making when they were motivated to do so by their peers rather than their supervisors. Not long after initial implementation, several officers using the new tools achieved noticeable decreases in crime within their jurisdictions. Other officers observed this success and, wanting to achieve similar results, began to use the new methods as well.
31
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
Recommendation Four: Keep the software accessible to officers on the beat. One of the biggest advances in predictive analytics is the ability to leverage real-time data. This ability is most useful, however, if officers can access output that uses real-time data, such as maps, while out in their cars and on foot. The Baltimore County Police Department is in the process of giving officers greater access to predictive software while they are out on their beats. While several software programs currently have in-car and mobile device capabilities, departments are working toward implementation. Recommendation Five: Consider the geographic and demographic nature of the jurisdiction. Predictive policing can be particularly useful for informing resource deployment in jurisdictions that cover a large amount of geographical space and a dispersed population. Through the detection of hot spots, departments can focus their resources on geographically manageable areas. In areas with high population densities, such as New York City, a police beat covers a relatively small amount of geographic space, such as a few city blocks. In this type of jurisdiction, daily hot spot maps add less value, but time-of-day predictions and social network analyses are likely to be of greater value. Officers whose beats cover comparatively large geographical spaces, in contrast, rely on hot spot maps to make knowledgeable choices about which areas warrant the most attention when a limited number of locations can be visited in a single day. Recommendation Six: Collect accurate and timely data. The accuracy of predictions is driven by the completeness of input information. As data-sharing and data-entry software improve and become more accessible, the output generated by predictive analyses will be even more effective and accurate. Police departments are currently investigating new ways to improve the entry time of data related to criminal incidents. When backlogs of data that serve as critical inputs build up, thepredictions that rely on that data suffer. To address this issue, the Baltimore County Police Department is moving toward a data-entry system that is available to officers in their cars. Further, Baltimore County, along with many other police departments, is making greater use of data-sharing programs, such as the Law Enforcement Information Exchange and Digital Information Gateway. These and similar programs allow departments to access data from other local, state, and federal entities. Recommendation Seven: Designate leaders committed to the use of analytics. Predictive policing programs require leaders with expertise in analytical methodologies, police work, and communication. Leaders must be able to oversee analytical work, collaborate with officers on analyses, and communicate findings to officers, oversight bodies, and the public. The Richmond Police Department attributes the success of its predictive policing program largely to the leadership and management structure of its crime analysis unit. Renee Tate, crime analysis supervisor for Richmond, has designed a unique hybrid organizational structure that ensures that crime analysts regularly communicate with each other as well as officers. In the Richmond system, analysts split their on-the-job time between the crime analysis unit and their assigned precinct. Departments that have implemented predictive policing programs strongly believe in their utility for preventing crime, solving crime, and expending resources more efficiently. As the technology and training become more accessible, more police departments will adopt these programs. Through these data-driven systems, police departments are pioneers in a broader societal shift that is leveraging big data to make better decisions.
32
Acknowledgments
The author is grateful to the IBM Center for The Business of Government for supporting this work. In addition, the author expresses sincere thanks to those who provided valuable information and insight that informed this work, including Philip Canter (Assistant Professor at Towson University, former Chief Statistician with the Baltimore County Police Department), Brian Cummings (Planning Operations Manager, Richmond Police Department), Zach Friend (Crime Analyst with the Santa Cruz Police Department), Philip McGuire (Assistant Commissioner, Crime Analysis and Program Planning Section, New York City Police Department), Renee Tate (Crime Analysis Supervisor, Richmond Police Department) and Major Mark Warren (Crime Information and Analysis Division, Baltimore County Police Department).
33
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
References
Beattie, John. The Pattern of Crime in England: 16601800. The Past and Present Society, no. 62 (1974): 4795. Beck, Charlie. The Los Angeles Predictive Policing Experiment. 2012. http://www.policyexchange.org.uk/images/pdfs/predictive%20policing%20slides.pdf (accessed January 2, 2013). Boba, Rachel. Crime Analysis and Crime Mapping. Thousand Oaks, CA: Sage, 2005. Brantingham, Patricia, and Paul Brantingham. Criminality of Place. European Journal on Criminal Policy & Research 33 (1995): 526. Brantingham, Paul, and Patricia Brantingham. Patterns in Crime. New York, NY: Macmillan, 1984. Bruce, Christopher W. Closing the Gap between Analysis and Response. Police Chief Magazine, September 2008. Bureau of Justice Assistance. Understanding Community Policing: A Framework for Action. U.S. Department of Justice, 1994. Caplan, Joel, Leslie Kennedy, and Joel Miller. Risk Terrain Modeling: Brokering Criminological Theory and GIS Methods for Crime Forecasting. Justice Quarterly 28, no. 2 (2011): 360 381. Farrell, Graham, and William Sousa. Repeat Victimization and Hot Spots: The Overlap and its Implications for Crime Control and Problem-Oriented Policing. Crime Prevention Studies, 2001: 221240. Groff, Elizabeth R., and Nancy G. La Vigne. Forecasting the Future of Predictive Crime Mapping. Crime Prevention Studies 13 (2002): 2957. Grubesic, Tony H., and Alan T. Murray. Detecting Hot Spots Using Cluster Analysis and GIS. Proceedings from the Fifth Annual International Crime Mapping Research Conference. Dallas, TX, 2001. Hanawalt, Barbara. Crime and Conflict in English Communities: 13001348. Cambridge, MA: Harvard University Press, 1979. Hipp, John, Patrick Curran, and Kenneth Bollen. Crimes of Opportunity or Crimes of Emotion? Testing Two Explanations of Seasonal Change in Crime. Social Forces 82, no. 4 (2004): 13331372.
34
Johnson, Jennifer, and John David Reitzel. Social Network Analysis in an Operational Environment: Defining the Utility of a Network Approach for Crime Analysis Using the Richmond City Police Department as a Case Study. International Police Executive Symposium. 2011. Kennedy, Leslie, Joel Caplan, and Eric Piza. Risk Clusters, Hotspots, and Spatial Intelligence: Risk Terrain Modeling as an Algorithm for Police Resources Allocation Strategies. Journal of Quantitative Criminology 27 (2011): 339362. Levine, Ned. Crime Mapping and the CrimeStat Program. Geographical Analysis 38 (2006): 4156. McCue, Colleen. Data Mining and Predictive Analytics in Safety and Security. IT Professional Magazine, July/August 2006: 1216. McCue, Colleen, and Andre Parker. Connecting the Dots: Data Mining and Predictive Analytics in Law Enforcement and Intelligence Analysis. Police Chief Magazine, July 2012. Ned Levine & Associates and the National Institute of Justice. CrimeStat III: A Spatial Statistics Program for the Analysis of Crime Incident Locations. 2010. Otte, Evelien, and Ronald Rousseau. Social Network Analysis: A Powerful Strategy, Also for the Information Sciences. Journal of Information Science 28 (2002): 441453. Roncek, D, and M Pravatiner. Additional Evidence That Taverns Enhance Nearby Crime. Sociology and Social Research 79, no. 4 (1989): 185188. Sklansky, David Alan. The Persistent Pull of Police Professionalism. New Perspectives in Policing, March 2011. Uchida, Craig D. Predictive Policing in Los Angeles: Planning and Development. Justice and Security Strategies, Inc., 2009. Visher, Christy, and David Weisburd. Identifying What Works: Recent Trends in Crime Prevention Strategies. Crime, Law and Social Change, 1998: 223242. Weisburd, David, and Tom McEwan. Introduction: Crime Mapping and Crime Prevention. In Crime Mapping and Crime Prevention Studies, edited by David Weisburd and Tom McEwan. Moonsey, NY: Crime Prevention Studies, 1997. Walsh, Brandon, and David Farrington. The Future of Crime Prevention: Developmental and Situational Strategies. National Institute of Justice. 2010.
35
Predictive Policing: Preventing Crimewith Data and Analytics IBM Center for The Business of Government
36
37
Reports from
For a full listing of IBM Center publications, visit the Centers website at www.businessofgovernment.org.
Recent reports available on the website include: Assessing the Recovery Act
Recovery Act Transparency: Learning from States Experience by Francisca M. Rojas Key Actions That Contribute to Successful Program Implementation: Lessons from the Recovery Act by Richard Callahan, Sandra O. Archibald, Kay A. Sterner, and H. Brinton Milward Managing Recovery: AnInsidersView by G. Edward DeSeve Virginias Implementation of the American Recovery and Reinvestment Act: Forging a New Intergovernmental Partnership by Anne Khademian and Sang Choi
Improving Performance
The New Federal Performance System: Implementing the GPRA Modernization Act by Donald Moynihan The Costs of Budget Uncertainty: Analyzing the Impact of Late Appropriations by Philip G. Joyce Five Methods for Measuring Unobserved Events: A Case Study ofFederal Law Enforcement by John Whitley Forging Governmental Change: Lessons from Transformations Led by Robert Gates of DOD and Francis Collins of NIH byW. Henry Lambright
Strengthening Cybersecurity
A Best Practices Guide for Mitigating Risk in the Use of Social Media by Alan Oxley
Using Technology
Mitigating Risks in the Application of Cloud Computing in Law Enforcement by Paul Wormeli Challenge.gov: Using Competitions and Awards to Spur Innovation by Kevin C. Desouza Working the Network: A Managers Guide for Using Twitterin Government by Ines Mergel
For more information: Daniel J. Chenok Executive Director IBM Center for The Business of Government 600 14th Street NW Second Floor Washington, DC 20005 202-551-9342 website: www.businessofgovernment.org e-mail: businessofgovernment@us.ibm.com