Share this post
FaceBook  Twitter  

Crime forecasting is an emerging application area for forecast research. While there have been isolated papers in the literature, it is only recently that there has been major interest and thus research programs in the area. This interest has been fueled by the availability of electronic police records for analysis, availability of geographic information systems ( (GIS) software and street maps for spatial data processing and display, advances in criminology for model specification, and advances in police management that place the focus on performance measures. Andreas M. Olligschlaeger's Ph.D. dissertation was seminal in opening the field. The National Institute of Justice, and in particular its Mapping and Analysis for Public Safety Program, has funded research grants in the U.S. and the UK's Home Office has had an active research program in the area. This research brings to bear many of the advances in the forecast literature, but also addresses unique aspects of the crime forecasting problem including those as follow.

  • Short-Term Crime Forecasting - has the requirement to forecast over space and time series data such as monthly crime levels across uniform, square grid cells within a city. The grid cells need to be as small as possible, less than a mile on a side, in order to support targeting patrols and other police interventions. In this setting, it is critical to manage the small-area estimation problem; namely, to find means to accurately estimate models based on small and therefore noisy data aggregates. Data pooling across grid cells, in some form, is necessary to improve accuracy. My feeling, based on results from our current research, is that multivariate models estimated across all grid cells, instead of univariate models for each grid cell, is perhaps the best approach.

  • Multivariate Crime Forecasting - both for the short and long-term, draws on the vast and fascinating criminology literature plus modeling approaches from the field of spatial econometrics. These literatures provide appealing theories for controlling fixed effects of place (i.e., crime patterns depend on the nature of local populations and land uses, both of which do not change rapidly over time), for incorporating spatial interactions (e.g., using spatial and time lags to represent crime displacement to nearby areas caused by a crack down on drug dealing), and for specifying leading indicators for use in short-term forecasting (a version of the "Broken Windows" theory suggests that "soft crimes" harden over time to become serious crimes).

Other issues and materials of interest on this site include:

Crime Data

A primer on standard police data is provided, as well as a discussion of variables and spatial data processing.

Police Management

NYPD Police Commissioner William Bratton is credited with starting CompStat (Computer Statistics or Comparative Statistics) in 1994. Furthermore, the NYPD credits CompStat and related policies with the over 70% reduction in murders and large decreases in other major crimes in New York City.

Tutorials (To be updated)

 

Software (To be added)

Crime Forecast Audit Tool (To be added)