Fluctuations in aggregate crime rates contrary to recent shifts in the age distribution of the U.S. population have cast doubt on the predictive power of the age–crime hypothesis. By examining a longer time horizon, back to the early 1930s, we show that the percentage of the young population is a robust predictor of the observed large swings in the U.S. murder rate over time. However, changes in the misery index—the sum of the inflation and unemployment rates—significantly contribute to explaining changes in the murder rate. This applies, in particular, to those changes that are at odds with the long-run trend of the U.S. age distribution, such as the decline in the murder rate in the latter part of the 1970s or its increase starting around the middle of the 1980s.That is from this paper by Nunley, Seals, and Zietz published in the latest number of the Journal of Socio-Economics [the charts are taken from the working paper].
It remains to be seen if this theory actually applies to some Latin American countries (!?).
No comments:
Post a Comment