Jeff Asher is a crime analyst based in New Orleans and co-founder of AH Datalytics. He is the author of a Substack covering crime data and analysis.
A family member recently sent me to a website ranking the “2023 Top 100 Safest Cities in the U.S.” and asked my thoughts on it. My advice any time one of these rankings pops up is consistent: ignore it.
The rankings from NeighborhoodScout purport to show the “the 100 safest cities in America with 25,000 or more people, based on the total number of crimes per 1,000 residents.” That description seems straightforward enough, but looking under the hood shows many problems which help explain why rankings like these should be avoided.
The methodology is flawed at best, deceptive at worst
NeighborhoodScout describes its methodology on its main page, saying “Data used for this research are 1) the number of total crimes reported to the FBI to have occurred in each city, and 2) the population of each city. Based on the latest national data available at the time of publication, representing calendar year 2021 and released in October 2022, this report reveals interesting patterns about safety from crime in America.”
This is obviously a reference to the FBI’s Uniform Crime Report though it does not specify or link to any FBI website. Creating a ranking of safest cities for 2023 rankings based on data from 2021 is certainly misleading, but the problems get worse.
The first reason to avoid hard rankings of cities based on UCR data is that the FBI begs users of their data not to do this. The FBI has long used a disclaimer cautioning users on the “pitfalls of ranking” cities. Per the FBI:
Data users should not rank locales because there are many factors that cause the nature and type of crime to vary from place to place. UCR statistics include only jurisdictional population figures along with reported crime, clearance, or arrest data. Rankings ignore the uniqueness of each locale.
One of the 13 factors that the FBI notes can impact crime statistics is “crime reporting practices of the citizenry.” The National Crime Victimization Survey (NCVS) annually highlights how many crimes can be severely underreported. NCVS 2021 suggests that 46% of violent crimes and 31% of property crimes were reported to the police. Moreover, longer response times can further lead to crimes being improperly counted.
Around 85–88% of all crimes reported nationally by the FBI each year are property crimes, so NeighborhoodScout’s rankings are already heavily weighted towards the types of crimes that are least likely to be reported to the police.
Differences in non-reporting and response times are not spread homogeneously throughout the country, so changes in crime trends may simply reflect changing reporting practices. In South Bend, Indiana, for example, there was a surge in violent crime beginning in 2015 which reflected a tweak in how the local police department was categorizing simple vs. aggravated assaults. There was no surge in crime, only a surge in what South Bend was (now correctly) reporting to the FBI as major crimes.
The 2023 rankings are uniquely flawed due to issues with the primary data source. NeighborhoodScout is using the FBI’s Uniform Crime Report data but appears to completely ignore a change in reporting systems which made the 2021 dataset far from complete. Only 66% of cities with populations of 25,000 or more reported data to the FBI in 2021, compared to 95–97% in normal years, and only 53% of cities with populations of 25,000 or more reported 12 full months of data to the FBI in 2021. Failing to acknowledge this enormous hole in data collection is reason alone to discount the rankings entirely.
This omission becomes more problematic when another website takes these rankings and creates a map of the safest cities in the U.S. while pointing out that “no cities on the West Coast ranked in the top 50.” This is not surprising given that only 300 of 1,250 agencies in California, Oregon, and Washington — including just eight of 236 California cities with populations of 25,000 or more — reported a full year of data to the FBI in 2021. Taking 2021 crime data from California’s Department of Justice suggests there are around 10 California cities that should be included in the top 100 safest cities ranking, with five coming in the top 50 if we were ranking cities (which we shouldn’t be!). Florida similarly reported virtually no data to the FBI in 2021, and there are seven Florida cities that should probably be in the top 100 safest cities according to data from the Florida Department of Law Enforcement.
“Most Dangerous” rankings are even worse
The “safest city” methodology is flawed, but NeighborhoodScout’s “most dangerous city” methodology feels more downright deceptive. The safest cities list ignores cities that did not report data in 2021 while the most dangerous city methodology adds a second step in the case of missing data which isn’t obvious unless you compare the two FAQs. See the difference between the two methodologies below:
This is important because Bessemer, Alabama — the city being called the most dangerous city in America — did not report data to the FBI in 2021. This means that the most dangerous city in America ranking was bestowed using a “projection of violent crime rates based on prior years’ data.”
NeighborhoodScout projects Bessemer to have a violent crime rate of 33.8 per 1,000. It is not clear how this projection was made, though the highest violent crime rate ever reported by Bessemer was 29.9 per 1,000 in 2017. In 2020 Bessemer reported 16.8 violent crimes per 1,000, about half the rate that was projected for 2021.
Bessemer’s data is not publicly available, and the police department did not respond to my requests for data. But there are plenty of places that published 2021 violent crime figures but did not report to the FBI, which highlights the folly of these rankings using projections. San Bernardino, California is projected as 14.9 violent crimes per 1,000 but the city’s data shows the actual number is closer to 11.9. The violent crime rate for Birmingham, Alabama is projected at 20.6 per 1,000 — the seventh highest in the nation! — but city data points to it as roughly 15 per 1,000. Panama City, Florida comes in at the 92nd most dangerous city in America with a projected violent crime rate of 9.6 per 1,000, though data from FDLE puts it at a much lower 7.5 per 1,000.
There are other reasons to be wary of rankings, namely around how city boundaries are drawn and how populations are counted. The city of St. Louis is frequently regarded as a city with one of the nation’s highest murder rates, but that dubious distinction is due in part to the city’s formal borders used in calculating crime rates encompassing only a fraction of the city’s overall metro area. Only 11% of the St. Louis metro area lives within the city’s borders for counting crime statistics, compared to 33% in Houston and 73% in Las Vegas.
Additionally, NeighborhoodScout includes a comment on your “chance of being a victim,” which ignores that not all crime victims are residents of the place the crime occurred in, not all crimes have primary victims who are a person (who’s the victim in a shoplifting?), and the burden of crime is not distributed evenly throughout a city. But this post is already long enough and probably doesn’t need another 1,000 words going into depth on those issues.
Too many bad lists to rank
While this post has focused on NeighborhoodScout, other websites do rankings like this using faulty data and poor analysis, too. There’s the Rocket Mortgage 15 Safest Places in the US in 2023, U.S. News Most Dangerous Places in the U.S. in 2023-2024, and MoneyGeek Safest Cities In America 2023. All these rankings use the same deeply flawed 2021 NIBRS data, though the MoneyGeek ranking at least acknowledges the problems with that data and claims to have done “individualized research” to fill in the gaps, though it is not clear how successful they were.
There is a real cost to publishing bad rankings, especially for a small city like Bessemer, which has been labeled as the most dangerous place in the United States. Hopefully smart readers will understand the flaws of these rankings and ignore them now and into the future.
If you enjoyed this post, you can find more of Jeff’s work on his Substack.
In the absence of data about an important subject, people will use whatever is available. So, yeah, there are issues with these rankings.
Jeff Asher, what list should people use to determine how safe their city is?
Glad to see Mr. Jeff accurately note that St. Louis’s “most dangerous” reputation is a statistical myth, not truth.
Also, if anyone wants to know how dangerous we truly are, just TRY my patience here and argue that the myth is anything but bullshit. I promise that I will literally fight you.