Mortality in Florida Prisons

A data-driven investigation into deaths inside the Florida Department of Corrections

Abstract

Why this investigation? National policy-makers have long claimed, based on Bureau of Justice Statistics (BJS) reports, that the death rate behind bars is “lower than outside”, even after being adjusted for age, sex, and race. But the statistical approach used by the BJS effectively imports the community’s deepest health and demographic disparities into the baseline for prison mortality.

What I found instead: Using direct age- and sex-standardization to Florida’s general population, I found that prison mortality rates are neither unequivocally lower nor higher than “outside” -- rather, they are starkly more volatile and clearly spike in crisis years. In 2020 (when the COVID-19 pandemic hit Florida prisons), the adjusted death rate spiked dramatically from 567 per 100,000 in 2019 to over 2,277 per 100,000 -- a fourfold increase, and more than double the crude death rate outside prison during the same time. Other years show prison mortality rates that sometimes track closely with, and occasionally fall below, those seen outside, reinforcing that “safety” in prison is a myth shaped as much by crisis and mismanagement as by demographic adjustment.

On homicide: Florida’s in-prison homicide rate remains extraordinarily high -- averaging more than triple the state’s overall rate, and nearly double the global in-custody average. Institutional claims of safety fall flat when risk is examined honestly.

Bottom line: The real risk of death during incarceration in Florida isn’t “lower than outside” or “about average.” It’s a hazardous environment marked by extreme swings in all-cause death risk, persistent violence, and an institutional tendency to minimize or obscure the truth in official reporting.

Methods & Transparency

Introduction

I’ve started and stopped this investigation more times than I can count. It’s not because the work isn’t important, but because it’s personal -- sometimes painfully so. Incarceration cost me years of my life, and it cost some of my closest friends everything. I’ve lost people I loved to suicide, others to so-called ‘suicide’ that I suspect was really violence at the hands of officers, to overdose, to violence from other incarcerated people, and to what the system calls ‘natural’ causes -- when anyone on the inside knows that most such 'natural' ends are the result of systemic neglect or outright malpractice.

There were times I had to step back. With my first few attempts, the anger and grief were sometimes overwhelming, and I didn’t know how to do justice to the reality behind the numbers. Now, as I start graduate school and find new perspective, I’m ready to look honestly at the data -- to represent it as objectively as I can, and also to tell the truth behind the statistics.

This page is an open investigation. I’m committed to showing my work, my sources, and my methodology, so anyone can see both what’s counted -- and what’s left out. My hope is that, by shining a light on the numbers (and the stories they can obscure), I can give some measure of voice and dignity to those who didn’t make it out.

But there’s another reason for this inquiry. The Florida Department of Corrections and similar agencies are more than happy to point to surprisingly low crude, unadjusted death rates -- numbers that defy what we know of the conditions behind bars. Researchers seeking a deeper, fairer answer often turn to the Bureau of Justice Statistics for Age Adjusted Death Rates. Yet the BJS’s adjustment methods end up producing similar results: the death rates of those incarcerated are found to be consistently lower than those in free society. This statistical reassurance has echoed through policy debates and carceral research, even as it sits uneasily beside the firsthand realities of incarceration. My work doesn’t reject statistical adjustment; it begins with a close examination of the BJS method -- explaining what they did right, where their statistical choices distort the truth, and why those decisions yield such paradoxical results. Then, I put a more transparent, traditional, and public health–aligned alternative to the test, to see if it reveals a more honest picture.

That’s why this project isn’t just about refuting the FDOC’s ("everything is fine here") story -- it’s about looking at prison mortality through a public health lens: measuring all-cause death rates, applying more transparent demographic adjustments, and directly comparing risks inside to those faced by all Floridians. As you’ll see, the true pattern isn’t a simple “safer inside” story -- in some years, death rates in prison spike to crisis levels well above those outside, revealing a risk environment that is volatile, neglected, and too often overlooked.

Where the FDOC Data Comes From

  • FDOC Inmate Mortality Database FDOC INMATE MORTALITY— The official, public-facing mortality data provided by the Florida Department of Corrections.
  • FDOC OBIS Database FDOC OBIS Database — The Offender Based Information System (OBIS) is the Florida Department of Corrections’ internal database for all prisoner records, including deaths. Unlike the public-facing mortality reports, OBIS data is much less processed -- think enormous spreadsheets, not polished tables. Dates, facility names, and causes of death aren’t always standardized, and key details may be missing or inconsistent. Using OBIS data means wrangling a messy, unfiltered record: I've included all data wrangling and cleaning scripts in my methods documentation for full transparency.

For full details on how I extract, transform, and visualize the data, see Methods & Transparency.

The Story FDOC Wants to Tell

We're going to start by looking at the Florida Department of Corrections' own presentation of mortality data. The FDOC's public-facing mortality page is a carefully curated narrative that frames deaths in a way that minimizes institutional responsibility. It’s important to understand how the FDOC tells its story, because it shapes public perception and policy.

FDOC pie chart and table displaying manner of death for Florida state prisoners, fiscal years 2019-2024. The chart visually emphasizes 'natural' deaths as a large majority, with much smaller segments for accident, homicide, suicide, and pending cases. The table below lists counts for each manner of death by fiscal year, alongside annual total deaths and inmate population.
Screenshot of Florida Department of Corrections – Inmate Mortality page (Screenshot 2025-08-04 at 4.06.51 PM).

The first thing one learns about data visualizations, when studying them formally, is that every plot and graph is an argument. The FDOC’s mortality page is a case study in how institutions tell stories with numbers:

  • A giant pie chart (mostly labeled “natural deaths”) visually drowns out all other causes of death.
  • Total Population counts are always shown (with the exception of the error '0' for 2025), quietly suggesting deaths are rare -- acceptable even, by carceral logic. Little number next to big number = good.
  • Introductory “facts” center pre-existing conditions and life before incarceration, with no mention of the extreme conditions within the facilites or systemic negligence.
  • “Natural” is a black box. No nuance, no breakdown, no room for questions about inadequate care, harsh conditions, malpractice, or complicity.

This is not neutral reporting -- it’s institutional storytelling: "Nothing to see here". As someone who’s lived both the numbers and the lives they summarize, I want to push back. The data doesn’t -- and can’t -- speak for itself.

Reframing FDOC’s Presented Data

If you only glance at the FDOC’s charts and tables, you’re nudged toward two key takeaways:

  1. The death rate in Florida’s prisons is low and unremarkable, especially given the size of the incarcerated population.
  2. Most deaths are “natural” -- so by implication, the system isn’t to blame.

But neither of those claims holds up when you put the FDOC’s own numbers in context. A small number beside a big number visually suggests safety, but it doesn’t tell the whole story.

So let’s begin by reframing the most straightforward comparison: homicides. Instead of just seeing how many homicides happened in a prison population of tens of thousands, let’s translate those “little numbers next to big numbers” into actual homicide rates, then compare them with state-wide rates.

Statewide homicide rates are shown for calendar years, while FDOC rates use fiscal years. Both rates are per 100,000 person population. Final row shows the mean rate for each group over the interval.
Statewide Homicide Rate Source: FLHealthCHARTS.gov

How Does Prison Violence Compare?

To make these numbers real, here’s how Florida’s prison homicide risk stacks up against two major benchmarks:

  • Florida statewide homicide rate (average, 2019–2024):
    6.5 per 1,000,000
  • Global average in-prison homicide rate (UNODC, 2024):
    12.2 per 1,000,000
  • FDOC in-prison homicide rate (average, 2019–2024):
    20.3 per 1,000,000

This means that a person in a Florida prison faces:
  • Over 3 times the risk of being murdered as the average Floridian,
  • An almost double homicide risk than the average incarcerated person worldwide

Sources: FLHealthCHARTS.gov  |  UNODC, 2024  |  FDOC Inmate Mortality Database

How Deadly Is Prison, Really?

Homicide rates tell only part of the story. Most lives lost behind bars are attributed to so-called “natural causes”. Natural causes, however, can be inflicted by the unnatural conditions of incarceration. The bigger question is: How does the overall risk of dying in prison compare to the risk outside? And just as crucial, how do we make that comparison fair and meaningful?

This is where the statistics get trickier - and where official analyses can mislead as much as illuminate. The Bureau of Justice Statistics (BJS), in its flagship reports, claims that death rates in prisons are lower than in the general population, once you adjust for age, sex, and race. But as I’ll show, how you do that adjustment matters - and they're doing it wrong.

Comparing Methods: BJS vs. Direct Standardization

What the BJS Study Did

In its December 2021 report, Mortality in State and Federal Prisons, 2001–2019 – Statistical Tables (U.S. Department of Justice, Bureau of Justice Statistics [BJS], NCJ 300953), Carson and colleagues reported that “State prisoners were less likely to die in 2019 (308 per 100,000) than U.S. residents age 18 older (435 per 100,000)" The report notes: “To allow for direct comparisons between the two populations, BJS adjusted the U.S. resident population to resemble the sex, race or ethnicity, and age distribution of state prisoners before calculating overall and cause-specific mortality rates.”

To translate: the BJS study used an indirect standardization approach. Instead of adjusting the prison mortality rate to a fixed standard (like the 2000 U.S. population, as is traditonaly done for adjusted metrics), they reweighted the general population’s death rate to match the age, sex, and race/ethnicity profile of prisoners -- who are much younger, overwhelmingly male, and more heavily minority.

The Bureau of Justice Statistics’ method works like this: Instead of directly comparing the death rates in prison to those of the general population, they ask, “What would the death rate look like for everyone in the nation if the general population had the same age, sex, and race demographics as the prison population?” That is, they recalculate the nation's overall death rate, but first reshape the population to become far younger, overwhelmingly male, and more heavily minority -- just like the prison population.

Here’s why this matters: By building a comparison group that is disproportionately people of color, AND where current racial and social disparities in health outcomes presist in that distorted population, the 'expected' death rate for this group gets pushed astronomically high.

The end result: Prison’s own death rate is compared to this inflated baseline, so it appears low -- regardless of the real dangers and conditions inside. In reality, this isn’t a fair or neutral comparison; it’s one that practically guarantees prisons come out looking “safer,” because the outside benchmark has been artificially set so high.

Summary Table: Approaches

Method Population Standard Used What it tells you
Standard Age-Adjusted Death Rate: FL DOH/CDC, AADR United States or individual state population (traditional) “What would the FL death rate look like if Florida had the same demographic structure as the US?”
BJS (2021), indirect standardization Prison group’s age/sex/race structure “What would the US death rate look like if the US had the same demographic structure as those incarcerated?”
This analysis Florida state population (age structure by sex) “If Florida's population faced Florida prison death risks, how bad would it be?”

Reference: Carson, E. Ann. (2021, December). Mortality in State and Federal Prisons, 2001–2019 – Statistical Tables (NCJ 300953). Bureau of Justice Statistics.

Why Age Adjustment Matters

If you want a fair and complete answer to “How deadly is prison?”, you really can’t just compare death rates for those inside against those not and call it a day. Look at these two population pyramids: one for people inside Florida state prisons, one for all Floridians.

  • The FDOC prison population: Mostly 25–54, hardly anyone over 65, overwhelmingly male (almost 14:1).
  • The Florida population: Broad at the top, over a quarter above age 65; nearly even gender at every age group.

These differences aren’t trivial -- they’re at the heart of any valid mortality comparison. Older age means higher risk, so a crude comparison will make prisons look safer simply because their population is so much younger. To see the true impact, you need to “level the field” and ask: what if both groups had the same age and sex structure?

That’s what I do. My method directly standardizes for age and sex, using the real Florida population each year as the baseline

I do not adjust for race, as while it is recognized that race and ethnicity are major drivers of health disparities in the broader community, once inside the carceral environment, external social determinants of health -- like neighborhood, insurance access, and discrimination -- are diminished and replaced by the standardized environment of prison. Within-prison mortality risk is likely much more evenly distributed across racial groups than in free society, so weighting by race/ethnicity may inappropriately “transfer” community-driven disparities into the expected risk baseline, rather than truly reflecting differences attributable to incarceration itself.

By directly standardizing to the age and sex structure of Florida residents, my approach supports a genuinely “apples-to-apples” comparison of mortality risk and aligns closely with the methods used in state and national public health surveillance.

What Age Adjustment Reveals: Year-by-Year Death Risk

Here are the actual age- and sex-adjusted death rates for people in Florida state prisons (AADR), alongside the crude death rate for all Floridians, year by year:

Year Prison Age Adjusted Death Rates
(per 100,000)
Florida Crude Rates
(per 100,000)
2019567973
20202,2771,106
20211,3131,187
20227471,070
20231,6341,010
Prison rates are age- and sex-adjusted; Florida crude death rates shown for context.

The chart below tracks these numbers visually -- so you can see not just the overall trend, but the volatility and crisis-driven spikes that crude comparisons completely miss.

Sources: FLHealthCHARTS.gov  |  FDOC OBIS Database

The main takeaway: Florida prison mortality is volatile. In 2020, the age- and sex-adjusted death rate shot up to 2,277 per 100,000 -- quadruple the previous year and more than double the outside rate. Other years, the gap narrows -- or even appears to flip -- but the pattern is unmistakable: safety behind bars is unpredictable, and people inside are exposed to elevated and rapidly shifting risk whenever policy fails or crisis strikes.

John Wise logo