The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

The Hechinger Report spent the last year investigating a major subset of school discipline: suspensions and expulsions for vague, subjective categories like defiance, disruption and disorderly conduct. 

We started this project with some basic questions: How often were states suspending students for these reasons? What kinds of behavior do educators say constitute defiance or disorder, anyway? And were some students more likely to be punished for these kinds of things than others?

Answering these questions revealed how overwhelmingly common these types of suspensions are for a broad range of behavior, including minor incidents. Here’s how we did it.

How did we get state and district level suspension data?

We attempted to get data from all 50 states, but there is no single place to get school discipline data broken down by suspension category. States do not report this information to the federal government. In fact, some states don’t even collect it from their districts. 

When possible, we downloaded the data from the state’s department of education website. When it wasn’t readily available we submitted public records requests.

In the case of New Mexico, we used data obtained and published by ProPublica.

What did we ultimately collect? 

In the end, we obtained the data we were looking for from 20 states: Alabama, California, Georgia, Indiana, Maryland, New Hampshire, New Mexico, Ohio, Vermont, Washington, Minnesota, Mississippi, Massachusetts, Alaska, Colorado, Louisiana, Montana, North Carolina, Oregon and Rhode Island.

In most cases, we received data from 2017-18 to 2021-22. In the case of Vermont, however, we did not have data for 2021-22 and in North Carolina, we had data only for 2019-2020 and 2020-2021.

We had demographic data that allowed us to examine the racial and special education disparities in California, Indiana, Vermont, New Mexico, Montana, Maryland, Ohio, Rhode Island, Mississippi and Massachusetts.

Was the data uniform?

Far from it. Each state has its own categories for student discipline, ranging from just six reasons a student can get suspended in California to more than 80 in Massachusetts. 

First, we identified any of the categories that had to do with disrespect, disorder or disruption and singled them out. These were the primary focus of our analysis. But we also wanted to know how suspensions for these reasons compared to others. 

To do that, we looked for common threads among suspension categories and created our own larger categorizations. For example, any offense category that had involved alcohol, drugs or tobacco was grouped into the category “alcohol/drugs/tobacco.”  Any offense  that involved fighting or physical aggression we put into a category called “physical violence.” These groupings were made following research into state discipline codes and discussion. We also showed our groupings to experts to get their feedback. In the end, we had 16 unique categories. We added the numbers from all state categories that fell into one of our larger groups. 

This allowed for an overall look at how many punishments were assigned for broad types of behavior. Yet because of discrepancies in discipline definitions in each state, direct comparisons between states are not advisable.

Suspended for…what?

Students miss hundreds of thousands of school days each year for subjective infractions like defiance and disorderly conduct, a Hechinger investigation revealed. 

Read the series

How did we deal with missing or redacted data?

In all of the states, suspensions below a specific count (generally fewer than 10 but in some cases fewer than five) were redacted to make sure no student could be identified. We considered them as zero since there was no way to accurately assess that number. In most states, this did not affect the overall findings. In smaller states or districts, where we saw or expected significant redactions, we only looked at grand totals.

Did the data have any other limitations?

Yes, once again, we had to contend with a lack of uniformity in how states gather this information. In some places, we obtained information only for suspensions. In others, the data included expulsions. In Alabama, instances of corporal punishment and alternative school placement were also included.

Some states only allowed districts to report a single reason for a suspension. Others allow several reasons to be selected. And, muddying the waters further, some states reported numbers of students who were suspended, while others reported the number of incidents that led to suspension. We’ve made a list available with details about individual states

How did we analyze demographic disparities?

We calculated the rate of suspension by looking at the number of students of a particular race suspended per 100 students of that race in a state or district. The comparisons between rates of suspensions of Black students and white students were made by dividing the rate of suspension for the former by the rate of suspension for the latter. For instance, if Black students were suspended at a rate of four students per 100 Black students in a state and white students were suspended at a rate of two students per 100 white students, then Black students were suspended at twice the rate of suspension of white students (4/2 = 2).

We did the same analysis for students with disabilities relative to their general-education peers.

How do we know what kind of behavior students were suspended for?

We submitted public records requests to dozens of school districts across the country asking for the most recent year or two years of discipline records for any suspensions assigned in their category of defiance or disorderly conduct.  

Most districts denied our request or never responded. Some estimated it would cost tens of thousands of dollars for them to pull the records. In all, 12 districts in eight states granted our request for free or for a more affordable cost. This gave us more than 7,000 discipline records to analyze.

So how did you analyze them? 

After reading through many of the records to begin to identify patterns, we once again made some broad categories of behavior that kept coming up, including talking back to an educator, swearing or refusing a direct order. 

About 1,700 of the records were in PDFs (including some with handwritten notes) that could not easily be converted to a spreadsheet. We coded all of these by hand, checking if the incident contained any of our categories and marking yes or no. We also hand-coded 1,500 of the remaining records. Each incident could have as many “yeses” as merited. We checked each other’s work to make sure we were being consistent. 

We then used a machine-learning library and trained a model with our labeled dataset and used the trained model to predict the remaining incident reports for the same categories. The accuracy of the model in predicting the incidences (on a test dataset which was taken out from the labeled dataset) varied across categories but, overall, the model had a low rate of false positives. We also spot checked the findings to make sure records were not being miscategorized. 

This story about discipline data was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.