The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

This story about attendance-related suspensions was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education, and the Arizona Center for Investigative Reporting, an independent, nonpartisan, nonprofit newsroom dedicated to statewide, data-driven investigative reporting. Sign up for the Hechinger newsletter and the AZCIR newsletter.

“Education Suspended,” a collaboration between the Arizona Center for Investigative Reporting and The Hechinger Report, represents an ambitious, nearly yearlong effort to better understand the impact of school absences on Arizona students. When kids aren’t in class, they aren’t learning, a reality underscored by the COVID-19 pandemic.

Arizona Center for Investigative Reporting
This story also appeared in Arizona Center for Investigative Reporting

This series isn’t just about why students may miss class, though. Our collaborative investigation is more about what Arizona schools do to students who are consistently absent — or, as we ultimately found, what they do even if kids are repeatedly a few minutes late, sometimes through no fault of their own.

Administrators at schools of varying sizes, types and demographic makeups often respond to a range of attendance violations by keeping students out of class altogether, in the form of in- or out-of-school suspensions. Being blocked from class for missing class, however, compounds the problem these officials say they’re trying to solve.

Yet, without comprehensive data readily available at either the state or federal levels, it was initially impossible to see how widespread the practice of suspending students for attendance issues was, or how many additional school days kids were missing as a result.

We also couldn’t tell which districts most frequently used suspensions in response to attendance problems, how heavily they leaned on out-of-school suspensions, which types of attendance violations were being punished most often, and whether students from historically marginalized groups were overrepresented when it came to disciplinary action.

So we submitted hundreds of records requests and used the responses to create an original database to answer those and other key questions.

Here’s a closer look at how we did it, and the decisions we made along the way.

Why were public records requests needed to access this information?

Districts and charter schools periodically collect and report suspension and expulsion data to the U.S. Department of Education as part of the Civil Rights Data Collection, a federal effort to ensure the country’s public schools do not discriminate against protected classes of students. Though Arizona displays the results online as part of its “school report card” system, CRDC data couldn’t address the questions we sought to answer.

The data does not tie suspensions to violations, making it impossible to see whether a school system suspended students for attendance issues in general, or which types of attendance violations it suspends for specifically. CRDC data also does not provide a full picture of suspension lengths, which we needed to determine how long students were being blocked from class as a result of missing class.

The Arizona Department of Education does not collect detailed disciplinary data for all students, either. It does, however, maintain enrollment data and chronic absenteeism data that we used to establish district baselines for comparison.

What information did reporters ask for?

To fill these data gaps — and allow us to see, for the first time, which school systems use suspensions in response to absences, tardies and other attendance issues—AZCIR and The Hechinger Report submitted more than 400 records requests to districts and charter systems throughout the state.

We asked for disciplinary data capturing in-school and out-of-school suspensions, expulsions and transfers that:

  • Spanned the 2017-18, 2018-19, 2019-20, 2020-21 and 2021-22 school years.
  • Included the reason for each disciplinary action.
  • Noted, for suspensions, how many days the punishment lasted.
  • Was broken down by race/ethnicity, gender and disability status.

We also requested information on truancy referrals, since some schools refer students with excessive absences to the court system.

Were records requests sent to every school system in Arizona?

We excluded the following from our analysis:

  • Alternative schools, which explicitly serve at-risk students: those with a history of disruptive behavior issues, for example, or those who have previously dropped out of school or are primary caregivers. Since alternative schools tend to treat discipline differently than traditional campuses (for example, by committing to avoiding highly punitive measures), their data could skew the results of our analysis.
  • Accommodation districts. We used a similar rationale here, since these districts also tend to serve student populations with specific needs (such as students in juvenile detention, students experiencing homelessness, students on military bases or reservations) in nontraditional learning environments.
  • Career and technical education districts, another type of unconventional educational setting that focuses on preparing students for the workforce. CTEDs require a majority of instructional time to be conducted in a field-based or work-based learning environment.
  • Small rural districts that have so few students, the Department of Education redacts any meaningful data.

Did all Arizona school systems provide responsive data?

No. Several school districts were slow to comply, in many cases only providing data after several rounds of follow-up. Hundreds of other districts and charters either didn’t provide usable data or respond at all (a violation of Arizona public records law).

It ultimately took more than six months to get responses from about 200 district and charter systems, and many were incomplete. Some were missing demographic information, or did not specify how long suspensions lasted. A handful did not tie disciplinary actions to violation categories. Others redacted all data points representing numbers smaller than 11, citing student privacy concerns.

School systems also provided responses in a wide range of formats. A portion sent clean databases that could be analyzed immediately, while many more sent spreadsheets or PDFs that required significant standardization before they could be analyzed. Some sent scanned copies of individual incident report forms or other files that included thousands of narrative descriptions. In those cases, we read the descriptions and logged the corresponding data points in spreadsheets.

Why was there so much variation in disciplinary data?

At a basic level, the discrepancies make sense, because schools aren’t required to maintain this data in a uniform way. Districts and charters use a variety of digital student management systems that generate different types of reports, and some smaller schools don’t use a digital system at all.

In other circumstances, the discrepancies were either deliberate or avoidable. For instance, some school systems, acting on advice from their attorneys, converted spreadsheets to PDFs before providing their data. Some districts claimed current employees were not familiar enough with their student management systems to produce responsive reports — one wanted AZCIR/Hechinger to pay more than $400 for employee training in order to get the data (we did not). Others said they could only provide data for certain years because they had switched student management systems and had not retained records from prior systems.

How did you standardize the data?

We first reviewed each school system’s data to determine whether it had suspended students for attendance violations over the past five school years.

Sometimes, it was easy to identify attendance-related violation categories: “truancy,” “tardy,” “unexcused absence,” “excessive absences,” “ditching,” “skipping,” “other attendance violation,” and so on. Other times, we had to make a judgment call. For instance:

  • We considered “leaving school grounds/campus without permission” and “elopement” (fleeing the campus) attendance violations, as students were being punished for leaving during the school day.
  • We considered skipping an in-school suspension to be an attendance violation, since those occur during class time, unlike out-of-school suspensions.
  • We did not consider skipping an after-school detention to be an attendance violation, since that would happen outside of school hours.

If a school system did, in fact, suspend for attendance-related violations, we worked to clean and standardize its data so we could add it to our master database.This involved (1) making basic fixes to ensure data was consistently formatted and (2) using data analysis software to calculate total in-school suspensions, total out-of-school suspensions and days missed as a result, by violation type and school year. When provided, we also calculated demographic totals — by race/ethnicity, gender and disability status — by violation type and school year.

A few things worth noting:

Data for some school systems indicated they doled out partial-day suspensions, such as sending a student home on out-of-school suspension “for the rest of the day.” When districts calculated the time missed for us, we used their numbers. When they didn’t, we calculated estimates ourselves based on details in the incident description. (For example, if a district noted a student was sent to the in-school suspension room the last two of six class periods, we estimated 0.3 days missed. If the incident time indicated a district sent a student home about halfway through the school day, we estimated 0.5 days missed.)

Some school districts listed a minimum duration of one school day for all suspensions, then claimed that many suspensions were actually for less than one day when contacted about total days missed. If districts could not provide more precise suspension durations, we used the data as submitted.

We did not calculate suspension lengths for districts that only listed start and end dates for each incident. Doing so would have required determining total days included in the range, then reviewing both standard and academic calendars for each year to subtract weekend days and school holidays for every suspension.

In at least five cases, we could see a district suspended students for attendance problems, but data integrity issues did not allow us to glean much more. We ultimately excluded those districts from our database, since their data was not usable for detailed calculations.

What did the final AZCIR/Hechinger database include?

Our database included the number of in-school and out-of-school suspensions each district issued, by school year and violation type, as well as how long the suspensions lasted. We also added demographic data for those suspended when available. We distinguished between true zeroes and missing or redacted data points, since those differences matter.

We categorized each violation category as attendance-related or not, so we could analyze the number and rates of suspensions for attendance issues — the most original part of our data-driven reporting. Among other questions, we wanted to know: What proportion of suspensions were for attendance violations overall? Which districts used them most often? Were school systems using in-school or out-of-school suspensions more to punish kids for missing class time? Which types of attendance violations were being punished most frequently? How much additional class time were kids missing as a result of these suspensions?

How did you analyze the data?

We did topline calculations using data from districts and charters that suspended for attendance violations. We first determined the total number of attendance-related suspensions for the five-year review period, as well as the total days lost to those suspensions. We did the same calculations for in-school suspensions specifically, as well as out-of-school suspensions.

Because we received suspension data organized by incident, not by student ID, totals represented the number of suspensions issued, not the number of students suspended. In a group of 100 suspensions, for instance, one student could account for 10 of them.

Because several districts did not provide the lengths of any of all suspensions issued, we also knew total days missed would likely be an undercount.

Though the analysis did not include all Arizona school systems, or complete data for every district that did respond, the calculations offer a better understanding of the proportion of overall suspensions tied to attendance violations, illustrating for the first time just how pervasive the practice of suspending for attendance issues is across Arizona — and what that means in terms of additional days missed, even if approximate.

The in-school versus out-of-school suspension comparison revealed that more than 1 in 5 attendance-related suspensions in our sample were served out of school, a practice experts argue is even more detrimental than in-school suspensions when it comes to student disengagement. It also showed, as we explore in part two of this series, that attendance-related suspensions tend to disproportionately affect Arizona’s Black, Hispanic and Indigenous students.

How did you do the district-level analysis?

At a more granular level, we wanted to better understand which district and charter systems most harshly punished students for attendance issues, and whether that was consistent over the five school years analyzed.

Because student populations varied widely from district to district, though, looking at raw numbers of suspensions issued and the resulting days missed — by district and year — wasn’t a fair comparative measure for every question we were trying to answer.

We instead used Arizona Department of Education enrollment data to calculate annual rates of attendance-related suspensions for each school system. Specifically, we analyzed the number of suspensions issued for attendance violations by district, per 1,000 students, per school year. This allowed us to compare districts, and also to see if and how a district’s use of suspensions for attendance violations changed over time.

We then used ADE’s chronic absenteeism data to calculate annual chronic absenteeism rates for each school system (again, per 1,000 students). We used those rates to determine if districts that most heavily relied on suspensions for attendance issues were the same as those with high rates of chronic absenteeism.

To examine specific subcategories of attendance-related violations — for instance, to see where suspending for tardies was most common — we filtered our database using keywords. In cases where districts grouped multiple offenses leading to a single suspension, we counted that suspension when calculating totals for each subcategory. For example, a suspension for “truancy/tardies” would appear both in the total number of tardy suspensions and the total number of truancy suspensions.

These analyses helped inform our decisions about where to focus our efforts when it came to interviewing district administrators, school officials and students.

How did you check for overrepresentation of certain racial/ethnic groups?

Though roughly 75 school systems provided some level of demographic breakdown for their suspension data, much of the race and ethnicity data was incomplete or heavily redacted.

To ensure our analysis was as accurate and fair as possible, we opted to analyze only the top 20 districts with the highest number of attendance-related suspensions and the most comprehensive demographic data (for both discipline and overall enrollment) for disproportionality. These districts accounted for just over a quarter of the state’s public school population but nearly 90 percent of attendance-related suspensions in the AZCIR/Hechinger sample.

To check for overrepresentation of certain racial/ethnic groups, we compared each group’s share of attendance-related suspensions within a district with its share of district enrollment, as supplied by ADE, for a given year. If the former was higher — for example, if Black students represented 10 percent of a district’s relevant suspensions but only 5 percent of its student population — that group was understood to be overrepresented, and thus disproportionately affected by attendance-related suspensions. This also allowed us to see that white students tended to be underrepresented among those suspended.

Two items worth mentioning:

  • Because data for Indigenous students in particular was even more limited, analysis of that group involved about a dozen of the top 20 districts.
  • Some school systems differed in how they treated students identifying as Hispanic — whether they listed a student identifying as Black and Hispanic under both Black and Hispanic or under “two or more races,” for instance. We generally had to defer to the school system when it came to race/ethnicity categorizations, which means it’s possible a small number of students appeared more than once in a district’s data.

This story about attendance-related suspensions was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education, and the Arizona Center for Investigative Reporting, an independent, nonpartisan, nonprofit newsroom dedicated to statewide, data-driven investigative reporting. Sign up for the Hechinger newsletter and the AZCIR newsletter.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

2 replies on “Inside our analysis of attendance-related suspensions in Arizona”

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

  1. Fifty years ago, as a young teacher, I surveyed a number of high schools as to the use of suspensions. All of the administrators agreed it did little or nothing to improve the attendance or behavior of students being disciplined. Almost everyone agreed to one point – suspension keeps good kids good. Good kids never want that call going home saying they’ve been suspended.

  2. Fifty years ago, as a young teacher, I surveyed a number of high schools as to the use of suspensions. All of the administrators agreed it did little or nothing to improve the attendance or behavior of students being disciplined. Almost everyone agreed to one point – suspension keeps good kids good. Good kids never want that call going home saying they’ve been suspended.

Letters are closed