By Monica Henson
In a recent blog post, Maureen poses a thoughtful question: How should we judge schools?
She cites examples in DeKalb County where families responded to district decisions to close schools for low performance or enrollment. Their criteria for determining whether their local schools were doing a good job were related to how their children experienced the school day, not bureaucratic sorting and counting in response to well-meaning state and federal legislative mandates.
During No Child Left Behind, districts were required to offer transportation to “better” schools to families of children in schools deemed low-performing. A miniscule percentage of parents availed themselves of this offer.
I have been pondering this question since the school I have led for six years, Graduation Achievement Charter High School, was judged to be underperforming by the State Charter Schools Commission. As a result of that judgment and the commission’s resistance to revisiting its criteria, the school will close June 30 after graduating our final senior class.
The simple fact of the matter is this: the people who are keenly interested in the measurement of academic performance of a school or system are the people who constructed the measure, not the people whose children attend. Parents and students care about how the child is treated inside the building and what the child’s experience of school is.
This phenomenon is present in the national divide inside the charter school world over the idea of "choice for the sake of choice," a movement that is bubbling underneath the surface here in Georgia and will, I predict, erupt forcefully should the Georgia Cyber Academy meet the same fate as GACHS. I will leave that discussion to another time and limit my remarks here to the issue of "accountability for the sake of accountability."
Without boring the readers with arcane detail, I will summarize a few of my objections to the method of determining “academic effectiveness” used by the State Charter Schools Commission. Simply stated, the commission’s Value-Added Model can answer the question, “How is my school’s performance different from the performance of the typical (elementary, middle, or high) school in Georgia?” The model does not answer the question, “Compared to other schools that look like mine, how is my school’s performance different?”
The first question assumes all schools serve similar student compositions, with respect to gender, special education, socioeconomic status, student age, etc. The second question acknowledges schools with abnormal student populations may perform differently than those with typical student populations.
GACHS has always served a highly abnormal student population — with at least 70 percent of its students arriving at the school two or more years behind. The VAM used by the commission does not take this into consideration. The commission insists that it does because they include the difference (in months) from the average student age as a student-level control (thus they do acknowledge that student age is related to performance outcomes).
However, to answer the second question above, a two-level model must be used to take account the percentage of students that are overage at each school in the state. My educated guess is that the “average” Georgia high school may have up to 5 percent overage students (probably less if using our two-years-behind definition). Unfortunately, this data is not publicly reported, so we cannot say with certainty.
The commission’s VAM analysis of alternative Schools ignores the fact many of them impose enrollment restrictions; consequently, comparing us to those schools is not an apples-to-apples scenario. The best, truest comparison cohort is limited to designated alternative schools serving high school grades without restriction on enrollment.
We have worked for several years with a highly respected firm in Colorado that works nationally with charter schools serving highly abnormal populations. Our review of the commission’s VAM has been performed by their staff of statistical analysts and charter accountability experts. In fact, the commission had initially sought this firm produce a report for them several years ago before discovering that we had already engaged them. Therefore, the source of the critique is compelling and credible.
I am not criticizing the VAM itself — but the application of it and the claim that it adequately levels the field for GACHS, as well as State Charter Schools Commission Deputy Counsel Stevens’ recent claim that the VAM is a “best practice.”
There are only five states that use a VAM for school and district academic accountability measurement — none of them use it in isolation — and of those, two don’t use it all for alternative schools, and one allows the alternative schools to select the degree of weight the VAM will carry, and applies several other measures as well.
Any VAM is nothing more than a tool in a toolkit. The National Association of Charter School Authorizers recommends multiple alternative accountability measures to determine the effectiveness of charter schools serving highly abnormal populations. We provided the commission two years ago with a proposed alternative framework for academic accountability that included short-cycle assessments in reading and math that would provide performance data for all of our students, not just the first-time Milestones test takers who were enrolled for a full academic year — fewer than 10 percent of our student served. Our average length of stay for students is 25 weeks. We were refused the opportunity even to discuss this NACSA-endorsed concept.
Most states use student growth models to judge academic effectiveness. Interestingly, the Georgia Department of Education doesn’t use a VAM and hasn’t seen fit to adopt one despite the commission’s claimed success with it.
The College & Career Readiness Performance Index (CCRPI) is a student growth model. Interestingly, in the Georgia DOE accountability world, GACHS posted a three-year achievement gap closure rate of 60 percent -- higher than any other alternative school in Georgia, regardless of whether those schools restrict enrollment in any fashion.
The School Effectiveness division, which oversees the federal school turnaround program, including Title I Priority Schools, saw fit to exit GACHS with sustained support based on our academic improvements over time and our attainment of the Alternative School designation. GACHS is the only commission charter school that holds that designation and the only commission school that has produced independent third-party confirmation of the significant risk factors exhibited by a majority of its student population over time.
The Governor's Office of Student Achievement Report Card for 2017 confirms our free and reduced lunch-eligible population that year is 77 percent -- higher than any other State Charter Schools Commission high school.
Coincidentally, we also have a higher percentage of college-ready graduates as measured by GOSA than the two other commission high schools to which we are most frequently compared, neither of which is a designated Alternative School. (GACHS is not eligible for removal from the GOSA School Turnaround list because we are an open-enrollment school. GOSA’s definition of “nontraditional” schools precludes open enrollment, an accountability Catch-22 that doesn’t make a whole lot of sense.)
The argument that test scores aren’t the best measure of school effectiveness is leveled by traditional school districts as well. It has been shown that Georgia school systems labeled as “failing” in our state’s measurement systems would not be considered failing in some other states. The same is true of charter schools serving abnormal populations. Our research partner ran analyses of several Georgia charter schools in other states’ alternative accountability frameworks and found that GACHS would be considered meeting, and in at least one case even exceeding, those authorizers’ expectations.
My argument should not be construed as an indictment of the commission staff. They are an earnest, well-meaning group of young people with law and policy degrees who do the best they can with sorely limited experience in the real world of the practitioner in the field. They care about children.
However, they’ve been charged by the Legislature and their commissioners with producing a portfolio of schools that provide an excellent public education. The benchmark that has been selected is “outperforming the state average.” GACHS simply does not fit into that portfolio.
Other struggling commission schools have played the game of adjusting the inputs and closed grade levels that don’t produce high performance. This doesn’t create real school improvement or a true outperforming-the-state scenario — it simply shifts the problem back to local school systems.
We have stayed true to our mission of being a truly open-enrollment statewide high school open to all students, even those that have been expelled or long-term suspended elsewhere. We have received high praise from the Georgia Association of Hearing Officers as the only school system in the state to maintain this enrollment policy. We have demonstrated with voluminous evidence that we are not a “failing” school.
The State Charter Schools Commission is mistaken in its claim that we are failing, and even more so in attempting to promote the idea that the taxpayers have somehow been served better by our closure. The problems that GACHS has been helping to address are simply being shoved back onto the overflowing plates of the traditional school districts, the social service system, the mental health system, and the juvenile justice system.
The State Charter Schools Commission’s portfolio will have been “improved”—by adjusting the inputs. The biggest policy question in this situation is whether a state agency that is charged with providing excellent public education to all students is living up to that mission, or is it gaming the numbers itself by denying an educational choice to the students least able to find options?