Georgia Cyber Academy, the state’s largest school with more than 13,000 students, says the way the state graded it last year simply doesn’t work.
The school has reason to complain: the state gave it low marks for its 2014-15 performance, and as a charter school it must do better or risk losing its authority to operate and to draw tens of millions of dollars in state money.
The Atlanta Journal-Constitution published an in-depth report about the school on Sunday.
The low score is rooted in student performance on standardized state tests. The academy says many of its students are transient, and transient students typically do worse on tests. The school says the state is failing to adequately adjust its scoring for transience. But the state disagrees.
The academy got its charter from the State Charter Schools Commission, which uses test results in two ways to assess its schools’ performances. One method is called “growth” on tests. The commission relies on the Georgia Department of Education to do the growth analysis, which compares each student’s test results only against peers with similar historic test performance. The education department says its growth analysis doesn’t directly take transience into account, but says its results are still “reflective” of that and other student characteristics, since only similar performers are compared.
Another measure used by the commission does specifically address student mobility, though, according to the expert who used it to give the academy similarly low marks. It’s called “value-added,” as in how much value a school added to its students’ performance. In this model, the test performance of students who are demographically similar is compared.
Tim Sass, of Georgia State University, produced a value-added evaluation of all the commission schools for the commission. He said he included transience as a measurable student characteristic, considering both movement between schools during the school year and during the summer breaks. He eliminated the scores altogether of students who missed 65 percent of the academic year. “This eliminates highly mobile students who do not stay at any one school for more than a few months,” he said.
The Sass report says that in 2014-15 the academy was strong in some areas, such as high school literature. “However, its performance in all elementary subjects, all middle school subjects, and high school Analytic Geometry, Biology, Economics Physical Science, and U.S. History is weak relative to the state.”
Matt Arkin, the head of the academy, said he didn’t have enough information about Sass’ model to dispute it, though he said he doubts the results in 2015. That was the year Georgia switched from the old Criterion-Referenced Competency Tests, or CRCTs, to the Milestones tests in use now. He said the school’s performance on various tests in the two models — growth versus value-added — was inconsistent in 2015, unlike in prior years. He said that’s likely because of the attempt to compare results between the Milestones and CRCTs; experts say the results of different types of tests shouldn’t be compared. He said it could be that factors such as student transience had a greater effect on his school than on a traditional school, since new students can take a while to adjust to online learning.
Arkin and the chairman of the school’s board, Ryan Mahoney, argued at a commission performance review in June that they were making progress and that the state’s measures were flawed, especially as it pertains to at-risk students, many of whom are transient.
That prompted an unfavorable reaction from several on the commission, including Tony Lowden. He said the last thing he wanted to hear was an argument about how to measure the performance of at-risk students because “they don’t get a do-over.” The school took them on, he said, “so you can’t argue that we’ve got the worst of the worst yet you don’t want to be measured.”
Arkin said in a later interview that he believes his school’s scores will improve with the 2016 Milestones, which, unlike the 2015 results, will come from a comparison against the same kind of tests the year before. He also said his own review of his student data indicates that the school does better with students the longer they remain at the school. Middle and high school students lag the state average for growth by more than 10 percentage points in their first year, approaching the average after that and surpassing it, barely, after two years. He’d like to see the state measuring his school’s performance in that way, too.
“We’re not saying we shouldn’t be held accountable,” he said. “We just want to be held accountable in a fair and valid way.”