Monday, Oct. 7, 2013 | 4:51 a.m.
Hi, (not you?) | Member Center | Sign Out
Posted: 6:35 p.m. Saturday, March 24, 2012
The Atlanta Journal-Constitution
We requested average reading and math test results and the count of students tested for each school, grade and test subject from 50 states and the District of Columbia for all years in which grades 3 through 8 were tested.
Because many states suppressed data for groups of fewer than 10 students, we excluded these groups in all states.
We requested the results as scaled scores. States convert raw test scores into scaled scores so results can be compared over years.
We created approximate cohorts by matching results for each school, grade and subject with test results from the previous grade in the previous year. We refer to this grouping of students at a specific school in a single grade and taking a specific test — either reading or math — as a “class.”
For each state, grade, cohort and year, we created a linear regression model, weighted by the number of students in a class, and compared the average score for a class with the score predicted by the model based on the previous year’s average score. We then calculated a p-value — an estimated probability that such a difference would occur by chance — using standardized residuals and the “T” probability distribution, which adjusts the probability upward for classes with fewer students.
Classes with scores rising or dropping with a probability of less than 0.05 were flagged as unusual.
Finally, we looked for improbable clusters of unusual score changes within districts by calculating the probability that a district would, by random chance, have a number of flagged classes in a year, given the district’s total number of classes and the percentage of classes flagged statewide.
The district calculations excluded schools identified as charter schools.
Analysis limitations
A statistical analysis cannot prove cheating. It can only identify improbable events that can be caused by cheating and should be investigated.
Ideally we would look at how individual student test scores change from year to year, but federal privacy regulations precluded access to that data. The approximate cohorts we used were the only available substitute. It is unlikely that two groups of students in a cohort are perfectly identical. Urban districts in particular have high student mobility.
But we found that large demographic changes at a school — large increases or decreases in poverty levels, for example — are rare. Approximate cohorts mostly compare similar students.
Because of this, large jumps or dives in test scores should be rare, experts told us.
Matching schools and grades between years in this way makes it impossible to compare groups of students who change schools. Also, school identification codes sometimes change, making it difficult to match schools between years. These classes may be excluded in our analysis.
● Other data was excluded from our analysis because of problems at the state level. Nebraska was excluded entirely because it did not have a statewide testing system until last year. Because state data formed a baseline for comparing district results, the District of Columbia was excluded because it is not in a state.
For Maine, the 2009-10 cohort was excluded because the state underwent a massive school system reorganization that made it impossible to match 2009 classes with 2010.
In Wyoming, failure of an online testing system rendered 2010 results invalid, so we used only earlier years.
In Louisiana, only data from 2008 and later were used because of disruptions in schools caused by Hurricane Katrina.
Also, on the advice of Massachusetts education officials we used their raw scores instead of scaled scores.
Consulting experts
With the analysis largely complete, we consulted statisticians and testing experts:
● Gary Phillips, vice president and chief scientist, American Institutes for Research, advised on methodology.
● Jaxk Reeves, director of the University of Georgia Statistical Consulting Center, conducted a detailed independent review of our data and methodology, replicating the analysis for multiple states from scratch.
● James Wollack, director of testing and evaluation services, University of Wisconsin-Madison, reviewed methodology and a sample of the findings.
● Edward Rothman, professor of statistics and director of the Center for Statistical Consultation and Research, University of Michigan, reviewed methodology.
Feedback on analysis
Some school district officials and education consultants have raised the issue of whether high student mobility would lead a district to be highlighted in our analysis even if they had no cheating problem.
A high rate of mobility is a characteristic of virtually all inner city high-poverty districts. If it were true that our methodology just flagged mobility instead of potential cheating, then you would expect all urban districts with high mobility to be flagged.
This was not the case. For example, Cleveland schools, with a better than 30 percent mobility rate, had an average 4 percent of classes flagged by our analysis in 2008-2011. Statewide, about 5 percent of classes were flagged in those years. Chicago, Fresno and Amarillo, Tx., are other examples of districts grappling with high mobility which did not have high concentrations of suspect scores.
Before publishing the analysis, we also looked for large changes in the percentage of poor students. They were so rare we did not eliminate any data because of them. We also looked for changes in the number of students in a grade that were greater than 25 percent, suggesting a massive change at a school. We did remove from our data those with such big changes, which amounted to about 6 percent of classes.
University of Georgia statistics professor Jaxk Reeves, who is also director of the Statistical Consulting Center, reviewed the AJC’s analysis and said that absent a radical change in the makeup of a school — such as a sudden influx of poor or wealthy students — mobility should not have a major impact on how a district fares in the analysis.
Atlanta school district officials also raised this issue when the AJC first began analyzing their scores. The newspaper took an in-depth look at student mobility in Atlanta and in other, similar districts nearby. It showed that the other districts did not have nearly the same concentration of suspicious scores as Atlanta, despite having similar populations of students.
Research notes
50: States provided us with standardized testing data.
14,743: Districts from across the country we examined.
69,000: Schools administered the tests.
13 million: Students took these exams in 2010.
1.6 million: Records we analyzed.
2,400: Statistical models we used to identify unusual scores.
© 2013 Cox Media Group. By using this website,
you accept the terms of our Visitor Agreement and Privacy Policy, and understand your options regarding Ad Choices
.
Already have an account? Sign In
{* #registrationForm *} {* traditionalRegistration_displayName *} {* traditionalRegistration_emailAddress *} {* traditionalRegistration_password *} {* traditionalRegistration_passwordConfirm *}Already have an account? Sign In
{* #registrationFormBlank *} {* registration_firstName *} {* registration_lastName *} {* traditionalRegistration_displayName *} {* traditionalRegistration_emailAddressBlank *} {* registration_birthday *} {* registration_gender *} {* registration_postalZip *} {* traditionalRegistration_passwordBlank *} {* traditionalRegistration_passwordConfirmBlank *} {* agreeToTerms *}We have sent you a confirmation email. Please check your email and click on the link to activate your account.
We look forward to seeing you frequently. Visit us and sign in to update your profile, receive the latest news and keep up to date with mobile alerts.
Don't worry, it happens. We'll send you a link to create a new password.
{* #forgotPasswordForm *} {* forgotPassword_emailAddress *}We have sent you an email with a link to change your password.
We've sent an email with instructions to create a new password. Your existing password has not been changed.
To sign in you must verify your email address. Fill out the form below and we'll send you an email to verify.
{* #resendVerificationForm *} {* resendVerification_emailAddress *}Check your email for a link to verify your email address.

You're Almost Done!
Select a display name and password
{* #socialRegistrationForm *} {* socialRegistration_displayName *} {* socialRegistration_emailAddress *} {* traditionalRegistration_password *} {* traditionalRegistration_passwordConfirm *}Tell us about yourself
{* registration_firstName *} {* registration_lastName *} {* registration_postalZip *} {* registration_birthday *} {* registration_gender *} {* agreeToTerms *}