At least three large school districts and two state agencies say they will use an Atlanta Journal-Constitution analysis of test scores to help them root out cheaters.

But officials in many other districts denied test-tampering ever occurs in their classrooms and they don’t plan to investigate further. Some districts challenged the AJC’s analysis, suggesting that high rates of students moving from one school to another skewed some results.

But last week, a statistician from the University of Georgia examined the issue of student mobility and found that it does not explain away suspicious scores.

The AJC’s investigation, published March 25, identified nearly 200 school districts nationwide with highly unusual test scores that resembled those found in Atlanta, where a state investigation confirmed widespread cheating last year. The AJC used a statistical analysis to identify improbable jumps and drops on state math and reading tests.

Experts said that while the AJC’s analysis doesn’t prove cheating, school officials should follow up and investigate wild score changes.

School systems across the country contacted the AJC about the findings, many wanting more information about how the analysis was conducted. While some said they will examine their scores more closely based on the results, others emphatically rejected the idea that cheating occurred.

Houston education leaders plan to use the analysis to aid their search for educators who are cheating, a spokesman said this week. Baltimore and Mobile County, Ala., said the data could help improve the reliability of test scores and they will act on the information.

The three districts were among nine large districts nationwide with the most suspicious concentrations of test scores. In those districts, the odds of such scores occurring without outside intervention were worse than one in 1 billion. Some states are stepping up as well with more aggressive programs to search for cheating.

Michigan education officials are planning a more thorough search for cheating.

“We did look at your information and we did think it was troubling,” said Jan Ellis, a spokeswoman for the Michigan Department of Education.

The department plans to review and enhance their testing security measures, she said.

Missouri’s Office of the Comptroller cited the AJC’s analysis when it announced it will investigate St. Louis’ testing procedures and responses to cheating allegations.

Baltimore officials said they consider the analysis strong enough to provide them another tool to examine their testing results.

Baltimore started an aggressive investigation of cheating in 2007 and continues to police test results. The AJC’s testing data will help the district identify and stamp out cheating, said Andrés Alonso, the district’s chief executive.

“This is important as far as I’m concerned because it helps us check our own data, check our own assumptions,” Alonso said. “It might show we have missed a particular school. It doesn’t prove anything, but it may mean we have to go back and look at it again.”

Other districts said they know there’s no cheating in their schools and there’s no reason to look for any.

Nashville officials told local reporters that despite the AJC’s suggestion of suspect scores, cheating doesn’t happen in their schools. Officials said the newspaper’s methodology was flawed and they don’t intend to investigate.

Before even looking at the AJC’s data, the Tahoma School District in Washington state told a local reporter that steep increases in some of their test scores could not be the result of cheating because the district is vigilant about test security.

Test security protocols, such as monitoring schools to make sure the tests are administered properly, are a cheating deterrent used by most districts.

But few school districts go beyond such protocols to examine unusual test scores and patterns of wrong-to-right erasures to make sure the deterrents are working.

Tahoma has no process to look for cheating after the tests are taken, said Kevin Patterson, a district spokesman. When officials do identify dramatic increases in test scores, he said, they presume the scores are the result of excellent education.

The district reviewed its approach to test security after reading the AJC’s report and found it aligned with state standards, Patterson said. It also looked at the classes flagged in the analysis and believe they are simply high-performing classes.

“We worked our tails off,” Patterson said.

That is a common response from districts and echoes the reaction from Atlanta Public Schools officials when the AJC first wrote about the district’s highly unusual scores in 2009. But education and testing experts say instruction isn’t likely to move so many scores to the degree seen in the AJC’s analysis.

The AJC’s report two weeks ago garnered national attention, fueling debate about high-stakes testing in American schools.

The analysis was the subject of more than 500 newspaper stories and 400 radio and television stories across the country.

U.S. Secretary of Education Arne Duncan said in an emailed statement after being briefed on the AJC’s analysis that the findings were “concerning.”

“I think everyone is interested to do a better job here so we’ll continue to push hard,” Duncan told an AJC reporter after the investigation was published.

The mobility factor

Numerous school districts said student mobility explains wild fluctuations in test scores. Atlanta Public Schools offered that as an explanation for erratic test score changes found by an AJC analysis.

That explanation was echoed by Gary Miron, a professor of evaluation, measurement and research at Western Michigan University.

Miron, who reviewed the analysis of Ohio’s data at the request of the Ohio Education Association, a teachers’ union, has said in blog posts that the AJC’s analysis is flawed because it does not track individual student test scores year to year.

Because scores that allow tracking students from year to year are not available for all states, the AJC tracked classes of students — for example, fourth-grade test results in 2009 and fifth-grade scores in 2010 at a single school.

The districts’ concern is that some fourth-graders who took the test in 2009 would leave the school and be replaced by new students who would take the test in 2010. Districts critical of the AJC’s analysis claim these new students cause test results to rise or fall.

The AJC and the experts who advised its reporters considered student mobility while conducting the statistical analysis and dismissed it as a challenge to the results for several reasons.

For student mobility to cause an extraordinary rise in scores, all incoming students would have to be excellent test takers and all exiting students poor test takers. This is unlikely, experts say, because schools generally draw students from the same neighborhoods and students with similar backgrounds generally tend to perform comparably.

Further, if mobility were a major cause of extreme fluctuations in test scores, then most or all districts with high mobility rates would have unusual scores. But high mobility is characteristic of virtually all urban, high-poverty districts, and many such districts, like Chicago, do not have high concentrations of large test-score swings.

Jaxk Reeves, director of the University of Georgia Statistical Consulting Center, who conducted a detailed independent review of the AJC’s data and methodology, tackled the mobility theory from a different perspective this week.

Reeves found that even when comparing completely different student populations within the same school — for example, third-graders in 2009 and third-graders in 2010 — statistical programs still show a strong ability to predict scores from one year to the next.

“So that does sort of debunk that [mobility] theory,” he said.

Robert Hauser of the National Research Council, a Washington, D.C., group that seeks to improve public understanding of science, agrees mobility is not a significant issue. Hauser, who did not advise the AJC on its methodology, said those citing mobility as a flaw in the analysis have yet to present any evidence that it causes fluctuations in test scores.

“My take on that is that if he has data on that from the districts he’s studied concerning student mobility, he should produce it,” Hauser said, referring to Miron. “It should be his job to demonstrate that it would make a difference.”

Atlanta district officials also raised the mobility issue after the AJC analyzed their scores in 2009. The newspaper examined student mobility in Atlanta and in similar districts in the metro area. The review showed that districts with similar populations of students did not have a concentration of suspicious scores nearly as high as Atlanta’s.

Taking action

As cheating on standardized tests has become a serious issue nationwide, many states have begun or are considering programs to look for cheating.

A bill was introduced in the Missouri Legislature this week to require the state’s department of education to look for testing irregularities. Washington’s department of education plans to petition the state’s Office of Financial Management for $168,000 to conduct erasure analysis this year, a spokesman for the department said. The department made its request shortly after the AJC published its analysis.

“Right now it’s possible we may need to look into this a little more,” said Christopher Hanczrik, director of assessment operations for the department.

But some districts already know how important it is to examine their test results.

Houston investigated 22 cases of cheating in 2010 and 2011 and spent more than $1 million to look for cheating in 2011.

District officials in Houston believe the AJC’s analysis identified more unusual scores than actually occurred, but Jason Spencer, a spokesman for the district, said even so, he’s sure there are some schools identified in the analysis that need another look.

“We know cheating happens,” Spencer said. “We’re not going to sit here and say the whole list is flawed. It’s going to happen, the difference is how you to react to it.”