HOUSTON — Andres Balp’s Texas classroom provides a glimpse of the data-driven future facing Georgia teachers and students.
In his fourth-grade room at Houston’s Lyons Elementary, the focus is clear: measuring exactly how well students are progressing. Children are grouped by how they have done on standardized tests, making it easier for Balp to work with the lower-performing kids. A poster charts daily test scores — black ink for high marks, red for low — to show who’s on track to hit state exam goals. Students follow their own progress too: Taped to each desk is a small square of colored paper with the student’s goal score for a daily 12-question assessment.
Balp, a 17-year classroom veteran, holds the master key to all this data: a three-ring binder filled with graphs showing test scores for every student — and forecasts of how they’re expected to do going forward. It informs the interactions he has with his students.
His income hinges on this data. So does his job.
It’s not this way in most Georgia classrooms. But it will be soon.
Georgia, along with 10 other states and the District of Columbia, is introducing educational reforms like those in Houston, aimed at better measuring teachers’ effectiveness. This fall, the first changes will roll out in 26 Georgia school districts, with plans to expanding them statewide in four years.
The engine of these changes is the hotly debated use of “value-added” data. The exact way this data is calculated can vary, but it essentially uses results from students’ past standardized tests to project how much they should progress over the school year, then evaluates teachers on that progress. Ultimately, such data will factor into decisions about how Georgia teachers and principals are hired, fired, licensed, paid and evaluated.
It’s a controversial shift for the teaching profession in Georgia, where currently, nearly everyone gets a positive evaluation regardless of how much students learn. It’s also part of a larger trend in U.S. education — spurred recently by the Obama administration’s Race to the Top grant — toward using data to hold teachers and schools accountable for what students are learning. Sixteen states now require value-added achievement data to be factored into a teacher’s evaluation; 10 require it to be the primary factor, according to the National Council on Teacher Quality.
Proponents say the emphasis on data will make teachers better. Critics fear the system will take public education further down the path of teaching to the test, and say it glosses over the intangible ways teachers influence kids.
Georgia’s added emphasis on test data arrives even as the state is investigating whether educators in DeKalb and Atlanta cheated to help boost test scores. A similar situation is playing out in Washington, where testing irregularities have been called into question.
In Houston, Balp said he was skeptical of the value-added data when it was introduced there in 2006. Now, he said, he finds it an accurate measure of his students’ academic capabilities.
Because the data predicts a student’s future performance based on past results, teachers have better insight into how their students should be scoring in class. If a student isn’t on track, the teacher can offer remediation. It also allows teachers to work with their colleagues to address weaknesses revealed by the data. For example, if one fourth-grade teacher isn’t “adding enough value” in math, and another fourth-grade teacher’s students aren’t scoring well in language arts, the two might swap classes for one subject.
“I don’t know what I would do without it,” Balp said. “I can look at the data at the beginning of the year and know how my class is going to be whereas before, I couldn’t tell if the student was low, or if the student was high. Now I can tell if a low student can do a lot better.”
Georgia on fast track
No other state has enacted the level of data-centered change that Georgia is planning. But a few school districts where reforms are already under way offer an indication of how this shift in measuring effective teaching will play out.
District of Columbia Public Schools has one of the most high-stakes evaluation systems linked to the data; teachers who don’t get high ratings can be fired after a single year. Denver administrators say their pay-for-performance system, one of the oldest in the nation, has helped attract higher-quality teachers.
Georgia is planning changes that would link a teacher’s annual salary increases to value-added data. State leaders hope to select the data system they’ll use by June.
In Houston, school administrators say linking pay to value-added scores shifted the district’s emphasis from whether students were passing or failing exams to how much students were learning from year to year. District passing rates have increased and more students are performing at the highest level in reading and math on state exams.
Balp earned $11,330 in bonus pay for the 2009-10 school year as part of the district’s pay-for-performance program, now in its sixth year. It was the district’s highest bonus award, based primarily on student academic progress. By the district’s measure, some of Balp’s students learned three years of information in one school year.
In class, the energetic Balp flutters from group to group, switching between Spanish and English to ensure that his bilingual students understand the instructions for a lesson on probability. The school where he works is in a low-income neighborhood where metal chain link fences protect overgrown lawns filled with stacks of tires. Most of the students are Hispanic and qualify for free or reduced lunch. But despite the challenging demographics, Lyons Elementary is celebrated as a leader in student growth.
Houston administrators and faculty are not talking about how to get kids on grade level so they’ll pass state exams, they’re discussing how to push students to grow academically beyond their predicted performance.
Houston a pioneer
The Houston district faces challenges much like those in schools across the country. Eighty percent of students are classified as economically disadvantaged. The board is grappling with a $171 million shortfall. Funding for teachers has been reduced and the board is contemplating closing schools where enrollment is dwindling.
But the district, the nation’s seventh-largest with about 204,000 students, prides itself on being a leader in school reform. It was one of the first in the country to offer bonuses to teachers using value-added data.
Now, administrators here are using the data in new ways. A teacher can be recommended for dismissal based on poor value-added scores, although that has not happened, said Superintendent Terry Grier. But he said there are numerous teachers with poor scores who have retired, and non-tenured teachers who did not have their contracts renewed.
“Our teachers are treated with dignity and respect and their due-process rights are not violated,” Grier said. “But teachers who cannot teach children are not going to be allowed to stay in Houston.”
The goal of this data-based movement is to attract, retain and create a stronger teaching workforce, and Houston administrators say there is evidence that’s happening. Almost 92 percent of teachers retained in 2008-09 received a bonus, compared to 84 percent in 2006-07.
In Houston’s toughest schools, though, there’s been a 16.5 percent decline over the past four years in the number of teachers receiving a bonus based on classroom-level performance, meaning fewer teachers in the most challenging environments are boosting academic performance at the highest levels.
Critics are skeptical of how a teacher’s value-added score is calculated, and say some formulas are too difficult to understand. Nationally, there are only a handful of statistical models that generate value-added data; while some are seen as too complex, the simpler ones are considered less statistically accurate.
Parent Rhonda Jones, who has two children in the Houston district, said she has not noticed a change in the way teachers behave since the use of student data took center stage. She supports the idea of rating educators on what students are learning.
“I realized that there were teachers out there who actually showed zero growth,” she said. “If we hold students accountable, we have to hold teachers accountable as well.”
Data can fall short
Many parents, teachers and administrators agree that for the most part, the value-added data is an accurate reflection of student and teacher performance, said Carla Stevens, assistant superintendent for Houston’s department of research and accountability.
But in some areas — including classes of highly advanced students and students moving from Spanish to English — a teacher’s true impact isn’t showing up in the test data, she said.
“We do have some principals who say, ‘My best teachers in these certain areas are not being identified as good teachers, but we know they are,’” Stevens said. “We’re trying to run the numbers, investigate and see where this is a problem, and where it is a perceived issue but not a real issue.”
Darilyn Krieger, a physics teacher at Carnegie Vanguard High, a Houston school for gifted and talented students considered one of the best in the nation, said she is proof value-added data can be misleading.
Krieger said last year she earned almost $6,000 because of her student-growth scores. But this year, the data showed her students didn’t learn a year’s worth of information, even though 100 percent are passing and doing college-level coursework.
That’s because the district uses a test that doesn’t measure what her students learned in the current year. Instead, it’s an exit exam designed to measure what they have learned in science over 11 years. Her chief concern is the value-added data’s lack of transparency — it isn’t specific enough to show where her students did not meet expectations.
“I am being measured and told I didn’t do my job, but they can’t tell me what I did wrong,” Krieger said. “I don’t care what my scores were. I don’t care if I get a dollar on this stupid award. I want my kids to do well because they know the material.”
Debate comes home
Krieger’s concerns echo those of educators nationwide who believe no test can truly determine who is a good teacher. Georgia education groups have concerns about how quickly the state is introducing its new system. In the next four years, a Georgia teacher’s livelihood — from employment to salary — will be based on a data system that has yet to be chosen.
Tim Callahan, spokesman for the 81,000-member Professional Association of Georgia Educators, said the new changes have the potential to be a “disaster.”
“We’ve read some of the research that tells us that value added has some bugs in it and is not totally worked out,” he said. “It’s troubling because very important decisions about salary, promotion, retention are going to be made on flimsy data.”
Supporters of using student test data to rate and pay teachers almost universally agree that it can’t be the only factor used in decision-making. In Georgia, 50 percent of the evaluation of teachers in “core” subjects — those covered by standardized tests — will be based on the growth scores; the other half will come from classroom observations, lesson plans and student surveys. Core teachers, such as those in math and writing, make up about 30 percent of the educator workforce.
States shouldn’t wait for a perfect system, proponents say. Houston administrators offer a common rebuttal to complaints about its value-added formula: “I don’t know exactly how the engine of my car works, but I trust it’ll get me where I want to go.”
Back in Balp’s classroom, sweat gathers on the teacher’s forehead as he darts from group to group. Balp says he’s not working harder than he used to, only more intelligently.
“More than anything, it is a tool for the teacher,” he said. “This is not a tool for principals to use against you. Whether you get evaluated on your performance, that’s a side-effect. I think this data is useful.”
About the Author