Originally published in the Florida Times-Union
Following a protracted court fight, Florida released Monday hundreds of thousands of scores showing how it measures teachers’ effect on student learning.
Parents can now see exactly how Florida’s controversial value-added model, or VAM, system gauges the state’s teachers.
The Florida Department of Education’s value-added model looks to quantify how much, or little, individual teachers contribute to students’ progress over the school year.
Local school systems use the scores as part of a teacher’s evaluation, although districts have some latitude on how the data is incorporated. At least two counties locally plan to change their use of the measurements, the validity of which is disputed by many teachers.
The scores, publicly available for the first time, reflect pockets of excellence and low performance.
About 58 percent of school districts in Florida last year saw a majority of their teachers receive aggregated scores below statewide norms, according to a Times-Union analysis of the data.
The analysis of those teachers who received scores also showed:
■ In most Northeast Florida school districts, the majority of teachers were scored above the average, according to aggregated numbers.
■ In St. Johns County, 68 percent of the teachers received above-average evaluations, the highest rate in the state. Overall, St. Johns teachers were credited for helping students make about 15.6 percent more progress than a state model said would have been typically expected.
■ Baker, Clay and Nassau counties also had a majority of teachers outperform the state average. Baker students’ gains – about 9 percent more than typical – led to its teachers having the third-highest aggregate VAM ratings in the state. Clay had the state’s seventh-highest teacher score, about 7.6 percent higher than would have been expected. Nassau County teachers were scored about 2.6 percent higher than the state overall.
■ In Duval and Putnam Counties, slightly more than half the rated teachers who were rated received below-average VAM scores. The state’s modeling said Duval teachers contributed to about 1.5 percent less progress than would be typically expected, and Putnam teachers’ impact was about two percent below normal.
Precise analysis of the teacher data is difficult given the data contains many redacted and duplicate names.
In addition to providing VAM scores for many teachers, the state also provided scores for schools. About 57 percent of Duval County schools scored below average, and 59 percent of schools in Putnam County were below average.
The data was released after the Times-Union sued for access under Florida’s open-records law.
“The commissioner and the Department of Education have been fighting for teachers in an effort to maintain the confidentiality of teacher’s names and their individual value-added data,” said Kathy Hebda, the education department’s chief of staff. “This was important to the commissioner because she believes in the value of the teacher, principal relationship for professional development which is supported when evaluation information has a period of protection.”
Using a complex mathematical formula that considers student scores on state reading and math tests as well as other school-wide and statewide data, the value-added model tries to predict how much each student will improve each year, and if a student does better or worse, measures the difference, saying that is how much value the teacher added to that student’s education.
School districts have received value-added data since the 2011-12 school year; the latest was released last summer. It can reflect student test score growth from one, two or in some cases three academic years.
Districts use VAM scores for 40 to 50 percent of a teacher’s annual evaluation. Those evaluation can help decide if a teacher keeps his or her job, or if he or she receives a raise.
Hebda stressed that value-added measures are just a portion of a teacher’s evaluation. She also said the measures aren’t new; they are used in the health care and economic sectors as well as in education.
But even though state officials said the scores are accurate and reliable, when it comes to evaluating teachers, value-added scores are not to be used alone. They should be added to classroom observation reviews, parent input, and principal and teacher professional goals for the school year, Hebda said.
“Looking at this data in isolation can lead to a misunderstanding about an individual teacher’s evaluation,” she said.
State teachers union officials agree with her on that point, but not on many others. The use of value-added scores has drawn harsh criticism from teachers unions.
The Florida Education Association opposes disclosing value-added data to the public and opposes what it says is the over-use of that data in teacher evaluations. Union officials said last week they distrust the state’s value-added formula and believe it is too flawed to make up as much as half of a teacher’s evaluation.
“To my mind the data is not useful for anyone or anything,” said Catherine Boehme, a Pensacola teacher who chairs the union’s personnel problems committee.
The system is moving too quickly, said Andy Ford, union president, and has major flaws like not accounting for poverty and not having a reliable system ensuring teachers’ scores are based on the students they actually taught.
Union officials also criticized the VAM scores because they rely on student FCAT scores for reading, math and Algebra 1 end-of-course exam scores, even for teachers who don’t teach those subjects.
In other words, more than two-thirds of Florida’s teachers have been evaluated, in part, on scores for students or classes they don’t teach, according to union officials.
That includes teachers in kindergarten through grade three, teachers in high school who don’t teach reading or math, and teachers of art, music, physical education, certain sciences and social studies.
Boehme, who teaches high school biology and advanced chemistry, said her VAM score was based on her students’ ninth-grade Algebra and 10th-grade reading scores, subjects she doesn’t teach. She also said she didn’t get her VAM scores until well into the following school year, making it hard to know what things she should do to improve.
In many school districts, including Duval and St. Johns, educators who teach art, music or physical education had 40 or 50 percent of their total evaluation based on average school-wide VAM scores.
Things are changing, state officials say.
Beginning with this school year, thanks to a court decision, state rules give districts more flexibility to choose how they’ll measure student growth for teachers of subjects that are not FCAT-tested or not Algebra I.
Districts can use their own test or purchase assessments to measure student growth and achievement in non-FCAT courses and get state approval. Officials in Nassau and Duval counties said they’re developing new assessments or purchasing others to measure student growth in the untested courses.
That doesn’t resolve every concern about VAM. The union says VAM should only amount to 25 to 30 percent of a teachers’ evaluation while the state is in favor of 40 to 50 percent.
There’s also the nagging question about what happens when Florida again changes its state exams to reflect changing academic standards.
The state has already begun incorporating the more rigorous Common Core set of state standards, with some statewide alterations, and it next expects to develop a new set of state tests for more grades and subjects.
Union officials are seeking a time out.
Using VAM, Ford said, “is just moving too fast to be valid… We’re in a state of flux right now.”
But Hebda, with the department of education, says she is confident of a smooth transition, because Florida already has made many changes to its grading and testing systems and still produced consistent VAM scores.
“We anticipate everything will go very, very well,” she said. “The school districts have been our partners on this and there has been a lot of hard work on it the last three years on this.”