12/06/2010 02:41 pm ET | Updated May 25, 2011

Playing Games with Data is Actually Playing with Students' Educations

As Michelle Rhee takes her dog and pony show to Florida, another bastion of data-DRIVEN accountability, we should take note of the latest study of the misuse of data in the DC schools by Becky Smerdon of the Council of Great City Schools.

I am thrilled by the report because before NCLB, I received timely, and invaluable, information whether each student's absence was unexcused or excused due to medical, legal, counseling, or family matters. But it is even easier to manipulate attendance data, than it is to jack-up graduation or test score pass rates. Since NCLB, whenever there is the need to keep bad news from moving up the chain of command, in my experience, a new attendance code is created for the computer, and absences disappear.

If my district, the District of Columbia, Florida or any school district really wants to use data to help students, as opposed to making accountability numbers look good, attendance data would be used as an early warning system. The Council of Great City Schools, however, discovered that attendance data was complete for only 23 percent of the students in her study. The report expressed surprise that no system existed for taking attendance or even setting a basic policy on absences. It also discovered that the schools most lean on staff struggled the most, and if the system wanted a job done, it assigned a person to do the task.

Worse, the study of data in four DC schools documented a predictable pattern where test results were used "to drill on test items and teach ONLY what was tested. In addition, schools were using these results to identify and focus on the students just below the threshold of passing." One school used a trained reading specialist to boost the scores of "bubble kids" who were close to passing and who thus would make the school look better, while the students with the lowest skills were dumped on the PE teacher. Neither should it be a surprise that the new tests looked like the old tests, and the nature and the scheduling of the tests made it impossible to use data to inform instruction. As the report concluded, when the real purpose of data is to hang charts on the wall and pretend that reforms are yielding results, it is no surprise that trust is eroded, as educators and students face "burnout or bored-out."

Please, read more of my thoughts at Scholastic