Huffpost Education
The Blog

Featuring fresh takes and real-time analysis from HuffPost's signature lineup of contributors

Alan Singer Headshot

The "Big Lie" Behind the High-Stakes Testing of Student Teachers

Posted: Updated:
Rhisiart Hincks via Getty Images

On April 30, 2014 the Higher Education Committee and the Education Committee of the New York State Assembly held public hearings on a new teacher certification evaluation system administered by Pearson known as edTPA. It was scheduled to take affect on May 1, 2014.

The Assembly hearing became moot before it was even held, at least temporarily, when the New York State Regents, the governing body for education, decided to postpone implementation until July 2015, although this year's student teachers had already completed, submitted, and been graded on the project which included videos and commentary. Nobody discussed whether the student teachers would be refunded their $300 test fee.

State Education Commissioner John King said the delay was in response to requests from college faculty and others for a "safety net" covering teacher candidates. More likely it was a decision by Governor Andrew Cuomo to make peace with the state teachers union as he prepares for his reelection campaign. Education professors at the state university system who are members of the NYSUT, New York State United Teachers, had lobbied hard for the postponement.

Ironically, the New York State decision came at the same time that Arne Duncan and the federal Department of Education announced it would press to make similar teacher certification evaluations a national requirement. My recommendations for preparing edTPA portfolios may quickly become relevant nationally.

Although it is being used to evaluate student teachers for certification, the TPA in edTPA stands for Teacher Performance Assessment. Student teachers in my seminar suggested a better title would is "Torturous Preposterous Abomination," although "Toxic Pearson Affliction" was a close runner-up in the voting.

All of my students passed the edTPA evaluation, including some who I felt were weak. In one case, two student teachers that handed in very similar packages received significantly different scores, which calls into account the reliability of the evaluations.

Statewide, the passing rate was 83%. One graduate student summed up the way the class felt about the procedure. "The whole process took time away from preparing in advance for future lessons . . . It really just added unneeded stress."

Based on my experience this year with edTPA, I made the following recommendations to my university for preparing student teachers for the test in the future.

1. We prepare students as if New York State will eventually enforce edTPA while at the same time we organize to get the requirement dropped.

2. We keep edTPA submission as a university requirement for student teachers for 2014-2015 although those who fail will be permitted to take an optional exam.

3. edTPA and student teaching are not the same thing. Student teaching is about learning to be an effective and creative teacher. edTPA is about following directions. Some weaker student teachers performed very well on edTPA because they followed the format.

4. There were five keys for successful edTPA submissions.

a. Whatever format they use on a regular basis, for the edTPA submission, student teachers need to have a very structured lesson format that includes all edTPA requirements. They do not need to do this for every lesson - that would be torture - only for edTPA.

b. Students need to video an appropriate lesson. Students who failed, failed because of poor lesson choices. Sometimes they were too restricted by the cooperating teacher. If a teacher or school will not accommodate them, move them right away.

c. For videoing, set up the room as a television studio. Have the board, smart board, map, etc. right behind the student teacher. Have the students you want on camera right in front of the camera working as a team. Keep the lights on. You do not need field microphones. They make the whole thing too complex.

d. Student teachers must have evidence of informal and formal assessments. As they walk around the room they should interact with students and take notes on a note pad or electronic device. This needs to be seen on camera. Every lesson should end with a final writing assignment that will be submitted as evidence of three levels of performance.

e. After teaching and videoing the lessons, students should rewrite lesson plans to reflect what actually took place. Some colleagues feel that reviewers value writing over teaching since the reviewers do not actually see much teaching. This would explain why some weaker student teachers who are good writers scored higher than expected. I recommend that lesson plans and commentaries include a lot of action words - Students will analyze, evaluate, explore, examine, determine, assess -- and that student teacher teams review and edit each others work before they submit portfolios to Pearson.

5. The new Pearson created ALST and EAS tests were harder than previous Pearson certification tests. Schedule a review as part of the student teaching retreat before the start of each semester. The key to success on the Pearson ALST is to remember they are not asking for the right answer or the best answer on the multiple choice part of the test, but for the answer they imbedded in the reading passage. Don't argue with the test. Find their answer in the passage.

6. The Pearson EAS asks students to develop strategies for teaching diverse student populations, English Language Learners, and students with special needs. They get to pick the grade, subject, and topic. These essays are worth 30%. Students reported being unable to finish this task. They need to have strategies ready before the test.

Below is the testimony I e-submitted to the New York State Assembly Committee. I demanded that edTPA be dropped rather than postponed. I figure I will probably have to submit my testimony again next year when the postponement runs out.

Did Mike Trout learn to play baseball by writing a fifty to eighty page report explaining how he planned to play baseball, discussing the theories behind the playing of baseball, assessing a video of his playing of baseball, and explaining his plans to improve his playing of baseball?

Did Pablo Picasso learn to paint by writing a fifty to eighty page report explaining how he planned to paint, discussing the theories behind painting, assessing a video of his painting a picture, and explaining his plans to improve his painting?

Did you learn to drive a car by writing a fifty to eighty page report explaining how you planned to drive a car, discussing the theories behind driving a car, assessing a video of your driving a car, and explaining your plans to improve your driving?

Of course the answer in all three cases is a resounding "NO!" You learn to play baseball, paint a picture, or drive a car by playing baseball, painting pictures, and driving cars, not by writing about it.

Yet Stanford University, Pearson, and New York State are trying to sell the public that you learn to teach, not by teaching, but by writing about it. They also want you to believe that they have perfected a magically algorithm that allows them to quickly, easily, and cheaply assess the writing package and accompanying video and instantly determine who if qualified to teach our children. Maybe they plan to sell the algorithm to Major League Baseball next.

New York State is currently one of only two states that proposes to use edTPA to determine teacher certification. Not only should New York State postpone the implementation of edTPA, but it should withdraw from the Pearson, SCALE, Stanford project. edTPA distracts student teachers from the learning they must do on how to connect ideas to young people and undermines their preparation as teachers. Instead of learning to teach, they spend the first seven weeks of student teaching preparing their edTPA portfolios and learning to pass the test. Based on preliminary results on the first round of edTPA, most of our student teachers are pretty good at passing tests, so edTPA actually measured nothing.

Sometimes in the middle of intellectual dishonesty and double-speak, the spin doctors make a mistake and accidently tell the truth. SCALE, the Stanford Center for Assessment, Learning, & Equity, recently spilled the beans about its partnership with Pearson and the American Association of Colleges for Teacher Education (AACTE) to create edTPA. In a question and answer session at a breakfast meeting held at the annual conference of AACTE on March 2, 2014, SCALE representatives Sharon Robinson, President of AACTE, and Ray Pecheone, executive director of SCALE, discussed edTPA. The bottom line is that the companies and organizations pushing for these "innovations" and profiting from them are using students and teachers as their laboratory guinea pigs. They have no proof that these things really work and will improve education in the United States.

SCALE claims that "edTPA is a preservice assessment process designed by educators to answer the essential question: "Is a new teacher ready for the job?" However in a question and answer session at an AACTE breakfast meeting, when asked, "Why hasn't SCALE completed any predictive validity research for edTPA? Has this been linked to variation in classroom performance in any way?" they had a very strange response.

"Predictive validity studies for licensure assessments are routinely conducted after a test or assessment has been in operational use. In fact, examining the validity processes used for other forms of performance assessment of teaching, there is not one instance where predictive validity was established prior to the adoption and operational use of the assessment . . . conducting predictive validity studies during a field trial introduces many sources of error that could compromise the results, including the main concern that candidates are not the teacher of record during clinical practice that certainly would confound the results of the study."

In other words, despite their claims, there is no evidence that edTPA accurately measures anything.

Robinson and Pechione went on to say that "The implementation of predictive validity requires following candidates into their teaching practice for several years in order to obtain a stable estimate of student learning based on the research findings of value-added studies conducted for teacher evaluation."

Again, in other words, they expect New York State to buy in, but they will have no idea about the reliability of edTPA for years until they can see whether teachers who scored high on the test actually become good teachers.

Meanwhile, student teachers are not being evaluated by trained field supervisors or cooperating teachers, but by temporary evaluators of questionable qualifications (they work online so they can be anywhere in the United States and have no familiarity with New York State) who are hired by Pearson, which is most noted for its famous Pineapple races a Hare passage on an eighth grade state reading assessment. As of March 17, 2014, Pearson was still trying to hire people to evaluate the portfolios. Pearson was requesting that "scorers possess both strong pedagogical content-specific knowledge and experience in roles that support teaching and learning in the edTPA content area in which they are scoring," but had no procedure in place to evaluate the evaluators. The portfolios contain twenty minutes of video and between fifty and eighty pages of lesson planning and commentary, but evaluators were expected to complete their task in two hours and were being paid $75 per portfolio or $37.50 an hour if they work fast.

After reading the pineapple passage, students were asked to decide which animal spoke the wisest words, the hare, the moose, the crow, or the owl. Now it is your chance to speak. As the Grail Knight said to Indiana Jones in the movie "Indiana Jones and the Last Crusade," "Choose wisely."