SCALE and edTPA Fire Back!: Methinks They Doth Protest Too Much

Beverly Falk, a professor of Early Childhood Education at the City College of New York and the director of its Graduate Program posted a statement defending the Teacher Professional Assessment, better known as edTPA. But there are a few things Dr. Falk neglected to mention in her defense.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

On June 13, 2004, Beverly Falk, a Professor of Early Childhood Education at the City College of New York (CCNY) and the director of its Graduate Program posted a statement on Diane Ravitch's Blog defending the Teacher Professional Assessment, better known as edTPA. edTPA is currently being implemented in a number of states through a partnership of SCALE (the Stanford University Center for Assessment, Learning, and Equity), the American Association of Colleges for Teacher Education (AACTE), and the publishing and testing mega-giant Pearson Education. About the same time, SCALE released "edTPA MYTHS and FACTS," which responded to critics of edTPA and purports to set the "record straight."

The Falk statement was signed by thirteen City University of New York (CUNY) faculty members. It makes clear that the criticism they are most directly responding to is my Huffington Post blog The 'Big Lie' Behind the High-Stakes Testing of Student Teachers. Falk refers to me by name, Alan Singer, four times in her statement, although she did not link readers to the blog.

There are a few things Dr. Falk neglected to mention in her defense of edTPA and SCALE.

1. The first thing Dr. Falk neglected to mention is that the Professional Staff Congress (PSC), the union that represents CUNY faculty, opposes the implementation of edTPA and proposed a resolution rejecting edTPA to NYSUT, the umbrella group representing all teachers in New York State. According to the PSC, edTPA and other proposed teacher certification reforms "fail to take into account the specific communities and populations teacher education programs"; they are being imposed without adequate research or demonstration of their validity; and "reduce the practice of teaching to a series of quantifiable behaviors that do not capture the complexity and nuance of teaching."

United University Professions, which represents academics and professionals involved in teacher preparation programs at seventeen SUNY campuses, also strongly opposed New York State's rush to implement edTPA. According to Jamie Dangler, Vice President for Academics, United University Professions, "While most states are at various stages of developing their use of the edTPA or other performance assessments for teacher candidates, New York moved quickly to make the edTPA a high stakes certification requirement."

2. In the statement, Falk and her colleagues describe themselves as "teacher educators from the City University of New York, a university comprised of many campuses across NYC that serve a socioeconomically, culturally, racially, and linguistically diverse population of students. We are advocates for equity and access in education." The twelve full-time faculty who signed the statement come from four CUNY campuses. However, CUNY offers teacher education programs at ten senior colleges as well as at seven community colleges. By my estimate, there are over 500 full-time teacher education faculty at the various CUNY campuses, including at least 49 at the City College of New York where Dr. Falk teaches. Dr. Falk and her co-signers represent a tiny fraction of those faculty members.

3. In the statement, Dr. Falk neglected to mention her professional connections with SCALE and edTPA. According to her CCNY faculty profile, Dr. Falk is a "partner of the Stanford Center for Equity, Learning, and Assessment (SCALE), an organization that supports innovation in performance-based assessment for teachers and students." In September 2012, Dr. Falk delivered the opening remarks at a SCALE sponsored conference on "The edTPA as an Educative Enterprise." The conference program listed her as a "Senior Scholar, SCALE." In August 2013, Dr. Falk, again in her capacity as a Senior Scholar for SCALE, represented the organization at a ISNetworkED conference in New York State. She was also the "guest speaker" representing SCALE at a 2013 Long Island, New York conference on the opportunities and challenges in edTPA and she was a featured speaker defending edTPA at the 2014 Barnard College National edTPA Conference. I emailed SCALE asking if Beverly Falk is a paid representative of SCALE. At this point I have not received a reply.

Some of Falk's claims in her response to my Huffington Post challenge credulity. For example, according to Falk "researchers found that 96% of teacher candidates reported that the edTPA was a positive influence on their learning, pointing especially to how it made them more self-aware and focused them on student learning. More than 90% of teacher educators reported the experience of supporting the edTPA enabled them to reflect on and improve their program design and instruction." This sounds more like the results from a Soviet era rigged election than education research. I do not believe 90% of anybody ever agrees on anything, let alone 90% of teacher educators and 96% of teacher education students.

Falk provided a link to an edTPA resource page, but not to a specific study. Following the edTPA link I located a study "Developing and Assessing Beginning Teacher Effectiveness: The Potential of Performance Assessments" whose primary author was Linda Darling-Hammond of SCALE. In the study, Darling-Hammond evaluated the Performance Assessment for California Teachers (PACT), an early version of edTPA. I assume this is the study that Falk is referring to. Darling-Hammond concluded "Candidates' feelings that they learned from the assessment were strongest when they also felt well-supported by their program in learning to teach and in completing the assessment process."

However, I found no evidence for the claims made by Falk. The report cited a high level of positive responses from teachers who participated in National Board certification, but not for PACT or edTPA. This is a big difference. National Board certification is voluntary and it is for experienced teachers. edTPA is required for beginning student teachers seeking initial certification. This looks like an example of academic "Bait and Switch" to me.

Not to be too picky, I also believe Falk made a slight reporting error. "A survey of more than 5,600 National Board candidates found that 92% believe the National Board Certification process helped them become a better teacher," not 96% (13). In addition, "Well over 80 percent of candidates felt that their teacher education coursework and student teaching experiences were helpful in preparing them for the PACT teaching event," but that only shows that the teacher education programs were modified to prepare students for the test (25). I may be reading this all wrong. If I am, I hope Beverly Falk can provide citations with more specific evidence supporting to the validity of edTPA.

Dr. Falk focused on my claim that edTPA involves a partnership between Pearson, SCALE, AACTE and the states where it is being used. She disputes state involvement, which I think will be a touchy subject with State Education Departments if she is correct.

Falk describes Pearson as "an operational partner (much like the publisher of a text), responsible for creating and managing the online platform that collects portfolios and delivers them to the teachers and teacher educators who score them." Instead of complaining about me, maybe Beverly Falk needs to complain about Pearson. The Pearson Assessment website lists edTPA as one of their assessment products and services. According to the Pearson Education website, since 2010 Pearson "is recruiting 1,200 educators to score the new Teacher Performance Assessment (TPA), a nationally available, performance-based assessment instrument for measuring the effectiveness of teacher candidates seeking initial licenses."

Among my major criticisms of edTPA are my questions about the qualifications of the evaluators and the inter-rater reliability of the evaluations. One respondent on the Ravitch website reported, "I received an invitation to serve as an edTPA scorer for secondary math education. I am not certified in mathematics and have never been a secondary math teacher." This could have been a mistake. Mistakes happen.

But other problems are built into the recruitment procedure. According to the Pearson call for evaluators, edTPA evaluators have to be an experienced teacher or teacher educator in the subject area or grade where they are applying to evaluate student teachers. However, nothing is said about the academic level of students or the demographic of schools. For example, an evaluator can be a teacher with experience working with honor students in an affluent suburban school and be assigned to evaluate student teachers working with academically challenged students from impoverished rural towns or troubled inner-city urban communities, or vice versa. As far as I can tell from the websites, applicants do not have to provide evidence that they actually were good teachers, worked in inclusive and multicultural classrooms, or are familiar with, support, and use state learning standards of the states where student teachers are working.

According to the SCALE mythbusters release, it is a myth that (a) "Pearson hires part-time employees who are unqualified and don't score reliably"; (b) "External scorers don't know our candidates and can't/shouldn't judge their teaching": and (c) "Pearson scores edTPA." The fact, which they acknowledge, is that "Pearson manages edTPA scoring activities" using "scoring training . . . designed by the Stanford Center for Assessment, Learning and Equity (SCALE)."

I accept that "edTPA scorers include teacher educators, clinical supervisors of student teachers, K-12 teachers, administrators and National Board Certified Teachers" with "verified experience," but this does not address my concerns. If SCALE and Pearson are so confident of the quality of their edTPA evaluators, why don't teacher education candidates, State Education Departments, and Schools of Education receive signed evaluations that include the qualifications of the individual reviewers.

SCALE's edTPA mythbusters also claims that "all scorers complete an extensive 20-plus hour training curriculum that includes multiple checks to ensure that they score consistently" and contains an "anti-bias module."

There are at least three problems here. First. I do not see any evidence to support claims that a twenty-hour online program ensures consistent scoring or in any way eliminates participant bias. Second, if a simple online module could eliminate bias based on "gender, socioeconomic status, region/location, and language" as they claim, as well as change attitudes toward "instructional context characteristics" such as "classroom setting or context, curriculum constraints, grade level or teaching assignment," we would have a much better society, far fewer conflicts about what should be taught and how it should be assessed, and all of our country's social and educational problems would be solved. But unfortunately they aren't. Maybe Pearson should broadly distribute the anti-bias module!

The third problem is inter-rater reliability. The SCALE mythbusters release claims "scorers are back-read by scoring supervisors and score previously scored validity portfolios to ensure they continue to score consistently and without bias." We are not told how often this happens and how accurate is the procedure. Teacher education candidates pay $300 to be evaluated and evaluators are paid only $75 per portfolio that they evaluate. Is there enough money available to ensure a rigorous checking mechanism? As I stated in my original blog, I had strong candidates who passed edTPA with relatively low scores and some weak candidates who were judged to have reached mastery level. Because the evaluation process is outsourced to Pearson, there are no legal requirements for transparency or an open evaluation process.

Falk concludes, "Although there is no such thing as a perfect assessment, especially for something as complex as teaching, we believe that edTPA vastly improves the process by which teachers are certified in New York State." She quotes from interviews with three student teachers who are featured on the official edTPA website. Once again, Falk leaves something out as she tries to make her case. She neglected to mention that these students participated in a voluntary edTPA trial and did not experience the level of stress experienced by student teachers who were being evaluated for real.

My students have a different story to tell. I think their story is more valid, but I can't prove it. Molly Clingenpeel completed New York State certification in Secondary Education, Social Studies as a graduate student in a Master's program. According to Molly, "My experience with edtpa is not one I'd like to repeat. It distracted me from trying to create lessons that would really connect with my students and the writing portion was a waste of time. A lot of the prompts were repetitive and ones that I was already doing in my classes so I felt I was doing tedious, annoying, repetitive work. The video taping was helpful but having them looked at and judged is just another form of testing. Student teachers will give the state what they want just so they pass.

Michele Dello Iacono, who completed Secondary Education, English Language Arts certification as an undergraduate wrote: "I think the rebuttal to Alan Singer's post brings up some good points, however, they completely disregard the many problems in the edTPA process! The idea of edTPA is commendable, but the roll-out of the assessment and the way it was first presented to us was degrading. edTPA tries to determine whether someone is "right" for the teaching profession, but it doesn't go about it in a coherent and justifiable manner. The reflection process was useful, but it was only helpful for the three lessons that I included in my edTPA portfolio. Even that was rather tedious, the same things that had to be "prompted" into the lesson in a different way about five different times. With constant updates being sprung on us, my classmates and I felt like we were on a hamster wheel: running and exerting ourselves without getting anywhere. This experience only taught me how to ace the test, not how to become a more effective teacher.

Jeffrey Edmundson, a teacher educator at the University of Oregon, wrote as a comment on the Ravitch blog that "I, too, have worked with the edTPA with students, and debriefed them extensively. Not a single one of them found it more valuable than what we were already doing."

Alexandra Miletta, a New York based teacher educator and another critic of edTPA described the whole edTPA assessment process as "building a plane as we fly it." I think Miletta is too kind. edTPA is more like someone banging your head with a stick and telling you how good it will feel when they stop.

As for Beverly Falk, I can only quote William Shakespeare. "The lady doth protest too much, methinks."

Popular in the Community

Close

What's Hot