Is Pearson (Mis)Education Trying to Intimidate Critic?

Is Pearson (Mis)Education Trying to Intimidate Critic?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In a May 2014 blog, Pearson CEO John Fallon, claimed there are "no gag orders" and teachers are free to discuss their opinions of Pearson tests. According to Fallon, test design is a "complex system and one that, working with the education community, we're constantly looking to improve. With all that going into creating an assessment, it's understandable when confusions arise. A recent example - that there are "gag orders" in these contracts - is in danger of becoming a popular misconception. Let's be clear: there are no gag orders in testing contracts." Not only that, but "we believe that progress can only be made if everyone contributes to improving the system." But unfortunately some of Pearson employees have not gotten the message!

Michael Ludwig is a teacher educator in a health education program in New York State. Recently I reported that he wrote Pearson (Mis)Education alerting them that edTPA reviewers were not following grading guidelines.

Completion of edTPA is required for teacher certification in New York State. Student teachers must complete an electronic portfolio that generally includes two ten-minute video segments cut out of three to five sequential 40+ minute lessons and approximately sixty pages of commentary with sample student work. They then submit their edTPA portfolio to Pearson for evaluation and receive a grade of pass, fail, or mastery with limited feedback. It is an arduous task that takes months to complete. Instead of learning how to become teachers, they learn how to master edTPA.

edTPA was created at Stanford University by a sub-division called SCALE. Essentially SCALE and Pearson decided they could replace student teacher evaluations by university field supervisors and cooperating teachers with an electronic portfolio. The edTPA website claims thirty-eight states and Washington DC participate in the program, although there is wide variability in how states use it.

Michael Ludwig is an experienced health educator and with the advent of edTPA altered the MS in health education at his university to align with the expectations of the tasks that are part of the portfolio. After several of his colleagues had difficulty assisting his health education students pass edTPA, he was determined to study and master the intricacies of edTPA. He registered to become a scorer for Pearson so he would be better be able to provide support to his health education students. He scored a practice portfolio and was told during the subsequent webinar that he was the best scorer they had worked with to date. He hit the majority of the rubrics with the score that was expected and the few that he didn't get exactly were very close. However, because his time demands increased during the semester he never submitted a second practice portfolio and did not become a certified scorer.

In his letter to Pearson, Ludwig explained he and his student teacher followed Pearson guidelines for appropriate candidate support and used the "Thinking Behind the Rubrics" as our guide and were confident that no rubric was below a three and that most of them should have earned a four or five." He requested that Pearson have the submission re-scored without charging the student for resubmission.

Pearson's response, in an email to Michael Ludwig from Kerry Seidel, Senior Project Director at Pearson North America (email address kerry.seidel@pearson.com) was basically an effort to intimidate a critic. Seidel wrote:

"Within the body of the email, you've stated you and the candidate both used the 'Thinking Behind the Rubrics' as a guide in supporting your candidate. Please refer to the guidelines documented in the Confidentiality and Acknowledgement Form which is signed by scorers prior to scorer training. Below is the portion on confidentiality:

Both during and after my employment with Pearson, I agree not to use or reveal to others any information about Pearson's products or business except as required by employment to Pearson. This includes information I learn while working for Pearson, which I have been told or reasonably know to be information which is confidential, or which is the subject of reasonable efforts to preserve its confidentiality.

I will not reveal to anyone: 1) training instructions and or procedures; 2) scoring trends; 3) any details about the scoring system; 4) any results of scoring either before or after completion of the scoring.

I agree not to use or reveal any proprietary or confidential information from any customer or other third-party that is made available to me during my employment.

edTPA scorers, supervisors, or trainers may not reveal information that falls under the scorer NDA. This includes providing "scoring hints" or other types of coaching that targets achieving high scores on edTPA through the use of scorer training materials, including 'Thinking Behind the Rubrics'. The slightly modified version of that resource is available to program faculty and candidates as 'Understanding Rubric Level Progressions.' All materials and discussion should be about good teaching in reference to edTPA-related constructs."

But after threatening Ludwig, Seidel concedes that a "slightly modified version of that resource is available to program faculty and candidates as 'Understanding Rubric Level Progressions.'" In fact, what Seidel claims is confidential, and what Ludwig used with his student, is widely available on the Internet. I found versions of "Thinking Behind Rubrics" at the University of California Santa Barbara and San Jose State websites. As a result, I think someone can reasonably assume this is not confidential information.

My blog on edTPA also drew an email response from Dr. Andrea Whittaker, director of Teacher Performance Assessment at SCALE. Among Dr. Whittaker's responsibilities, she coordinates a national consortium that includes the American Association of Colleges of Teacher Education, the Council of Chief State School Officers, and Stanford University, many of the forces behind Common Core in the United States and high-stakes assessment.

Dr. Whittaker emailed me that "As national director for edTPA, I would like to offer a few corrections to some inaccuracies in your recent blog (April 25, 2016).

Dr. Whittaker wrote: "SCALE was an early partner in the development of RESA (used during induction in Ohio) but it is not an "adaptation" of edTPA. In fact, most of the 51 educator preparation programs in Ohio are choosing to use edTPA for preservice program completion. TeachScape and Charlotte Danielson led the development of RESA and SCALE left the partnership before it was launched in Ohio. The two assessments are similar because they are performance-based assessments of teaching, but they are not the same nor an adaptation of one another."

Alan's response: Or maybe they are similar because they were created by the same people before "SCALE left the partnership" and because Ohio is committed to using edTPA in its pre-service teacher evaluations

Dr. Whittaker wrote: "Pearson does not 'grade' edTPA. Educators with subject-specific expertise score edTPA in a Pearson hosted platform. All scorers are either P-12 teachers with recent classroom experience in that field, or university faculty who supervise or teach methods in that field. All scoring training is developed by SCALE. For more details about the training, please see the administrative report noted above."

Alan's response: I think Dr. Whittaker needs to check with Pearson. The edTPA website includes a Pearson copyright notice and says "Candidates should allow adequate time prior to their planned submission date to upload and review their files in the Pearson system." Applications to become an edTPA scorer are made to Pearson and Pearson does the scorer training. Based on this it seems reasonable to say that Pearson is the company responsible for the grading of edTPA portfolios.

Dr. Whittaker wrote: "It is true that candidates submit video clips, lesson plans, student work samples and commentary responses. In most fields the commentaries have a maximum length of 25 pages and about 5 pages of that are the commentary prompt questions."

Alan's response: Dr. Whittaker needs to check the guidelines. On a recent mastery-level secondary education edTPA submission that I looked at AFTER IT WAS GRADED and DID NOT READ the Context for Learning was four pages; the Planning segment was nine pages; instructional commentary was seven pages; assessment commentary was eleven pages; lesson plans were a dozen pages; instructional material another dozen pages; evaluation criteria, feedback to students, and student assessments were fifteen additional pages. That is a total of 31 pages of commentary and 39 pages of supporting material for a total of 70 pages.

Popular in the Community

Close

What's Hot