Putting Common Core Tests to the Test

Putting Common Core Tests to the Test
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

2014-03-23-TNTPChicago0388_685x257.jpg

By Sam Firke

This post was originally published on the TNTP Blog.

Here’s a question you probably know the answer to: Until recently, many standardized tests taken by students in American public schools…

  1. Failed to capture students’ ability to solve in-depth problems.
  2. Did not accurately measure the learning of students working far below and above grade level.
  3. Took months to return results.
  4. All of the above.

If you answered “d,” you’ve identified just a handful of the limitations of the old model of standardized testing. I believe in the importance of measuring the impact of teachers and schools—and the role tests can play—but like many, I’ve been frustrated by the shortcomings of our current testing capabilities. That’s about to change.

This week, states will begin field testing the new, Common Core-aligned assessments developed by two state-led groups, Smarter Balanced Assessment Consortium (SBAC) and Partnership for Assessment of Readiness for College and Careers (PARCC). These new tests look different than what we’re used to, and have tremendous potential to address many of the biggest problems with standardized tests. It’s critical that we stay focused on the long game here, because in the short-term, implementation is going to get messy.

Let’s unpack the good stuff first:

Rigor. The biggest difference you’ll see with the new tests is that the questions are more complex—because they’re designed to match the rigor of the new standards. The questions go beyond multiple-choice items of yore; the new tests contain open-response items that can better assess students’ problem-solving skills, and measure a deeper level of knowledge and ability. These questions are closer to the free-response items on Advanced Placement exams. I used to teach AP Calculus and was a fan of the free-response questions, which made up half of the test, because they required students to think through more in-depth problems, and they allowed me (and the test scorers) to see evidence of my students’ thinking in action, not just whether they got the right final answer.

If we want our kids to be able to think critically, draw connections between source materials and offer original ideas in response to information they’re presented with, we should assess whether or not they’re developing those skills.

Technological advantages. Both SBAC and PARCC are computerized (although PARCC will offer a mix of paper-based and computerized tests during the field testing period). Computer-based tests present several potential advantages over their pencil-and-paper predecessors: With clicks digitally recorded, it will be much more difficult for adults to game the system, the way we’ve seen in test erasure scandals. SBAC is adaptive, too, which means that as kids miss questions, the test gets easier; as they get them right, it gets harder. This eliminates the floor and ceiling effects that hurt our lowest- and highest-performing students, makes the test a more equally challenging experience for kids at all performance levels and provides more accurate estimates of individual student achievement.

The computerized tests also support question types that allow students to engage more deeply with the problems. For instance, students can plot points or graph a line while answering a math question. And digital tests offer faster turnaround of results, too—which could make the data provided by the new tests more actionable for educators, schools and districts.

This is all good news. But here’s the obvious bad news: The roll-out of these tests is not going to be pretty. Change is always hard, and in this case, the long-term positives of the new tests come with some shorter-term negatives.

Proficiency rates will drop. Even in districts that are providing excellent support to teachers in the transition to the Common Core, teachers are still learning how to teach to the new standards. Getting everyone up to speed is going to take time. Since the Common Core standards are more rigorous than the standards a majority of states have been using up until this point, the new assessments are measuring students’ learning against a higher bar. Inevitably, proficiency rates will go down, as we’ve already seen in states that have implemented their own Common Core-aligned tests.

There will be technological glitches. The technology required for the new tests will present both big stumbling blocks—districts lacking sufficient computers or adequate Internet coverage, for example—and small ones, like the logistical drama of scheduling access to a school’s computer lab. Students will need to get familiar with the testing interfaces, so that the tests don’t inadvertently measure computer skills above everything else—and the test developers may need to make improvements along the way, to make the interfaces more student-friendly. I had some trouble entering equations on the PARCC practice items, and if I did, some kids will too. And as we move beyond multiple choice, there will initially be anxiety over entering open-response answers in such a way that they will be accurately scored.

*

But these issues will arise with any such improvement in the tests, and hitting pause on implementation would only kick the can down the road—while also delaying the positive returns from better tests. These tests are a big step in the right direction. Let’s use the field testing period to be ruthless about working out as many kinks as possible, to minimize the issues when the tests go live. And all of us who are invested in education should go online and try the available practice items, so we can engage in an informed conversation about these tests. When the tests go live, for high stakes, year one is going to be ugly. Year two will still be rocky. But by year twenty, we’re going to wonder why we ever did this any other way.

Sam Firke is Site Advisor for Data & Analysis at TNTP.

Popular in the Community

Close

What's Hot