Driving Education Accountability: 'I'm Sorry Officer, the Gas Gauge Shows I Wasn't Speeding.'

03/24/2015 05:06 pm ET | Updated May 24, 2015

Imagine being made to follow the speed limit, but denied access to a speedometer -- only learning how fast you are going after the police officer pulls you over and gives you a ticket. And when you protest, you're told you should have been watching your gas gauge.


Too often that's how it is with education data. We post the signs for what we want performance to be and publicly report offenders, but don't provide the drivers of education the right gauges for monitoring and correcting.

The reality is education reporting and data collection systems almost exclusively focus on metrics like graduation rates, test scores and even employment, which are reported too late to be acted on. While important and representative of goals we as a nation must attain, they are not designed to help those in the delivery of education do the work required to meet those goals. Faculty are just not in a position to fix graduation rates after they are posted. Like the speedometer, the drivers of education need leading indicators they can respond to in time to make a difference for students.

So, how do we know which measures and data will provide the best feedback to empower faculty and staff to make needed changes? How do we move toward a mindset that focuses on metrics that matters?

Don't assume
Often those determining accountability metrics either miss what is needed to improve or assume that by setting a standard, educators will rise to the occasion.

It's a hazardous assumption that sets educators up to fail. Instead, asking those who work closest to the student to decide what data are collected makes abundant sense. Not only will they identify indicators they have control over, they become invested in using the data to improve. This practice creates habits of reflection and collaboration. While it may take some extra time, it is worth the investment.

Where is this happening? At Odessa College, leaders knew they had a dropout problem, but faculty needed a way to identify and support struggling students earlier.

Use metrics that matter
No one knows better than faculty that students who fall behind after his or her first class find it nearly impossible to catch up. Selecting leading indicators like in-class retention numbers -- including whether a student showed up for class, completed assignments, and/or met with advisors and faculty -- put Odessa faculty in the drivers seat to take action to avert students' dropping out. These leading indicators help predict whether students may need more support to stay on track to graduate.

Predictive analytics are catching on and improving student success at more colleges. But approaching anything close to Amazon-esque algorithms recommending products based on past buying behavior is not in most college's budget. Furthermore, higher education data is notoriously inaccurate and inconsistent. Until these systems are more affordable, simply engaging faculty to identify a few leading indicators is more cost effective and helps ensure any future predictive analytics will be fueled by valid data.

Match solutions to problems
Metrics alone don't provide the answers for reaching students at greatest risk of dropping out, but they do provide the ability to ask more intelligent questions and find ways to address problems.

Successful college leaders provide routine opportunities for faculty to discuss data and how to act on the metrics that matter most. Odessa College studied faculty behavior in classes where students were retained at high levels and found four factors that led to student success:

  1. Establish a connection: Faculty know each student's first name on the first day of class.
  2. Engage: Every student is required to meet with faculty for 15 minutes to discuss expectations and interests.
  3. High standards meted fairly: Faculty hold students accountable for meeting very high standards, but if a student has a good excuse, they're flexible.
  4. Assess early and often: Faculty don't wait to grade students' work at the midterm. They constantly gain feedback about student engagement with material in the form of assignments, class discussion and assessments.

While a breakthrough, there was one more key ingredient that was needed. Just having this data was not enough to effectively address the problem.

Create a sense of urgency
To make changes in a culture notoriously resistant to change requires a sense of urgency. Odessa College leaders made it clear changes were imperative and all were provided support for incorporating the four factors into daily teaching. Teams of faculty and staff were assigned to follow and support students throughout the semester. The president went public with the school's data and goals so that everyone could help to solve the problem

The results were stunning. Graduation rates increased 55 percent. In-class retention improved from 83 percent to 95 percent in four years. Traditional struggling groups have also increased: both males and Hispanics increased course success rates by 10 percent.

Odessa's achievements demonstrate what is possible when data and accountability systems are built for those who actually drive the learning process. Instead of reporting results that are too late to change, faculty and advisors are continually provided feedback needed to respond to students needs in time to keep them on track to earning a degree.

Brad C. Phillips, president and CEO of the Institute for Evidence-Based Change (@BPhillipsIEBC), which helps education stakeholders use data to make informed decisions, improve practice, and increase student success.