THE BLOG
12/22/2015 06:12 pm ET Updated Dec 22, 2016

NCCD, Netflix, and Amazon all use predictive analytics--what is different about child welfare is the consequences of mistakes

Recently, I interviewed NCCD's Chief Program Officer Dr. Jesse Russell, who has been educating juvenile justice and child welfare stakeholders alike on how to use predictive analytics to improve the child welfare system. Dr. Jesse Russell has published several recent papers on predictive analytics. He has also hosted numerous predictive analytics webinars.

Marquis Cabrera: What are some consistent pain points you see in the current foster care/child welfare system?

Dr. Jesse Russell: Child welfare is currently oriented in three goal areas: permanency, safety, and well-being. Each of these areas could use substantial improvement, but that doesn't mean that the system fails. However, the child welfare system does fail when it does not keep kids safe, aid in permanency, or promote well-being. It is a system that responds to reports of children being abused or neglected, but it does not always achieve that purpose, leaving much room for improvement.

Marquis Cabrera: But in your expert opinion, what is the persistent problem?

Dr. Jesse Russell: It's answering the toughest questions: Which families should we be most worried about? Which children should we be most concerned about and use our resources for? Intake, hotline, safety determination, and safety plan building are all factors that contribute to answering those questions, but there's still a big part of this that involves guessing about what will work the best -- this [guess work] often leads to over intervention and under intervention. There are times when intervention methods are not completely innocuous, and then there are times where children are not being kept safe as they ought to be. These misses occurred because there wasn't enough information to know the children were alright or because the system intervened more than it should have - like opening a case and removing a child when we shouldn't have. This is one problem where there are regular errors.

Marquis Cabrera: How is NCCD helping to innovate the child welfare system?

Dr. Jesse Russell: It continues to move forward Structured Decision Making® (SDM) to ensure validity, reliability, and a culture of moving data forward. Structured decisions are a primary mechanism for bettering the field by using data for better decision making, which has been our primary effort. Recently, we have been using predictive analytics to further support SDM®.

Marquis Cabrera: What is NCCD's role in using predictive analytics in child welfare?

Dr. Jesse Russell: Every child welfare agency has more information than they have ever had before. It's important to figure out how to leverage predictive analytics to move the [child welfare] system forward. System improvement needs to be a continuous movement, and data and analytics has the power to push that [movement] forward. Let's keep working on things we have been working on and using predictive analytics to help keep more kids safe and increase well-being. However, if predictive analytics doesn't do those things, then we shouldn't use it.

At NCCD, we work with partners to engender a culture of using data to answer questions. It's within those relationships that we try to drive data and decision making using predictive analytics in that context. For example, predictive analytics can help us answer questions: Are we approaching this issue in the best way possible? Given our overall population, where should we direct resources to? Predictive analytics can show us how to model outcomes, create algorithms, and inform ongoing decisions.

Marquis Cabrera: How can predictive analytics be used to improve the child welfare system?

Dr. Jesse Russell: It's something we're still trying to figure out. It's a worthwhile question to pursue, but the answer hasn't fully emerged yet. We have used predictive analytics to drive some powerful insights. Some others have had success with predictive analytics informing specific decisions. It hasn't happened to scale yet because we need to figure out what the best ways to use predictive analytics. We are really on the learning edge of what predictive analytics in child welfare will look like. The ethics component is still ongoing regarding data, tech, algorithms, and some child welfare goals. The ethics are what we're going to have to wrestle with; determining how we can mitigate the over-intervention of predictive analytics has yet to be figured out.

If you squeeze a balloon in one place, other parts of it start to swell up. Given that there are multiple goals and they don't always move in same direction, if we isolate and prioritize one goal, we may actually be moving the wrong way on others. We don't want to let predictive analytics tell us how to prioritize goals; we still need to be making those decisions ourselves and not rely on algorithms.

Over-intervention depends on how people react to it, like the "successful" algorithms used with Netflix and Amazon. It makes guesses for you based on your past history: you might like this movie or these similar items that other customers bought. However, these algorithms also miss a lot of the time: Netflix recommends movies but left out this one that you enjoy, and Amazon recommends books but you didn't like that one. The consequences are pretty nominal - you didn't like the movie or book recommended - but the consequences of those false positives are really small. There are plenty of false positives with these services, but the consequences (a movie you didn't like) are small compared to consequences of false positives in child welfare (e.g., a child is removed from a home who shouldn't have been). Mistakes made about safety decisions, permanency, or child well-being cannot be taken lightly in same way that Amazon and Netflix can. The stakes are too high with child welfare.

Marquis Cabrera: What are the limitations of predictive analytics?

Dr. Jesse Russell: One of my biggest worries is that the excitement will make people feel like they have been relieved of their responsibilities, rather than setting goals and using predictive analytics to improve those goals. We need humans to make decisions at case level and strategic level. We need child welfare agencies to be held accountable and not believe they are turning decision-making over to technology. It is true that technology can help us as human services workers to do our jobs better, but if you want paradigm shift, you'll have to think bigger, and not just use technology.

Marquis Cabrera: Can you give me a use case of predictive analytics being used effectively in child welfare?

Dr. Jesse Russell: We created a model to highlight young people in the child welfare system that are most at risk of becoming involved in the juvenile judicial system. Predictive analytics proved great for highlighting the children most accurately. It created an email alert and flagging system and notified caseworkers with an email. For example, this young person is at an elevated risk and the analytics were able to leverage an integrated practice model to help those youth.

Marquis Cabrera: What advice would you give to someone looking to use predictive analytics?

Dr. Jesse Russell: Some people can dive in and go for it. People have to have a lot of caution and humility about the insights they can derive from data using predictive analytics. But using the data can form relationships: working together with people, building a community, creating bonds. People don't need to be scared of data; there's a lot of learning that can be done collaboratively. We should be cautious, but not scared. NCCD works with many child welfare agencies, so we try to establish both data and person-to-person relationships. It can be really effective to start a relationship first, and then from there, look at data as an ongoing process versus just a once-a-year reporting requirement.