Data…
Data…
Data…
If you are a Head Start manager or director, you have seen this word a lot. You know that the new Performance Standards are going to require a whole new approach to how you think about data. You see data showing up in training programs, guidelines, and pretty much everywhere in the Head Start universe.
But just what does this mean in practice? How does a program use data to demonstrate outcomes?
A simple, easy-to-understand, wrong answer is to collect more data, building mounds of information about your clients and your programs on top of the data that you already collect. When the private sector turned to data-driven decision-making this was one approach. The problem is that the “more is better” approach often fails.
More data does not mean good data.
Data is just information. Information can be good or bad. In the case of Head Start programs, improving data quality is going to matter more than increasing data quantity.
Head Start programs have heavy reporting requirements and have been collecting a vast trove of data. This data has fed compliance reports for years. Data systems to manage that data are common: ChildPlus, COPA, and PROMIS have been around for years.
The challenge under the new performance standards will be to make use of this data for demonstrating outcomes. To meet these new standards, data quality will come to the forefront from the need to demonstrate two things:
- Continuous quality improvement (CQI)
- Evidence-based outcomes (EBO)
To make good decisions based on data, you need good data.
Busy Bees Playgroup Head Start lives in the compliance world. To them, complete data is what is key: All of their health screenings were completed and the data was entered on time. Boxes checked. PIR fed. On to the next review.
Building Blocks Childcare Center Head Start lives in the world of the new performance standards: for CQI and EBO, having actionable data is what is key: When are screenings completed? Did all of the children who needed referrals get them? Were some sites better than others at getting kids referred to services? They still have to report the checked boxes (the PIR is always hungry) but they also need to show that the screenings did what they are supposed to do: accurately screen children and generate referrals to services to help those who need them.
Building Blocks also has to talk about how they have improved their process over the years. They know that last year 40% of their children were screened in the last 10 days of the 45-day window, largely due to problems screening at two distant sites. This year, 30% of children were screened in the last 10-days because they piloted a mobile screening service at one of the two distant sites. As a result, referrals to services increased and more kids got the help they needed. Next year, both distance sites get the mobile hearing screening. Boxes still checked and PIR still fed. But CQI also demonstrated.
For Head Start managers and directors looking for ways to implement the new performance standards, an evaluation of data quality will be a key starting point. This is a new kind of thinking and it will require new approaches. It will also mean being flexible enough to look at how you collect data for compliance and asking hard questions about whether or not the data that lets you complete a checklist really gives you the information you need to make good decisions.
Improving data quality is a challenge. It requires you to ask hard questions and potentially change some of how you collect, store, and use information. The payoff in actionable information is worth the price.