Ten years ago while working in Ohio I discovered Implementation Science and came to realize that effective implementation of effective practice requires much more than doing what the manual tells us to do. In other words, it’s a whole lot more than just checking off boxes.
Back then, the field was just emerging and even now, we know more about what doesn’t work than what does. But there has been substantial progress in understanding the key elements of effective implementation of programs.
The January 8th event, titled “Leadership Symposium on Evidence-Based Practice, Implementation Science: Closing the Gap Between Innovation and Practice,” at UC Davis reported on the latest developments in implementation science. For us implementation junkies, it was a shot in the arm.
This symposium was sponsored by the California Implementation Science Collaborative, a group of dedicated education, child welfare, mental health, academic and criminal justice professionals, as well as organizations who believe that the effective use of implementation science is critical to the successful implementation of evidence-based practices (EBPs) and programs. What we know is that it takes both evidence-based programs and evidence-based implementation to get results.
The National Implementation Research Network (NIRN) and Dean Fixsen, Ph.D., whom I consider the leader in this field, simplifies this concept as such: Effective interventions x Effective Implementation = Effective Outcomes. If the intervention or the implementation is ineffective, the outcome is will most certainly be poor.
Significant research now exists which suggests that at least 15 percent of the funding of any intervention intended to achieve positive outcomes must be dedicated to external support. For example, if staff receive the traditional “train and hope” approach to learning and implementing a program, we get a 5-10 percent uptake. With consistent quality coaching, however, we will get 90-95 percent uptake. If staff implement the program as intended (fidelity to the model) we get better outcomes. Measuring fidelity, coaching, etc. requires resources, which unfortunately are the first cuts made when funding falls short.
EBPs are implemented by people and people have complex ways of taking in and processing information. Each person has very different experiences from the next that result in different values, beliefs, and behaviors. These have a huge impact on how practices are implemented, which means the “process” is critical. One of the Partnership for Community Excellence’s objectives is to spread implementation science in criminal justice in California. We will work with our partners in criminal justice and the California Implementation Science Collaborative, of which we are a member, to incorporate implementation science into community corrections.