To What Extent Are You Delivering Services as Intended?

Proper implementation of a bullying prevention program is essential in order to achieve positive outcomes. Because most school-based bullying prevention programs rely on a set curriculum, implementing a program properly means that all content theorized to produce desired outcomes must be covered. It also means that implementation characteristics such as level of staff training, number of sessions, and mode of delivery are carried out in a way that is consistent with the original design of the program.

Implementing a program exactly as designed is especially important when implementing evidence-based programs with known key ingredients (or “core elements”). However, this rarely happens without an investment of training and resources. Research suggests that teachers are most likely to implement a bullying prevention program as intended if they: (a) perceive bullying to be a problem in their own class; (b) perceive bullying to be a problem in their school;  (c) read more of the materials; and (d) identify empathically with students who are bullied (Kallestad & Olweus, 2003).

The terms “implementation fidelity” or “implementation quality” are often used to describe the delivery of a program as originally intended (Domitrovich et al., 2008; Mihalic, 2002). Tracking adherence to program design and content regularly will allow you to assess whether the program is being implemented consistently and with fidelity. 

Provided below is an example of an implementation assessment tool that has been developed for an evidence-based program. This tool is designed to be completed by program administrators over a period of months initially and then completed periodically afterwards to track progress.

If you are not implementing this particular program or an evidence-based program with an existing assessment tool, you can consult this tool and use it as a model for developing your own assessment tool. However, a note of caution is needed to warn that developing tools without a strong background in survey design and measurement is not recommended. If you do develop your own tool, it will need to be refined and improved over time and, ideally, reviewed by an external research firm to validate its reliability and validity.


Sources Cited

Domitrovich, C. E., Bradshaw, C. P., Poduska, J. M., Hoagwood, K., Buckley, J. A. Olin, S., Romanelli, L. H., Leaf, P. J., Greenberg, M. T., & Ialongo, N. S. (2008). Maximizing the implementation quality of evidence-based preventive interventions in schools:  A conceptual framework. Advances in School Based Mental Health Promotion, 1, 6–28

Kallestad, J. H., & Olweus, D. (2003). Predicting teachers' and schools' implementation of the Olweus Bullying Prevention Program: A multilevel study. Prevention and Treatment, 6, 3-21. doi:10.1037/1522-3736.6.1.621a

Mihalic, S. (2002). The importance of implementation fidelity. Blueprints for Violence Prevention Initiative. Center for the Study and Prevention of Violence. Retrieved 12/1/11 from