To What Extent Is the Program Being Delivered as Intended?

Proper implementation of a substance abuse prevention program is essential in order to achieve positive outcomes (Botvin & Griffin, 2003). Because most family-based drug and alcohol prevention programs rely on a set curriculum, implementing a program properly means that all content theorized to produce desired outcomes must be covered. It also means that implementation characteristics such as level of staff training, number of sessions, and mode of delivery are carried out in a way that is consistent with the original design of the program.

Implementing a program exactly as designed is especially important when implementing evidence-based programs with known key ingredients (or “core elements”). However, it rarely happens without an investment of training and resources. Research suggests that teachers are most likely to implement a program as intended if they are newer to the profession, receive proper training, are confident in their ability to teach interactive methods, and are enthusiastic about program (Dusenbury et al., 2003).

The word “fidelity” is often used among researchers to describe the implementation of a program as originally intended. Fidelity assessment tools are tools designed to assess implementation quality. While some tools are designed to be administered after every session, others can be administered periodically (for example, after every academic quarter). Tracking fidelity regularly will allow you to assess whether the program is being implemented consistently. When a program is implemented consistently over a period of time, but participant outcomes are not improving, it may be necessary to adapt your program.  For a good resource on implementation fidelity and adaptation in the context of substance abuse prevention, see Backer, T.E. (2002).

Provided below are examples of implementation assessment tools that have been developed for an evidence-based program. These tools are designed to be completed by observers after each session. If you are not implementing this evidence-based program with an existing assessment tool, you can use these tools as models when developing your own assessment tool. However, a note of caution is needed to warn that developing tools without a strong background in survey design and measurement is not recommended. If you do develop your own tool, it will need to be refined and improved over time and, ideally, reviewed by an external research firm to validate its reliability and validity.

 

Surveys/Assessments

 

Sources Cited

Backer, T.E. (2002). Finding the balance: Program fidelity and adaptation in substance abuse prevention: A state-of-the art review. Rockville, MD: Substance Abuse and Mental Health Services Administration (SAMHSA), Center for Substance Abuse Prevention (CSAP). Available at: http://www.eric.ed.gov/PDFS/ED469354.pdf.

Botvin, G. J., & Griffin, K. W. (2003). Drug abuse prevention curricula in schools. In Z. Sloboda & W. J. Bukoski (Eds.), Handbook of drug abuse prevention: Theory, science, and practice (pp. 45-74). New York: Kluwer Academic/Plenum Publishers. doi:10.1007/0-387-35408-5_3

Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237-256. doi:10.1093/her/18.2.237

National Institute on Drug Abuse. (2003). Preventing drug use among children and adolescents: A research-based guide for parents, educators, and community leaders.