To What Extent Are Services Being Delivered as Intended?

Implementing a program as designed is an effective practice associated with positive sex education outcomes (Kirby, 2007). This means that all content expected to produce desired outcomes is covered and implementation characteristics (such as level of staff training, number of sessions, and mode of delivery) are consistent with the original design of the program.*

This is especially important when implementing evidence-based programs that have known key ingredients (sometimes referred to as core elements or core components). There are now 14 evidence-based, sex education programs listed on the Office of Adolescent Health web site (http://www.hhs.gov/ash/oah/oah-initiatives/tpp/tpp-database.html). Many of these programs have resources to help measure implementation fidelity. The word “fidelity” is often used among researchers to describe the delivery of a program as originally intended.

Tracking fidelity regularly using implementation assessment tools can help to determine whether the program is being implemented as intended and in a consistent fashion. Some tools are designed to be completed after every session, while others can be completed periodically (for example, after every academic quarter). Tools can be completed by observers, by the program leaders themselves, or, sometimes, by participants.

If you are implementing an evidence-based sex education program, you will want to use the fidelity assessment tools developed for the program. If you are not implementing an evidence-based program with an existing assessment tool, you can consult the tool below as a model for developing your own assessment tool. However, a note of caution is needed to warn that developing tools without a strong background in survey design and measurement is not recommended. If you do develop your own tool, it will need to be refined and improved over time and, ideally, reviewed by an external research firm to validate its reliability and validity.

Note: When program delivery is consistent and true to the program model, but participant outcomes are not improving, it may be necessary to adapt your program.

Surveys/Assessments

 

Sources Cited

Kirby, D. (2007). Emerging answers: Research findings on programs to reduce teen pregnancy. Washington, D.C.: National Campaign to Prevent Teen Pregnancy. Available at http://www.thenationalcampaign.org/