• Adult Children Exposed to Domestic Violence
  • Runaway & Homeless Youth Toolkit
  • Prevent Intimate Partner Violence
  • Violence Against Women Resource Library
  • Domestic Violence and Housing Technical Assistance Consortium
  • Domestic Violence Awareness Project
  • National Resource Center on Domestic Violence

While the thought of “evaluation” can be daunting, if not downright intimidating for many domestic violence (DV) programs, there are many good reasons to evaluate the job we are doing. The most important reason, of course, is to understand the impact of programs and practices on the lives of survivors and their children in order to build upon those efforts that survivors say are helpful to them and stop putting time and resources into efforts that are not helpful or relevant to them. Evaluation is also important because it provides “hard evidence” to present to funders, policymakers, and allied organizations, encouraging them to continue and increase the resources available to support effective programs and approaches.

The materials and tools highlighted below (and more will be added over the coming year) were selected to assist DV advocates and allied organizations understand the different types of evidence that can inform our work and provide practical considerations and strategies for approaching evaluation in domestic and sexual violence organizations. These resources also explore the unique challenges of evaluating and making the case for intervention and prevention initiatives.


Theory of Change
Guiding Evaluation


Evaluation Manuals
and Briefs

Tips and Tools
for Data Collection


Evaluation Manuals and Briefs

Evaluation Issue Briefs

Knowledge is power. The more DV service providers and advocates know about designing and conducting evaluation efforts the better those efforts will be. Well-structured evaluations of DV services can provide valuable information we need to continually improve our programs. The following Issue Briefs prepared by the NRCDV are short discussions of related conceptual and practical topics that can help DV advocates more fully understand the benefits of evaluating their services, and identify important considerations before engaging in evaluation.


Outcome Evaluation Guides


Outcome Evaluation Strategies for Sexual Assault Service Programs: A Practical Guide

This Guide was developed in 2000 by Cris Sullivan, Ph.D. and Suzanne Coats, MSW, for the Michigan Coalition Against Domestic and Sexual Violence as a resource to assist service providers in the state explore effective outcome evaluation strategies. Some of the content is adapted from the guide to outcome evaluation for the Pennsylvania Coalition Against Domestic Violence in 1998 (see below).


Sullivan_Outcome_ManualA Practical Guide: Outcome Evaluation Strategies for Domestic Violence Programs

This 1998 Guide was developed by Cris Sullivan, Ph.D., for the Pennsylvania Coalition Against Domestic Violence as a resource to assist domestic violence service providers in examining the effectiveness of their programs in a straight-forward manner. Although many programs are feeling external pressure from funding sources to conduct outcome evaluation, this guide is useful not just for convincing external sources of our importance but also in enhancing program effectiveness.



Evaluation for Improvement: A Seven Step Empowerment Evaluation Approach for Violence Prevention Programs

This 2009 manual, developed by the National Center on Injury Prevention and Control/Center for Disease Control, is designed to help violence prevention organizations hire an empowerment evaluator who will assist them in building their evaluation capacity through a learn-by-doing process of evaluating their own strategies. It is for state and local leaders and staff members of organizations, coalitions, government agencies, and/or partnerships working to prevent sexual violence, intimate partner violence, youth violence, suicide, and/or child maltreatment.


Tips and Tools for Data Collection

This site contains practical and concrete information and tools to help you evaluate

(1) a variety of services commonly provided to and with survivors, and
(2) Train the Trainer workshops [section 4].

For help in explaining the Theory of Change guiding your services, please click here.

The first three sections focus on services and programs for survivors of domestic assault. The first section contains handouts that you can copy and distribute to your staff – these handouts were designed to help you and your colleagues decide for yourselves how, when, and how often to evaluate your services.

The second section provides some examples of actual surveys you might use or modify to evaluate different types of services.

And the third section includes some standardized (valid) measures of quality of life, social support, hope, life satisfaction, and housing instability. You do not need to use these for program evaluation, but may find them useful for various projects.




For Word versions of these surveys, please contact the NRCDV.



These standardized (valid) measures are NOT necessary for you to evaluate your programs. However, a number of advocates have requested to see surveys that measure such things as quality of life, social support, hope, life satisfaction, trauma-informed practice, and housing instability.



For those who provide “training of the trainer” (TOT) workshops on any topic, you may find these three documents useful. The first includes some tips and examples for you to consider when you need to evaluate the effectiveness of your Train the Trainer workshop. The second and third provide you with templates for creating Pre-Workshop surveys and Post-Workshop surveys.

Additional Resources:

Understanding Evidence Part 1: Best Available Research Evidence: A Guide to the Continuum of Evidence of Effectiveness 

The Best Available Research Evidence enables researchers, practitioners, and policy-makers to determine whether or not a program, practice, or policy is actually achieving the outcomes it aims to and in the way it intends. The more rigorous a study’s research design, (e.g., randomized control trials, quasi-experimental designs), the more compelling the research evidence. However, literature also suggests that two other forms of evidence — experiential evidence (or practice-based evidence) and contextual evidence — which are both distinct and overlap.

Continuum of Evidence Chart

Working Protocol for Domestic Violence Programs when Approached by Researchers

This working protocol was drafted by a group of researchers and practitioners connected to Michigan State University and the Michigan Coalition Against Domestic and Sexual Violence. The protocol recognizes that while there is a growing need for domestic violence service providers to become active in assessing the value and process of research, these programs have an obligation to assess both the risks and benefits of each research study and to ensure that the safety and confidentiality of participants will be maintained.