Ensure Use and Share Lessons Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs STEP 1: ENGAGE STAKEHOLDERS 1.1 Determine how and to what extent to involve stakeholders in program evaluation STEP 2: DESCRIBE THE PROGRAM 2.1 Understand your program focus and priority areas 2.2 Develop your program goals and measurable (SMART) objectives 2.3 Identify the elements of your program and get familiar with logic models 2.4 Develop logic models to link program activities with outcomes STEP 3: FOCUS THE EVALUATION 3.1 Tailor the evaluation to your program and stakeholders needs 3.2 Determine resources and personnel available for your evaluation 3.3 Develop and prioritize evaluation questions STEP 4: GATHER CREDIBLE EVIDENCE 4.1 Choose appropriate and reliable indicators to answer your evaluation questions 4.2 Determine the data sources and methods to measure indicators 4.3 Establish a clear procedure to collect evaluation information 4.4 Complete an evaluation plan based on program description and evaluation design STEP 5: JUSTIFY CONCLUSIONS 5.1 Analyze the evaluation data 5.2 Determine what the evaluation findings say about your program . STEP 6: ENSURE USE OF EVALUATION FINDINGS AND SHARE LESSONS LEARNED 6.1 Share with stakeholders the results and lessons learned from the evaluation 6.2 Use evaluation findings to modify, strengthen, and improve your program SUGGESTED CITATION: SalabarríaPeńa, Y, Apt, B.S., Walsh, C.M. Practical Use of Program Evaluation among Sexually Transmitted Disease (STD) Programs, Atlanta (GA): Centers for Disease Control and Prevention; 2007. Ensure Use and Share Lessons E nsuring use of evaluation results to improve your program and sharing lessons learned from the evaluation process is the sixth step in CDCs framework for program evaluation. This step involves (1) providing recommendations for action, and (2) disseminating the evaluation findings with those who need to be aware of the information. Evaluations that are not used or are inadequately disseminated are a waste of time and resources. Step 6 includes two evaluation tools: Tool 6.1 provides information on how you can share the results and lessons learned from the evaluation with stakeholders and other interested audiences. Tool 6.2 provides guidance on using your evaluation findings to modify, strengthen, and improve your program. TOOL 6.1: SHARE WITH STAKEHOLDERS THE RESULTS AND LESSONS LEARNED FROM THE EVALUATION INTRODUCTION Once you have analyzed and interpreted the evaluation data, you are ready to develop recommendations for action and carefully consider the most useful ways to disseminate the evaluation findings to your stakeholders and other audiences. Appropriate dissemination is the key to ensuring that your evaluation findings translate into informed decision making and action. This tool will help you: (1) develop recommendations based on the evaluation results; (2) identify with whom you need to communicate the results; and (3) prepare tangible products of the evaluation (e.g., reports). The flowchart below illustrates where sharing your evaluation results and lessons learned fits in with your other evaluation activities. LEARNING OBJECTIVES Upon completion of this tool, you will be able to: Identify factors to consider when making evaluation recommendations. Determine strategies for informing audiences about relevant aspects of the evaluation. Organize and write up the findings. WHAT ARE SOME FACTORS TO CONSIDER WHEN DEVELOPING YOUR RECOMMENDATIONS? Recommendations for action should be based on the interpretation of the evaluation findings. How you frame the recommendations depends on the users and the purpose of your evaluation. When you are in the process of developing your recommendations, review with your stakeholders the purpose of the evaluation and its users (see Tool 3.2). Purpose of the evaluation The purpose(s) of the evaluation which you identified early in the process should guide the recommendations you make. For example, is the purpose of your evaluation to identify ways to improve the functioning of your program? To demonstrate program effectiveness? To demonstrate accountability for resources? Users of the evaluation results The recommendations need to be relevant to your stakeholders and to those who need to be aware of the evaluation results. These individuals or organization representatives become the audience(s) for the evaluation recommendations; consequently, you need to know what information they want and what is important, relevant, and useful to them. Tailoring the recommendations to your audience(s) increases ownership and motivation to act on what is learned. The following scenario provides two examples of possible recommendations developed from the findings of an evaluation of a syphilis elimination initiative targeting men who have sex with men (MSM). Background: For the last two years, Green City has been implementing a syphilis elimination initiative to address a syphilis outbreak in the MSM community. The initiative consists of: (1) risk reduction counseling to males who selfidentify as MSM in order to reduce syphilis transmission; (2) a media campaign that includes public service ads placed in the local gay newspapers and posters distributed to businesses and other places frequented by MSM addressing the need for MSM to obtain syphilis testing; and (3) outreach and education conducted by CommunityBased Organizations (CBOs) that serve MSM, including distribution of educational materials and condoms, and individual and group presentations/educational sessions by peers. The evaluation was conducted to strengthen the initiative by making changes accordingly. The STD Program Director in Green City would like to determine whether the media campaign had reached the target population. CBOs serving highrisk MSM are interested in knowing if the media and CBO education component(s) were culturally relevant. Example 1 PURPOSE OF EVALUATION: Determine if the target population is being reached by the media campaign. AUDIENCE: Green Citys STD Program Director and staff. USE OF FINDINGS: Improve intervention. FINDINGS: Approximately 40% (n=400) of surveyed MSM living in the target area recalled the content of the health departments campaign message, and 30% (n=120) of those who recalled the message indicated they sought testing as a result of the message. One individual (.8%) was diagnosed with secondary syphilis. In contrast, 53% (n=140) of CBO clients recalled the prevention message that was being promoted by the CBO (decrease number of partners and condom usage), and 40% (n=56) of these reported getting tested as a result of the peer intervention. Of those tested, 3.5% (n= 2) were positive for syphilis. Three focus groups of MSM (10 participants per group) living in areas targeted by the media campaign found that a majority (60%) did not identify with the visuals and message content in the health departments posters and ads. More than one third felt that the locations in the community where these posters were placed were not the most appropriate. Focus group participants who subscribed/read the newspapers (n= 15) used in the campaign indicated that the ads were too small and did not have prominent placement (i.e., they were often overlooked by the readership). STD Program staff who were interviewed reported that the posters/ads had not been tested with the community due to a series of glitches on the part of the contractor who was handling the communications component of the initiative. In contrast, focus groups with CBO clients found that participants appreciated the interaction with peers (a human face to ask questions), the credibility of the organizations involved (CBOs trusted in the community), and the access to condoms. RECOMMENDATION: Based on these findings, it seems that while the STD program is reaching some individuals in the target population, they need to revise the media campaign material (posters and flyers) with appropriate visuals and content messages and integrate members of the target population in this process to reach more atrisk MSM. Also, the program needs to work with participating community newspapers to revise the ad layout (e.g., size) and to place them in more visible sections. Close collaboration (e.g., training CBO staff on STD prevention and testing, sharing staff) with the peer outreach efforts conducted by CBOs, which seems to be an effective way to promote prevention and control messages and syphilis testing in the target population would be an asset for the STD program. Example 2 PURPOSE OF EVALUATION: Determine cultural appropriateness of the media campaign and improve intervention accordingly. AUDIENCE: Partner CBOs and MSM advocates of the initiative. USE OF FINDINGS: Revise media material and approach, if necessary, to respond to the needs of the target population. FINDINGS: An overwhelming majority (85%; n=340) of CBO clients interviewed gave high ratings to the peer education sessions conducted by the CBOs. Small group education sessions where MSM could exchange experiences with peers were favored by most respondents who were Spanish speaking clients. This group represents a high proportion (70%; n=280) of the CBOs clients and had been reported as high risk for syphilis in this community based on low condom use, having sex with men but identifying themselves as heterosexuals, lack of access to both information and STDrelated services, and unfamiliarity with the health care system. It was found that only one of the four CBOs participating in the initiative had a bilingual peer educator, which was not sufficient to meet the demand for services from half of the clients served at the CBOs. A breakdown of respondents by race/ethnicity showed that 35% (n=17) of AfricanAmerican and 25% (n=70) of Latino MSM found the educational materials appropriate, compared with 75% (n=41) of white MSM. Focus groups conducted with AfricanAmerican and Latino MSM found that the visuals and images used in the materials were neither culturally nor linguistically relevant to them. RECOMMENDATION: Revise current materials to address cultural and linguistic issues identified by the target population(s). If funding is available, hire additional bilingual peer educators (1 per clinic). At a minimum the STD Program should work with the CBOs to devise a plan to identify and train a cadre of bilingual peer educators to address the needs of the majority of CBOs and their clients. HOW SHOULD YOU SHARE EVALUATION RESULTS WITH THE TARGET AUDIENCE(S)? Once you have developed the recommendations based on the evaluation findings, you need to communicate the results of the evaluation to your stakeholders and other possible audiences. The methods you select to communicate evaluation findings depend on the information needs of the stakeholders and other users of the evaluation, and their preferences for format and style. Methods for Decision Makers and Program Staff A complete evaluation report, with findings highlighted in an executive summary (12 page summary of the evaluation), is usually appropriate for decision makers (e.g., STD Director/Manager, Health Commissioner) who require complete information such as a full program description, evaluation methodology/process, and detailed results and recommendations (see Appendix for organizing a written report). STD program staff, who are both knowledgeable about and invested in the program, may need not only a detailed report, but also indepth discussion of the findings and implications of the evaluation for program activities. Once the first draft of the report is ready, you may want to schedule a 2hour meeting with other stakeholders to discuss the report (e.g., use 30 minutes to present the findings and the rest of the time to create a plan for implementing the recommendations). This is also an opportunity to debrief your stakeholders on the evaluation process. Remember that program staff often carry the burden of the evaluation process, and it is important to maintain staff motivation for future efforts. Methods for Other Audiences For other audiences such as program participants, local media, the community, and legislators, you may want to consider other formats (e.g., oral presentations, fact sheets, or local radio). Ask your audiences how they would like to receive the information. The following is guidance on a few reporting formats. 1. Written Reports A written report is the most common method for disseminating evaluation results. The report must clearly, succinctly, and impartially communicate all parts of the evaluation. When writing an evaluation report, keep the following in mind: Know the audience for the report and the information they need. Tailor your report to your audience(s). You may need a different version of the report for each audience, or perhaps different summaries. Relate the evaluation findings to decisions that stakeholders may need to make. Prepare an Executive Summary containing the highlights of your findings, and your recommendations. Your audience may not have time to read the entire report, so you need to be brief yet informative. Highlight important points with boxes, different font sizes, and bold or italic type. Use examples, graphics, and stories to illustrate and support your findings. If you used mixed methods for the evaluation, merge qualitative (e.g., themes, quotes from interviews or descriptions from observations) and quantitative results (e.g., percentages, mean, correlation) so that the audience can have a more complete picture of the evaluation findings. Present data simply and concisely. For example, instead of including long excerpts from interviews, pick a few short, powerful quotes that make your point, and insert them at appropriate sections of the text. Use active verbs to shorten sentences. Write short paragraphs, each covering only a single idea. Edit the report, weeding out unnecessary words and phrases. Ideally, have someone else edit it as well. Verify that the report is accurate. Avoid distortions that can be caused by personal feelings, and ensure that your findings and recommendations are accurate. 2. Oral presentations If you are doing an oral presentation, the following provides some guidance. Begin your report by reviewing the goals/objectives of the program component or activity evaluated, why you asked particular evaluation questions, and what you expect to do with the results of the evaluation as it relates to program improvement. It is important to place the evaluation in the larger context of the overall program. If you report the findings without explaining how the answers you found can be used by your target audience, you may be inviting a response of So what? Be sure to tell your audience what you learned and what you expect the audience to do with this information. Consider including exercises that actively involve participants in providing input on how to use the findings for decision making and program improvement. Consider using a slide show. Create slides that communicate the key points succinctly and supply the details orally. If possible, print a set of slides to serve as handouts for your audience. This will help them focus on your oral presentation without having to take copious notes. 3. Mass Media Portions of the written evaluation report can be used as a public relations resource. When distributed to newspapers and other media outlets, this information may increase community awareness and support for your program. Write a carefully worded press release, and, if possible have a credible office or public figure deliver it to the media. If you hold a press conference, include participation from other stakeholders, and use a fact sheet with concise and understandable bullets for the media. For all this, you need to communicate with the state public affairs official, or state media/press contact to follow the protocol on how to deal with the media. SUMMARY CHECKLIST: Sharing Evaluation Findings CONCLUSION AND NEXT STEPS An evaluation serves its purpose only if the results are used for program improvement. Therefore, the formulation of recommendations needs to respond to the purpose(s) of your evaluation and the stakeholders needs. This tool gave you guidance on the factors to consider when developing your recommendations, who you should share your findings with, the methods you may use for sharing the findings, and tips for writing an evaluation report. The next tool (Tool 6.2) provides additional details on the uses of evaluation findings and strategies to increase the likelihood that findings are used. ACRONYMS USED IN THIS TOOL CBO Communitybased organization CDC Centers for Disease Control and Prevention MSM Men who have sex with men STD Sexually transmitted disease OBGYN Physician specialized in obstetrics and gynecology KEY TERMS Audience: The individuals (such as stakeholders and other evaluation users) with whom you want to communicate the results of an evaluation. Dissemination: The process of communicating the procedures, results, and the lessons learned from an evaluation. Executive summary: A 12 page summary of the full evaluation report. It provides a concise description of the evaluation activities, procedures, results, conclusions, and recommendations. Since this information can be extracted from sections of the full report, the summary is written last, but presented at the beginning of the report. Purpose of evaluation: General intent of the evaluation (e.g., to finetune program operations). Stakeholders: The individuals or organizations directly or indirectly affected by your program and/or the evaluation results (e.g., STD program staff, family planning staff, representatives of target populations). Users of an evaluation: The specific persons/organizations that will employ the evaluation findings in some way (e.g., STD director, CBO, funder). Uses of an evaluation: The specific ways that the STD program staff and other stakeholders will apply what is learned from the evaluation (e.g., change STD clinical practice, inform STD prevention policy). CASE STUDY Cactus City, a medium size city located in the American southwest, has been implementing a gonorrhea control program targeting Hispanic/Latino males and females, with particular focus on Mexican Americans. The program consisted of: a. Development of new, or revision of existing STD prevention materials by the STD program to be disseminated by all city clinics serving Mexican Americans, and made available to CBOs and other providers of health information in the Latino community. b. Outreach to health care providers Physicians (i.e., OBGYN and General Practitioners), and communitybased clinics to provide updated information about the outbreak and encourage them to report all GC infections, as required by law. c. Expanded efforts by Latino CBOs to distribute STD prevention awareness materials and condoms to their clients. The initiative was developed a year earlier, under pressure from the mayors office, in an effort to respond to a local news report on the growing epidemic of gonorrheal infection in the MexicanAmerican community and protests from Latino advocates that the health care needs of this community were being ignored. A plan of action was developed with input from the citys STD, family planning, laboratory, surveillance and budget offices, as well as an advisory committee composed of community leaders and Latino advocates, and various health care providers serving the Latino community. The interventions are being evaluated focusing on the following process evaluation questions to help assess the implementation of the different activities pertaining to the initiative: 1. Are the STD prevention materials (e.g., pamphlets, fact sheets, posters) that were developed for use by the city health clinics culturally appropriate for the target population (i.e., Mexican American males and females)? 2. To what extent did the city health clinics disseminate the STD prevention materials to the target population? 3. To what extent was updated information about the outbreak provided to health care providers who serve the target population? 4. To what extent did Hispanic/Latino/Mexican/Mexican American CBOs distribute STD prevention awareness materials (e.g., fact sheets; referral information) as well as condoms to Mexican American clients? Answer the following questions using the information provided in the above case study: Who are the stakeholders involved in this evaluation? What would you say are their interests in this evaluation? Who would use the evaluation results, and how would they use them? How and when would you share the results of the evaluation with the different audiences/users you identified? (Note: Table 1 at the end of the tool lists possible answers to the case study.) REFERENCES Centers for Disease Control and Prevention. (1999). Framework for program evaluation in public health. MMWR Recommendations and Reports, 48(RR11) (pp 2225). Retrieved February 22, 2005, from http://www.cdc.gov/mmwr/preview/mmwrhtml/ rr4811a1.htm Centers for Disease Control and Prevention. (2002). Physical activity evaluation handbook. Retrieved October 17, 2004, from http://www.cdc.gov/nccdphp/dnpa/physical/handbook/index.htm Danielle T. Jones, Brenda Parker, Scott Scrivner. Program Evaluation of Milwaukees Sexually Transmitted Disease Clinic. Retrieved June 19, 2005 from http://www.lafollette.wisc.edu/publications/ workshops/20002001/spring/PA869/STD.pdf MacDonald, G., Starr, G., Schooley, M., Yee, S. L., Klimowski, K., & Turner, K. (2001, November). Introduction to program evaluation for comprehensive tobacco control programs. Atlanta: Centers for Disease Control and Prevention. Retrieved October 17, 2004, from http://www.cdc.gov/tobacco/evaluation_manual/ Evaluation.pdf Mathematica Policy Research, Inc (2005). Abstinence Education Programs Increased Youths Support for Abstinence; Effects on Expectations to Remain Abstinent Less Clear. Retrieved June 19, 2005 from http://www.mathematicampr.com/Press%20Releases/abstinence.asp W.K. Kellogg Foundation. (1998, January). W.K. Kellogg Foundation evaluation handbook. Retrieved October 17, 2004, from http://www.wkkf.org/Programming/ResourceOverview.aspx? CID=281&ID;=770 Table 1: Possible Answers to Case Study Exercise STAKEHOLDER STAKEHOLDERINTERESTS EVALUATIONAUDIENCES EVALUATIONUSES COMMUNICATION/DISSEMINATION METHODS STD ProgramDirector/Manager Appropriateness ofthe STD preventionmaterials for thetarget population. Whether material wasdisseminated to thetarget population. Whether updatedmaterial was suppliedto health careproviders serving thetarget population. STD program directorand staff STD clinic staff Other healthdepartment staff(family planning,laboratory) Improve interventionsimplementation. Allocate resources todifferent activities, ifneeded. Obtainadditional/futurefunding. Interim reports andoral presentationsfrom evaluation team. Full written report(accompanyingPowerPoint/ graphicspresentation). CBOsparticipatingin theintervention Effectiveness of CBOsin distributinginformation andcondoms to targetpopulation. Participating CBOsand otherorganizationsfocusing on Latinohealth STD program directorand staff Latino communityadvocates Improve informationand condomdistribution in thetarget population. Justify funding byCBOs for continuationof activities. Interim briefing. Executive summary. Oral presentationwith graphics. APPENDIX EVALUATION REPORT COMPONENTS An evaluation report is typically organized as follows: 1. Title Page includes the title of the evaluation, authors, and reference date. 2. Executive Summary 12 page document which provides a concise description of the evaluation purpose and procedures, evaluation results, conclusions, and recommendations. Since this information can be extracted from sections of the full report, the summary is written last, but presented at the beginning of the report. Because some audiences may only read the executive summary, it needs to clearly address the key points of the evaluation. 3. Program/activity Purpose provides background information and rationale for the program activity or component evaluated, its goals and process/outcome objectives, and it describes the target population(s). 4. Program/Activity Description Includes a logic model of the activity or program component evaluated and staff for the evaluation. 5. Evaluation Design and Methods includes the specific process and outcome evaluation questions and their related indicators. This section also elaborates on data collection (instruments used, data sources, sample selection) and data management and analysis issues, and it notes limitations of the evaluation study. 6. Results presents quantitative and/or qualitative evaluation findings corresponding to each of the evaluation questions. Graphs and tables are included in this section to illustrate the key findings. 7. Conclusions and Recommendations This section provides answers to each of the evaluation questions based on the findings, the extent to which the objectives pertaining to the STD program activity/component were reached, and poses actionoriented recommendations. 8. Appendices (e.g., evaluation instrument, observation log). Tool 6.2: USE EVALUATION FINDINGS TO MODIFY, STRENGTHEN, AND IMPROVE YOUR PROGRAM INTRODUCTION This is the final step of your evaluation process. Understanding that an evaluation serves its purpose only if the results are used, this tool emphasizes how to use evaluation findings/recommendations to make decisions about your STD program activities. The flowchart below summarizes all the evaluation activities that have been presented thus far, and depicts where the task of using your evaluation findings fits with previous program evaluation activities. LEARNING OBJECTIVE Upon completion of this tool, you will be able to: Use evaluation results to improve your program. HOW DO YOU ENGAGE STAKEHOLDERS TO PROMOTE THE USE OF EVALUATION FINDINGS? One of the main purposes of evaluating a program component or activity is to use the findings for program decision making. You want to avoid undertaking an evaluation to simply generate one more report for decision makers without implementing its recommendations. Following are some activities that can help increase the probability that your evaluation findings will be used: 1. Work with stakeholders throughout the evaluation process so the results are actually used. Planning the evaluation of a program component or activity requires that you focus, at the early planning stages, on stakeholders key questions, their issues, and how they will use the results. Do not wait until the end of the evaluation to get in touch with stakeholders. Conduct regular meetings with them to plan and address their concerns about the evaluation and brainstorm about possible solutions for any issues as they arise. Stakeholders who merely participate in an initial meeting may be less likely to be fully engaged and committed to the evaluation. Those who are fully engaged may be more likely to be wholeheartedly committed, take an interest in the findings, and use them to strengthen the program. 2. Share information about the evaluation in a timely manner. Balance the need to provide complete information to stakeholders with providing limited but important information in a timely manner. For example, a complete evaluation report received by stakeholders in July, after resource allocation decisions were made in June is less likely to be used than preliminary findings coming to them in May. 3. Choose methods of sharing evaluation findings that will encourage evaluation use. See Tool 6.1 for guidance on how to disseminate evaluation findings to stakeholders. Remember that the methods you use to share evaluation findings with different stakeholders may impact their willingness to make programmatic decisions. 4. Follow up with decision makers and other stakeholders on the progress toward implementing recommendations. Have a postevaluation meeting with stakeholders six months after the completion of the evaluation process to assess progress toward addressing the recommendations and findings of the evaluation. If there are remaining recommendations to be implemented, determine the reasons for not implementing them and develop an action plan with a timeline and the person(s) responsible for their implementation. Then, followup six months later to find out about the status of the action plan and reconvene accordingly. HOW DO YOU USE EVALUATION FINDINGS FOR PROGRAM IMPROVEMENT? 1. Use the evaluation findings to understand how your program is implemented. Results pertaining to process evaluation allow you to determine whether or not program activities are conducted as planned; and if not, the reasons for that so you can have more information to make decisions. You can use these findings to modify approaches to serve the target population(s), increase their access to program activities and services, and improve STD program delivery and reallocation of resources. Understanding adequacy of staffing patterns and resource allocation can provide you with useful information for current and future STD program planning. 2. Use the evaluation findings to get an idea about your program effectiveness. You can use the results of outcome evaluations to determine the changes produced in the target population (e.g., awareness, knowledge, skills, behaviors) resulting from your STD program activities. Although many designs used in program evaluation do not allow you to determine with certainty if your program caused a particular change, you can use program evaluation findings to understand who benefited from the program and how, and which program activities most likely contributed or did not contribute to the programs effectiveness. When you understand strategies that did not work or those that did not result in sufficient change among participants, you can use these valuable lessons for program modifications. For example, The STD program was providing technical assistance and training to the family planning clinic staff on the implementation of Chlamydia screening, counseling and treatment protocols. The clinical staff was interviewed and medical charts were reviewed to determine if there had been a change in practice since this collaboration started. It was found that 80% of clinical staff were fully implementing the protocols compared to 55% at baseline. Based on this finding, the STD program staff, the family planning clinic director and other stakeholders agreed to continue the collaboration, and to offer a refresher workshop on Chlamydia to clinicians and other clinic staff providing Chlamydia screening, counseling, and treatment, and a basic workshop for new staff. 3. Use the evaluation findings to identify training and technical assistance needs. Evaluations often provide insights into what is working well and what is not. You can employ these findings to supplement other information sources regarding future training needs of STD program staff. 4. Use the evaluation findings to allocate program resources. Based on the evaluation findings, you can reduce or increase funding for a certain STD program component/activity. Evaluation findings can provide a strong justification to allocate funds to those activities that are producing the desired results and are having a positive effect on the target population(s). 5. Use the evaluation findings to identify funding for program continuation. If the program is achieving the intended outcome(s), the evaluation findings can be used to convince a potential funder that program continuation can make a difference in the target population and therefore is important. Conversely, if the program is not achieving the intended outcome(s), and the evaluation findings point to a lack of resources, the STD program staff can develop a funding proposal using the evaluation not only as justification for the request, but also to address how the opportunities for improvement can be implemented, given differing levels of funding available. SUMMARY CHECKLIST: Use Evaluation Findings to Modify, Strengthen, and Improve Your Program Work with stakeholders throughout the evaluation process so the results are actually used. Share information about the evaluation in a timely manner. Choose methods of sharing evaluation findings that will encourage evaluation use. Use the evaluation findings to: executive summary program purpose program description evaluation design and methods results conclusions recommendations CONCLUSION AND NEXT STEPS In this tool you learned the importance of engaging stakeholders in ongoing discussions about the implications and use of the evaluation findings and about some ways to use the findings for program decision making. Program evaluation is a cyclical process. The findings of an evaluation should prompt STD program staff and other stakeholders to develop new evaluation questions pertaining to the same or to other program components or activities. Ongoing program evaluation and utilization of its findings will ultimately lead to strong STD programs and thus continue providing high quality services to those affected by STDs and positively impact their health. ACRONYMS USED IN THIS TOOL CBO Communitybased organization DIS Disease Intervention Specialist STD Sexually transmitted disease KEY TERMS Effectiveness: This relates to outcome evaluation, and it refers to the contribution a program makes to produce changes in the target population/organization. Stakeholders: Individuals or organizations directly or indirectly affected by your STD program and/or the evaluation results (e.g., STD program staff, family planning staff, representatives of target populations). CASE SCENARIO The following case scenario, although not exhaustive, illustrates some of the ways an STD program used evaluation findings for program decision making. The STD program staff engaged other stakeholders throughout their evaluation process, so the stakeholders were more receptive to implementing the recommendations. [This is a followup of the case scenario included in tool 5.2.] BACKGROUND: Over the past year, Project Area has reported a low number of sexual contacts initiated for gonorrhea cases (i.e., <1 sexual contact initiated per patient interviewed) among adolescents. Program management decided to intensify efforts to increase the number of contacts identified and found. PURPOSE OF THE EVALUATION: Determine why the project area is reporting a low number of sexual contacts initiated in order to take corrective action. SAMPLE EVALUATION QUESTIONS: (When you develop your evaluation questions, you will probably have more than these three evaluation questions.) Are the 3 disease intervention specialists (DIS) following standard protocols for eliciting sexual contacts from gonorrheainfected individuals? Are all contacts being recorded appropriately? What factors contribute to the low number of initiated sexual contacts? INDICATORS: Number of DIS who follow the elicitation protocols all the time. Barriers identified by DIS pertaining to the elicitation process. Barriers identified by DIS in following the protocol for recording sexual contacts. Barriers and facilitators identified by DIS and their supervisor(s) in eliciting sexual contacts of gonorrhea cases. FINDINGS: Observations of DIS conducting interviews with gonorrhea cases revealed that frequently many DIS did not follow elicitation protocols completely. According to DIS supervisors who were interviewed: (1) they were having to spend an increasing amount of time on administrative paperwork, and did not have sufficient time for observing and mentoring DIS, (2) there was a high staff turnover, and (3) although DIS staff had interviewing experience, they were relatively new (4 months) to the STD program and this job. DIS staff were also interviewed regarding their comfort level in eliciting information from cases, training opportunities, barriers in identifying sexual contacts, and support from their supervisor and program management. In many instances it was found that it took several visits to identify contacts and due to the case load of each DIS, it took longer than expected to follow up with each identified case. The number of cases assigned to each DIS was more than they could complete in a timely manner. In addition, many of the gonorrhea cases were adolescents, and the interviews were considered to be particularly challenging from those three DIS. DIS indicated that they would like to learn about ways to gather more information about adolescents sexual contacts, their sexual venues, and how to discuss risk prevention and treatment with this population. INTERPRETATION: DIS are relatively new to the job and need more training on the implementation of elicitation protocols and interviewing skills, particularly when working with adolescents. They also need to receive more mentoring and guidance from their supervisors who are often bogged down with administrative duties. RECOMMENDATIONS: Provide ongoing training to DIS on partner elicitation, particularly when interviewing adolescents. Develop peertopeer education by pairing DIS with varying skill levels to encourage DIS learning from each other; Setup regular mentoring activities between DIS and their supervisors; Delegate some of supervisors administrative duties to administrative staff so that supervisors can concentrate on improving the quality and output of clinic DIS. USE OF EVALUATION FINDINGS: To understand how the program is implemented: The program findings were used to add additional workshops for the DIS, with emphasis on adolescent interviews. In addition, a buddy system was developed in which DIS could observe and learn from each other to develop their elicitation skills. Lastly, the supervisors role was revised to reduce certain administrative tasks and free up their time for observing and mentoring DIS. To allocate program resources: Funds were allocated for ongoing training for DIS. To identify training and technical assistance needs: The evaluation findings were used to plan the content of additional training to be offered to the DIS. REFERENCES Centers for Disease Control and Prevention. (1999). Framework for program evaluation in public health. MMWR Recommendations and Reports, 48(RR11). Retrieved October 17, 2004, from http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm Centers for Disease Control and Prevention. (2002). Physical activity evaluation handbook. Retrieved October 17, 2004, from http://www.cdc.gov/nccdphp/dnpa/physical/handbook/index.htm MacDonald, G., Starr, G., Schooley, M., Yee, S. L., Klimowski, K., & Turner, K. (2001, November). Introduction to program evaluation for comprehensive tobacco control programs. Atlanta: Centers for Disease Control and Prevention. Retrieved October 17, 2004, from http://www.cdc.gov/tobacco/evaluation_manual/Evaluation.pdf McKenzie, J. F., Smelter J. L. (1997). Planning, implementing, and evaluating health promotion programs: A primer (2nd edition). New York: Macmillan. Patton, M. Q. (1997) Utilizationfocused evaluation: The new century text (3rd edition). Thousand Oaks, CA: Sage. Rossi, P. H., Freeman, H. E., & Lipsey, M. W. (1999). Evaluation: A systematic approach (6th ed.). Thousand Oaks, CA: Sage.