Step 8: Improve Program

Evaluation does not end with the writing of a report. To benefit from the evaluation, it is important that its findings are put to use! This section provides suggestions for how you can use evaluation results and provides advice for how to encourage others to help implement desired program changes.

 

How can I use my evaluation results to benefit my program?

Improve your program's visibility and outreach

  • Advertise the program’s successful results to demonstrate the impact it has on your target audience or the community. This strategy can help you gain additional support from funders and partnering organizations as well as recruit new participants.
  • Improve outreach to your target audience. You may have learned from the evaluation that your program is not reaching as many individuals as intended or a particular group. Use what you learned from the evaluation to identify ways to increase awareness of your program’s offerings.

From the Foellinger-Freimann Botanical Conservatory

The Foellinger-Freimann Botanical Conservatory offers field trip programs to local schools.

The evaluation:

After noticing that many school groups came unprepared for their field trip, the Conservatory decided to evaluate the effectiveness of the pre and post visit materials that it offers teachers. Through surveys of local teachers and school administrators, the Conservatory learned that teachers did not have time to look over the pre-visit materials and that many teachers and administrators were unaware of the Conservatory’s range of programs.

Improvements made:

Based on the evaluation, the Conservatory created pre-visit orientation videos for students and teachers, which were shared with school district libraries. Any teacher planning a field trip was also required to take a personal tour of the Conservatory prior to his or her classroom’s visit. Additionally, the Conservatory gave presentations at school district meetings to increase administrators' awareness of its programs. These changes, among others, resulted in smoother field trip experiences, garnered administrative support for the programs, and fostered positive reaction to the Conservatory’s offerings.

Source: Nate Meyer, Regional Extension Educator, Environmental Science Education, University of Minnesota Extension

Foellinger-Freimann Botanical Conservatory

Improve how your program is delivered or implemented

  • Reallocate resources to support the components of your program that are most effective. You may have learned that certain aspects of your program are more successful at producing desired outcomes than others. If so, consider investing more resources in those aspects of your program and to cut other, less successful parts.
  • Address organization and planning problems to make the delivery and implementation of your program more successful and increase participant satisfaction.

From the Chesapeake Bay Foundation

The Chesapeake Bay Foundation (CBF) works cooperatively with government, business, and citizens to protect and restore the Chesapeake Bay. CBF seeks to reduce pollution, restore habitat and replenish fish stocks, and educate and engage constituents to take action for the Bay.

The evaluation:

In 2001, CBF underwent a comprehensive external evaluation of their environmental education (EE) programs aimed at K-12 teachers and students. This comprehensive evaluation allowed CBF to examine the scope of twenty-five years of programming, specify what program outcomes to measure and determine if desired accomplishments occurred. Results of the evaluation allowed CBF to match their expected outcomes with program design components, compare specific components of their overall program, and determine what design changes could increase program effectiveness.

Improvements made:

Results revealed that some CBF education programs were not working synergistically to create the greatest impact:

  • CBF staff-led field trips were identified as a highly effective program component, but many CBF-trained teachers who used CBF curricular materials with their students did not bring their students on field trips.
  • Participation in restoration projects was found to be less effective, likely due to the difficulty associated with implementing the restoration projects in a complete and meaningful way with large groups of students.
  • Students who participated in multiple CBF programs (other than the restoration project), such as engaging in classroom curriculum activities along with a CBF staff-led field trip had the most positive outcomes.

The evaluation led CBF to focus on making their programs work synergistically to achieve greater program effectiveness with a smaller target number of participants. They developed a training program, Chesapeake Classrooms, requiring teachers to participate in all aspects of CBF formal education program offerings including CBF curriculum materials and field trips. Recruitment shifted to focus on teachers willing to make this more involved commitment. Furthermore, CBF reduced their resource investment in K-12 restoration projects, and increased their investment in teacher training and the combined programming package.

Source: Don Baugh, Vice President for Education, Chesapeake Bay Foundation

Chesapeake Bay Foundation

 

From the Denver Zoo's W.I.N.-W.I.N. program

The Wonders In Nature - Wonders In Neighborhoods (W.I.N.-W.I.N.) program was developed by Denver Zoo and the Colorado Division of Wildlife. Together, with the support of more than thirty partners, they provide EE programs for urban students in the Denver metro area through classroom visits, pre and post-visit activity curricula, field trips, and family days.

The evaluation:

In 2000-2001, a W.I.N.-W.I.N. evaluation system was developed under the guidance of a team of external evaluators. The data gathered from this stake-holder based evaluation suggested that the program was successful overall. However, one specific finding was that there was frustration among participants and partners with the logistics of registering for and coordinating programs.

Improvements made:

In light of these results, W.I.N.-W.I.N. switched from a paper-based registration process to an electronic database. The new system provided automatic coordination among partners, schools, and the bus company used for field trips, thereby significantly improving communication and reducing registration and scheduling errors.

Source: Chasta Beals, W.I.N.-W.I.N. Logistics Specialist

Denver Zoo’s W.I.N.-W.I.N program

Improve the content of your program

  • Change content based on participants’ feedback or other evaluation results. This may be appropriate, for example, if participants suggested that they wanted to know more about certain topics or thought that certain topics were over emphasized.

From the Environmental Learning Center

The Environmental Learning Center in Florida teaches themed lessons to 1st, 3rd, and 4th graders. For example, the 4th grade “Lagoon Days” program uses volunteer instructors to lead students through 6 learning stations that help students learn concepts including benthic ecology, seining, Florida history, canoeing, adaptations, and birding.

The evaluation:

The Lagoon Days program has been extensively evaluated using a pre/post questionnaire for students, a teacher survey, and a volunteer survey. One major finding was that the program was only moderately successful in developing understanding of key concepts that correspond to education benchmarks.

Improvements made:

ELC has used the pre/post student results to improve the program, with special emphasis on subjects that meet key education benchmarks. The evaluation results have also informed the center’s volunteer training. Efforts have focused on increasing volunteers’ understanding of the concepts and creating key concept teaching aids for the volunteers to use.

Source: Heather Stapleton, Education Coordinator

Environmental Learning Center

Inform future evaluations

  • Test alternative ways to assess outcomes.
  • Learn from the mistakes and limitations of past evaluations to avoid these pitfalls in the future. Likewise, take note of what evaluation techniques were particularly worthwhile.

From MEEA & EETAP

The Missouri Environmental Education Association (MEEA) works to advance EE in Missouri through professional development and networking opportunities. The Environmental Education and Training Partnership (EETAP) serves as a leader in delivering environmental education training to education professionals. Grant funds from EETAP allowed MEEA to develop a publicly available, statewide EE resource database of individuals, institutions, and businesses/industry interested in enhancing education and the environment.

The evaluation:

To evaluate the MEEA Resource Database, subscribers were surveyed regarding their knowledge of and attitudes towards the database; and their intentions to use it as a means to advertise their resources, programs, and services. While 80% of subscribers agreed or strongly agreed that the database was important for promoting their resources, MEEA felt that there was still more to be learned, and that additional information could help MEEA refine the database to make it more useful for both current and potential subscribers.

Improvements made:

As part of the resource database evaluation, MEEA made several recommendations to both improve the database and inform future evaluations. These include:

  • conducting in-depth interviews with database subscribers to obtain greater insight through qualitative data,
  • tracking database use to obtain data on visits, hits, and ways the database is being used, and
  • identifying barriers to awareness and use by surveying educators who do not use the database, but would be expected to.

Source: MEEA and EETAP (site no longer available)

Help advance the field of environmental education

Sharing the results of your evaluation can help other environmental educators and program managers design not only more effective educational experiences but also better evaluations of their own programs. You can offer to share your evaluation reports with other MEERA users by submitting them to Dr. Michaela Zint.

How can I ensure the evaluation is used to improve my program?

At this point, the usability of your evaluation depends on the evaluation questions you chose and the extent to which you selected a participatory approach. Even now, however, there are things you can do to help ensure that the evaluation will benefit your program:
 

Remember to work as a team.

 
 

Finalize and distribute your report promptly

This way you will not miss opportunities to influence important decisions. If you delay distribution, your findings may no longer be relevant when they are shared. There are some exceptions, that is, times when you may want to wait. For example, it may make sense to share your report just before an annual planning meeting or other important decision-making meeting, when participants will have a concrete reason to pay close attention to the findings (RBFF, 2006).

Be strategic about how you share your results

Directly communicate your findings to those who you want to use the information and do so in ways that will appeal to them. For example, rather than disseminating an evaluation report and hoping it will be read, develop tailored presentations of your results for specific individuals or groups. Provide information that is most relevant to stakeholders’ priorities. Suggest ways that you plan to address their priorities and include specific actions they can take to help implement the recommendations.

Follow up!

After sharing your report or recommended changes with intended users, make sure they have a chance to discuss to what extent and how to best implement them. One way you can help ensure the recommendations are acted upon is by coming up with an implementation plan (see Tip) and indicating how you will help to carry out the plan.

In what other ways can I get the most from the evaluation?

All of the above will help to ensure that your evaluation will be noticed and that recommendations are acted on. Here are some additional ideas to think about to get the most from your evaluation:

Evaluate the evaluation!

At this stage of the process, take some time to reflect on the evaluation. What went well? What would you do differently if you could do it over? Specifically, what would you do differently to help ensure that recommendations will be acted on? Reflecting on the evaluation and its influence will help to improve future evaluations.

For a description of things to consider when evaluating the evaluation, refer to:

  • Project Evaluation Toolkit (.pdf)
    Center for the Advancement of Learning and Teaching, University of Tasmania
    Beginner
    Section 14 “Evaluating the Evaluation” shows how you can apply the process you used to develop and implement the evaluation to evaluate it, in turn.
 
 

Have a plan!

 
 

Document changes to the organization that may have occurred as a result of the evaluation

Not only can the evaluation’s recommendations, when acted on, help to improve your program but so can the process of conducting the evaluation A positive evaluation experience can stimulate improvements in organizational culture, teamwork, and relationships. The evaluation process typically increases participants’ understanding of the program, and increases their motivation to help the program succeed. The capacity-building effects of the process can lead participants to develop lasting skills and habits in critical thinking, problem-solving, leadership, and in the practice of evaluation itself. Documenting benefits to staff and the organization as a whole will help to justify further investments in evaluation.

If you want to learn more about how to use evaluations and how to ensure the use of evaluations, review:

  • W.K. Kellogg Foundation Evaluation Handbook (.pdf)
    W.K. Kellogg Foundation (1998).
    Beginner
    Step 9, "Utilizing the Process and Results of Evaluation," provides an overview of how evaluations can be used to improve programs. It stresses the importance of usable results, describes ways to use findings, and explains how to learn from the evaluation process itself.
  • Best Practices Guide to Program Evaluation for Aquatic Educators (.pdf)
    Recreational Boating and Fishing Foundation. (2006).
    Beginner Intermediate
    Pages 86-87 in Chapter 5 “Create Useful Results from Evaluation Data” describes how evaluations can be used and makes the case for monitoring how evaluations are used. Factors that influence the use of evaluations are also described.
  • Measuring Progress: An Evaluation Guide for Ecosystem and Community-Based Projects (.pdf)
    Ecosystem Management Initiative, University of Michigan, 2004.
    Intermediate
    Stage D, “How Will You Use the Information in Decision-Making?” explains how to use the evaluation to create an action plan for improving your program. The proposed approach is based on the use of “trigger points”– predetermined values of an indicator that signify the need to take action. This section guides you through the process of establishing trigger points for indicators, determining actions to take when trigger points are reached, and specifying who will be responsible for taking these actions.
 

References

Worthen, B. R., Sanders, J. R., & Fitzpatick, J. L (1997). Program evaluation: Alternative approaches and practical guidelines. New York: Longman.

 

Kellogg Foundation. (2006). Using Evaluation Findings. Downloaded September 25, 2006 from: www.wkkf.org/Default.aspx?tabid=90&CID=281&ItemID=2810022&NID=2820022&LanguageID=0

EMI (Ecosystem Management Initiative). (2004). Measuring Progress: An Evaluation Guide for Ecosystem and Community-Based Projects. School of Natural Resources and Environment, University of Michigan. Downloaded September 20, 2006 from: www.snre.umich.edu/ecomgt/evaluation/templates.htm

Back to planning
1 2

Phase 1

Understanding Your Program

Step 1. Before You Get Started Step 2. Program Logic
3 4

Phase 2

Planning Your Evaluation

Step 3. Goals of Evaluation Step 4. Evaluation Design
8

Integrate
Your
Results