Lessons from the Field

As you begin a program evaluation, perhaps for the very first time, you may be interested in other environmental educators’ experiences with evaluation. This section offers lessons and insights practitioners shared with us through a series of interviews. Many of the examples summarized here are also included elsewhere in MEERA.

Why should I conduct an evaluation?

Environmental educators have different reasons for conducting evaluations. The following is one practitioner’s view of the importance of evaluation. For more information on this topic, see Evaluation: What is it and why do it?

speaker Why should environmental educators conduct evaluations?, approx. 2 minutes

Gus Medina, Project Manager, Environmental Education and Training Partnership

 

Why should I pilot test my instrument?

Pilot testing of data collection instruments is a key component of the evaluation process. Pilot testing can identify problems such as errors and biases. For example, if your participants do not interpret your questions the same way you do, your findings won’t provide the information you are looking for. The following is an example of how pilot testing helped the South Eastern Wildlife and Environmental Education Association discover a cultural bias in their evaluation instrument. For additional information on pilot-testing and other aspects of data collection see Step 5: Collect Data.

From Friends of Coastal South Carolina & EUGENE

The Friends of Coastal South Carolina’s Earth Stewards is an eight-week cross-curricular freshwater wetlands program for 5th graders that consists of multiple in-school and field site lessons.

The evaluation:

To evaluate Earth Stewards students' knowledge gains, Friends of Coastal South Carolina worked with the USDA Forest Service to develop an evaluation tool called Ecological Understanding as a Guideline for Evaluating Nonformal Education, or EUGENE for short. EUGENE assesses learning outcomes related to eight broad ecological principles. Before using EUGENE on a large scale, Friends of Coastal South Carolina pilot tested the instrument and learned that it was economically and culturally biased and was written at a reading level that was too difficult for some students.

Improvements made:

Cultural and economic biases were eliminated from EUGENE, and issues related to the reading level were addressed (for instance, students may have statements read aloud to foster understanding). Further piloting and refinement of the instrument is ongoing. Eventually, the EUGENE instrument will be more widely applied and administered by the different Friends of Coastal South Carolina programs.

Source: Karen Beshears, Executive Director, Friends of Coastal South Carolina 

Friends of Coastal South Carolina and EUGENE

Can I evaluate my program without overtaxing my resources?

Environmental educators may worry that they do not have the resources or capacity to conduct evaluations. Evaluations, however, do not have to be resource intensive to be useful. The following is an illustrative example of an evaluation based on a teacher satisfaction survey. To learn more about resources needed for evaluation see Step 1: Before you get started.

From Elachee Nature and Science Center

The Elachee Nature Science Center offers approximately 30 field trip opportunities to students and teachers in addition to programs for the general public. While their programs cater to school groups from pre-k to college, the majority of their field trips are designed for elementary school students.

The evaluation:

To evaluate Earth Stewards students' knowledge gains, Friends of Coastal South Carolina worked with the USDA Forest Service to develop an evaluation tool called Ecological Understanding as a Guideline for Evaluating Nonformal Education, or EUGENE for short. EUGENE assesses learning outcomes related to eight broad ecological principles. Before using EUGENE on a large scale, Friends of Coastal South Carolina pilot tested the instrument and learned that there was one culturally biased statement. The instrument included a true/false question that read, "No energy is transferred when a lizard sunbathes on a rock." When this question was used in a predominantly African-American school, one 5th grader asked what sunbathing meant, showing that the instrument included words whose meanings were not available to all of the evaluation participants. The statement was changed to, "No energy is transferred to a lizard that is laying on a rock in the sun."

Also, because some students were not reading at grade level, the evaluators decided to read the questions to the students to eliminate problems related to student reading difficulties.

Improvements made:

Cultural and economic biases were eliminated from EUGENE, and issues related to the reading level were addressed. Further piloting and refinement of the instrument is ongoing. Eventually, the EUGENE instrument will be more widely applied and administered by the different Friends of Coastal South Carolina programs.

Source: Becky Jones, Education Program Manager, Elachee Nature Science Center

Elachee Nature Science Center

How can I use my evaluation results?

Environmental educators use evaluation results in a variety of ways. Below are stories from environmental educators about how they have used their evaluation results. For more on how to use evaluation results see Step 8: Improve Program.

From the Walt Disney Corporation

speaker Results as rationale for internal funding, approx. 30 seconds

Source: Dr. Jackie Ogden, Director of Animal Programs

Walt Disney Corporation

From the Foellinger-Freimann Botanical Conservatory

The Foellinger-Freimann Botanical Conservatory offers field trip programs to local schools.

The evaluation:

After noticing that many school groups came unprepared for their field trip, the Conservatory decided to evaluate the effectiveness of the pre and post visit materials that it offers teachers. Through surveys of local teachers and school administrators, the Conservatory learned that teachers did not have time to look over the pre-visit materials and that many teachers and administrators were unaware of the Conservatory’s range of programs

Improvements made:

Based on the evaluation, the Conservatory created pre-visit orientation videos for students and teachers, which were shared with school district libraries. Any teacher planning a field trip was also required to take a personal tour of the Conservatory prior to his or her classroom’s visit. Additionally, the Conservatory gave presentations at school district meetings to increase administrators' awareness of its programs. These changes, among others, resulted in smoother field trip experiences, garnered administrative support for the programs, and fostered positive reaction to the Conservatory’s offerings.

Source: Nate Meyer, Regional Extension Educator, Environmental Science Education, University of Minnesota Extension

Foellinger-Freimann Botanical Conservatory

From the Chesapeake Bay Foundation

The Chesapeake Bay Foundation (CBF) works cooperatively with government, business, and citizens to protect and restore the Chesapeake Bay. CBF seeks to reduce pollution, restore habitat and replenish fish stocks, and educate and engage constituents to take action for the Bay.

The evaluation:

In 2001, CBF underwent a comprehensive external evaluation of their environmental education programs aimed at K-12 teachers and students. This comprehensive evaluation allowed CBF to examine the scope of twenty-five years of programming, specify what program outcomes to measure and determine if desired accomplishments occurred. Results of the evaluation allowed CBF to match their expected outcomes with program design components, compare specific components of their overall program, and determine what design changes could increase program effectiveness.

Improvements made:

Results revealed that some CBF education programs were not working synergistically to create the greatest impact:

  • CBF staff-led field trips were identified as a highly effective program component, but many CBF-trained teachers who used CBF curricular materials with their students did not bring their students on field trips.
  • Participation in restoration projects was found to be less effective, likely due to the difficulty associated with implementing the restoration projects in a complete and meaningful way with large groups of students.
  • Students who participated in multiple CBF programs (other than the restoration project), such as engaging in classroom curriculum activities along with a CBF staff-led field trip had the most positive outcomes.

The evaluation led CBF to focus on making their programs work synergistically to achieve greater program effectiveness with a smaller target number of participants. They developed a training program, Chesapeake Classrooms, requiring teachers to participate in all aspects of CBF formal education program offerings including CBF curriculum materials and field trips. Recruitment shifted to focus on teachers willing to make this more involved commitment. Furthermore, CBF reduced their resource investment in K-12 restoration projects, and increased their investment in teacher training and the combined programming package.

Source: Don Baugh, Vice President for Education, Chesapeake Bay Foundation

Chesapeake Bay Foundation

From the Denver Zoo's W.I.N.-W.I.N. program

The Wonders In Nature - Wonders In Neighborhoods (W.I.N.-W.I.N.) program was developed by Denver Zoo and the Colorado Division of Wildlife. Together with the support of more than thirty partners provide EE programs for urban students in the Denver metro area through classroom visits, pre and post-visit activity curricula, field trips, and family days

The evaluation:

In 2000-2001, a W.I.N.-W.I.N. evaluation system was developed under the guidance of a team of external evaluators. The data gathered from this stake-holder based evaluation suggested that the program was successful overall. However, one specific finding was that there was frustration among participants and partners with the logistics of registering for and coordinating programs.

Improvements made:

In light of these results, W.I.N.-W.I.N. switched from a paper-based registration process to an electronic database. The new system provided automatic coordination among partners, schools and the bus company used for field trips, thereby significantly improving communication and reducing registration and scheduling errors.

Source: Chasta Beals, W.I.N.-W.I.N. Logistics Specialist

Denver Zoo’s W.I.N.-W.I.N program

From Michigan Sea Grant: Summer Discovery Cruises

Summer Discovery Cruises are a component of Michigan Sea Grant’s community-based education efforts. The intent of these programs is to improve marine and aquatic scientific literacy, and raise awareness of Great Lakes issues such as coastal habitat, fisheries and invasive species. The cruises are free-choice learning programs, with themes such as nature, history, art, and wetlands.

The evaluation:

Evaluation is a regular component of the Summer Discovery Cruises, and evaluation data is collected on each cruise trip. The assessment tool is a survey completed by participants at each cruise’s conclusion. Questions relate to facilities, perceptions of instructor knowledge, behavioral intentions, and other aspects. At the end of each season, results are compiled, examined, and synthesized in a short report.

Improvements made:

Results from the surveys have been used for a variety of purposes. Because many of the cruise programs continue from year to year, results allow program staff to incrementally adjust and improve the cruises over time. For instance, responses indicated that the hard seats on the boats were uncomfortable for the passengers, particularly as cruises last from 2½ to 4 hours. To improve participant comfort, which can be a critical factor in program effectiveness, boats were outfitted with cushions. Evaluation results are also shared with program partners, helping to justify the program and support its continuance. Furthermore, results are used to recruit new program partners.

Source: Steve Stewart, District Extension Sea Grant Educator,
Director, Great Lakes Education Program/Summer Discovery Cruises,
Michigan Sea Grant Extension

Michigan Sea Grant: Summer Discovery Cruises

From the Environmental Learning Center

The Environmental Learning Center in Florida teaches themed lessons to 1st, 3rd, and 4th graders. For example, the 4th grade “Lagoon Days” program uses volunteer instructors to lead students through 6 learning stations that help students learn concepts including benthic ecology, seining, Florida history, canoeing, adaptations, and birding.

The evaluation:

The Lagoon Days program has been extensively evaluated using a pre/post questionnaire for students, a teacher survey, and a volunteer survey. One major finding was that the program was only moderately successful in developing understanding of key concepts that correspond to education benchmarks.

Improvements made:

ELC has used the pre/post student results to improve the program, with special emphasis on subjects that meet key education benchmarks. The evaluation results have also informed the center’s volunteer training. Efforts have focused on increasing volunteers’ understanding of the concepts and creating key concept teaching aids for the volunteers to use.

Source: Heather Stapleton, Education Coordinator

Environmental Learning Center

From MEEA & EETAP

The Missouri Environmental Education Association (MEEA) works to advance EE in Missouri through professional development and networking opportunities. The Environmental Education and Training Partnership (EETAP) serves as a leader in delivering environmental education training to education professionals. Grant funds from EETAP allowed MEEA to develop a publicly available, statewide EE resource database of individuals, institutions, and businesses/industry interested in enhancing education and the environment.

The evaluation:

To evaluate the MEEA Resource Database, subscribers were surveyed regarding their knowledge of and attitudes towards the database; and their intentions to use it as a means to advertise their resources, programs, and services. While 80% of subscribers agreed or strongly agreed that the database was important for promoting their resources, MEEA felt that there was still more to be learned, and that additional information could help MEEA refine the database to make it more useful for both current and potential subscribers.

Improvements made:

As part of the resource database evaluation, MEEA made several recommendations to both improve the database and inform future evaluations. These include:

  • conducting in-depth interviews with database subscribers to obtain greater insight through qualitative data,
  • tracking database use to obtain data on visits, hits, and ways the database is being used, and
  • identifying barriers to awareness and use by surveying educators who do not use the database, but would be expected to.

Source: MEEA and EETAP (site no longer available)