Home > Toolkits > Evaluation Reporting

Evaluation Reporting

This toolkit is part of a series of toolkits on Evaluation Planning, Implementation and Reporting for science festivals. You can find a list of the other toolkits in this series and a description of their contents below.

The phrase “You get out of it what you put into it” is certainly true of evaluation. As powerful a tool as evaluation can be, it is often one of the last considerations of festivals in terms of time and budget. Many festival organizers have limited experience planning for and conducting evaluations and don’t know where to start. Others have had negative prior experiences with evaluation, feeling that they did not getting anything useful or insightful out of the effort.

It doesn’t have to be that way! This toolkit is designed to help you plan and implement your festival evaluation, and then report impact to your festival stakeholders.  The resources were developed to be mindful of the unique nature of science festivals. In addition to some of the more general resources found within this toolkit, there are several connected and complementary toolkits that provide more detail and specific tools (such as surveys) that you are free to adapt for your purposes or use as-is.

2014 Science Festival Alliance Annual ReportEvaluation is the systematic appraisal of an initiative or program. Evaluation can be used to summarize what happened at your festival, but that’s just the beginning of what good evaluation should do. In addition to celebrating the good things that happened, the best evaluations spur discussion and lead to new insights about your festival. Effective evaluations can also help you plan your future efforts in the direction of your festival’s goals. For example, the North Carolina Science Festival has the goal of producing a high-quality event within a 30 minute drive of every North Carolinian – each year’s evaluation data help to both document our progress and refine strategies for getting there.

It is important to determine that our Festival work matters, but defining what we mean by IMPACT – and then going about measuring it – is both challenging and rewarding! Some organizers think of impact strictly in terms of the number of participants at their festival. Others think of impact as a function of geographic reach and/or the way the festival engages underserved or underrepresented communities. Still other festival organizers see impact as the creation of opportunities for scientists to engage with the public about their research.

Engagement, fun, learning, career awareness and science literacy are some of the many ways festivals define impact. No matter how you define impact for your festival, evaluation is the process that will help you understand your festival’s current impact and assist you with planning for the future.

The evaluation resources in these toolkits will grow substantially in the coming years thanks to the generous support of the National Science Foundation (NSF). In 2014, the NSF funded Collaborative Research: EvalFest (Evaluation Use, Value and Learning through Festivals of Science and Technology) (NSF#1423004) to build a community of practice among festivals to develop and test innovative evaluation methods. EvalFest currently involves 24 U.S. science festivals who are working together to determine what festivals want and need to measure and then devising tools for doing so. As new resources emerge from this community of practice, they will be shared via these toolkits. Because some of the methods are being implemented by all 24 EvalFest festivals, important discoveries about science festivals overall should also come to light.

In addition to this toolkit, which contains general information about festival evaluation, evaluation resources can be found in the following toolkits:

Evaluation Planning:  checklists, timelines, and job descriptions related to developing an evaluation plan and hiring field researchers to assist with data collection.

Evaluation Methods: descriptions of the many ways festivals might go about measuring their progress towards their goals, including methods such as surveys and observations. Less traditional methods such as mystery shopping, embedded assessment booths, zip code analysis and social network analysis will also be added.

Evaluation Instruments: copies of actual surveys, recording sheets and other devices used by festivals to capture data during evaluation activities.

Evaluation Reporting: written reports and presentations of science festivals that share evaluative data and are directed at festival collaborators, funders and the community at large.

EvalFest: documents, training materials, presentations, publications and reports specific to the EvalFest project. While anyone is welcome to view and use these resources, they are best understood and useful to those festivals formally involved in the EvalFest project.

EvalFest: Labeling Evaluators and Surveyed Attendees : templates and instructions to help EvalFest partners develop materials to identify their evaluators and to label attendees who have already been surveyed at events.

EvalFest: Training Videos for Field Researchers : training videos designed to help EvalFest partners train field researchers and prepare them for evaluating events and giving the survey to attendees.

 

Toolkit Resources

 

Evaluation Report: 2009 San Diego Science Festival

This presentation reports data and evaluation findings from the inaugural 2009 San Diego Science Festival.

GO TO RESOURCE

ppt

Evaluation Report: 2009-2012 Science Festival Alliance Evaluation Report

The Science Festival Alliance (SFA) has helped to spark and support dozens of independent festival initiatives in the US and abroad. But the SFA got i...

GO TO RESOURCE

Evaluation Report: 2012 Science Festival Alliance Evaluation Report

Goodman Research Group, Inc. (GRG) is serving as the external evaluator of the NSF-funded Science Festival Alliance (SFA), a collaborative started by ...

GO TO RESOURCE

pdf

Evaluation Report: 2009 Science Chicago Final Report

From September of 2009 to August of 2009, approximately 300,000 people of all ages across the region celebrated science through online and in-person...

GO TO RESOURCE

pdf

Evaluation Report: 2006 Small Talk Final Report

Small Talk was a three-year long collaborative project that looked at the benefits for the science communication community in working together on dia...

GO TO RESOURCE

pdf

Evaluation Report: 2009 San Diego Science Festival Expo Survey Report

This is a report of data collected from surveys administered at the 2009 San Diego Science Festival Expo.

GO TO RESOURCE

Evaluation Report: 2012 San Diego Science Festival Sponsor Report

The fourth annual San Diego Festival of Science & Engineering engaged an estimated 50,000 people in science, technology, engineering and math (STE...

GO TO RESOURCE

pdf

Evaluation Report: 2011 Las Vegas Science Festival Community Report

The 2011 Las Vegas Science Festival brought together over 80 organizations who provided over a hundred exciting free STEM-related programs to thousand...

GO TO RESOURCE

pdf

Evaluation Report: 2009 San Diego Science Festival Exhibitor and Performer Survey Report

This report includes results from the San Diego Science Festival Exhibitor and Performer survey. 

GO TO RESOURCE

Evaluation Report: 2010 Economic Impact of Science Festivals

Most successful festivals of any kind (science, art, food, music, cultural, etc) directly generate local economic activity. In 2010 the Science Festiv...

GO TO RESOURCE

Evaluation Report: 2014 Arizona SciTech Festival Community Report

The community evaluation report from the 2014 Arizona SciTech Festival.

GO TO RESOURCE

pdf

Evaluation Report: Report Layout Checklist

A checklist for creating a legible and engaging layout for your evaluation reports by Stephanie Evergreen.

GO TO RESOURCE

pdf

Evaluation Report: 2013 North Carolina Science Festival Executive Summary Brochure

A sample of an executive summary brochure from the 2013 North Carolina Science Festival's Thorp Science Night.

GO TO RESOURCE

pdf

Evaluation Report: Data Visualization Checklist

A checklist for creating engaging data visualizations for evaluation reports by Ann K. Emery and Stephanie Evergreen.

GO TO RESOURCE

pdf