By Claire Quimby, Jennifer Borland, and Alison Allen
Last week, a handful of our staff converged with other practitioners (both virtually and in-person) at the American Evaluation Association’s annual conference in Cleveland. This year’s conference drew over 2700 registered attendees and featured more than 1700 sessions. To say that inspiration and ideas were flowing would be an understatement. It was an amazing opportunity to learn from other practitioners and to share our own experiences. It’s hard to distill all that information in a blog post, but here are some of our top takeaways:
Talk to clients about how they might react to different outcomes before you conduct the study.
We know it’s not easy to talk to clients about unexpected results when a project nears its conclusion. When a program doesn’t deliver the outcomes we hope for or perhaps has unintended consequences, it can be hard to have a productive discussion. What if we instead talked to clients about the range of possible results at the start of our projects? Reflecting ahead of time on different possible outcomes and talking to clients about how they might take action in response to these could set the stage for a more productive evaluation overall. These ideas were shared by Bob Williams (independent evaluation and OD consultant), Miles McNall (Community Evaluation and Research Collaborative), and Emily Gates (Boston College) at their session on using systems approaches to tackle tough issues in evaluation.
Communication and transparency are key for conducting classroom observations.
Amy Puca of Via Evaluation shared a wealth of advice on conducting successful classroom observations, noting that teachers don’t like to be observed (who does?) but that observations can be a great tool when used correctly. Working well with educators in this context requires open communication and transparency – including sharing your observation instrument with the teachers ahead of time, clearly explaining how the data will be used and not used (e.g. we don’t collect data for use in teacher evaluations), and giving the teachers opportunities to ask questions both before and after the observation session.
When planning site visits to collect data, send an info sheet ahead of time.
Collecting data on site for a project always involves some degree of messiness and a need for flexibility, but you can help lay the groundwork for a smooth visit by sending an info sheet to your site partners in advance. This great tip was shared by Karen Franck (University of Tennessee) and Joseph Donaldson (North Carolina State University) who used this technique in a process evaluation of 4-H programs. Their info sheet prepped site partners by describing the purpose of the visit, giving them a sample agenda and a checklist of their responsibilities, and listing sample focus group questions.
Youth evaluators can speak truth to power.
In the spirit of the conference theme, Kate Richards-Schuster and Bianca Montrosse-Moorhead lead a panel of five youth evaluators representing EvalYouth, Metro Youth Policy Fellows (Detroit), and Chicago Freedom Schools, to help adult evaluators understand the importance of youth voice in evaluations. If you work with youth, consider what your program could gain by inviting them to participate in evaluation design or serve as consultants. Panelists noted that there is a lack of standards and understanding on how to invite youth to the table, in part due to their level of experience and expertise, but their voice is critical for understanding why programs do or don’t work. Other benefits of involving youth in evaluation? “It can give them an opportunity to help other youth,” one panelist stated, and it can help to build a community of evaluators for the future. For an example of what teen evaluators can accomplish, check out this toolkit/report from the After School Matters Youth Advisory Council.
These are some of our favorite insights from the conference, and you can find more by browsing the hashtag #Eval18 on Twitter to see what resonated with other conference goers.