In May 2017, Heather Talley, Tzedek’s Director of Social Justice Leadership and Community Engagement, enrolled in Evaluate for Change’s 6-week course Introduction to Measuring Your Impact. During the course, Heather reflected on the challenges of designing program evaluation and collecting data.
Evaluation has a bad reputation for a reason. Sometimes assessment tools are used towards unjust ends—hello educational tests and tracking! Sometimes, folks solicit feedback and use that feedback in wonky ways. (Have you ever seen constructive criticism result in the eradication of a program? I have.) It’s no wonder that so many of us are burned out on surveys and focus groups, specifically, and skeptical of data, generally.
If poorly designed, evaluation can actually cause harm.
Evaluation has been complicit in deepening social inequalities, dehumanizing learners, and fortifying dominant power structures. Sometimes, evaluation is a process in which implicit bias manifests in structural outcomes. Organizations lose funding. Programmatic changes leave particular populations without services. And the very process of providing feedback can be oppressive. Questions utilize inaccessible language and often ignite a sense that our work or learning or growth doesn’t measure up.
At the same time, Heather recognizes that in order to measure impact and make improvements to programs like the Tzedek Social Justice Fellowship, we rely on feedback and data.
At Tzedek, we’ve been asking ourselves what a transformative approach to evaluation looks like. Transformative approaches emerge when we, first, ask how inequalities and violence are enacted through seemingly benign social structures and, then, develop alternative strategies that center human flourishing.
We’re trying to carve out tools using the following questions:
- Does this evaluation tool empower learners to celebrate emerging skills?
- How does our process demonstrate value for multiple forms of knowledge and growth, understanding that success and impact look different for each of us?
- Does this tool measure strengths and resist punitive or stigmatizing language?
- Is this process aligned with our values?
- How will we use this data to create an evolving and responsive program that shifts in response to our Fellows and the times?
We don’t have all the answers, but our hope is that by asking these questions we resist deploying evaluation in ways that work against the very world we seek.
To read Heather’s full article, Working Towards Transformative Evaluation, visit Evaluation for Change. For more information about the evaluation tools Tzedek employs, contact email@example.com.