Title: The “Politics” of Evidence: Knowledge to Action with the Youth Research and Evaluation eXchange (YouthREX)
Organization: The Youth Research and Evaluation eXchange (YouthREX)
Contact: Uzo Anucha, MSW; PhD // anucha@yorku.ca
MAIN FIELDS OF YOUTH WORK ADDRESSED BY THE PRACTICE:
- Education and training of youth workers
- Knowledge development, research, better understanding of youth work
MAIN THEMES or trends in or influencing youth work ADDRESSED BY THE PRACTICE:
- Professionalism and professionalisation of youth work, development of quality in youth work; education, training and competence-building of youth workers and trainers
- Promoting innovation in youth work
- Evidence-based youth work including leveraging program evaluation
About YouthREX
YouthREX is an Ontario province-wide initiative based at the School of Social Work at York University funded by the Ontario Ministry of Children, Community and Social Services in recognition of the role that research, and evaluation can play in improving outcomes for youth in Ontario. YouthREX’s mission is to provide the youth sector with research and evaluation resources, tools, supports and services to promote the well-being of Ontario Youth. YouthREX translates and mobilizes evidence and creates opportunities for learning, connection, and sharing through professional development programming for youth workers, both online and in person, tailored to the diverse and intersecting issues and contexts impacting youth and youth worker wellbeing. We also support youth workers in infusing evidence in the design, development, and evaluation of their programs, understanding and improving their impacts, and telling the stories of their work. YouthREX has been very well served by a community-university model that brings the expertise, knowledge, and resources of community partners and university researchers together to support Ontario’s youth sector. YouthREX in many ways is an example of how universities can be in service to the community in deeply meaningful ways while also providing benefits to the university community.
Aims/objectives of the Practice:
This Practice contribution to the 3rd European Academy of Youth Work reflects on the successes, challenges, and lessons from YouthREX’s collaborative approach to building the evaluation capacity of youth organizations in Ontario, Canada. Central to YouthREX’s approach is a nuanced understanding of ‘evidence’ that acknowledges the politics of evaluation and the need for evaluation processes and methods to reflect youth sector realities. It describes the evaluation framework that YouthREX developed to guide program evaluation for community-based youth programs. This framework offers a youth program a three-phase, seven step process to develop an evaluation plan, implement the plan, and use the findings to improve the program and support the wellbeing of young people.
Description of the Practice:
“Evidence and evaluation support youth programs to do what they do, better”.
Youth programs provide young people with the skills and resources they need to overcome challenging circumstances and make positive contributions to their communities. But youth programs sometimes struggle with how to understand and measure these outcomes and articulate the impact of their programs to stakeholders – including family members, funders, and youth themselves.
Evaluation provides youth programs the tools to understand, measure and track if their programs achieved their intended outcomes and impacts but equally important, understand how they are successful and how they can be improved. Program evaluation can support youth programs to be reflective, improve, change and grow to ensure that young people that participate in programs are experiencing the outcomes that these programs are working towards.
Despite the benefits of program evaluation, youth organizations frequently struggle with evaluation and ask questions such as:
1/ How can evaluation improve our practice rather than get in the way of it?
2/ What do we need to evaluate given the tension between what funders want measured and we are interested in knowing?
3/ How do we evaluate, and what does our evaluation mean?
4/ How can we change based on what we learned
This Practice contribution describes YouthREX’s approach to building the evaluation capacity of youth sector organizations. Central to this approach is a nuanced understanding of evaluation that acknowledges that evaluation is sometimes ‘political’ as well as acknowledging the influence of paradigms and values. Understanding the concerns of youth workers with evaluation opens up the space to thoughtfully address these concerns and discuss how evaluation can be ‘leveraged’ to improve outcomes for youth, which is what the sector is passionate about.
YouthREX’s Customized Evaluation Supports (CES) provide a continuum of evidence-based services to support youth programs with the design, development, and evaluation of their work with young people. It supports youth programs to understand their program theory and intended outcomes so they can develop and implement evaluation plans that track, measure and share the impact of their work with young people. CES support youth programs to not only understand the impacts of their work but to also use the findings to strengthen their programming and improve youth wellbeing.
CES is guided by an evaluation framework that YouthREX developed to guide program evaluation for youth programs. This framework offers a three-phase, seven step process to develop an evaluation plan, implement the plan, and use the findings to improve the program and support the wellbeing of young people. The framework pulls together the key elements of program evaluation in a simple step-by-step process that is suited to the context of grassroots youth programs. This process supports re-imagining evaluation as a storytelling tool, including creative strategies for visualizing and sharing your evaluation findings so stakeholders can understand the REAL story of a program.
The framework emphasizes three lenses that are uniquely suited to the organizational, social and political realities of youth programs. The lenses provide youth programs with a guide on how to choose evaluation methods from the many evaluation options available. The first lens is a Learning Focused Lens that asks: “Will the evaluation produce insights and findings that can be used by the youth program to improve and promote youth wellbeing?” Evaluation for grassroots youth sector programs is better focused on improving the program than proving the worth of the program. Program evaluation is about developing insights and findings that a program can learn from to improve outcomes for youth – evaluation helps a program do what they do, better. ‘Good’ evaluation is not just about gathering accurate evidence about how a program is achieving outcomes for youth wellbeing, but ‘good’ evaluation produces findings and insights that a program can use to learn and do its work better.
The second lens is a Youth-Engaged Lens that asks: “Does the evaluation meaningfully engage youth participants?” This lens recognizes that meaningfully involving youth strengthens evaluations of youth programs. Youth engagement improves the overall quality of evaluation and more importantly, meaningful youth involvement in evaluation can support the outcomes for youth that the program is working to achieve. Youth engagement in research and evaluation provides opportunities for youth to develop relationships with peers, mentors, and adult allies. Opportunities for relationship building, such as adult-youth pairings and the development of youth-friendly environments, can provide the context for relationship development for young people.
The third lens is a Contextualized Methods Lens that asks: does the evaluation methods allow a youth program to tell rich and nuanced stories of their processes and outcomes that acknowledge the complexity and dynamism of youth work? Rather than privileging an experimental approach that views evaluations with randomized control groups as the gold standard of evaluation, YouthREX embraces the rich, contextual insights that mixed-methods including qualitative methods can provide an evaluation of a youth program and recommends that evaluations of youth programs include mixed-methods and multi-sources. A program evaluation can build in rigor by having multiple lines of evidence from different methods and data sources, for example, multiple methods (i.e., ways of collecting data), multiple sources (i.e., different types of data), multiple timepoints, and multiple analysers. For instance, collecting both quantitative and qualitative data through surveys and interviews with youth program participants and their parents or other caregivers. When analysing and interpreting data, you may include multiple staff members.
Were there particular aspects of your practice or questions highlighted by the participants during the discussion at the EAYW event?
Discussions generally focused on how to evaluate, and leverage the findings to sustain a program.
Links to Explore Further: