Culturally Responsive Extension Program Evaluation
Author: Sara Tomis
sara.tomis@uconn.edu
Reviewers: Dr. Mary Rodriguez, Ohio State University
Publication EXT115 | April 2025
Evaluating programs that are intended to engage stakeholders from distinct cultural backgrounds and realities requires intentional skill development. This factsheet, for professionals working in Extension or other nonformal education programs, describes practical strategies for enhancing the cultural responsiveness of evaluations of community-based educational programs.
Culture is defined as a shared set of values, beliefs, and perspectives held closely by a group of individuals. Learned, rather than genetically inherited, culture shapes actions, how meaning is applied, and how the world is viewed. Culture impacts the interpretation of various programmatic elements, and how participants react to the experiences. The outcomes include both individual participants and the entire community.
Culturally responsive evaluation (CRE) was established on the belief that culture and context should be actively integrated within the evaluation process, in order to acknowledge the cultural specificity of educational assessment and program impact. Within a culturally responsive approach to evaluation, active community participation in the evaluation process is necessary.
CRE can be operationalized in many programmatic contexts. It is especially valuable when engaging in evaluation efforts with communities that experience vulnerabilities, marginalization, or those who have been historically underserved by Extension efforts, because it allows for representation and inclusion of community voices. Emphasizing community input and culture in evaluation promotes democracy and agency within Extension programs. Benefits of a CRE approach also include an increased trustworthiness and promotes the usability of evaluation outputs among community members resulting from inclusive partnerships.
Using a CRE Approach: Recommendations for Extension Professionals
Applying a culturally responsive approach to program evaluation can happen at any point in the development, implementation, and assessment process. However, CRE is most effective when initiated in the preliminary stages of community partnerships. Recommendations presented here can be applied in context-specific order to best support the evaluator in serving the community.
Understand yourself
Before implementing a CRE approach, develop a strong understanding of your own self-concept. Reflecting on your own cultural background, life experiences, worldview, and any assumptions you bring to program evaluation strengthens your understanding. Working outward, consider how your understanding of self relates to the program being considered. Are you an in-group member, evaluating within a community that you have cultural connection to, or an out-group member who will be evaluating across cultures? What impact will this have on your relationships, communication efforts, and trust development within the community? Consider your experiences with the community and program to identify potential biases that may have been introduced by this history. Self-reflection is an ongoing activity. Continue to reflect as you move through the program evaluation process.
Understand context
Before initiating an evaluation, research the history and present realities of the community you are partnering with as well as any past or current programmatic efforts. Engage with primary and secondary sources, including community members, peers, and media. It may be useful to conduct a review of the literature and existing resources before conversing with others to establish a basic understanding. Consider who is authoring these resources to confirm authenticity and credibility. Determine major historical events, characterize interactions with government and universities, and identify exposure to social, political, economic, and environmental challenges. Rural communities, for example, may experience different challenges and opportunities than urban or suburban counterparts.
Seek community perspectives to determine a representative view of the program context, understanding that some topics may be difficult to talk about. Trust and rapport should be built before attempting to engage in a vulnerable discussion with community members. Applying a strengths-based approach that describes a community through their assets and opportunities rather than their challenges can provide a more respectful and holistic view of a community context.
Operate with humility and respect
Approach interactions with an attitude of humility and mutual learning, recognizing that you are a guest in the community with as much to learn as you have to offer. Understand that there may be perceived or actual risks associated with engaging in programmatic activities, including evaluation. For example, data collection may have negative connotations or consequences for community members. Data collection in voluntary community-based programs, therefore, should be approached as a privilege rather than a right. Respect individual and community culture, sovereignty, and rights to privacy. Reflect on the positional authority you hold within your identity as an individual and/or as an evaluator and determine how you can steward that influence responsibly. Community Institutional Review Board (IRB) processes should always be followed, when applicable. Seek community feedback to ensure that you are operating in culturally respectful ways and always stay true to your word.
Understand community and culture
Learning from community members about their culture, values, and norms is a vital step to engaging a culturally responsive approach to program evaluation. Building partnerships and relationships with community members occurs over time and requires intentional effort on behalf of both parties. Be patient and take advantage of opportunities to immerse yourself in the community when invited, such as attending community events. Show community members through your actions that you are authentically invested in learning and make your intentions clear to enhance transparency and credibility.
Evaluators must respect the community's timeline and work to develop trust through consistent and respectful practice. If you sense that you have not achieved the level of community trust needed to conduct an exploration of community culture, take a step back and start smaller through continued relationship building. When you have established a genuine relationship with the community, seek to develop a comprehensive understanding of the culture, including communication norms, community-identified values, and ways of knowing and doing. Engage with many members of the community, such as youth, older people, and individuals of different economic backgrounds, to yield a representative view of the culture. Use observation skills and ask thoughtful and respectful questions to enhance your understanding.
Engage community in program evaluation
Welcoming community members into program evaluation in a participatory way can promote democracy, relevance, and capacity-building. Community members can collaborate with the evaluator at all stages of the evaluation processes, such as in defining objectives, determining methods, and interpreting meaning. Engage representative voices to consider how power, funding, and social influences impact program needs and/or outputs and how they may differ between different members of the community.
Determine evaluation specifics
The CRE approach can be used within both formative evaluations (occurring before or during a program to inform development and implementation) and summative evaluations (occurring after a program to identify outcomes and determine if goals have been met). Culture should be actively considered and integrated into each element of the evaluation, from method and approach through interpretation and reporting. Storytelling and other qualitative methodologies, for example, may be appropriate for program evaluations in Indigenous contexts, due to the emphasis of oral communication within many Indigenous cultures.
Traditional logic models may not be representative or conducive to community culture. Consider how you can collaborate with community members to develop a culturally responsive model for the program and its evaluation. These discussions can facilitate community participation in the evaluation process and can help you further identify community resources, culturally responsive activities, and community priorities. Multi-generational approaches to learning and evaluation may yield more holistic views of program outcomes while also acknowledging community culture.
The timing of the evaluation should be aligned with community cadence. Recognition of culture in rural contexts, for example, may stipulate scheduling of evaluation activities around peaks in industry timelines, such as harvest or lambing seasons. Select indicators that have value to both community members and the evaluator; measure them in responsive ways that support community trust and meaning making. Determination of culturally responsive indicators and assessment approaches may require intentional exploration in partnership with community members.
Design instruments to minimize imposition and burden on participants - the more aligned with community culture and norms, the better. Conversation or observation, for example, may reflect Indigenous cultural methodology. These approaches have been used by Extension professionals and can be facilitated in a formal way through focus groups, rubrics, or cultural and artistic performances.
When working across languages, additional considerations are needed. Evaluation instruments or other materials that need to be translated should be done by a professional with an adequate understanding of context, and/or confirmed by a community member, for the most appropriate and accurate outputs. To respect the preference for oral communication and to accommodate challenges associated with literacy, Warrix et al. (2006) recommended the use of focus groups when collecting evaluation data from older Hispanic adults. Similarly, Zoellner et al. (2006) developed an innovative evaluation instrument that used audio recordings paired with digital images and asked Hispanic participants to provide information about their behaviors associated with nutrition.
Building out a Program
New programs should be informed by a culturally responsive needs assessment. Once a viable and reciprocal partnership has been developed with a community, seek out community-defined perceptions of needs and opportunities that could be addressed by programmatic efforts. Viable CRE strategies within needs assessments can include working with advisory committees and community-identified opinion leaders. Perry and Hoffman (2010), for example, partnered with a Northwestern tribe and established a community advisory board for the creation of an assessment survey to evaluate exercise behaviors among tribal youth. As a result of the participatory nature of this approach, the program was put into effect more rapidly and was more germane to the community.
Evaluating Program Outcomes and Impacts
Communities may have their own priorities for program outcomes, including how they should be assessed and understood. For some cultures, long-term impacts may be more valuable and telling of a program’s effect on community members and conditions. However, funder or university reporting requirements may not readily align with community culture and boundaries. Partnering with community members to identify innovative ways to meet divergent evaluation expectations, such as video-recorded story circles or portfolios, may help to overcome these obstacles. Encouraging community input in the analysis stage is also needed. Cross et al. (2011) epitomized this within analysis of their culturally responsive study using community forums, participation by program staff, and member checking efforts.
Disseminate findings in ways that benefit the community
Sharing evaluation outputs with community members can promote trust and transparency. It may also be an opportunity to celebrate community growth and learning. Share findings in ways that will be most accessible and valuable for community members, by asking them what works best for them. For example, a storytelling event or a graphic recording may be more engaging than a written report. Ask community members how they would like to learn about the evaluation outputs. Follow through on your commitment to community development by creating usable outputs and appropriate programmatic responses after the evaluation has been concluded.
Conclusions
Culturally responsive evaluation is a process of learning and reflection, both of oneself, and the communities served. CRE skills can be learned over time as one adopts a mindset of growth and cultural appreciation. By approaching program evaluation in a way that values culture, context, and community input, Extension professionals can enhance evaluation processes and outputs for the benefit of their audiences.
Resources
Adams, A., Miller-Korth, N., & Brown, D. (2004). Learning to Work Together: Developing Academic and Community Research Partnerships. Wisconsin Medical Journal, 103(2), 15–19, PMID: 15139553.
Anderson, K. C., Stern, M. J., Powell, R. B., Dayer, A. A., & Archibald, T. G. (2022). A Culturally Responsive Evaluation Framework and its Application in Environmental Education. Evaluation and Program Planning, 92, 102073. https://doi.org/10.1016/j.evalprogplan.2022.102073
Chouinard, J., & Cram, F. (2020). Culturally Responsive Approaches to Evaluation: Empirical Implications for Theory and Practice. SAGE Publications, Inc. https://doi.org/10.4135/9781506368559
Christopher, S., Watts, V., McCormick, A. K. H. G., & Young, S. (2008). Building and Maintaining Trust in a Community-Based Participatory Research Partnership. American Journal of Public Health, 98(8), 1398–1406. https://doi.org/10.2105/AJPH.2007.125757
Cousins, J. B., & Earl, L. M. (1992). The Case for Participatory Evaluation. Educational Evaluation and Policy Analysis, 14(4), 397-418. https://doi.org/10.3102/01623737014004397
Cross, T. L., Friesen, B. J., Jivanjee, P., Gowen, L. K., Bandurraga, A., Matthew, C., & Maher, N. (2011). Defining Youth Success Using Culturally Appropriate Community-based Participatory Research Methods. Best Practices in Mental Health, 7(1), 94–114.
Dogan, S. J., Sitnick, S. L., & Onati, L. L. (2012). The Forgotten Half of Program Evaluation: A Focus on the Translation of Rating Scales for Use with Hispanic Populations. Journal of Extension, 50(1), Article 1FEA5. https://doi.org/10.34068/joe.50.01.06
Frierson, H., Hood, S., & Hughes, G. (2002). A Guide to Conducting Culturally Responsive Evaluation. In J. Frechtling (Ed.), The 2002 User-Friendly Handbook for Project Evaluation (pp. 63-73). National Science Foundation.
Hassel, C. A. (2007). Can Cross-Cultural Engagement Improve the Land-Grant University? Journal of Extension, 45(5), Article 5FEA7. https://tigerprints.clemson.edu/joe/vol45/iss5/9
Hepler, A. N., Guida, F., Messina, M., & Kanu, M. (2010). Program Evaluation with Vulnerable Populations. In S. A. Estrine, R. T. Hettenbach, H. Arthur, & M. Messina (Eds.), Service Delivery for Vulnerable Populations: New Directions in Behavioral Health (pp. 355- 371). Springer Publishing Company.
Hood, S., Hopson, R. K., & Kirkhart, K. E. (2015). Culturally Responsive Evaluation: Theory, Practice, and Future Implications. In K. E. Newcomer, H. P. Hatry, & J. S. Wholey (Eds.), Handbook of Practical Program Evaluation (4th ed.) (pp. 281 - 317). https://doi.org/10.1002/9781119171386.ch12
Kirkhart, K. E. (2010). Eyes on the Prize: Multicultural Validity and Evaluation Theory. American Journal of Evaluation, 31(3), 400-413. https://doi.org/10.1177/1098214010373645
LaFrance, J., Nichols, R., & Kirkhart, K. E. (2012). Culture Writes the Script: On the Centrality of Context in Indigenous Evaluation. New Directions for Evaluation, 2012(135), 59–74. https://doi.org/10.1002/ev.20027
Letiecq, B. L., & Bailey, S. J. (2004). Evaluating from the Outside: Conducting Cross-Cultural Evaluation Research on an American Indian Reservation. Evaluation Review, 28(4), 342– 357. https://doi.org/10.1177/0193841X04265185
Martenson, D. M., Newman, D. A., & Zak, D. M. (2011). Building Community-University Partnerships by Listening, Learning, and Responding. Journal of Extension, 49(5), Article 5FEA4. https://doi.org/10.34068/joe.49.05.05
Perry, C., & Hoffman, B. (2010). Assessing Tribal Youth Physical Activity and Programming Using a Community-Based Participatory Research Approach. Public Health Nursing (Boston, Mass.), 27(2), 104–114. https://doi.org/10.1111/j.1525-1446.2010.00833.x
Richmond, L. S., Peterson, D. J., & Betts, S. C. (2008). The Evolution of An Evaluation: A Case Study Using the Tribal Participatory Research Model. Health Promotion Practice, 9(4), 368–377. https://doi.org/10.1177/1524839906289069
Sahota, P. C., & Kastelic, S. (2012). Culturally Appropriate Evaluation of Tribally Based Suicide Prevention Programs: A Review of Current Approaches. Wicazo Sa Review, 27(2), 99–127. https://doi.org/10.5749/wicazosareview.27.2.0099
Thomas, V. G., & Campbell, P. B. (2021). Evaluation in Today’s World: Respecting Diversity, Improving Quality, and Promoting Usability. Sage: Los Angeles.
Tomis, S. M., Bunch, J. C., Harder, A., & Roberts, T. G. (In review). Culturally Responsive Evaluation in Indigenous Youth Extension Programs. Journal of Human Sciences and Extension.
Warrix, M. B., Nieto, R. D., & Nicolay, M. (2006). Developing Culturally Appropriate Evaluation Instruments for Hispanics with Diabetes. Journal of Extension, 44(6), Article 6TOT1. https://archives.joe.org/joe/2006december/tt1.php
Zoellner, J., Anderson, J., & Martin Gould, S. (2006). Development and Formative Evaluation of a Bilingual Interactive Multimedia Dietary Assessment Tool. Journal of Extension, 44(1), Article 1FEA8. https://tigerprints.clemson.edu/joe/vol44/iss1/10
The information in this document is for educational purposes only. The recommendations contained are based on the best available knowledge at the time of publication. Any reference to commercial products, trade or brand names is for information only, and no endorsement or approval is intended. UConn Extension does not guarantee or warrant the standard of any product referenced or imply approval of the product to the exclusion of others which also may be available. The University of Connecticut, UConn Extension, College of Agriculture, Health and Natural Resources is an equal opportunity program provider and employer.