35 Community-Based Research for Food Systems Change: A Collaborative Praxis [Award Winning Paper]

Vanessa L. Daether

*Best Paper Award For Contribution to Practice

Abstract

Organizations that operate community food initiatives (CFIs), such as community gardens, farmer training workshops, and food literacy projects, are challenged to provide evidence of the tangible contributions they make towards advancing food security, food sovereignty and/or just and healthy local food systems. Critiqued within the literature for failing to address income as a root cause of food insecurity (Martin, 2018, p. 116), acknowledge how they reinforce inequitable systems of power (Guthman, 2008, pp. 434–436), and offer meaningful alternatives to the neoliberal economic system (Moragues-Faus, 2017, p. 471), academic and community stakeholders have called for more focused and robust evaluation practices for CFIs (Knight, 2013, p. 30; Levkoe & Blay-Palmer, 2018, p. 50; Loopstraw & Tarasuk, 2013, p. 55).

Unfortunately, for many organizations, addressing these critiques through evaluation remains elusive due to a lack of conceptual frameworks, financial supports, and literature (Barbour, Ho, Davidson, & Palermo, 2016, p. 128). Accordingly, this reflective paper reviews the results of an applied doctoral research project investigating how organizations operating CFIs within Central Vancouver Island can evaluate their impacts from a food systems perspective. Structured through community-based research methodology and framed by the food regime analysis and food sovereignty theoretical frameworks, this qualitative study employed semi-structured interviews, document analysis, and an analysis of food systems definitions to identify the evaluation method and indicators required to revaluate CFIs’ food systems impacts. The findings supported the drafting of a food systems-focused evaluation framework for the study’s community research partners, including an evaluation logic model, matrix, and plans. In addition, the process of conducting community-university research revealed avenues for improving the application of CBR and social sciences theory within applied, collaborative doctoral research.

Keywords: community-based research; community food initiatives; evaluation

Introduction

This reflective research paper summarizes the conduct and results of an applied, collaborative doctoral research project that studied how organizations operating community food initiatives (CFIs) within Central Vancouver Island can evaluate their impacts from a food systems perspective. First, an outline of the project’s applied context opens the paper to reveal the study’s purpose, site, and research partners and describe how the community-university research collaboration was founded, nurtured, and maintained through community-based research (CBR) methodology. Second, the study’s theoretical framework is addressed to define the rationale and tensions associated with the application of social science theory in an applied research setting. Third, the presentation of findings describes how the research team (also referred to as the CBR Team) and I applied the results to inform the drafting of a food systems-focused evaluation framework (i.e., the evaluation logic model, matrix, and plans). Fourth, this paper closes with a discussion of the challenges and opportunities the CBR Team and I faced in balancing our distinct expectations (me as a doctoral candidate and the research partners as grassroots community organizations interested in actionable results) with the epistemological demands of CBR and offers recommendations to the academy pertinent to how best prepare students and community stakeholders for collaborative research.

Applied Research Context

This research project grew from an operational challenge my colleagues and I faced while working at the Cowichan Green Community (CGC), a registered not-for-profit organization whose mission is to “cultivate food, community, and resilience” (CGC, 2021, para. 2) within the Cowichan Region of Central Vancouver Island. As program staff responsible for organizing community-based food events, initiatives, research projects, and workshops, our jobs frequently entailed evaluation. As we faced high staff turnover rates, limited evaluation expertise and resources, overambitious workloads, and restrictive funding, our evaluation practices mainly consisted of funder-driven evaluations that often failed to deliver meaningful data to our organization. For example, such evaluations required our team to gather generalized program metrics, including details on outputs, participation rates, program activities, and program expenditures. Few, if any, asked or supported our team to gather data on the broader, systems-level impacts of our work or contemplate how our initiatives met our organization’s mission statement. Although we recognized that funders did not wish to burden CGC with unrealistic evaluation processes and also needed to collect data to evaluate the merits of their funding opportunities, we felt that when organizations direct vital energy to evaluation, the results needed to be impactful.

Consequently, our team sought to develop an evaluation framework that would assist us to evaluate our work in relation to the systems-level critiques we faced, namely: (a) can CFIs genuinely tackle the root causes of food insecurity?; (b) can CFIs present democratic alternatives to neoliberalism?; and (c) can CFIs amend social injustices? Further, we hoped to use the results of such an evaluation framework to communicate our impacts (and limits) to funders and local governments, who, in our experience, failed to recognize that the impacts of our CFIs evolved over a project’s lifespan and occurred outside the scope of our place of work. Thus, aware that CGC had a responsibility to progress our field of practice by addressing this knowledge gap, our staff began to explore how we might evaluate our work’s impacts on our local food system. In pursuit of this goal, we applied (unsuccessfully) for funding to support evaluation, discussed and defined food-based evaluation matrices, partnered with a local university to engage students on the idea of evaluating CFIs, and recruited interns to develop evaluation concepts. With each activity, we made small steps towards our goal. Unfortunately, without dedicated evaluation funding, we were unable to actualize any one idea.

Fortuitously, our investigation into evaluation practices surfaced in tandem with my acceptance to Royal Roads University’s (RRU) Doctor of Social Sciences (DSocSci) program, where I chose to dedicate my research to this applied problem. With CGC willing to form a community-university research partnership, we commenced a five-year, multi-stakeholder research collaboration with the Cowichan Food Security Coalition, which served as an advisory team, and the Nanaimo Foodshare Society, a registered charity in Nanaimo, BC, that also faced barriers to evaluating their CFIs. In November 2017, we formalized our working relationship and designed a research project to answer: “How can the impacts of CFIs on Central Vancouver Island be evaluated from a food systems perspective?”

Methodological Framework

Due to the collaborative nature of this study, our methodological selection rested upon eight requirements, including: (a) the interdisciplinary trajectory of food studies and food systems research in Canada (Koç, Bancerz, & Speakman, 2017, pp. 3–6); (b) the growing body of literature calling for more community-university research collaborations in the area of food systems and evaluation (Barbour et al., 2016, p. 128; Barton, Wrieden, & Anderson., 2011, p. 593); (c) the inter/trans-disciplinary focus of RRU’s DSocSci program (RRU, 2014, pp. 3–4); (d) my axiological and epistemological outlook which favoured applied, inter/trans-disciplinary research that fosters an equitable sharing of power, relationship-building, and reciprocity; (e) the boundaries and expectations of doctoral research that defined our timeline and matters such as intellectual property; (f) the collaborative nature of this study; (g) CGC and NFS’s need for a flexible participation model; and (h) CGC and NFS’s desire for action-oriented results.

To meet these demands, we selected CBR, a branch of action-oriented and participatory research that offers an alternative approach to positivist epistemology and knowledge creation. By dispelling research hierarchies and the notion that the academy is the custodian of knowledge, CBR encourages the active collaboration and participation of academic and community partners to co-design knowledge creation instruments and meaningfully share data collection, analysis, and knowledge mobilization processes. Inspired by the writings of Etmanski, Hall, and Dawson (2014), Israel, Schulz, Parker, and Becker (1998), Ochocka and Janzen (2014), Smith (2008), and previous CBR research within food studies (Andrée et al., 2014; Rojas, Black, Orrego, Chapman, & Valley, 2017), we designed a qualitative study consisting of three data collection methods. Specifically, throughout 2018, we conducted 17 semi-structured interviews with leaders of CFIs operating on Vancouver Island and Coast, a document analysis of six academic and 19 community-published reports on the evaluation of CFIs, and an analysis of 45 definitions to food systems terms (e.g., food justice, food literacy, food security, food sovereignty, Indigenous food sovereignty, and the right to food) retrieved from the public websites and/or social media channels (e.g., Facebook and Twitter) of 27 organizations in British Columbia that operate CFIs.

Though I was an emerging scholar new to CBR, and CGC and NFS felt our research methodology was too fluid or unbound (concerned by CBR’s broad interpretations and lack of defined protocols), we established procedures to guide our working relationship and our implementation of CBR. First, we agreed to adapt Kemmis, McTaggart, and Nixon’s (2014) Action Research Cycle (planning, action, observation, and reflection) (p. 18) as part of our methodology. In doing so, we applied the Action Research Cycle to ensure we consistently reviewed our research steps and took the necessary time to plan the study’s subsequent stages. Second, we agreed to formalize the working relationship between CGC, NFS, and me through a voluntary Memorandum of Understanding (MOU). The MOU defined our research team’s operational norms relevant to communication, document and resource sharing, inducements, meeting schedules, project scope, and responsibilities, as well as policies and procedures on anonymity and information use, confidentiality, conflicts of interest, conflict resolution, disclosure and retention of information, knowledge mobilization, possible negative effects, and the right to withdraw. With the support of RRU’s Office of Research Ethics and faculty, the drafting of the MOU required our team to unpack the tenants of RRUs ethics application and review process, discuss the minutia of our collaboration, define how to operationalize CBR within the context of a doctoral study, and grapple with issues sensitive to research collaborations, including the delineation and sharing of intellectual property. Although not without its contentions, the MOU served as a beacon to remind ourselves of our individual and shared responsibilities to the research project and each other and as a reference point to inform our collective decisions.

Theoretical Framework

In drawing from Powers’ (2010) description of a theory’s role in research—to outline what society currently knows of a phenomenon, elicit questions for future inquiry, and change and advance to indicate a progression in what is known of a phenomenon (p. 5)—the CBR Team and I searched for a theoretical framework to account for both the historical events that resulted in CFIs and the current circumstances requiring their further evaluation. In consultation with the CBR Team, I met these requirements by pairing food regime analysis and food sovereignty. Equally rooted in political-economic discourse, food regime analysis frames the development of the global, industrial food system as part of the colonial and neocolonial agendas (Friedmann & McMichael, 1989, pp. 93–95; Friedmann, 2009, p. 336), and food sovereignty offers a critique of the historical and contemporary economic and political systems, or regimes, that disenfranchise local or small-scale food producers and consumers (Gürcan, 2018, pp. 324–326, Wiebe & Wipf, 2011, pp. 3–8). Together, they frame the evolution and future trajectory of market-based food systems and offer a historical yet modern lens to analyze local to global food systems and their discontents.

In practice, I applied the theoretical framework post-analysis to contextualize this project’s findings and identify gaps within the evaluation framework. From a myopic focus on agrarian food systems (over aquatic, Indigenous, or non-market food systems) to issues of power that perpetuate food insecurity, I applied theory to reveal the evaluation framework’s limitations and identify opportunities for change. While a positive outcome, the use of theory within this study proved an area of contention as the CBR Team found the theoretical application burdensome. Faced with action-oriented mandates and a lack of time, neither organization possessed the resources to explore the theoretical literature, contemplate Powers’ articulation of theory, or assess how theory could inform our evaluation framework. To this, CGC and NFS acknowledged and respected the need for theory within an academic context and understood why I believed that it was critical to review our findings and the evaluation framework through the perspectives of food regime analysis and food sovereignty. Still, the use of theory distanced both research partners from my analysis and, equally so, our findings—an outcome counter to our understanding of CBR’s tenants. Consequently, as we prepare to close our study, the experience leaves me questioning how I could have improved the implementation of theory within this research context and how academia can make social sciences theory more accessible to community stakeholders.

Results

The analysis generated seven key findings that challenged and broadened our research team’s understanding of evaluation. We drew upon each theme within the design of our evaluation framework, including our approach to the evaluation method and indicator development. These findings included:

  1. The evaluation should be cost-effective, accessible to a broad audience, quick to administer, and accompanied by supports. Should our evaluation framework fail to account for the operational landscape of CFIs and prove too technical to implement, the evaluation would not become a part of CGC or NFS’s work culture.
  2. The evaluation must be culturally safe. As clearly identified in the literature (Blanchet-Cohen, Geoffroy, & Hoyos, 2018, pp. 21–29; Hanberger, 2018, pp. 114–119; Patton, 2011, p. 247; Patton, 2012, p. 76), evaluations must guarantee respect for culture. However, the interview data revealed that many staff of CFIs on Vancouver Island and Coast felt ill-equipped to evaluate from a culturally informed lens. This data, linked to concepts of anti-racism, decolonization, food justice, and Indigenous food sovereignty, suggested that staff require further resources or training to conduct culturally safe evaluations and that organizations bear a responsibility to prepare staff to take on this essential yet sensitive work.
  3. The evaluation should empower staff in their work. The evaluation framework must operate from a position of appreciative inquiry or evaluative inquiry (Preskill & Torres, 1999, p. 44), whereby the process and results are not an exercise focused on deficits but a process of ongoing exploration and learning dedicated to evolving and improving CFIs. Such a focus may increase staff willingness to evaluate and the value they place over their work.
  4. The evaluation should be adaptive to evolving food discourses. One of the most impactful outcomes of this study was a discussion of the breadth of food terms employed to define CFIs. From food justice, food literacy, food sovereignty, food systems, and more, there are a multitude of terms and countless definitions to describe a CFI’s purpose and proposed impacts. As such, this evaluation framework needs to contend with food discourse and the ever-changing lexicon employed in food studies and food systems work while remaining accessible to a general audience. Thus, the evaluation framework needs to tacitly connect to key food terms without burdening the process in technical language.
  5. The evaluation needs to reflect the scaled impacts of CFIs. In order to recognize the numerous actors and arenas that CFIs interact with or impact, the evaluation framework needs to account for the various sub-systems (within a local food system) that CFIs intersect, including the participant, organization, community, and local food system. This finding confirmed our systems approach to the evaluation framework.
  6. The evaluation should support organizations to evaluate the intangible systems-level impacts of CFIs. The findings indicated that we could not alone evaluate the food systems impacts of CFIs. Instead, we also need to evaluate their broader, interdependent systems- level impacts. From decolonization to emergency preparedness, the evaluation framework needs to facilitate a process to capture both food systems and the broader systems indicators that equally affect the attainment of just and healthy local food systems. As identified in Table 1 (Preliminary Logic Model Template), the Indicator Themes and the four Indicator Levels (e.g., participant, organization, community and local food system) represent this perspective. Likewise, in Table 2 (Preliminary Evaluation Matrix Template), the systems-informed evaluation questions mirror this outlook.
  7. The community evaluated should develop the evaluation. While our interview data revealed that many staff of CFIs would appreciate access to standardized food systems evaluation frameworks, interviewees also described that it is best practice to develop evaluations directly with those engaged or impacted. A sentiment mirrored in the collaborative and participatory evaluative literature (Cousins & Chouinard, 2012, p. 10; Fetterman, Rodríguez-Campos, Wandersman, & O’Sullivan, 2014, p. 145; Schones, Murphy-Berman, & Chambers, 2000, p. 55), a collaborative approach to evaluation may increase buy-in from all stakeholders and the implementation of evaluation results. For interview participants, this was a critical demand of evaluation within an Indigenous or cross-cultural context and/or with vulnerable populations. In practice, this indicated that the evaluation framework should not offer prescriptive evaluation indicators. Instead, the framework should offer indicator themes related to a food system and the broader systems that CFIs intersect. Users of the evaluation framework are, accordingly, encouraged to draft their initiative-specific indicators through the lens of the diverse indicator themes presented in Table 1.

In summary, the findings of this project informed not only the method that this study’s evaluation framework for CFIs required but the types of indicators relevant to a food systems-focused evaluation. As a result, this study produced a draft evaluation framework for CGC and NFS that we hope will support each organization to design and evaluate their work in a manner reflective of the scaled impacts of their work and the various stakeholders they engage. However, as this study approaches completion, we have additional plans for our findings and research collaboration. First, the evaluation framework possesses gaps and shortcomings that we need to amend. As highlighted in the above outline to the theoretical framework, our evaluation framework is data-driven and is missing several viewpoints, including anti-racist, gendered, non-agrarian, and non-market perspectives. Second, we need to test the evaluation framework for its efficacy. Although our initial research plan was to test the evaluation framework, our ambitious research goals conflicted with busy work schedules and the onset of the COVID-19 pandemic. This test will now likely take place outside the confines of this dissertation. Third, we need to reimagine the evaluation framework. As exhibited by Tables 1 and 2, the evaluation framework, despite our best efforts, does not fully align with our first finding’s demand for an easy-to-use framework. While the logic model format was selected as this approach speaks to funders and is congruent with best practices in the evaluation field, CGC, NFS, and the Cowichan Food Security Coalition believe it is overly complicated. Therefore, our future work may look to how we can repurpose our data to generate other equally useful expressions of program measurement or assessment. One option is to develop an online assessment tool for organizations to identify where and how their proposed initiatives intend to impact their local food system.

 

Table 1 Preliminary Logic Model Template

Program Purpose:
Program Outputs:
Inputs / Activities / Participants Indicator Themes1 Participant Level Indicators Organization Level Indicators Community Level Indicators Systems Level Indicators

Inputs:

 

Activities:

 

Participants

Accessibility Adequacy
Agency Availability
Area
Food and Culture
Food Economy
Food Justice
Food Literacy
Food Sovereignty
Indigenous Food Sovereignty
Administration
Capacity Building Collaboration/Teamwork Communication
Community Building Decolonization
Emergency Preparedness
Health
Social Connections
Sustainability

1 As the findings from my analysis (a) identified over 2,000 unique indicators for the evaluation of CFIs and (b) cautioned against the use of a prescriptive approach or indicators, Indicator Themes were developed to capture the essence of the 2,000 indicators identified. Within the full evaluation framework, each Indicator Theme is accompanied by a definition to help users develop their initiative-specific indicators. In practice, in recognition that not all CFIs propose to offer impacts pertinent to each indicator theme, users of this evaluation framework are encouraged to select the most relevant indicator themes for an evaluation.

 

Table 2  Preliminary Evaluation Matrix Template

Evaluation Questions Intended Results Participant Level Indicators Organization Level Indicators Community Level Indicators Systems Level Indicators Data Source Data Collection Method Analysis Procedures
Who/how has this CFI impacted/ engaged within the local food system?
How has this CFI contributed to a just and resilient local food system?
What other impacts did this initiative have?
What impacts are missing?
What systemic barriers impede the achievement of intended impacts, and who holds power to affect change?

Discussion

In addition to the evaluation framework, the applied nature of this collaborative study generated significant learnings and recommendations for the academy pertinent to the application of CBR and social science theory within an applied, collaborative doctoral project. As documented previously by scholars who explored the challenges and merits of community-university research (see Jansson, Benoit, Casey, Phillips, & Burns, 2010; Klocker, 2012; Lewis et al., 2015; Paré, 2019; Shore, 2007; Travers, Pyne, Bauer, Munro, Giambrone, Hammond, & Scanlon, 2013; Warren, Calderón, Kupscznk, Squires, & Su, 2018), the doctoral process (or at least my comprehension of it) appeared to place my needs ahead of the community partners and, therefore, limited our full implementation of CBR. For our research team, we witnessed this on numerous occasions, including: (a) my academic timeline that directed the pace of the research project, a speed that admittedly was too slow for the CBR Team who wanted timely actionable results; (b) my ethics application whose technical language prevented CGC and NFS’s fulsome review of the document; (c) the purpose and design of the theoretical framework whose language alienated the research partners and prevented their direct connection to theory, despite its benefits to the study; and (d) my requirement, as the primary investigator, to independently publish original work, a process that excluded CGC and NFS from elements of the analysis and reporting of findings and recommendations.

To address some of these concerns, I made an effort throughout our study to reflect on the purpose of CBR and transfer power from me to the research partners. While inevitably imperfect, five examples are worth highlighting. First, CGC and NFS requested a definition of participation rooted in flexibility, and that would allow each organization to autonomously adjust their level of engagement in the project as it unfolded. As our research took over four years to complete, they required a means to participate that would support varying levels of engagement and adapt to staff turnover and fluid work schedules. Accordingly, in recognizing that CGC and NFS’s participation throughout the project would be more successful if we built flexibility into its definition, we developed a model of collective engagement suited to everyone’s shifting capacities. As a student, this required me to be responsive to CGC and NFS’s participation in the project and adapt my role as the situation required.

Second, during data collection, NFS and I shared the responsibility of conducting the semi-structured interviews. Looking back, I was, perhaps naively, concerned that our unique interview styles would complicate our data and findings. Regardless, NFS’s staffs’ interview style and the rapport they built with their interviewees drew out data and themes that my interviews did not. Instead, our findings benefited from their participation as an interviewer. Equally so, NFS benefited by building new relationships with future community partners—a result that aligns well with the intent of CBR.

Third, between November 2017 and November 2018, the CBR Team and I met once per month to advance the project’s objectives. These meetings typically followed a ridge one-hour agenda that I prepared. However, before the meetings turned to our research project, they typically commenced with more casual conversations related to CGC and NFS’s upcoming grant applications and other projects. Though unrelated to our research question, these discussions served an important CBR function—relationship building which “means seeing the relationships and the vision, rather than the project, at the heart of the work” (Andrée et al., 2014, p. 50).

Fourth, the project’s timeline and associated expectations required flexibility. In designing this study, we intended to both develop and test the evaluation framework. Instead, in 2019, as I completed the analysis and we drafted the evaluation framework, both organizations faced busy funding cycles, and I was about to start maternity leave. Again, in 2020, the onset of COVID-19 redirected CGC and NFS’s energies, as each organization found itself awash with new projects and demands presented by the pandemic. These circumstances, while challenging, resulted in our collective decision not to pursue a test of the evaluation framework within my doctoral timeline. Instead, I took this period to analyze how the pandemic exposed the broader systemic injustices ubiquitous to food systems and CFIs, re-apply theory to our evaluation framework, review the literature, and revise what we had created. This additional time permitted better results for CGC and NFS and allowed our research team to maintain our mutually supportive relationship.

Fifth, I wrote the dissertation in a manner that attempted to be inclusive of the expertise and voices of my research partners. Although the dissertation is an original work, we selected two processes to bring CGC and NFS’s voices to my knowledge translation. First, I followed Klocker’s (2012, p. 156) decision to write their doctoral dissertation through collective pronouns, a practice readers can see reflected in this paper. Despite this practice leaving me feeling uncomfortable and exposed, having been trained within disciplines that discredit first-person narratives, this act allowed me to give much due credit to CGC and NFS. Second, in alignment with the CBR protocol standards “developed in 1994 by a committee of the Oakland Community-Based Public Health Initiative” (as cited in Brown & Vega, 2008, p. 395), I asked CGC and NFS to review my dissertation and confirm that I accurately and fairly represented their organizations, perspectives, and work. I also sought their feedback on the evaluation framework and the final recommendations.

Overall, while I am content with the work we completed, this collaborative research experience highlighted four strategies for improving the application of CBR and social sciences theory within applied, collaborative doctoral research, including: (a) developing ethics applications for collaborative, community-centred research—ones written in accessible language and focused on the research team’s collective responsibilities to research and each other; (b) creating resources for community partners that deconstruct the technical processes of an ethics review, methodological and theoretical selection, methods selection, data collection, analysis, and reporting; (c) offering a clearer means or more explicit rules for how community research partners can have their authentic voices represented in a doctoral dissertation (without compromising the students’ need to publish original work); and (d) designing a new approach to how universities teach CBR to graduate students—one which draws upon literature published by students using CBR over academic staff who might not face the same barriers to adequately fulfilling the tenants of the methodology.

Conclusion

As this study and research partnership prepares to close, the parallels between the role of CBR in food studies research and evaluation in CFIs become more transparent. In practice, collaboration takes many forms. However, within this study, collaboration through CBR allowed our research team to delineate our participation needs and address uneven power dynamics. Although imperfect, by defining our collaborative praxis, we demonstrated how applied research could contribute to grassroots food systems change. For this project, this change is presented in a new approach to evaluating CFIs and a deeper understanding of how to form community-doctoral research partnerships. From this, I take away that the realization that positive outcomes associated with both community-university research and CFIs rest in part upon the capacity of individuals and institutions (whether an organization or university) to embrace the uncertainty and immense freedom that results from sharing power. In doing so, those engaged in applied food studies research may contribute equally to an erosion of the barriers between academic and community stakeholders while contending with the critiques of CFIs and their future trajectories.

References

Andrée, P., Chapman, D., Hawkins, L., Kneen, C., Martin, W., Muehlberger, C., … Stroink, M. (2014). Building effective relationships for community-engaged scholarship in Canadian food studies. Canadian Food Studies/La Revue canadienne des études sur l’alimentation, 1(1), 27–53. doi: 10.15353/cfs-rcea.v1i1.19

Barbour, L. R., Ho., M. Y. L., Davidson, Z. E., & Palermo, C. E. (2016). Challenges and opportunities for measuring the impact of a nutrition programme amongst young people at risk of food insecurity: A pilot study. Nutrition Bulletin, 41(2), 122–129. doi: 10.1111/nbu.12200

Barton, K. L., Wrieden, W. L., & Anderson, A. S. (2011). Validity and reliability of a short questionnaire for assessing the impact of cooking skills interventions. Journal of Human Nutrition and Dietetics, 24(6), 588–595. doi: 10.1111/j.1365-277X.2011.01180.x

Blanchet-Cohen, N., Geoffroy, P., & Hoyos, L. M. (2018). Seeking culturally safe Developmental Evaluation: Supporting the shift in services for Indigenous children. Journal of MultiDisciplinary Evaluation, 14(31), 19–31. Retrieved from https://journals.sfu.ca/jmde/index.php/jmde_1/article/view/497/449

Brown, L., & Vega, W. A. (2008). Appendix: A protocol for community-based research. In M. Minkler, & N. Wallerstein (Eds.), Community-based participatory research for health: From process to outcomes (2nd ed., pp. 395–397). San Francisco, CA: Jossey-Bass.

Cousins, J. B., & Chouinard, J. A. (2012). Participatory evaluation up close: An integration of research-based knowledge. Charlotte, NC: Information Age.

Cowichan Green Community. (2021). About us: Our mission, our vision, our values. Retrieved from https://cowichangreencommunity.org/about-us/mission-and-history/

Etmanski, C., Hall, B. L., & Dawson, T. (Eds.). (2014). Learning and teaching community-based research: Linking pedagogy to practice. Toronto, Canada: University of Toronto.

Fetterman, D., Rodríguez-Campos, L., Wandersman, A., & O’Sullivan, R. G. (2014). Collaborative, participatory, and empowerment evaluation: Building a strong conceptual foundation for stakeholder involvement approaches to evaluation (a response to Cousins, Whitmore, and Shulha, 2013). American Journal of Evaluation, 35(1), 144–148. doi: 10.1177/1098214013509875

Friedmann, H. (2009). Discussion: Moving food regimes forward: Reflections on symposium essays. Agriculture and Human Values, 26(4), 335–344. doi: 10.1007s/1046000992256

Friedmann, H., & McMichael, P. (1989). Agriculture and the state system: The rise and decline of national agricultures, 1870 to the present. Sociologica Ruralis, 29(2), 93–117. doi:10.1111/j.1467-9523.1989.tb00360.x

Gürcan, E. C. (2018). Theorizing food sovereignty from a class-analytical lens: The case of agrarian mobilization in Argentina. Agrarian South: Journal of Political Economy, 7(3), 320–350. doi: 10.1177/2277976018800608

Guthman, J. (2008). Bringing good food to others: Investigating the subjects of alternative food practice. Cultural Geographies, 15(4), 431–447. doi: 10.1177/1474474008094315

Hanberger, A. (2018). Democratic caring evaluation for refugee children in Sweden. In M. Visse, & T. Abma (Eds.), Evaluation for a caring society (pp. 105–124). Retrieved from https://books.google.ca/books?id=al9RDwAAQBAJ&lpg=PA120&dq=Evaluating for a caring society&pg=PR6#v=onepage&q=Evaluating for a caring society&f=false

Israel, B. A., Schulz, A. J., Parker, E. A., & Becker, A. B. (1998). Review of community-based research: Assessing partnership approaches to improve public health. Public Health, 19(1), 173–202. doi: 10.1146/annurev.publhealth.19.1.173

Jansson, S. M., Benoit, C., Casey, L., Phillips, R., & Burns, D. (2010). In for the long haul: Knowledge translation between academic and nonprofit organizations. Knowledge Application, 20(1) 131–143. doi: 10.1177/1049732309349808

Kemmis, S., McTaggart, R., & Nixon, R. (2014). The action research planner: Doing critical participatory action research. Singapore: Springer.

Klocker, N. (2012). Doing participatory action research and doing a PhD: Words of encouragement for prospective students. Journal of Geography in Higher Education, 36(1), 149–163. doi: 10.1080/03098265.2011.589828

Knight, A. J. (2013). Evaluating local food programs: The case of Select Nova Scotia. Evaluation and Program Planning, 36(1), 29–39. doi: 10.1016/j.evalprogplan.2012.05.003

Koç, M., Bancerz, M., & Speakman, K. (2017). The interdisciplinary field of food studies. In M. Koç, J. Sumner, & A. Winson (Eds.), Critical perspectives in food studies, (2nd ed., pp. 3–18). Don Mills, Canada: Oxford University.

Levkoe, C. Z., & Blay-Palmer, A. (2018). Food counts: Food systems report cards, food sovereignty and the politics of indicators. Canadian Food Studies/La Revue canadienne des études sur l’alimentation, 5(3), 49–75. doi: 10.15353/cfs-rcea.v5i3.277

Lewis, D. Jr., Yerby, L., Tucker, M., Foster, P. P., Hamilton, K. C., Fifolt, M. M., … Higginbotham, J. C. (2015). Bringing community and academic scholars together to facilitate and conduct authentic community based participatory research: Project UNITED. International Journal of Environmental Research and Public Health, 13(1), 1– 13. doi: 10.3390/ijerph13010035

Loopstra, R., & Tarasuk, V. (2013). Perspectives on community gardens, community kitchens and the Good Food Box program in a community-based sample of low-income families. Canadian Journal of Public Health, 104(1), 55–59. doi: 10.1007/BF03405655

Martin, M. A. (2018). “Sometimes I feel like I’m counting crackers”: The household foodwork of low-income mothers, and how community food initiatives can support them. Canadian Food Studies/La Revue canadienne des études sur l’alimentation, 5(1), 113–132. doi: 10.15353/cfs-rcea.v5i1.188

Moragues-Faus, A. (2017). Emancipatory or neoliberal food politics? Exploring the “politics of collectivity” of buying groups in the search for egalitarian food democracies. Antipode, 49(2), 455–476. doi: 10.1111/anti.12274

Ochocka, J., & Janzen, R. (2014). Breathing life into theory. Illustrations of community-based research: Hallmarks, functions and phases. Gateways: International Journal of Community Research and Engagement, 7(1), 18–33. doi: 10.5130/ijcre.v7i1.3486

Paré, A. (2019). Re-writing the doctorate: New contexts, identities, and genres. Journal of Second Language Writing, 43, 80–84. doi: 10.1016/j.jslw.2018.08.004

Patton, M. Q. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York, NY: Guilford.

Patton, M. Q. (2012). Essentials of utilization-focused evaluation. Thousand Oaks, CA: SAGE. Powers, C. H. (2010). Making sense of social theory: A practical introduction (2nd ed.). Plymouth, United Kingdom: Rowman & Littlefield.

Preskill, H., & Torres, R. T. (1999). Building capacity for organizational learning through evaluative inquiry. Evaluation, 5(1), 42–60. Retrieved from Retrieved from http://www.stes-apes.med.ulg.ac.be/Documents_electroniques/EVA/EVA-GEN/ELE%20EVA-GEN%207458.pdf

Rojas, A., Black, J. L., Orrego, E., Chapman, G., & Valley, W. (2017). Final Report–Insights from the Think&EatGreen@School project: How a community-based action research project contributed to healthy and sustainable school food systems in Vancouver. Canadian Food Studies/La Revue canadienne des études sur l’alimentation, 4(2), 25–46. doi: 10.15353/cfs-rcea.v4i2.225

Royal Roads University. (2014). DSocSci Handbook. Victoria, Canada: Royal Roads University.

Schones, C. J., Murphy-Berman, V., & Chambers, J. M. (2000). Empowerment evaluation applied: Experiences, analysis, and recommendations from a case study. American Journal of Evaluation, 21(1), 53–64. doi: /10.1016/S1098-2140(00)00063-1

Shore, N. (2007). Community-based participatory research and the ethics review process. Journal of Empirical Research on Human Research Ethics, 2(1), 31–41. doi: 10.1525/JERHRE.2007.2.1.31

Smith, L. T. (2008). Decolonizing methodologies: Research and Indigenous people (12th ed.). London, England: Zed Books.

Travers, R., Pyne, J., Bauer, G., Munro, L., Giambrone, B., Hammond, R., & Scanlon, K. (2013). ‘Community control’ in CBPR: Challenges experienced and questions raised from the Trans PULSE project. Action Research, 11(4), 403–422. doi: 10.1177/1476750313507093

Warren, M. R., Calderón, J., Kupscznk, L. A., Squires, G., & Su, C. (2018). Is collaborative, community-engaged scholarship more rigorous than traditional scholarship? On advocacy, bias, and social science research. Urban Education, 53(4), 445–472. doi: 10.1177/0042085918763511

Wiebe, N., & Wipf, K. (2011). Nurturing food sovereignty in Canada. In H. Wittman, A. A. Desmarais, & N. Wiebe (Eds.), Food sovereignty in Canada: Creating just and sustainable food systems, (pp. 1–19). Black Point, Canada: Fernwood.

Author Note

This applied research was conducted in partnership with Cowichan Green Community Society and Nanaimo Foodshare Society.

License

Share This Book