I’ve just been at an excellent international workshop on community engagement evaluation in research, held in Naivasha, Kenya.  It was an exciting opportunity to share experiences and ideas with others grappling with how we can make our evaluations of community engagement rigorous, relevant, and useful. Although some of the case studies and papers we discussed focused on public/community engagement for biomedical research, many issues are far more widely relevant. Here I share a few points useful for health policy and systems researchers with an interest in gender and ethics.

The potential value but also complexity of public/community engagement activities

Public and especially community engagement is widely promoted to strengthen ethical practice in health research, including for health policy and systems research.  It is a good in and of itself (for example, as a way of showing respect or ensuring inclusion) and also as a way of improving research (for example, through learning from communities about how to do more locally appropriate, and responsive work).  Increasing resources in terms of time, money and personnel are being devoted to a diverse range of community engagement initiatives, many very innovative and creative.  Although few would argue with the broad goals of community engagement, the complex and contested nature of key elements are increasingly recognised (Which communities should we engage with?  Who represents those communities?  For what reason(s) are we engaging them?  What is a ‘community’ anyway? And what does representation really mean?  What kind of interaction ‘counts’ as engagement?). 

There is also potential for negative effects in community engagement, ranging from simply wasting community members’ time, to exacerbating inequitable power relations through only interacting meaningfully with those least able to articulate the needs of the most vulnerable in society. 

Community engagement evaluations are often complex social interventions

There have been growing calls for careful evaluations of engagement initiatives.  These evaluations have to recognise that community engagement activities are often complex social interventions with the potential for unexpected and unintended effects. This in turn suggests that as health policy and systems researchers interested in community engagement we should consider:

What are the community engagement activities we are evaluating?  It is often difficult (but important) to understand the relationship between community engagement activities and the health research study that they are linked to.  The study itself may well involve community engagement or even – in the case of community based participatory research – be led by communities.  Our health policy and systems research may well have inbuilt community engagement, making it difficult to disentangle the community engagement elements from the intervention(s) in which they are embedded.  Unpacking the various elements of the intervention(s), including public/community engagement elements, and considering what they are expected to achieve, is therefore likely to be a key starting point in designing a community engagement evaluation in health systems research.     

Taking multiple perspectives and viewpoints into account.  An interesting analogy was shared at the meeting by Jim Lavery. He said that community engagement was rather like architecture, where a building’s look can be highly appreciated by some members of a community, and thoroughly disliked by others (see for example the Royal Ontario Museum below); where options of the building are likely to differ significantly between communities of architects, and between communities of users of the building.  This analogy suggests that evaluations of community engagement must recognise that different players – researchers, health managers, community members, funders – may have very different perspectives.  In deciding on whose perspectives to include in evaluations, we should consider competing interests and if and how the voices of the least vocal members of communities have been included.  This is likely to require an intersectional lens, whereby the range of social stratifiers (eg gender, race, age, class, (dis)ability and/or sexuality etc), that intersect to influence health needs, experiences, and outcomes are considered.

Mountain View

Royal Ontarion Museum, Canada. Credit: Grant MacDonald

Recognising fuzzy distinctions between community engagement, public engagement, and wider stakeholder engagement.  To support sustainability of public/community engagement activities, and to reach wider audiences, health policy and systems researchers increasingly engage stakeholders at multiple levels and from early in research processes. These activities are often considered as supporting uptake of research findings once the study is complete, but can overlap with community and public engagement. For example, policy makers can be a gateway to broader publics and communities.   Consequently, as well as considering multiple perspectives at local levels, evaluations may need to consider perspectives at national or higher levels.

Reflecting on the positionality of the evaluation team.  There can be advantages and disadvantages of evaluators being part of the community engagement intervention team being evaluated.  Advantages can include depth of understanding of the intervention (and context) and levels of trust built over time, whereas disadvantages can include an interest in proving ‘success’ of activities, and an inability to see things more easily observed by an outsider.  For health systems research, as for other research, one potentially valuable option is to bring together a team of evaluators, which includes both ‘insider’ and ‘outsider’ perspectives, and which incorporates different types of backgrounds and experience.  Critical team members are likely to be people with strong local cultural and language knowledge and the ability to facilitate input and perspectives from those with the least voice locally.    

Being proportionate in evaluations, and aiming for depth over breadth.  Recognition of the complexity of community engagement activities has the potential to lead to large and expensive evaluations, which also take up the time and energy of those who contribute as participants.  Without careful planning, these evaluations do not necessarily provide valuable information.  Our meeting discussions highlighted the need to keep evaluations proportionate to the engagement activities, and in some cases to strengthen the depth of learning over breadth of topics/issues covered.  Depth might be achieved by working out what has contributed to apparent successes and failures with regards to the intended goals of engagement, and doing this through selecting a design that is appropriate to the type of evaluation being planned.  Realist evaluation can be particularly useful where there is an interest in building an understanding of ‘what works for who in what circumstances and why’, encouraging us to identify the underlying mechanisms at play for specific outcomes in particular contexts. 

Drawing on the wide range of available resources.  One of the organisers of the workshop was Georgia Bladon, who has facilitated the development of the MESH website, a space for community engagement with health research in low- and middle-income countries.  MESH is a repository of 67 guides, tools and reports and papers to help develop the community engagement field.  Something I found particularly useful was their map of the evaluation materials.  For social science and especially qualitative and participatory work more generally, another potentially valuable website under global clinical trials is global social science.  And finally, to support an intersectional lens to analysis, and gender analysis, there are a range of materials available on the DFID-funded RinGs website.  These include a how to guide on gender analysis and a ten best resources on intersectionality guide.

This resource resulted from the March 2017 Mesh Evaluation workshop. For more information and links to other resources that emerged from the workshop (which will be built upon over time) visit the workshop page.

For a comprehensive summary of Mesh's evaluation resources, and to learn how to navigate them, visit the Mesh evaluation page

Creative Commons License

This work, unless stated otherwise, is licensed under a Creative Commons Attribution 4.0 International License

Reply

Please Sign in (or Register) to view further.