Published on March 5, 2012
Pubic involvement in research:assessing impact through arealist evaluationinvoNETDavid Evans, Vito Laterza & Rosie Davies21 February 2012
Focus of this session• Initial thoughts on reflexivity• Background to our project, design and conceptual framework• Reflections on early learning – Theory development and testing – Objectivity and contamination – Working with research partners
Reflexivity and reflective practice• “Reflexivity” – not the same as reflective practice:• Thinking about ourselves (i.e. reflective practice) and learning from that (i.e. do things better with reflective practice)• But also linking ourselves to our research participants (not just what you can do better, but thinking about others through our own experience)• Thinking about thinking (i.e. epistemology): reflecting upon basic structures and processes that are behind the taken-for-granted everyday reality we live in and study in our research contexts
Background• Team based at University of the West of England (UWE) and University of Coventry• Grew out of existing UWE Service User and Carer Involvement in Research initiative• Nine academic researcher and four research partner (user) co-applicants/co- researchers
Design• Realist evaluation framework• 18 month project• Eight case studies• Mainly qualitative methods – Semi-structured interviews (c. 5 participants per case study x 3 interviews over one year) – Observation – Documentary analysis – Consensus workshops• Economic analysis
Realist evaluation• Policy driven by an underlying theory of how an initiative is supposed to work• Role of the evaluator to compare theory and practice• “What works for whom in what circumstances and in what respects?”• Look for regularities of context, mechanism and outcome (CMO)(Pawson 2006; Pawson & Tilley 1997; 2008)
Levels of public involvement inresearch theory• Policy level – what do DH, NIHR and other senior R&D stakeholders think are effective mechanisms leading to desired policy outcomes?• Programme/project level – what do stakeholders (e.g. PI’s, research teams, research partners) think involvement contributes to their desired outcomes?• Academic level – what are the dominant academic theories in the literature about public involvement mechanisms in research and whether/how they work?• Our research team – what do we think are the context- specific and generalisable mechanisms leading to desired policy outcomes?
Our CMO theory – to dateContext Mechanism Outcome•Leadership on •Involvement throughout a Impact on research designinvolvement by the PI or research project and delivery:other senior member of •Long-term involvement •Project designthe research team •Training and support •Research tools•Attitudes of trust and •Linking involvement to •Recruitmentrespect towards the public decision making •Data collectioninvolved •Budget for involvement •Analysis•Culture of valuing and •Defined roles •Writing upsupport for involvement •Dissemination•Infrastructure thatsupports involvement, e.g.policy on payment andexpenses
Our task – articulate and test publicinvolvement in research theory• Articulate policy-level programme theory from policy documents, actions, etc – it’s about research quality not empowerment• Synthesise what we know about context, mechanisms and outcomes in practice from the literature – recognising complexity and uncertainty• Simplify and express theory in a testable form for testing in case studies• Collect case study data and analyse• Revise theory and repeat process
Reflections on theorydevelopment and testing• Difficulty of categorising factors as context or mechanism• Multiple contextual factors and mechanisms making causal attribution difficult• Time period for data collection shorter than project timescales• Impact may be diffuse not specific• Outcomes may be quite limited
‘Objectivity’ and ‘contamination’At this stage, three related reflexive effects emerged:• Leading members of the team are also experts who provide advice and training on public involvement in research to researchers and PIs (i.e. “now that you study us, can we still ask for advice?”)• The questions we are asking are triggering processes of reflection that are likely to have an impact on the ongoing processes of shaping public involvement structures and mechanisms (i.e. “Mmm that’s a good question, I haven’t thought about that” or “I will certainly consider these issues further”)• Varying responses on potential issues of overlap (or “contamination”) from the participants themselves (i.e. some do not seem to be particularly concerned, while others are more concerned about keeping our study process from influencing the ongoing process of public involvement under study)
Reflections on ‘objectivity’ and‘contamination’Implications for questions of ‘objectivity’ in qualitative health research:• Reflexive effects need to be taken into account and productively explored as data, rather than discarded as “unwanted bias”: this is why we decided to keep reflection going, rather than have a unilateral policy on this (i.e. we will continue to give advice and we are also aware that participants might change their behaviour due to the questions and reflections emerged from data collection; these effects will be followed up, where possible)• We will respect research participants’ wishes on the matter: if they want us to put specific measures in place to reduce any possible influence, we aim to accommodate that• We will protect research participants’ confidentiality and anonymity in all cases, and this will always take precedence on questions of reflexivity and objectivity in any case.
Involving research partners• Contributed from design stage onwards• Each research partner works with one academic researcher and Vito on two of the eight case studies• Research partners meet together as a group in addition to attending full team meetings• Involvement in all aspects of the project, including theory building, conducting interviews and analysis• Reflection on the team’s process of involvement, issues and outcomes included as data• Support from Vito and named researcher on team
Reflections on involvingresearch partners• Role development in the institution at UWE – Extending formal arrangements because of our need for research passports• Difficult to keep track of impact!• Differences: levels of experience in research, life experiences, kinds of contributions, between case study teams ...• Complex roles and relationships
Contact details• David Evans, Professor in Health Services Research (Public Involvement) David9.Evans@uwe.ac.uk• Vito Laterza, Research Fellow Vito.Laterza@uwe.ac.uk• Rosemary Davies, Research Partner Rosemary3.Davies@uwe.ac.uk
invoNET 2012 . Researching public ... David Evans, Rosie Davis and Vito Laterza, University of West of England presented on their National Institute for ...
... 2012. Watch Live as the ... UFC on FOX: Evans vs Davis Pre-fight Press Conference - Duration: ... Evans vs. Davis Weigh-in Video - Duration: ...
Vito Laterza Full List of Publications. ... Evans, D., Laterza, V. and Rice, C. (2012) ... Evans, D., Laterza, V. and Davies, ...
HEALTH SERVICES AND DELIVERY RESEARCH ... Rosemary Davies, Christine Donald, Vito Laterza, ... programme, which were merged in January 2012.
Dave Evans; Birth name: David Evans: Born 20 July 1953 ... In June 2012, Evans performed as guest vocalist for the Norwegian band Barbed Wire on a mini AC ...
Researching public involvement in research ... The sixth invoNET workshop provides an ... 11.45am – 12.00pm David Evans, Rosie Davies, Vito Laterza
Revisions of tests are becoming a fact of life (Butcher, 2000) Tulsky and Ledbetter (2000) say that a horse put together by a committee need not resemble.
The Evan "Funk" Davies Show: Playlist from ... Back to The Evan "Funk" Davies Show ... May 4 2012 Please Like The Evan "Funk" Davies Show on Facebook and ...