Improving use of evidence

On 4th November 2015, Iriss facilitated a workshop to scope actions to improve the use of evidence within social services in Scotland. This work takes forward one strand of the  Social Work Services Strategic Forum (SWSSF) Vision and Strategy 2015-2020. The workshop was designed and delivered in partnership with the Scottish Government Office of the Chief Social Work Adviser and Professor Andy Kendrick of Strathclyde University and Scottish University Social Work Education, who leads the Improving use of evidence strand of the strategy.

The workshop generated much discussion and some actions to take forward in 2016. Read the report of the workshop here

Improving use of evidence_workshop-report

 

Research as if people were human

In this post guest blogger Dr Cathy Sharp, Director Research for Real, describes how stories can be used in action research. 

Stories are a big part of action research.  Drawing on a mix of stories as an explicit inquiry practice and source of experiential learning can generate great insight and motivation for change.   Not just the stories told by people who live in the communities in which we work or those who use services, but the stories of staff who work in those services.  Almost certainly we could all make much better use of this most overlooked source of existing ‘data’, using them as a kind of ‘living case study’ to motivate and inform our action.

Stories are descriptive accounts of something that has happened, that bring things to life and, in sharing, compel people to ask questions.  They’re best told in the first person.  Stories involve relationships, events, notions of cause and effect and priorities; they may be complex, yet memorable and galvanizing for action: ‘When people can locate themselves in the story, their sense of commitment and involvement is enhanced’ (Shaw, 1998) More recently BrenĂ© Brown calls stories ‘data with a soul’. (Brown, 2013)

A friend and former colleague of mine used to say that she didn’t like the term stories ‘because where I come from a story means a lie!’  I said that I didn’t mind too much what she called them as long as she didn’t dismiss them as ‘not proper research’.  A lot of the time stories are told informally and dismissed as anecdotal.  We’d all have our own ideas about what makes a good story or turns a good story into good evidence, but credibility is clearly important.  That seems to stem from a sense of the authenticity of the teller, the plausibility or believability and our ability to empathise with the story.  Action researchers would want to add that the stories should be drawn from multiple voices and should not just generate insight, but should also encourage and enable action for positive change.

A fairly typical response to hearing a story is that you can’t dispute the emotion.  People have described stories as ‘honest and compelling’ and are often profoundly moved by the stories they hear, and so are encouraged to find out more.  They may be surprised or challenged in their perceptions and assumptions, especially those of which they were previously unaware.   In this way, stories provide new insights and help to shape deeper understanding of other peoples’ experience, helping us to make sense of the world.

The tone and atmosphere created by stories, by their very nature, is often attractively equal and democratic (Bate and Robert, 2007).  Sharing stories and making links between them often allows new meaning to emerge; in this way they are an important approach to knowledge co-creation and a great way to create a real appetite for evidence, a hunger for more stories, more feedback and shared sense making.  Sharing stories is the bedrock of relationship-centred practice:  when people talk with each other in this way about high points or challenges, they naturally build empathy, mutuality, respect, trust and genuineness – in short, they build high quality relationships (Dewar and Mackay 2010).   Just the qualities we need to preserve and build in health and social care and across the public service reform agenda.[1]  The Chief Medical Officer Sir Harry Burns has described this sharing as a ‘vital ingredient in involving people more in shaping and running public services in the future’.[2]

The recent IRISS Insight provides a good account of some of the issues for researchers and practitioners interested in making more of stories[3].  An important question for action research is about how the stories are analysed, when and by whom; I believe that this is where their real potential as a driver for change lies.

My own ‘story about stories’ is one of a slowly changing acceptance of their validity and value.  Research on the quality of health and well-being in south east Glasgow used a collaborative and appreciative process to gather and analyse the stories collected from the community (Kennedy and Sharp 2010). This was described as an ‘unexpected reality-check’ – quite different from a focus group or survey process.  By focusing first on what was working well, practitioners got some valuable and unusual feedback.  Further exploration of the stories helped them to see their own assumptions about the people and services they encountered. The stories revealed telling messages about the attitudes faced by local people using services, the lack of joined-up practice and the hurdles they faced when trying   to access basic information. The stories prompted more stories from within the system too and enabled a different kind of conversation to happen.

In the evaluation of the Cedar pilot the ‘Stories of Practice’[4] were composite stories based on themes drawn from 2 years of data and accounts from children and mothers in recovery from domestic abuse (Sharp, C and Jones, J with Humphreys, C and Netto, G, 2011).  For the final Cedar evaluation event, it was necessary to find a way of retaining the integrity of the voices of children and their mothers and to enable a small number of them to participate in the event alongside professionals (Sharp, 2014 forthcoming).    At the event, the stories were ‘re-told’ by volunteer story tellers for whom they had some resonance.  The story tellers and listeners included young people and mothers who had completed the Cedar group work as well as programme staff and facilitators.    The stories were collectively analysed in small groups through a dialogical group process which drew out key themes and learning and from which the participants together developed recommendations for the Scottish Government.   This process was adapted from the Storydialogue method for health promotion knowledge development and evaluation (Labonte, R. and Feather, J. 1996).

Sharing stories in this way gave access to the real and emotional content of experience in a relatively raw and anonymised form that also enabled wider participation in the analysis process.   It is this kind of co-analysis of stories that is particularly powerful in facilitating a different and more appreciative dialogue which motivates participants to identify existing good practice, brings this to life and propels mutual learning, collaborative action and the embedding of change and practice development (Dewar and Sharp, 2013).

For further guidance and inspiration see ‘Changing Places with Stories’ developed by Glasgow’s Space Unlimited which provides two ways to enable collective analysis of stories so that they get shared in ways that lead to collaborative learning  and change (Space Unlimited 2013).

At the end of the day, the full potential of these kinds of dialogical methods lies in their use in a wider ‘whole cycle approach’ to research and evaluation that is built into the fabric of what we do and how we design and deliver our services (Wadsworth, 2011)  Ultimately they energise and restore people’s connections with work and motivate people to work together.  A humane approach to inquiry; truly, ‘research as if people were human’[5].

 

References

Bate, P and Robert, G (2007) Bringing User Experience to Healthcare Improvement, The concepts, methods and practices of experience-based design, Radcliffe

Brown, B (2013) Daring Greatly, Penguin

Dewar B, and Mackay R, (2010) Appreciating compassionate care in acute care setting caring for older people International Journal of Older People Nursing, 5, 299-308

Dewar, B and Sharp, C (2013) Appreciative dialogue for co-facilitation in action research and practice development, International Practice Development Journal 3 (2) [7]  http://www.fons.org/library/journal.aspx

Kennedy and Sharp (2010) Modest and mighty: stories of health and well being from Langside and Linn, Final report http://library.nhsggc.org.uk/mediaAssets/CHP%20South%20East%20Glasgow/Langside%20and%20Linn%20Final%20Report%20Jan%202010.pdf

Labonte, R. and Feather, J. (1996) Handbook on Using Stories in Health Promotion. Ottawa: Health Canada

Sharp, C (2014 forthcoming) Creating a community of reflective practice – supporting children and mothers in recovery from domestic abuse, in Promoting Change Through Action Research: Current Trends In Education, Social Work Health Care And Community Development,  (Eds) Stern, T., Rauch, F., Schuster, A., and Townsend, A , Sense

Sharp, C., Jones, J., with Netto, G  and Humphreys, C (2011) “We thought they didn’t see” Children and Mothers Experiencing Domestic Abuse Recovery, Evaluation Report, Scottish Womens’ Aid, June

Shaw et al, Harvard Business Review (May-June 1998 v76 p41-8) Quoted in Wadsworth, Y., Wierenga, A. and G. Wilson (2nd ed. 2007) Writing narrative action evaluation reports in health promotion – manual of guidelines, resources, case studies and Quick Guide, State of Victoria, Department of Human Services and the University of Melbourne, Australia.

http://docs.health.vic.gov.au/docs/doc/62DE26A64376FED8CA257B1E0020CAE4/$FILE/near.pdf

Space Unlimited (2013) Changing Places with Stories http://www.spaceunlimited.org/media/91137/changing%20places%20with%20stories%20(260713).pdf

Wadsworth, Y (2011) Building in Research and Evaluation, Human Inquiry for Living Systems, Allen and Unwin


[1] See for example NHS Education for Scotland (2013) Listen Learn Act Harnessing the power of personal stories to drive service improvement  Information and guidance for staff on story-gathering http://lx.iriss.org.uk/sites/default/files/resources/final_pdf_listen_learn_act.pdf

[2] Quoted in Space Unlimited (2013)  Changing Places with Stories

[5] This phrase is used by John Rowan (2001) The Humanistic Approach to Action Research in Reason, P and Bradbury, H (eds) Handbook of Action Research, Sage.

Evidence and Innovation: dance, dance whatever you may be…

I am now two-thirds of the way through an IRISS project exploring the links between evidence and innovation in the context of Scotland’s social services.  Arguably, one of the most important parallels to have emerged so far is that both evidence and innovation are surrounded by voluminous, and at times contradictory, literatures.  In particular, diverse definitions and models have been developed to describe and explain these terms, and how they operate in organisational contexts.

In relation to evidence-based approaches, there is considerable debate surrounding two questions: How do policy-makers and social service practitioners use evidence, and what counts as ‘good’ evidence in this context?  In searching the extensive terrain of these debate we might start with Weiss’ (1979) influential seven-fold typology of evidence use, and trace its development into the three-fold model presented by Amara et al (2004) where research use is said to be either instrumental, conceptual or ideological.  In terms of gauging what counts as ‘good’ evidence, we might want to engage in on-going debates concerning the research design hierarchy in efficacy and evaluation research (see my previous blog post), or to broaden our remit and consider the value of practitioner and service-user experiences as important forms of evidence (Collins and Daly, 2011).

Likewise, our attempts to ‘pin down’ innovation may lead us in diverse and sometimes contradictory directions.  We might find the idea of an innovation spectrum a useful way to delineate radical from incremental innovation, or we might enjoy Kirby Ferguson’s refreshing view that ‘everything is a remix’.  We might want to focus on public sector orientated debates surrounding the many perceived barriers to innovative policy and practice in this context, (Burke, Morris & McGarrigle, 2012) or see what transferrable lessons we can learn from a range of other contexts and sectors including technology, music and international development (Ted, The Creative Spark Playlist, 2013).

Whichever route we choose to go down (and the above is just a taster of the possibilities available to us) what has become clear during the evidence and innovation project at IRISS is that how we define and conceptualise evidence and innovation influences the relationship(s) we perceive between them.  I will use one example as an illustration here.

In evidence-based approaches, evidence is sometimes viewed in an instrumental way.  This is where evidence is seen as the product of research and as a means to an end.  It is a piece of knowledge or information that is of direct practical use, telling us whether or not a policy or practice is capable of achieving a particular outcome (Amara et al, 2004, p. 76). This is the view of evidence which appears to underpin the ‘What Works’ agenda, viewed most recently in the discourse of the What Works Centres.  There are parallels with Weiss’ (1979) Knowledge-driven Model and Problem-solving Model (p. 427-8), where evidence is the “fruit” of research, from which “new policies emerge” (p. 426).

The appeal of this view of evidence is its apparent simplicity, rationality and pragmatism.  It implies that evidence is a bounded and useful entity, which can be directly used to guide decision-making.  Arguably, this fits with the move to create more accountable, transparent and rigorous public and social services throughout the UK since the 1980s (Munro, 2004). This model may better lend itself to the straightforward and transparent defensibility of decisions, enabling policy-makers and practitioners to claim  ‘the evidence told us this, so we did this’.  In the current context of scrutinised public spending, this may be a valued trait.

It is also perceived as a way to make social services, such as social work, more able to articulate what they do and more able to defend practice and decisions in the face of criticism.  There is a sense that, what would be required here would be a commitment to using those methods and practices that have been ‘proven’ to ‘work’, so that social workers can clearly articulate the logical and objective grounds for their decision making if something goes wrong.  However, some have suggested that, taken to its logical conclusion, this implies that there is one right way to do something, which could lead to the standardising of, or a “cookbook’ approach to, social work (Otto & Ziegler, 2008, p.  273; Forrester, 2010).

This is an interesting debate in itself, but what we are concerned with here is the impact of adopting this instrumental view of evidence when considering the relationship between evidence and innovation.  Theoretically, there may be a tension between the use of evidence as a means of standardising, accounting for, and defending social service practice, and the desire to boost innovation in this context.  These dual aims may sit together rather awkwardly given research that suggests that social service innovation is stifled in rigid, highly standardised and risk-averse contexts (Brown, 2010; Munro, 2011).  This may be particularly true if it is radical and transformative innovation that is being sought, as it is in the context of Scottish social service reform (The Scottish  Government, 2011).

So, when evidence is defined instrumentally, it may be in tension with innovation in a social service context.  In contrast, what happens if we adopt Weiss’ (1979) ‘enlightenment’ view of how evidence is used, which has been referred to elsewhere as the ‘conceptual’ use of research (Amara et al, 2004)?  What is being referred to here is the role evidence plays in influencing what is on the policy agenda and how it is framed, conceptualised and discussed.  Thus it is;

the concepts and theoretical perspectives that social science research has engendered that permeate the policy-making process… the imagery is that of social science generalizations and orientations percolating through informed publics and coming to shape the way in which people think about social issues (Weiss, 1979, p. 429).

So research and evidence can stimulate ideas and curiosity, which can lead to a reframing of the agenda and new ways of understanding the world, which, in turn, may suggest new ways of  “intervening and changing the world” (Nutley et al, 2003, p. 130; Gough, 2013, p. 163).  Adopting this conceptualisation of evidence-use may, theoretically, mean that evidence and innovation are mutually reinforcing agendas.  Both are about engaging with ongoing processes of reframing, rethinking and rearticulating as part of a wider process of change and reform.

The aim here has been to highlight the importance of spending time unraveling the definition, conceptualisation and usage of the key terms of this project, in order to develop a more nuanced understanding of the relationships between evidence and innovation.  It also reinforces the importance of considering one reform agenda (e.g. innovation) in the context of other, simultaneous reform agendas (e.g. evidence-based approaches).  As parallel reform discourses, evidence and innovation may be compared to dance partners*. They share a space and thus from time to time they will inevitably cross paths and step on one another’s toes. However, the particular dances they perform, and the extent to which they move as one or trip one another up, will, to some extent, be dictated by the ways they are discussed, defined, understood and operationalised by all those involved.

* Thanks to my colleague, Rhiann, for this helpful metaphor.

References:

Amara, N; Ouimet, M; Lndry, R (2004) New Evidence on Instrumental, Conceptual, and Symbolic Utilization of University Research in Government Agencies, Science Communication, 26:75, pp. 75-106.

Brown, L (2010) Balancing Risk and Innovation to Improve Social Work Practice, British Journal of Social Work, 40(4), pp. 1211-1228.

Burke, K; Morris, K; McGarrigle, L (2012) An Introductory Guide to Implementation, Dublin: Centre for Effective Services. http://www.effectiveservices.org/implementation/

Collins, E & Daly, E (2011) Decision making and social work in Scotland: The role of evidence and practice wisdom, Glasgow: IRISS.

Gough, D (2013) Theories, perspectives and research use, Evidence & Policy, 9:2, pp. 163-4.

Munro, E (2004) The impact of audit on social work practice, British Journal of Social Work, 34(8), pp. 1073-1095.

Munro, E (2011) The Munro Review of Child Protection: Final Report, London: Department for Education.

Nutley, S; Walter, I; Davies, H (2003) From Knowing to Doing, Evaluation, 9(2), pp. 125-148.

Otto, H & Ziegler, H (2008) The Notion of Causal Impact in Evidence-Based Social Work: An Introduction to the Special Issue on What Works? Research on Social Work Practice, 18:4, pp. 273-277.

Weiss, C (1979) The Many Meanings of Research Utilization, Public Administration Review, September/October, pp. 426-431.

 

 

 

 

What came to the forefront whilst I was researching the background?

I have just completed my first week working on a project with IRISS exploring the links between evidence and innovation in the context of Scotland’s social services.  This blog marks the first of my weekly attempts to chart my progress and experiences of researching this emerging topic. I use the word emerging here, because what has already become apparent is that, although ‘evidence’ and ‘innovation’ have both been explored extensively across the academic, policy and practice literatures, their complex and evolving relationship with one another has not yet been considered in the same detail.  This makes for a project that is both timely and challenging.

Dedicating this week to researching the key debates and ‘buzz words’ associated with evidence and innovation, and their application to social services, has resulted in my own re-emphasising of the importance of context.  It has led me to conclude that the relationship between evidence and innovation cannot be adequately reflected on, in a way that is meaningful to practitioners, without this topic being situated within the current context of social service policy and practice in Scotland.  Of particular relevance for the evidence and innovation debate is the drive to reform Scotland’s public services in general, and social services in particular.  This imperative is present in influential reports including The Christie Report, The Scottish Government’s response to this in Renewing Scotland’s Public Services, and its report on Social Services, Changing Lives.  Each of these highlights the difficult situation faced by public services in the wake of a financial crisis which has simultaneously reduced public sector resources whilst increasing the demand for such resources as the effects of the recession are felt by society’s most vulnerable.  This, combined with an aging population and recruitment difficulties, has culminated in a challenging climate for social services.

The government’s response to these complex demands has been to present an improvement agenda, and there are two aspects of this agenda which I see as crucial to the contextualisation of the evidence and innovation project at IRISS. First, that improvement will be stimulated through radical innovation (rather than incremental innovation), and second that improvements will also be achieved by tightening oversight and accountability, through the application of transparent and rigorous audit mechanisms.  It is easy to get swept along in the power of these separate discourses.  ‘Innovation’ seems to have evolved into a term that has only positive connotations, and has become the ‘go to’ concept for those attempting to improve organisations and services.  The drive towards greater accountability for organisations funded by public money has been a persuasive discourse since the 1980s New Public Management revolution and, arguably, has only been strengthened by recent high profile illustrations of mismanagement and error in the public sector (The NHS debt and the Daniel Pelka case are just two examples).

However, upon closer inspection the government’s suggestion that social service reform should be pursued through greater accountability and radical innovation is problematic.  There is a potential antagonism here between the pursuit of innovation and tighter regulatory regimes, particularly where the latter is based on narrow, high-stakes performance criteria that are used publicly to facilitate competition, to scrutinise and to pass judgements. Innovation risks being stifled in a climate where practitioners have to ‘watch their backs’ and ensure that they are achieving particular performance targets (Munro, 2004; Brown, 2010).   This issue may be intensified if it is radical innovation that is being sought. Regulatory regimes that can serve to create a ‘blame culture’ may undermine desire and willingness to implement radical innovations, as social service practitioners may fear the repercussions for their clients and themselves if something goes wrong,

Given this potential antagonism present within the government’s reform discourse, and the likelihood that both radical innovation and greater accountability will continue to propel the improvement agenda, it is important that this context firmly underpins the IRISS project exploring the links between evidence and innovation. This context suggests that there is likely to be a pivotal role for ‘evidence’ throughout the innovation process, not least as a way of mediating some of the potential risks and concerns this poses for service users and the workforce.  Having established this, other questions begin to surface.  What kinds of evidence are integral to innovation?  Collected by whom? And used when and how?  These are just some of the questions I will be pursuing over the forthcoming weeks.

However, what also emerges from this consideration of context is the unlikeliness that evidence alone is enough to secure successful innovation in a social service context.  Evidence may well prompt, inform and evaluate innovation, but there is clearly an important role for the Government and service leaders in the development of a culture where “well thought through, well managed risks” are an accepted feature of the transformational process (Brown, 2010, p. 1222).  Come to think of it, there might be a role for evidence here too, as a way of developing our understanding of the culture and conditions necessary for successful innovation in a social service setting.  Undoubtedly, new links between evidence and innovation will continue to emerge as I begin the second week of this thought-provoking project.

Jodie Pennacchia
The University of Nottingham
Follow: @jpennacchia

References

Brown, L (2010), Balancing Risk and Innovation to Improve Social Work Practice, British Journal of Social Work, 40(4), pp. 1211-1228.

Munro, E (2004), The impact of audit on social work practice, British Journal of Social Work, 34(8), pp. 1073-1095.

 

 

Contribution Analysis

CONTRIBUTION ANALYSIS

I recently attended a session on contribution analysis. I was on a fact-finding mission. At the end, I came away thinking it wasn’t anything new and what was involved didn’t include anything I hadn’t done before as a researcher or evaluator. It just wasn’t called contribution analysis!

On further reflection, however, I saw that it might offer certain benefits – but before going any further
.

What is contribution analysis?

  • It was ‘invented’ by John Mayne, at the end of the 1990s when he was working for the Office of the Auditor General in Canada. He’s since become an independent advisor on public sector performance.
  • It works towards or back from evidencing the impact of a project/ programme or policy– and fits nicely with the shift to focusing on impact (or ‘outcomes’) as we move away from an evaluation culture where we list all the things we have done or produced! If you are smart  – and depending on what your project outcomes are – you should be able to link your outcomes to higher strategic ones eg those in Scotland’s national performance framework.  Policymakers will love you?!
  • It acknowledges right up front that your project/programme or policy is only part of a more complicated landscape– with multiple partners, different initiatives targeting the same groups, and spheres of direct and indirect influence all playing a part. You can also use it to explain why your project has, for example, limited impact due to other factors out with your control and postulate alternative explanations of change. In short, it doesn’t ignore the rest of the world.
  • Importantly, Mayne recognised the limitations we work under- that you can’t always prove causality by testing cause in effect in controlled conditions in the positivist tradition– however, YOU CAN provide evidence and reduce uncertainty about the difference you are making.
  • Where projects/programmes or policies don’t have clear outcomes or objectives, targets or timelines, it can help re-dress this – by making them explicit and thereby possible to evaluate.

So how do you do it?

Mayne identified a number of steps – which seem to have been slightly amended or re interpreted by a range of people – but essentially these are:

1. Set out the attribution problem to be addressed  – this could either be: identify the outcome you hope to achieve OR ask if you project/ programme/policy has achieved your stated outcome?

2. Develop a theory of change or logic model  – this involves developing a results chain to make explicit the theory of change upon which the project, programme or policy is based eg. if I invest time, money and human resources in delivering information, mentoring schemes, role models and taster events targeted at pupils from schools where few go onto higher education (HE), I can inform pupils and their influencers about the benefits of higher education, raise their aspirations, motivate them to achieve better exam results at school and successfully apply to university – thereby widening access to higher education (the ultimate outcome)!

Results chains can get complicated – with multiple chains or ‘nested logic models’ -but that is for another day.

This should also involve identifying the risks and assumptions you have made eg that your funding will continue and policymakers believe widening access to HE is a good thing; that you can keep and retain good staff; that your activities are well-received and pupils (and parents) react as you expect; that your investment is significant enough to sustain positive change in behaviour; that changes outside your sphere of influence such as charging or raising fees for courses or a reduction in university places doesn’t happen. Agreeing risks and assumptions will be an important part in assessing whether implementation has occurred as expected – with lessons to be learnt if assumptions are incorrect. You might also consider ways of reducing risks.

It’s been said that contribution analysis is good for confirming or reviewing a recognised theory of change – but not so good at creating or uncovering one! If so, does this mean it’s not suitable for trial or experimental projects?

3. Gather evidence to substantiate your theory of change or logic model – using data you have been gathering whether statistical or qualitative. You can also draw on others’ evidence eg to back up your assumptions around how people react to activities similar to your own – perhaps in the same or slightly different environments?

4. Assemble a contribution or performance story based on the evidence – this should assess where the evidence is strong/credible/weak or contested.

5. Seek out additional evidence

6. Revise and strengthen your contribution story using the additional evidence – your final version should be short as well as plausible (and perhaps with a visual representation of your theory of change). Policymakers seem to like this, preferring stories to reading vast tomes of research.

Different uses

The session also revealed that people have used contribution analysis in different ways. Some have used it to simply evaluate projects (often retrospectively) while others have used it as a planning tool to identify with partners the outcomes they want to achieve, agreeing a theory of change and results chain to determine their project before it starts.

In the second case, they also identify key measures for evaluating the different points in the chain, building in a system for  ‘quality assurance,’ ‘performance management’ or ‘continuous improvement’ to track progress over time and inform future design.  How ‘informative’ it is will depend on the type of data collected of course
.  Nevertheless, this approach offers the benefit of integrating your planning, monitoring and evaluation data into one set. You can also build your evidence year on year.

Of course, this assumes that you have the luxury of long-term funding as it may take time for you to build your evidence. Having said that, contribution analysis can be useful in that you have identified interim steps – or short, medium and longer-term goals – which you can report against rather than trying to demonstrate impact prematurely.

The challenges- of which there are many

  • Contribution analysis doesn’t remove the difficulties in identifying what it is you want to achieve, your theory of change and what indicators you use to measure it. If you’re working with others to plan, it doesn’t make the challenges of partnership working nor the time and commitment involved any less.
  • Sometimes your measures will be constrained and compromised by the data that is available- which may be a weak proxy or ill-fitting indicator!
  • Contribution analysis may lead to you discovering evidence that makes you want to revise your key outcome – eg you might find out that it’s better for more women to leave their husbands to protect them from domestic abuse, even though your project is funded to keep families together
. This may not sit well you’re your funders!
  • As mentioned, it can take a very long time to generate, gather and build credible evidence to attribute positive change– which can be frustrating and problematic, especially if the pressure is on to cut projects or activities within these to make savings. Those at the session who had used contribution analysis said they were nowhere near this stage – and didn’t know of anyone else who was either!!! Linking your impact to financial data to provide an indicator on value for money is yet another challenge!

To sum up

Contribution analysis is emergent practice, but I like that it provides a theoretical framework and set of methods – the end results of which are more likely to be accepted as credible by policy makers and funders and which focus on impact. I like that it builds in others’ spheres of influence to your contribution story  – with Steve Montague to be name checked for developing thinking in this area.  Having said all of that, it doesn’t make the doing any easier!

REFERENCES

Mayne J (1999) Addressing attribution through contribution analysis: using performance measures sensibly

Montague S  (2000) Three spheres of performance governance spanning the boundaries from single-organisation focus towards a partnership: http://evi.sagepub.com/content/13/4/399

Scottish Evaluation Network event, 18 June 2013, Contribution analysis: a half day workshop on the practicalities of using contribution analysis. Contributions from Lisa Cohen (NHS Health Scotland) and Sarah Morton (University of Edinburgh)

 

Making the most of practitioners doing research

Practitioners undertake a considerable amount of research, in fact Mitchell and colleagues estimate that ‘Practitioner research in social work probably occupies a major part of the total volume of research activity in this field’ (Mitchell et al, 2010: 8).

There is evidence to suggest that practitioner research can be a valuable approach for strengthening the use of research not just for the individual practitioner undertaking research but potentially for the organisation and perhaps even the sector in which they are based.  These benefits vary depending on the support available for the practitioner and how the research endeavour in structured; which can for instance involve support being provided by other practitioners, academics or research colleagues based in-house or in external organisations.   Some of the benefits of practitioner research for the practitioner and their organisation can include:

  • Delivers research of direct relevance to practice concerns
  • Improves research capacity of individual practitioners and organisations
  • Strengthens the active role of the practitioner in the research process
  • Brings the worlds of policy, practice and research closer together
  • Helps an organisation develop the capacity for critical inquiry and a “learning orientation”
  • Supports the desire for and the use of research done by “outsiders”
  • Reduces the distance knowledge has to travel from research to practice
  • Provides a starting point for further research-practice collaboration

(Armstrong and Alsop, 2010; Roper, 2002; Anderson and Jones, 2000: 430)

However, across social services and health we are are not necessarily maximising the impact of research undertaken by practitioners for several reasons, including:

1) practitioner researchers often lack professional support and training related to the use and application of research methods and theory.

2) practitioners struggle to access existing evidence related to their work, thus potentially affecting the quality of what they are able to produce.

3) practitioners engaged in conducting research into their own team, service or organisation do not usually have the time or capacity to disseminate their research findings or to support its use in other services or organisations.

Along with colleagues at Edinburgh University (Heather Wilkinson and Catherine-Rose Stocks-Rankin) at IRISS we’ve devised and supported a practitioner research programme, known as PROP (practitioner research:older people).  PROP focused on research about older people in an attempt to respond to some of the challenges outlined above – for further information see https://blogs.iriss.org.uk/prop/2012/05/04/welcome-to-props/. PROP involved practitioners undertaking small scale research projects, supported by research and knowledge exchange training, a research mentor, opportunities to engage with their peers (other practitioners undertaking research), support for disseminating materials (from a graphic designer and the project team) and a project fellow to talk to for support and advice. The PROP project has been incredibly well received, though the contribution analysis report currently being finalised will provide us with evidence of the impact of the project and the reasons for this.

We’re currently writing up some of the learning and reflections from this programme and are exploring how to build on this work. Our reflections are at an early stage and at the moment are more like questions than reflections, but include:

1) Research centric: We started from the idea that the lack of research use was a problem in improving support for older people.  Would this have been the key problem identified if we’d collectively devised our focus between practititioners, researchers, policy makers and older people themselves?  And did this focus encourage an unequal environment for exchange, with one group of collaborators bringing to the endeavour specific knowledge about research?

2) Peer support: Practitioners identified that they particularly valued and benefited from regular contact with their peers, and that they learnt a lot about other health and social care roles and organisaitons. One of the research projects we supported was conducted by two practitioners based across two organisations so we wonder could this type of approach further maximise the learning across sectors and organisaitons?

3) Skills based: Our approach focused on developing research skills and was less concerned with encouraging personal and organisational inquiry and reflection.  Would there be value in also exploring what a reflective, inquiring practitioner looks like and what behaviours and attitudes support this?

Any views or observations on this would be very welcome and we will share our more refined reflections and contribution analysis report once they are finalised on https://blogs.iriss.org.uk/prop/2012/05/04/welcome-to-props/ – watch this space…!
References

Anderson, G. and Jones, F (2000) Knowledge Generation in Educational Administration From the Inside Out: The Promise and Perils of Site-Based, Administrator Research in Educational Administration Quarterly (Vol. 36, No. 3 (August 2000) 428-464

Armstrong, F. and Alsop, A. (2010) ‘Debate: co-production can contribute to research impact in the social sciences’, Public Money & Management, 30 (4): 208-10

Mitchell, F., Lunt, N. and Shaw, I. (2010) Practitioner research in social work: A knowledge review. Evidence and Policy, 6 (1): 7 -31

Roper, L. (2002) ‘Achieving successful academic-practitioner research collaborations’, Development in Practice, 12 (3-4): 338-345

What improves evidence use in practice?

The notion of evidence-based practice emerged in the early 1990s, first emerging in medicine but spreading across a range of practitioner professions (such as social work and education) (Mullen, 2008).   To a large extent consideration of evidence and practice has lagged behind much more prominent and extensive consideration of evidence-based policy, which was paid considerable attention when New Labour entered government in 1997. Nowadays the concept of being informed by, rather than based on, evidence is more widely accepted, as this recognises the numerous influences on practice (and policy), and the importance of these (public opinion, capacity, local context, individual needs etc).  Perhaps the terminology is ready for further development however; for instance Nick Andrews (from the All Wales Academic Social Care Research Collaboration) recently referred to ‘evidence-enriched practice’, which seems to be an even better way to express the desired relationship between evidence and practice.

 

Despite the inevitable evidence gaps (the irony!) there is some evidence about activities which have potential to improve the use of evidence in practice:

  • Strengthen links and relationships between practitioners, researchers, citizens, service users and policy makers – who all bring different evidence and different types of evidence.  Personal contact is key to strengthening evidence use.

  • Nurture active collaborations where people are involved in actually doing something together (inquiring, thinking, reflecting, planning, doing).  Again potentially involving a mix of practitioners, researchers, citizens, service users, policy makers, and more…

  • Produce evidence together, involving practitioners, researchers and others, coming together at the earliest possible stage.

  • Strengthen evidence dissemination by developing targeted messages focused on key audiences in their language, walk in their shoes and reach out to them in ways that best meet their needs.  Think about dissemination as opening up further exploration and engagement rather than being an end in itself, so ensure methods support the development of links.

  • Provide opportunities to “try out” evidence, perhaps particularly where evidence challenges current attitudes or behaviours.

  • Support evidence role models, identify evidence champions, support them and raise awareness of their activities.  Champions need to be credible and although champions are required at all levels of the organisation it is particularly helpful if leaders champion evidence use.

  • Incentivise evidence use for practitioners and the creation of useful evidence for researchers.  This can be done through making evidence use/usefulness a required component in documentation (business cases, project plans etc), as a component in job description and as criteria for promotion.

  • Designate evidence responsibilities within organisations, tasking people to interpret or synthesise evidence and develop relationships between evidence and practice.

  • Set aside evidence time to consult, reflect and discuss evidence, perhaps within team meetings or existing structures.

  • Nurture a culture of self-challenge and evaluation, personally value and demonstrate these behaviours, and champion them throughout your organisation.

(this list draws heavily on: Nutley, 2003; and is informed by Lavis, 2002; Buckley 2009; and personal experience)

This list is not comprehensive, and there is a variation in the strength of evidence supporting these activities.  It is also worth reflecting on whether behaviours and attitudes matter more than any particular activity.  This is suggested by Landry and colleagues’ study of 1000 Canadian social science scholars, which identified that the behaviour of researchers and research users had more impact on the use of research than any specific product (Landry et al, 2001).  In terms of specific attitudes and behaviours, the importance of enthusiasm in particular has been identified as having significance for supporting evidence use (Nutley, 2003). My experiential evidence also suggests that factors such as being able to think from someone else’s perspective, being understanding and patient of difference, and demonstrating this in your actions, may also have a significant impact.  I’d be interested to know if others have evidence that supports or contradicts this…

So, this blog is intended as a space to share and reflect on evidence and practice, and to share what improves the relationship between them.  We would love to hear from you and particularly welcome guest posts if you have something to share.

 

References and links

Buckley, H. and Whelan, S. (2009) Putting Research Evidence to Work: Key Issues for Research Utilisation in Irish Children’s Services (CAAB Research: Report No.2) http://www.srsb.ie/Publications/PDFs—Publications/PREW/PREW-Full-Report.aspx

Landry, R., Amara, N. And Lamari, M. (2001) ‘Utilisation of social science research knowledge in Canada’, Research Policy, 30: 33-349

Lavis, J., Robertson, D., Woodside, J., McCleod, C., Abelson, J. (2003) ‘How Can Research Organisations More Effectively Transfer Knowledge to Decision Makers’ Milbank Quarterly, 81: 221-248

Mullen, E., Bledsoe, S., Bellamy, J. (2008) ‘Implementing Evidence Based Social Work Practice’, Research on Social Work Practice, 18: 335-338

Nutley, S. (2003) Increasing research impact: early reflections from the ESRC evidence network, ESRC UK Centre for Evidence Based Policy and Practice: Working Paper 16

Share thoughts and experiences

This blog is intended as a space to share and reflect on evidence and practice, and to share what improves the relationship between them.  We would love to hear from you and particularly welcome guest posts if you have something to share.

Contact: Claire Lightowler(Evidence-informed practice: Programme manager, IRISS)