Record the Unexpected

One of the early criticisms of outcomes was that organisations would find supporting evidence for these positive statements regardless of the reality of the situation and they would ignore everything else. This fear proved largely unfounded but keeping this potential pitfall in mind isn’t a bad idea. We must never allow our outcomes to become a strait jacket that prevents us from capturing the reality of our work.

More and more funders are telling us that they don’t want the reporting fairy tale: Once upon a time we did this and everyone lived happily ever after! They want the real picture of the difference their support is making… the successes, the challenges and the learning. Alongside expressing this desire some of the more progressive funders have started to address the issue of ‘Funding Assurance’. Working towards the “Harmonising Reporting” principles, helping their funded organisations to improve their evaluation skills or simply being clear about what they want to know are just some of the ways that funders are responding to this key expressed challenge of voluntary organisations.

What we are really talking about here is trust and trust requires honesty from funder and funded alike. So when it comes to reporting… tell the whole story. Show the progress you have made towards achieving your outcomes, record and report on all unexpected outcomes from your work, talk about the challenges you faced along the way and critically… demonstrate the learning that will inform your future plans as well as your future evaluations.

Tom Scott

Tom is the Training Officer for Evaluation Support Scotland, a charity that works with voluntary organisations and funders so that they can evaluate what they do, learn from that evaluation and deliver better services. You can find evaluation tools, support guides and templates on the ESS website.

Discovering the Story

Analysis is a scary word. Qualitative analysis is a scary phrase. Upon reading this phrase I am immediately transported to a lonely desk in a darkened side room where I sit hunched with head in hand… pouring over interview responses and client quotes at 11pm on a Sunday. Perhaps your picture is not quite as bleak! My picture has certainly improved over the years as I have learned more about evaluation and analysis and how valuable the process can be in telling the story of my clients. In evaluation, analysis is the part of the process that needs time set aside to allow us to discover the meaning behind the messages. Taking the time to make sense of the information we collect improves our reports and leads to better activities and outcomes for our clients. It can also provide a welcome morale boost by reminding us in rich colour… the difference we are making.

We have collected data from our clients using multiple methods and now a report is due. We need to discover the story so that we can tell it. This process often starts with theme generation. This is where you sift through a collection of responses (any qualitative data) and identify similarities. The responses can then be grouped by their commonality – this is called coding. Typically this word based analysis is done by counting word repetition or identifying key words in the context of our outcomes. Another word based analysis technique that is useful for the voluntary sector is looking for key indigenous terms. The idea is that human experience is marked by ‘tribal vocabulary’ or using words and phrases in a ‘localised’ way. This method fits well with the idea of participatory appraisal – people telling their own story in their own way using their own language.

While there are other methods of qualitative analysis the ones I described above are simple, clear and not overly time consuming. If you have more time there are plenty of ways to expand your analysis. Scrutinising your information by comparing it to other sets will provide more insights and noting information that doesn’t fit easily into identified themes can provide evidence for unexpected outcomes. Also, examining in detail why a defined outcome appears unsupported can yield interesting results… much can be learned from a text by what is not mentioned. All of the above will add value but your evaluation has to be appropriate to the size and scope of your organisation. Don’t overburden yourself with good intentions! Also, don’t do this on your own! Allowing different people to formulate ideas independently and then come to agreement as a group adds rigour to your findings. The “Evaluation Springboard” website has a basic ‘how-to’ guide that includes some tips on making sense of collected information. Finally, remember to keep a copy of your outcomes handy and you won’t go far wrong!

Tom Scott

Tom is the Training Officer for Evaluation Support Scotland, a charity that works with voluntary organisations and funders so that they can evaluate what they do, learn from that evaluation and deliver better services. You can find advice on analysing and reporting and other supportive resources such as guides and templates on the ESS website.