Evaluation is a journey

The third meeting of Evaluation Exchange was held on 23rd January 2013 at TouchBase, hosted by Sense Scotland in Glasgow. It’s a great, fully accessible space and I’d thoroughly recommend considering it for meetings.

Everyone highlighted positive progress overall, and we were keen to repeat our successful format from previous meeting, which had capitalised on the knowledge in the room and put peer support at the heart of the process. We quickly split into small groups with a short period of dedicated time for each person to get support and feedback on their evaluation.

Emerging from these discussions were issues such as capturing unintended outcomes, tensions between different types of outcomes (e.g. soft/hard, individual/service, service/funder) and data (administrative/statistical/qualitative/stories), the need to be pragmatic, the question of attribution (how do we know it was us that made the difference?) and how to present the results of our evaluations.

We then reviewed the evaluation steps from moving from defining an outcome, to identifying indicators to deciding how measure the indicators. You can look at these support guides from ESS to refresh these steps.  Through this lens, we worked through another outcome from one of the Evaluation Exchange members, this time an from PLUS Stirling’s Direct Short Break service, who provide short breaks to families and carers, using an online booking system and want to know that children and families feel in control of the service.

Credible self-evaluation?

During this process we started to ask questions about the credibility of our evaluations – how can we be sure they will be seen as robust, credible and impartial? Particularly if, as self-evaluators, we are undertaking them ourselves rather than getting an outside perspective. We agreed that we could increase the credibility of our work by:
•    Being open and transparent about what we have done in our evaluation
•    Reporting any known shortcomings or gaps in our work
•    Ensuring that all relevant view points and data sources are included
•    Actively encouraging honest responses from the people we support and other stakeholders
•    Consistently in highlighting learning points for improvement as well as positive results
•    Having confidence ourselves that we have been as impartial as possible

We agreed to focus on this as a topic in a future session of Evaluation Exchange.

What makes good peer support?

We were interested in exploring different ways of receiving peer support. The group felt that in general they preferred an informal method of support that could be either face-to-face or via the internet.

The group included email in their definition of ‘the internet’ – and they felt that day-to-day they were more likely to email one or two colleagues informally than post on an open forum. They felt that the biggest benefits of peer support came through small personal relationships, which were cultivated through the building up of trust and shared goals. They were also keen on the idea of being able to access support as it was needed, rather than blocking out time for meetings on an on-going basis.

That said, the group also felt that regular meetings and belonging to a formal, closed group had helped them progress their evaluations by keeping it on their priority list as they knew that they had another meeting coming up, where they would be asked how they were getting on.

There was no appetite for an open group which would lack some of the features that made a closed group with a static membership successful: continuity, established trust, familiarity with each others work, shared understanding and language, and the fact that personal/professional relationships had been built over time. It was acknowledged that over a long time period a closed group can have risks as members drop out and move on and the membership is not renewed.

Seeing the Big Picture – How are we doing?

In order to evaluate the session in a creative way, Tom showed the group a map with a number of features and asked everyone to “choose a vehicle, location or anything else that represents where you are on the evaluation journey”. We used The Big Picture map, produced by SCVO.

Here are some of the responses:

  • Pink car next to the Library and Bank – “I chose that car because I feel like I am gathering everything I need just now like information and support”
  • Hot air balloon – “I am with others, its objective, I feel pleasant and supported… and I know there is a direction even though I don’t have the map!”
  • Train at the station – “I am deciding on our direction… and who gets off!”
  • Boat on the river – “I have come by a meandering way but I will get to where I’m going”
  • Hot air balloon – “I’ve got the group, I’ve got the objectives and I’ve got other staff and volunteers… I’m clear on the timescales and the direction – what a doddle!”
  • Double decker bus – “I’m on the journey – not quite sure where I’m going yet but I’m not on the roundabout going in circles”.

Why not look at the map and reflect on where you are on your own evaluation journey and how you can move forward.

 

2 thoughts on “Evaluation is a journey”

  1. nice summary.
    Just thinking in terms of the my evaluation of Thorntree Street what anyone thinks of trying to work out where the guys might be if they hadn’t come to TTst. or does this seem too loaded do you think?

    1. If you have the time that thinking process can be incredibly valuable. I was delivering a course on logic modelling yesterday and one of the participants from a youth organisation made a point that, “some of my funders don’t see the value in the progress we make because it’s too small or it doesn’t involve enough participants… but if we weren’t here providing our services then some of these kids would be off the rails”. Evaluating services or activities that are aimed at prevention/early intervention (preventing hospital admissions, diverting from anti social behaviour, sustaining client participation etc) is challenging but it can be worthwhile.

Comments are closed.