{"id":109,"date":"2013-02-01T11:18:18","date_gmt":"2013-02-01T11:18:18","guid":{"rendered":"http:\/\/blogs.iriss.org.uk\/evalexchange\/?p=109"},"modified":"2013-02-01T11:18:18","modified_gmt":"2013-02-01T11:18:18","slug":"evaluation-is-a-journey","status":"publish","type":"post","link":"https:\/\/blogs.iriss.org.uk\/evalexchange\/2013\/02\/01\/evaluation-is-a-journey\/","title":{"rendered":"Evaluation is a journey"},"content":{"rendered":"

The third meeting of Evaluation Exchange was held on 23rd January 2013 at TouchBase<\/a>, hosted by Sense Scotland in Glasgow. It\u2019s a great, fully accessible space and I\u2019d thoroughly recommend considering it for meetings.<\/p>\n

Everyone highlighted positive progress overall, and we were keen to repeat our successful format from previous meeting, which had capitalised on the knowledge in the room and put peer support at the heart of the process. We quickly split into small groups with a short period of dedicated time for each person to get support and feedback on their evaluation.<\/p>\n

Emerging from these discussions were issues such as capturing unintended outcomes, tensions between different types of outcomes (e.g. soft\/hard, individual\/service, service\/funder) and data (administrative\/statistical\/qualitative\/stories), the need to be pragmatic, the question of attribution (how do we know it was us that made the difference?) and how to present the results of our evaluations.<\/p>\n

We then reviewed the evaluation steps from moving from defining an outcome, to identifying indicators to deciding how measure the indicators. You can look at these support guides from ESS<\/a> to refresh these steps.\u00a0 Through this lens, we worked through another outcome from one of the Evaluation Exchange members, this time an from PLUS Stirling’s Direct Short Break<\/a> service, who provide short breaks to families and carers, using an online booking system and want to know that children and families feel in control of the service.<\/p>\n

Credible self-evaluation?<\/strong><\/p>\n

During this process we started to ask questions about the credibility of our evaluations \u2013 how can we be sure they will be seen as robust, credible and impartial? Particularly if, as self-evaluators<\/em>, we are undertaking them ourselves rather than getting an outside perspective. We agreed that we could increase the credibility of our work by:
\n\u2022\u00a0\u00a0 \u00a0Being open and transparent about what we have done in our evaluation
\n\u2022\u00a0\u00a0 \u00a0Reporting any known shortcomings or gaps in our work
\n\u2022\u00a0\u00a0 \u00a0Ensuring that all relevant view points and data sources are included
\n\u2022\u00a0\u00a0 \u00a0Actively encouraging honest responses from the people we support and other stakeholders
\n\u2022\u00a0\u00a0 \u00a0Consistently in highlighting learning points for improvement as well as positive results
\n\u2022\u00a0\u00a0 \u00a0Having confidence ourselves that we have been as impartial as possible<\/p>\n

We agreed to focus on this as a topic in a future session of Evaluation Exchange.<\/p>\n

What makes good peer support?<\/strong><\/p>\n

We were interested in exploring different ways of receiving peer support. The group felt that in general they preferred an informal method of support that could be either face-to-face or via the internet.<\/p>\n

\"\"<\/a><\/p>\n

The group included email in their definition of \u2018the internet\u2019 \u2013 and they felt that day-to-day they were more likely to email one or two colleagues informally than post on an open forum. They felt that the biggest benefits of peer support came through small personal relationships, which were cultivated through the building up of trust and shared goals. They were also keen on the idea of being able to access support as it was needed, rather than blocking out time for meetings on an on-going basis.<\/p>\n

That said, the group also felt that regular meetings and belonging to a formal, closed group had helped them progress their evaluations by keeping it on their priority list as they knew that they had another meeting coming up, where they would be asked how they were getting on.<\/p>\n

There was no appetite for an open group which would lack some of the features that made a closed group with a static membership successful: continuity, established trust, familiarity with each others work, shared understanding and language, and the fact that personal\/professional relationships had been built over time. It was acknowledged that over a long time period a closed group can have risks as members drop out and move on and the membership is not renewed.<\/p>\n

Seeing the Big Picture – How are we doing?<\/strong><\/p>\n

In order to evaluate the session in a creative way, Tom showed the group a map with a number of features and asked everyone to \u201cchoose a vehicle, location or anything else that represents where you are on the evaluation journey\u201d. We used The Big Picture map, produced by SCVO<\/a>.<\/p>\n

\"\"<\/a>Here are some of the responses:<\/p>\n