On 5th December the Evaluation Exchange group met again for a half-day session, hosted by Capability Scotland (thanks, Sue!). We had a slightly reduced number attending as we were unfortunately beset by winter illnesses, but those who attended felt the day was really productive.
Making progress
Everyone felt that their evaluations had moved on since the first meeting, and that membership of the peer support network had in some respects prompted this. This was partly due to the on-going contact between participants (through the online space Basecamp) and partly due to the motivation of knowing that we’d be reporting back on our progress at the next meeting.
“It’s helpful to have contact in between […] gentle prompts.”
For others involvement in the project had sparked new ways of thinking about evaluation for them and their organisations – particularly those who were now thinking about undertaking user-led evaluations.
“[User-led evaluation is] a revelation, we’ve never done it before […] a big shift, but in a very good way.”
Participation in Evaluation Exchange had also generated enthusiasm for evaluation for the members’ organisations. One organisation had received great interest in the project following a mention in their staff newsletter, while discussions at a team meeting in another organisation had led to a desire to do peer evaluation within the organisation.
Supporting each other
We were keen to make sure that the expertise of the participants was used to the fullest in the meetings in order to really embed the peer support element of this project. At this session this was accomplished in three ways. First the participants split off into small groups where each individual was allocated a set amount of time to present the challenges or issues they were encountering in their project and receive advice, feedback and ideas from the others in their group. Participants knew to expect this in advance and were encouraged to think about where they most wanted input. This was very successful and the group was keen to repeat this at all future sessions.
“Really exciting looking at others’ outcomes – you see with clear eyes.”
Second, we had planned a structured sharing exercise, where the group listed a number of aspects of evaluation they wanted to know more about, then rated their own expertise on each of these. Those with more expertise in each area were then asked to share this with the group which generated further discussion. As part of this, the group members wrote up on a flip chart all the ‘creative’ evaluation methods they knew of or had used and we then had time for people to ask about those they wanted to know more about.
Third, during the input on developing soft outcomes (which we had agreed to have as a theme for session 2 at the end of the previous session) we used an soft outcome that one of the group members wanted to measure as an illustration of how to turn an outcome into measurable indicators. Tom explained these key steps in developing indicators:
- Define your outcome – it should include the elements WHO, WHAT, HOW
- Ask yourself “If this outcome happened what would it look like in real life?”
- Supplementary questions could include, what would we see/know/do/think/etc. as a result of this outcome being achieved
- Write down all the things you come up with, without censorship
- Some of these may need to be broken down further before you can use them as indicators
- Choose around 5 priority indicators to measure, trying to take a spread of those that:
- Fit best
- Are really important
- Are easy to measure
We also discussed making a list of all the potential stakeholders that you might want to get a view from and ensuring that the key ones are covered by your indicators.
The group was really enthused by this process and in evaluating the day said it was one of the most useful elements. So we agreed that over the course of the future sessions we would work as a group on one outcome from each participant. At January’s session we plan to take this to the next step and once we have defined indicators move on to the question of how to measure each one.
Benefits of peer support for self-evaluation
A key outcome of the project is to look at how peer support can be valuable in enabling organisations to be self-evaluative. At this early stage this (rather than training style input or expert advice from IRISS or ESS) does seem to be a driver in the participants’ perceived success of the project. In discussing the usefulness of peer support the group mentioned that sharing with others and coming together as a group helps in several ways:
- Helps to think things through with others
- Having other people question you (positively) helps you think more clearly or differently about your evaluation issues and challenges
- You can apply other people’s processes to your own evaluation
- Contact in between meetings helps to prompt you to keep thinking about your evaluation and not let it slip off the radar
- Similarly, knowing you have to bring something to discuss with the group keeps the evaluation on your priority list
It’s also important to note however, that some of the benefits mentioned (including those above) relate to the fact that participating in the group gives people a space to reflect on evaluation and protected time to work on it. This, while unsurprising, provides food for thought when we come to look at how peer support networks can become self-sustaining and what might be needed to allow their creation and existence in the longer term.
I’m also starting to think about other issues related to a successful peer support network for self-evaluation. Do such networks need to be restricted to a certain size to work? Are face-to-face meetings crucial for success and, if so, what is the importance is of online (or some other means) of keeping in touch in between meetings? Do, by their nature, groups of this type have a limited lifespan based on the fact that the evaluations being carried out will be completed? These are some of my first thoughts and it would be great to hear other reflections on this.
Thanks for providing a really useful summary of where we got to at our last meeting and for generating some thought-provoking ideas we can take forward and develop when we next meet.
This was a really useful session, a good mix of practical application and some great thought starters to keep things moving. I really liked the Creative Evaluation tools which I’m hoping will continue to generate more innovative ideas.