{"id":83,"date":"2012-12-14T16:05:43","date_gmt":"2012-12-14T16:05:43","guid":{"rendered":"http:\/\/blogs.iriss.org.uk\/evalexchange\/?p=83"},"modified":"2012-12-14T16:07:47","modified_gmt":"2012-12-14T16:07:47","slug":"its-beginning-to-look-a-lot-like-peer-support","status":"publish","type":"post","link":"https:\/\/blogs.iriss.org.uk\/evalexchange\/2012\/12\/14\/its-beginning-to-look-a-lot-like-peer-support\/","title":{"rendered":"It’s beginning to look a lot like… peer support"},"content":{"rendered":"

 <\/p>\n

On 5th<\/sup> December the Evaluation Exchange group met again for a half-day session, hosted by Capability Scotland<\/a> (thanks, Sue!). We had a slightly reduced number attending as we were unfortunately beset by winter illnesses, but those who attended felt the day was really productive.<\/p>\n

Making progress<\/strong><\/p>\n

Everyone felt that their evaluations had moved on since the first meeting, and that membership of the peer support network had in some respects prompted this. This was partly due to the on-going contact between participants (through the online space Basecamp<\/a>) and partly due to the motivation of knowing that we\u2019d be reporting back on our progress at the next meeting.<\/p>\n

\u201cIt’s helpful to have contact in between [\u2026] gentle prompts.\u201d<\/em><\/p>\n

For others involvement in the project had sparked new ways of thinking about evaluation for them and their organisations \u2013 particularly those who were now thinking about undertaking user-led evaluations.<\/p>\n

\u201c[User-led evaluation is] a revelation, we\u2019ve never done it before [\u2026] a big shift, but in a very good way.\u201d<\/em><\/p>\n

Participation in Evaluation Exchange had also generated enthusiasm for evaluation for the members\u2019 organisations. One organisation had received great interest in the project following a mention in their staff newsletter, while discussions at a team meeting in another organisation had led to a desire to do peer evaluation within the organisation.<\/p>\n

Supporting each other<\/strong><\/p>\n

We were keen to make sure that the expertise of the participants was used to the fullest in the meetings in order to really embed the peer support element of this project. At this session this was accomplished in three ways. First the participants split off into small groups where each individual was allocated a set amount of time to present the challenges or issues they were encountering in their project and receive advice, feedback and ideas from the others in their group. Participants knew to expect this in advance and were encouraged to think about where they most wanted input. This was very successful and the group was keen to repeat this at all future sessions.<\/p>\n

\u201cReally exciting looking at others\u2019 outcomes \u2013 you see with clear eyes.\u201d<\/em><\/p>\n

\"\"<\/a>Second, we had planned a structured sharing exercise, where the group listed a number of aspects of evaluation they wanted to know more about, then rated their own expertise on each of these. Those with more expertise in each area were then asked to share this with the group which generated further discussion. As part of this, the group members wrote up on a flip chart all the \u2018creative\u2019 evaluation methods they knew of or had used and we then had time for people to ask about those they wanted to know more about.<\/p>\n

 <\/p>\n

Third, during the input on developing soft outcomes (which we had agreed to have as a theme for session 2 at the end of the previous session) we used an soft outcome that one of the group members wanted to measure as an illustration of how to turn an outcome into measurable indicators. Tom explained these key steps in developing indicators:<\/p>\n