{"id":83,"date":"2012-12-14T16:05:43","date_gmt":"2012-12-14T16:05:43","guid":{"rendered":"http:\/\/blogs.iriss.org.uk\/evalexchange\/?p=83"},"modified":"2012-12-14T16:07:47","modified_gmt":"2012-12-14T16:07:47","slug":"its-beginning-to-look-a-lot-like-peer-support","status":"publish","type":"post","link":"https:\/\/blogs.iriss.org.uk\/evalexchange\/2012\/12\/14\/its-beginning-to-look-a-lot-like-peer-support\/","title":{"rendered":"It’s beginning to look a lot like… peer support"},"content":{"rendered":"
<\/p>\n
On 5th<\/sup> December the Evaluation Exchange group met again for a half-day session, hosted by Capability Scotland<\/a> (thanks, Sue!). We had a slightly reduced number attending as we were unfortunately beset by winter illnesses, but those who attended felt the day was really productive.<\/p>\n Making progress<\/strong><\/p>\n Everyone felt that their evaluations had moved on since the first meeting, and that membership of the peer support network had in some respects prompted this. This was partly due to the on-going contact between participants (through the online space Basecamp<\/a>) and partly due to the motivation of knowing that we\u2019d be reporting back on our progress at the next meeting.<\/p>\n \u201cIt’s helpful to have contact in between [\u2026] gentle prompts.\u201d<\/em><\/p>\n For others involvement in the project had sparked new ways of thinking about evaluation for them and their organisations \u2013 particularly those who were now thinking about undertaking user-led evaluations.<\/p>\n \u201c[User-led evaluation is] a revelation, we\u2019ve never done it before [\u2026] a big shift, but in a very good way.\u201d<\/em><\/p>\n Participation in Evaluation Exchange had also generated enthusiasm for evaluation for the members\u2019 organisations. One organisation had received great interest in the project following a mention in their staff newsletter, while discussions at a team meeting in another organisation had led to a desire to do peer evaluation within the organisation.<\/p>\n Supporting each other<\/strong><\/p>\n We were keen to make sure that the expertise of the participants was used to the fullest in the meetings in order to really embed the peer support element of this project. At this session this was accomplished in three ways. First the participants split off into small groups where each individual was allocated a set amount of time to present the challenges or issues they were encountering in their project and receive advice, feedback and ideas from the others in their group. Participants knew to expect this in advance and were encouraged to think about where they most wanted input. This was very successful and the group was keen to repeat this at all future sessions.<\/p>\n \u201cReally exciting looking at others\u2019 outcomes \u2013 you see with clear eyes.\u201d<\/em><\/p>\n <\/a>Second, we had planned a structured sharing exercise, where the group listed a number of aspects of evaluation they wanted to know more about, then rated their own expertise on each of these. Those with more expertise in each area were then asked to share this with the group which generated further discussion. As part of this, the group members wrote up on a flip chart all the \u2018creative\u2019 evaluation methods they knew of or had used and we then had time for people to ask about those they wanted to know more about.<\/p>\n <\/p>\n Third, during the input on developing soft outcomes (which we had agreed to have as a theme for session 2 at the end of the previous session) we used an soft outcome that one of the group members wanted to measure as an illustration of how to turn an outcome into measurable indicators. Tom explained these key steps in developing indicators:<\/p>\n <\/a>We also discussed making a list of all the potential stakeholders that you might want to get a view from and ensuring that the key ones are covered by your indicators.<\/p>\n The group was really enthused by this process and in evaluating the day said it was one of the most useful elements. So we agreed that over the course of the future sessions we would work as a group on one outcome from each participant. At January\u2019s session we plan to take this to the next step and once we have defined indicators move on to the question of how to measure each one.<\/p>\n A key outcome of the project is to look at how peer support can be valuable in enabling organisations to be self-evaluative. At this early stage this (rather than training style input or expert advice from IRISS or ESS) does seem to be a driver in the participants\u2019 perceived success of the project. In discussing the usefulness of peer support the group mentioned that sharing with others and coming together as a group helps in several ways:<\/p>\n It\u2019s also important to note however, that some of the benefits mentioned (including those above) relate to the fact that participating in the group gives people a space to reflect on evaluation and protected time to work on it. This, while unsurprising, provides food for thought when we come to look at how peer support networks can become self-sustaining and what might be needed to allow their creation and existence in the longer term.<\/p>\n I\u2019m also starting to think about other issues related to a successful peer support network for self-evaluation. Do such networks need to be restricted to a certain size to work? Are face-to-face meetings crucial for success and, if so, what is the importance is of online (or some other means) of keeping in touch in between meetings? Do, by their nature, groups of this type have a limited lifespan based on the fact that the evaluations being carried out will be completed? These are some of my first thoughts and it would be great to hear other reflections on this.<\/p>\n","protected":false},"excerpt":{"rendered":" On 5th December the Evaluation Exchange group met again for a half-day session, hosted by Capability Scotland (thanks, Sue!). We had a slightly reduced number attending as we were unfortunately beset by winter illnesses, but those who attended felt the day was really productive. Making progress Everyone felt that their evaluations had moved on … Continue reading It’s beginning to look a lot like… peer support<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":""},"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/posts\/83"}],"collection":[{"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/comments?post=83"}],"version-history":[{"count":0,"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/posts\/83\/revisions"}],"wp:attachment":[{"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/media?parent=83"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/categories?post=83"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.iriss.org.uk\/evalexchange\/wp-json\/wp\/v2\/tags?post=83"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}\n
\n
\n
\n
\nBenefits of peer support for self-evaluation<\/strong><\/p>\n\n