Sie sind auf Seite 1von 3

From pants to proactive supporting self-evaluation in UK

helplines
Evaluating the benefits of helplines offers an interesting challenge to even the most
seasoned of evaluators. The combination of the fleeting nature of the contact, strict
confidentiality requirements, and the difficulty of collecting data from callers in acute
emotional and practical distress, makes it tricky to measure the difference being made
by advice given at the end of the phone.
Between January and November 2013, the childrens charity Coram and Charities
Evaluation Services (CES) worked in partnership to deliver a programme of training,
peer-learning, reflection and mentoring to six UK helplines. This work was funded by
The Dulverton Trust. The programme aimed to support the helplines to access the
outcomes data they needed, both in order to attract funding and to improve the service
they offer. Or, as one participant put it, We need data that shows us how to improve the
service we offer to young people anything else, anything less than that, is just pants.

Common problems, common solutions


Each of the six helplines was distinct in terms of size, sector, scope and resources.
Nevertheless, all six were facing very similar problems when it came to measuring their
outcomes.
The first and pivotal barrier was around articulating the outcomes they hoped to achieve
for callers. In some cases, helpline outcomes were tacitly understood by the
organisation, but never written down, and in others, there was no agreement between
funders and service managers about what the outcomes should be. By the end of the
programme, all six helplines had reached agreement internally about the difference they
hoped to make in the short and longer-term. This unlocked progress to all sorts of other
practical changes, enabling them to build a plan for outcome measurement, but also to
open discussions with funders and staff/volunteers about the difference the service
makes.
A second shared area of concern was around data collection. When the peer-learning
programme started, all six helplines were collecting copious amounts of very detailed
information about their services. However, they all agreed that the data was either of the
wrong type or that too much was being collected to be practically useful.
For example, helpline advisers monitored calls on a call sheet, a paper monitoring form
collecting information about who calls the helpline, and what was discussed. One of the
most interesting peer-learning sessions involved a call sheet showdown between the
different organisations, with the winning entries clocking up between 50 and 70

From pants to proactive supporting self-evaluation in UK helplines

individual tick boxes for call handlers to complete for each interaction! It was time for
some streamlining, and while this ranged from judicious pruning to a more radical slash
and burn approach, the way was paved for greater efficiency and better quality
information.
In addition to more practical problems around missing monitoring and evaluation
frameworks and overpopulated call sheets, the six helplines also shared a less tangible
problem around hearts and minds. How would they get call handlers, many of whom
were volunteers, to help collect the right information from callers? Would it be possible
to bring everyone on board with the idea that changes to evaluation practice were
important and valuable?
There were also concerns about whether or not change would even be possible. Within
the group, there was enthusiasm for gathering outcome data, but also a lurking
suspicion that it simply might not be feasible, given the constraints around time,
resources, and resistance from staff and volunteers. On this score, the combination of
professional and peer support proved a potent one, facilitating the exchange of new
ideas. By the end of the programme all six helplines felt that they had the skills and
confidence to spread enthusiasm for outcomes evaluation within their team, and to
approach data collection in new and creative ways.

Learning points for helplines, funders and facilitators


The programme generated lots of learning points, both for the helplines themselves and
for the CES and Coram team as evaluators and facilitators.

Learning for helplines

Take the initiative and be imaginative

Just one of the helplines had been funded or commissioned to deliver specific
outcomes, but all six were keen to take the initiative to define and measure the
difference they made, in order to attract more sustainable sources of funding and to run
a more effective service. The three helplines which moved on to collect their first lot of
outcomes data tried out a number of different approaches. These included using
snapshots (collecting data from every caller over an eight week timeframe), and quotas
(asking every volunteer to collect data from a target number of callers each quarter),
online and phone questionnaires, and text and email prompts. Experimenting with
different approaches allowed them to develop a better sense of what was feasible,
proportionate and appropriate for their setting.

Buy-in from call handlers is absolutely crucial

Almost all of the helplines encountered resistance to a greater or lesser extent from the
call handlers. Asking callers to provide feedback was anathema by some staff and
volunteers, seen contrary to the confidential and anonymous nature of the helpline
service, which expects nothing in return for the support provided to callers. This
From pants to proactive supporting self-evaluation in UK helplines

resistance caused very practical issues, particularly around asking callers for consent to
follow-up. Successful solutions included: including call handlers in initial discussions
about defining and measuring outcomes; briefing staff and volunteers fully on the
purpose of evaluation and providing a script for recruiting callers; feeding back to the
team about the data collected, what it showed, and how it would be used to make
improvements.

Changes to practice can take time

The most important factor for implementing change was flexibility, not access to
resources. Of the three helplines who made the most progress, two were very small
teams, and one was in start-up phase, allowing for greater flexibility to change systems
and approaches. Contrastingly, the three organisations that made the least practical
progress encountered issues around (respectively) organisational IT, resistance to an
outcomes approach, and the challenges of delivering a helpline as part of a consortium.
This bigger picture made it slower and harder for the helplines to implement changes to
practice.

Learning for facilitators

A successful community of practice takes practice

There were a number of interesting learning points for us as facilitators. These included
the importance of keeping pace and momentum when participants were moving at
different speeds to implement change, as well as the extraordinary added value which
peer-learning brings to capacity building around monitoring and evaluation. We were
also struck by the way in which group accountability can motivate participants to
implement changes to practice.

And finally

Peer support is invaluable

Helpline evaluation is famously difficult, and the type of barriers and issues presented by
the six members of the community of practice are familiar to people who run, evaluate
and fund this type of work. However, all six of the helplines had made positive progress
by the end of the programme, indicating that while implementing outcome evaluation in
this type of setting takes dedicated time, effort, and outside support, working together
makes overcoming difficulties and challenges much easier.

From pants to proactive supporting self-evaluation in UK helplines

Das könnte Ihnen auch gefallen