Beruflich Dokumente
Kultur Dokumente
MIDLANDS REGIONAL
NETWORK
EVENT SUMMARY
LET'S GET REAL!
AN EVALUATION EXCHANGE ON REALIST EVALUATION
Our first event of 2018 was an evaluation exchange on Realist Evaluation, facilitated by our
Network Convener Karl King (Winning Moves). The session kicked off with an opportunity for
the nine Network members in attendance to discuss why they were attending the event and
what they hoped to get out of their attendance. The majority of the group had some
knowledge and understanding of the philosophy and principles of Realist Evaluation, but had
limited to no experience of conducting realist evaluations, so were interested to find out more
about how these could be put in to practice.
Karl started by giving a brief overview of realist evaluation, discussing the underlying
philosophy, key principles and types of evaluation question it is intended to answer, before
discussing some of the key insights he had drawn from practical experience of working on
realist evaluations at Winning Moves. This led to a group discussion of how to conceptualise
mechanisms in realist evaluation, ways in which theories could/should be constructed,
presented and tested, and how realist evaluations can be explained to those who may not be
familiar with the philosophy, principles and associated jargon.
Feedback received immediately following the event was positive, with everyone in attendance
saying they had found it interesting and useful. The session closed with an open discussion of
themes and topics for future evaluation exchanges. These were noted by the steering group
for inclusion in future events.
We are aware that some of you with an interest in realist evaluation were unable to attend the
event in March. If you were interested in the event, but were unable to attend, please get in
touch with karlk@winningmoves.com. If there is sufficient interest in running further events on
realist evaluation / putting the philosophy/principles into practice, we will seek to organise a
follow-up event on this at a later date.
2
NETWORK UPDATE JUNE 2018
Responses to our joining survey suggest that there are several topical evaluation challenges that people
would like to discuss with other members of the Network. These were many and wide-ranging, but
included:
Ahead of our next meeting, we would like those intending to attend to propose specific items/themes
they would like to discuss. These can include themes listed above, or any other topical evaluation-related
challenges you are grappling with at the moment where you’d appreciate opportunity to discuss with
other members of the Network. Please send your suggestions for topics/themes for discussion to Karl
King (karlk@winningmoves.com) no later than Friday 6th July 2018 to allow opportunity for the agenda
to be constructed and finalised ahead of the exchange.
If we don’t receive enough suggestions – we’ll refer to themes raised by respondents to the joining survey
in finalising the agenda.
If we receive more suggestions than we have time to cover, we’ll organise another evaluation exchange
later in the year to cover these – so please get your suggestions in as soon as you can to avoid
disappointment.
GDPR
All members responding to our joining survey were asked to confirm their preferences regarding
contact in our joining survey. However, we have issued an email requesting that all Network members
opt-in to continue to receive Network updates. If you are reading this and have yet to opt-in (or opt-
out!), please do so at your next convenience.
If you are receiving emails from the Network and no longer wish to do so for any reason, please contact
Karl karlk@winningmoves.com, or any other member of the Network Steering Group, who will remove
you from our mailing list. Similarly, if you would like to update your email address (e.g. because you
have moved to a different organisation since registering) please get in touch.
3
NETWORK UPDATE JUNE 2018
EVENT SUMMARY
UKES ANNUAL EVALUATION CONFERENCE 2018
The UK Evaluation Society’s evaluation 2. Are there ‘evaluation ideas that must die’ to
conference provides an annual opportunity for make more and faster progress in tackling big
evaluators, evaluation commissioners and users societal, economic and environmental
of evidence to network, learn about current challenges? Zenda Ofir’s keynote presentation
thinking and discuss/debate topical issues in challenged delegates to think more critically
evaluation. Those attending the conference this about how we do evaluation, warning of
year were treated to an action-packed evaluation ideas that could be holding us back
programme, with six parallel sessions of from enabling and accelerating positive
presentations stimulating wide-ranging transformative change. Inspired by John
discussion about the quality of evaluations and Brockman’s book, This Idea Must Die, Zenda
evidence from them. We were also treated to suggested a number of ‘evaluation ideas that
three excellent keynote presentations – from must die’ – e.g. the idea that dealing with
Professor Ian Boyd, Dr Zenda Ofir and Professor complexity is simply a case of experimentation
Annette Boaz. and adaptive management; and the idea that
conventional theories about development are
Hamayoon Sultan and Karl King from the UKES appropriate/superior to alternative ways of
MRN steering group share five things they took thinking. It was argued that developing
away from the conference: economic prosperity does not need strong
institutions, but that institutions and markets co-
1. Have we reached a ceasefire in the methods evolve with changing contexts. Zenda also
war in evaluation? The presentations and highlighted a number of ideas that are already
discussions at the conference this year certainly dying as the need to better deal with complexity
seem to suggest this is the case. There appears to in evaluation is acknowledged.
be much greater acceptance in the evaluation
community that there are ‘horses for courses’ and 3. Evaluation should be designed to contribute
an intrinsic need for mixed-methods to the wisdom of society, not just to meet the
methodologies to deliver high quality evidence. immediate needs and interests of
The quality of evaluations cannot be assessed stakeholders. Points made by Annette Boaz,
solely with reference to the method employed, Zenda Ofir and other speakers highlighted a need
but rather the extent to which they meet, in a to increase the contribution evaluation evidence
timely manner, the learning needs of the makes to the wisdom of society.
commissioners. cont'd...
4
NETWORK UPDATE JUNE 2018
... Evaluators, it was argued, must be voices of Genuinely adaptive management also requires
conscience. Designing evaluations to meet evaluators to develop a suite of tailored real-time
stakeholders’ immediate needs and interests can products and processes, to present learning in a
often encourage linear models of thinking; but timely and accessible manner.
understanding how to bring about transformative
change will more often require systems thinking, 5. Independent evaluator vs learning partner.
i.e. for evaluators to engage with the system as a Several presentations talked about different ways
whole (Best and Holmes, 2010) and immerse of characterising the role of the evaluator,
themselves in the bigger picture. This is about including Rob Lloyd (Itad) who described his
shifting ways of thinking, but also ways of doing journey from independent evaluator to learning
evaluation. For example, relatively short, pre- partner. Clear from the presentations of several
written interviews can fail to achieve contextual speakers was the need to get involved as early as
understanding that immersion into people’s lives possible, understand stakeholders needs,
(e.g. through ethnography) can find. As Zenda establish trust and deliver useful evidence. Dr.
put it in her presentation when referring to work in Dee Jupp (Palladium) also pointed out that, at all
international development – it’s not about your levels, there is cognitive overload, time pressures
project, it’s about their country. and risk avoidance. In such conditions,
evaluation methods must be appropriate and
4. Adaptive Management vs Flexible help people to learn. To achieve change,
Management. A hot topic at the conference, evaluations must appeal to people’s minds AND
Adaptive Management is seen as a key tool in hearts. We know this is the case, of course, but
managing complexity. However, it is important to it’s easy to forget/neglect when designing or
distinguish truly adaptive management from reporting on evaluations.
merely flexible management. Building in ‘interim
review’ points creates a degree of flexibility, but The conference also incorporated the UKES
genuinely adaptive management requires those Annual General Meeting, which included
delivering the project/programme to be discussion of progress in the development of a
empowered and sufficiently free from constraints, new value proposition to UKES members and to
to adapt the project/programme as the need is society. We’ll keep you in the loop as associated
identified. Genuinely adaptive management also actions to enhance UKES’s offer are progressed.
allows evaluators to look beyond progress
against plans, and to explore various social, The above however is but a snapshot of the
political, economic, and technological factors that valuable learning on offer at the conference, and
are at play in any given intervention. Importantly, we therefore highly encourage people to attend
there is also a need to embed evaluative thinking next year!
and learning within programme management to
achieve adaptive management in practice.
References:
Best, A. & Holmes, B. (2010) Systems thinking, knowledge and action: towards better models and methods, Evidence & Policy,
6(2), pp.145-59 [Available at: http://www.stmichaelshospital.com/pdf/crich/hef-best-holmes.pdf]
5
NETWORK UPDATE JUNE 2018
JOB OPPORTUNITY
RESEARCH AND EVALUATION CONSULTANT: BIRMINGHAM
WINNING MOVES
(£32,000-40,000 PER ANNUM (DEPENDENT ON EXPERIENCE) PLUS BENEFITS)
Evaluation Events
Our Consultant will confidently design and deliver:
- Insightful research – using a wide range of methodologies to answer research questions including
establishing awareness, exploring markets and behaviours and helping clients to understand their
customer’s experience and satisfaction;
- Robust evaluation – deploying incisive and robust solutions to help our clients make informed,
evidence-based decisions by exploring what works, for whom and under what circumstances, and the
difference they have made.
Conference Presentation
Dr Tracey Wond, Steering group member and Head of Research (Business, Law and
Social Sciences - University of Derby), summarises her conference presentation at
the International Research Society for Public Management (IRSPM) in April 2018.
This conference presentation highlighted the disconnect between evaluation use literature and public
policy discourse on evidence use. Western evaluation theorists have long highlighted typologies and
examples of mis-use, mis-evaluation and non-use (Alkin, 1990; Christie and Alkin, 1999; Patton, 2005).
Yet, there is little to no evidence that suggests such theoretical and conceptual developments has been
adopted by public policy theorists. The conference presentation explored this disconnect further, as well
as exploring primary data which characterized evaluation/evidence mis-/non-/use.