- Clarify what is to be evaluated!
- Tailor the approach!
- Position evaluation as a learning function!
These are three keys to better evaluation provided by Steve Montague, Fellow of the Canadian Evaluation Society, who came to Brussels to deliver a number of lively and thought-provoking seminars early this summer.
In this week’s Voices & Views, Bridget Dillon, an Evaluation Manager at DEVCO writes a guest piece highlighting Steve Montague’s three keys to improve evaluation. Montague gave a presentation on this topic at the External Cooperation Infopoint, which can be seen in the video below.*
Click here to download the powerpoint presentation.
*For best quality click on the settings button and change the quality to 1080pHD. |
Clarify what is to be evaluated
It is difficult to evaluate when you do not know what you are evaluating, why and for whom.
Does this seem obvious?
If you are an evaluator, this point will certainly resonate. So often it is the case that it is not clear in the Terms of Reference exactly what is to be measured. Then, when as an evaluator, you ask what is wanted, you find a number of people – who either commissioned the evaluation or who are in the reference group for the evaluation process – have quite different views of what is required.
The message here is – those commissioning evaluations need to work with all stakeholders to clarify and set out what is to be evaluated before the Terms of Reference are finalised. Evaluators should start their work by checking thoroughly with evaluation commissioners just what they are to evaluate.
Tailor the approach
‘I read the book and it said do it this way, so I did, but it has not worked out’. Heard this before? It has a logic. However, there is no blueprint. It makes better logic to apply your thinking and learning to the specific situation you are confronting; you need to tailor your approach for any evaluation you conduct.
When you know what to evaluate, for what purpose, and for whom, you can work out what data/information you need to obtain. This may mean for example, that you need an approach which delivers less aggregates, averages, and is more oriented towards relevance, and addressing how we value what works (to what extent) for whom and in what conditions and why ?
And when presenting a results logic framework we should avoid a too linear approach with one-way unexplained boxes and diagrams that lack context. “We’ve got to stop that, we’re not doing these for ourselves,” said Steve Montague. “We’re doing this to communicate and to have people communicate with us about what it is works and with whom and why. So let’s again find the models, find the way of depicting reality that works.”
What we need, believes Montague, are more ‘situated,’ described, system-oriented models which show how the implementation was undertaken (theory of implementation) as well as the change processes and behaviours (theories of change) with key actors.
The message here is - you need to think through what you need and what is reasonable to expect from any given evaluation before you can develop a rigorous approach and methods.
At organisational level, position evaluation as a learning function
Building on current knowledge and contributing to further knowledge is an important and core aspect of evaluation – it is not an ‘add-on,’ nor an, ‘if you like.’ It is therefore important that organisations establish evaluation as a learning function – finding out, analysing, synthesizing, and using evidence to bear on policy and practice. Such an emphasis on a learning function does not exclude an accountability function - far from it - but if evaluation is not understood as a learning function, it loses much of its value.
This has implications for everyone involved. Evaluators are at their best when they act as facilitators, educators, or ‘critical friends.’ Evaluation is not a passive exercise. It is most akin to a team sport. I.e. it involves close working with stakeholders and beneficiaries – everyone learns as part of the overall process. Evaluation managers need to recognise and reflect this in the way they co-ordinate the overall process, by ensuring discussion and exchange of views at key points along the way.
Food for thought.
2015 is the International Year of Evaluation. Visit the Public Group on Design, Monitoring and Evaluation for more information, including links to recent DEVCO evaluation reports. You can also find out more about the EC and EEAS development policy and uptake of evaluations in the articles: Evaluation Matters and Here is the Evaluation Report... so now what do we do? |
This collaborative piece was drafted by Bridget Dillon with support from the capacity4dev.eu Coordination Team.Teaser image copyright of mmarchin on Flickr.
Log in with your EU Login account to post or comment on the platform.