Skip to main content

Evaluation methodological approach

Group
public
50
 Members
2
 Discussions
213
 Library items

Table of contents

Synthesis phase (3)

This section is structured as follows:

                                                                                                                                              .
Each step of the synthesis phase is described according to the respective role of :
The evaluation manager
The external evaluation team
                                                                                                                                      .
Expressing findings

The evaluation team formalises its findings on the basis of the following elements:

  • Analysis undertaken in the first phase (desk)
  • Analysis undertaken in the field phase

Findings only follow from facts, data and analysis. Unlike conclusions, they do not entail value judgments. 

Findings include cause-and-effect statements related to the contribution of the support to observed changes, or the attribution of part of the observed changes to the intervention under evaluation.

Confirming findings

For each question, the evaluation team submits its provisional findings to criticism in order to confirm them. A finding is considered as sound if it stands criticism such as:

  • Validity tests for statistical analysis.
  • Cross-checking with other sources of information.
  • Search for biases in the surveys.
  • Search for external factors likely to explain the detected changes even in the absence of intervention.
  • Cross-checking with findings obtained from similar research and evaluations (according to the experts involved).
  • Critical comments received from the Delegation (s) or the reference group members when preliminary findings are discussed.

When a finding entails a cause-and-effect statement, the evaluation team specifies whether it may be generalised or transferred to other contexts.

Judgment and conclusions

For each question, the evaluation team formalises its responses by way of conclusions, on the basis of the following elements:

  • Evidence and findings.
  • Judgment criteria (also called reasoned assessment criteria) adopted in the first phase (desk).
  • Judgment criteria actually applied and justification for the discrepancies, if any.
  • Targets

Among its conclusions, the evaluation team identifies transferable lessons, in other words, conclusions based on generalisable and transferable findings. 

Apart from the answer to each question, the evaluation team seeks to articulate all the findings and conclusions in a way that allows for an overall assessment of the intervention.

Version 1 of the report

The evaluation team writes the first version of its report. This document must have the same format and contents as the final version, with the exception of the recommendations, which may be just sketched. The report consists of four parts:

  • Summary, including in particular the main findings, conclusions and recommendations.
  • Introduction presenting the assessed intervention, its logic and context, and the purpose of the evaluation.
  • Presentation of the evaluation method.
  • Detailed findings, conclusions and recommendations.

The report is limited in size (maximum 60 pages) so that it is easy to read. Details are appended in annexes. 

The evaluation team leader checks that the report meets the quality criteria. He/she ensures that the report is submitted to a thorough quality control by an expert who is not part of the evaluation team. Then, the report is handed over to the evaluation manager.

Presenting the report

Following are the points to be addressed:

  • Answers to questions.
  • Potential methodological limitations and judgment criteria actually applied.
  • Overall assessment of the intervention.
  • Outline of recommendations.

The evaluation team leader receives three types of comments:

  • Oral comments at the meeting.
  • Written comments after the meeting.
  • Comments from the evaluation manager on the methodological quality of the report.
Version 2 of the report

The evaluation team takes the comments received into account, yet without compromising the independence of its value judgments. The process is as follows:

  • Comments dealing with methodological quality are taken into account imperatively, as long as that is possible. Whenever they are not taken into account, the evaluation team accounts for its reasons.
  • Comments dealing with the substance of the document, findings and conclusions may be taken into account or rejected by the evaluation team. The team mentions and explains the requests for dismissal in a note or annex.

The evaluation team finalises its recommendations, which are clustered and prioritised. As far as possible, alternative options are proposed, including their respective benefits and risks. 

The new version of the report is handed over to the evaluation manager.

Discussion seminar

In the case of a geographic evaluation, a discussion seminar is organised at this point. When a seminar is convened at this stage, its purpose goes beyond mere dissemination. It aims at discussing the substance of the conclusions and recommendations.

  • The evaluation team updates the report's presentation slides prepared for the reference group.
  • The team leader participates in the seminar, maybe together with another member. He/she presents the report.
  • He/she takes note of the received comments, which account for the last chance to check factual data, confirm analysis and justify value judgments explicitly.
  • He/she also takes into consideration written comments received after the seminar.

The slides shown are handed over to the manager in PowerPoint version. 

A meeting with the other donors may be organised on site immediately after the seminar.

Specific situation

Country or regional level evaluation case

Barring exceptions, the evaluation comprises a discussion seminar in the partner country or region, with a view to discussing the final report, the substance of the conclusions and the utility of the recommendations in the presence of the evaluation team. 


At least fifty people are invited, including the delegation staff, national authorities, the civil society, project implementation managers, representatives of the member states and experts.

  • On demand of the evaluation manager, the Head of Delegation sets the date and place of the seminar.
  • The delegation draws up the list of organisations and institutions to be invited.
  • The participants are invited by the Head of Delegation and receive the latest version of the report in advance.
  • At the seminar, the manager presents the evaluation process, including its dissemination and intended use.
  • He/she participates actively in the seminar.
  • Through the delegation, he/she collects the written comments received after the seminar.
  • He/she writes a mission report and notes down the quality improvement requests made by the participants.
                                                                                                                                 .
Quality control of the draft report
  • The final report is drawn up by the external evaluation team and submitted as a first draft.
  • The manager checks that the document has the same format and contents as the final version; the only exception being the recommendations, which may be just sketched. He/she checks, in particular, that the report includes the findings and conclusions corresponding to the evaluation questions, as well as a synthesis containing an overall assessment of the intervention.
  • The evaluation manager makes a thorough quality assessment and assigns a score for each of the nine criteria on the quality assessment grid. He/she verifies that data collection and analyses have been rigorously carried out, and that findings, conclusions and recommendations are linked appropriately.
  • The evaluation manager's quality assessment is double-checked by a second person.
Discussion meeting
  • The manager sends the report to the reference group members and convenes a meeting with the participation of the evaluation team.
  • He/she asks the evaluation team to present its report using slides as visual support.
  • The manager chairs the meeting with the following objectives:
    • To discuss the answers to the questions, the overall assessment, and the substance of findings and conclusions.
    • To discuss the factual basis of findings, the validity of analyses, and the judgement criteria actually applied.
    • To discuss the utility of recommendations.
    • To check the readability and clarity of the report.
  • He/she writes the minutes of the meeting, and attaches the quality assessment grid as well as his/her requests for quality improvement.
  • He/she receives the new version of the report drawn up by the evaluation team. This version finalises the recommendations which must be:
    • Linked to the conclusions.
    • Clustered, prioritised and targeted at specific addressees.
    • Useful and operational.
    • If possible, presented as options associated with benefits and risks.
  • The manager verifies that:
    • Reference group's comments on the substance of conclusions and recommendations have been either taken into account or mentioned with an explanation of why they are not taken into account.
    • Requests for methodological quality improvement have been satisfied.
  • He/she approves the draft report and authorises moving onto the next step.
Finalising the report

Following the seminar, the evaluation team drafts the last version of its report and submits it to the expert in charge of quality control before it is handed over to the manager. 

At this point, the evaluation team finalises the annexes and decides on presenting them under one of the following formats:

  • Printed out annexes following the report.
  • Annexes on CDROM.

The report is printed out according to the instructions provided by the terms of reference.

  • The manager receives the new version of the report submitted by the evaluation team. He/she checks that the last comments have been taken into account and that the annexes are complete.
  • He/she runs a second full quality assessment with the help of the quality assessment grid, giving it a new overall score and making a qualitative comment for all of the criteria. Again, the assessment is double-checked by a second person,
  • The evaluation team leader receives a new quality assessment from the manager. If necessary, he/she writes a note setting forth the reasons why certain requests for quality improvement have not been sustained. This response will remain attached to both the quality assessment and the report.
  • The manager sends the final version of the report and the quality assessment to the reference group members.
Check lists
 
Content of the final report

Executive summary

(length: 3 pages maximum, a majority of sections devoted to findings, conclusions and recommendations)

  • Purpose of the evaluation
  • Method
  • Analysis and main findings
  • Main conclusions
  • Main recommendations

Introduction

  • Objectives, how they are prioritised and ordered, logical links between activities / instruments and expected impacts, connected policies, conditions and risks
  • Brief analysis of the political, economic, social and cultural context of the intervention
  • Purpose of the evaluation: presentation of the evaluation questions and of how they permit to assess the intervention as a whole

Methods

  • Judgement criteria related to each question, as agreed upon by the reference group
  • Indicators related to each criterion
  • Data collection process actually implemented and limitations if any
  • Analysis approach actually implemented and limitations if any
  • Judgement approach actually implemented, changes in comparison with first phase report (desk) and limitations, if any

Main findings, conclusions and recommendations

Three distinct chapters:

  • Answers to each evaluation question, indicating findings and judgement criteria supporting them
  • Overall assessment of the intervention covering the main families of evaluation criteria
  • Recommendations which have to be clustered and prioritised, preferably in the form of options with benefits and risks

Annexes (indicative)

  • Diagram(s) displaying the intervention logic
  • Methodology
  • Sheets relating questions, judgement criteria and indicators
  • Detailed chains of arguments for a given question: facts, interpretation and analysis, findings, judgement criteria and benchmarks, conclusions in response to the question
  • Overview of EC intervention
  • Informants met
  • Documents used
  • Terms of Reference
  • Statistical data and context indicators
  • List of the projects and programmes specifically assessed
  • Project assessment fiches, case study monographs
  • Questionnaires and survey reports
  • Acronyms and abbreviations
  • etc.
Quality control
Quality control in the evaluation reports
Final discussion meeting

Agenda

This is usually the fourth and last reference group meeting.

  • Reminder of the rules regarding the quality assessment of the final report.
  • Presentation of the first version of the report by the evaluation team:
    • Answers to the questions and overall assessment
    • Validity and/or limitations of analyses, findings and conclusions
  • Participants' comments on the substance of findings and conclusions. Comments on the factual basis of the conclusions.
  • Presentation on the recommendations and transferable lessons by the evaluation team.
  • Discussion on the utility and feasibility of the recommendations.