Skip to main content

Evaluation methodological approach

Group
public
46
 Members
3
 Discussions
211
 Library items
Share

Table of contents

Desk phase - Inception stage (1a)

Share

This section is structured as follows:

Each step of the desk phase is described according to the respective role of :
1420993037_em.png The evaluation manager
1421077476_1418988054_teams.png The external evaluation team
The inception stage starts as soon as the evaluation team is engaged, and its duration is limited to a few weeks.
1421077476_1418988054_teams.png Collecting basic documents

One of the team members collects the set of basic official documents such as:

  • Programming documents (e.g. project fiche), and subsequent modifications if there are any.
  • Ex ante evaluation.
  • EC documents setting the policy framework in which the project/programme takes place (EC development and external relations policy, EU foreign policy, country strategy paper.
  • Government strategy (PRSP).
1421077476_1418988054_teams.png Logic of the project/ programme

The evaluation team reviews the logical framework as set up at the beginning of the project/programme cycle. In the absence of such a document, the project/programme manager has to construct one retrospectively. As far as necessary, the evaluation team identifies the points which need clarification and/or updating. Any clarification, updating or reconstruction is reported in a transparent way. 

The analysis of the project/programme logic covers:

  • Context in which the project/programme has been launched, opportunities and constraints.
  • Needs to be met, problems to be solved and challenges to be addressed.
  • Justification of the fact that the needs, problems or challenges could not be addressed more effectively within another framework.
  • Objectives.
  • Nature of inputs and activities.

Of particular importance are the various levels of objectives and their translation into various levels of intended effects:

  • Operational objectives expressed in terms of short-term results for direct beneficiaries and/or outputs (tangible products or services).
  • Specific objective (project purpose) expressed in terms of sustainable benefit for the target group.
  • Overall objectives expressed in terms of wider effects.

Once the analysis has been performed on the basis of official documents, the evaluation team starts interacting with key informants in the project/programme management and EC services. Comments on the project/programme logic are collected.

1421077476_1418988054_teams.png Delineating the scope

The scope of the evaluation includes all resources mobilised and activities implemented in the framework of the project/programme (central scope). 

In addition, the evaluation team delineates a larger perimeter (extended scope) including the main related actions like:

  • Other EC policies, programmes or projects, plus EU policies.
  • Partner country's strategy (PRSP), or sector policy or programme.
  • Other donors' interventions.

An action is included in the perimeter as far as it reaches the same groups as the evaluated project/programme does.

1421077476_1418988054_teams.png Management documents

The evaluation team consults all relevant management and monitoring documents/data bases so as to acquire a comprehensive knowledge of the project/programme covering:

  • Full identification.
  • Resources planned, committed, used.
  • Progress of outputs.
  • Names and addresses of potential informants.
  • Ratings attributed through the "result-oriented monitoring" system.
  • Availability of progress reports and evaluation reports.
1421077476_1418988054_teams.png Evaluation questions

The evaluation team establishes the list of questions on the following bases:

  • Themes to be studied, as stated in the ToR.
  • Logical framework.
  • Reasoned coverage of the seven evaluation criteria
1421077476_1418988054_teams.png Evaluation criteria

The following evaluation criteria correspond to the traditional practice of evaluating development aid, formalised by the OECD-DAC (the first five criteria), and to the specific EC requirements (the last two criteria).

Evaluation criteria 

Relevance:

  • The extent to which the objectives of a development intervention are consistent with beneficiaries' requirements, country needs, global priorities and partners' and donors' policies

Effectiveness:

  • The extent to which the development intervention's objectives were achieved, or are expected to be achieved, taking into account their relative importance.

Efficiency:

  • A measure of how economically resources/inputs (funds, expertise, time, etc.) are converted to results.

Sustainability:

  • The continuation of benefits from a development intervention after major development assistance has been completed. The probability of continued long-term benefits. The resilience to risk of the net benefit flows over time.

Impact:

  • Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended.

Coherence/complementarity:

  • This criterion may have several dimensions:
    1) Coherence within the Commission's development programme
    2) Coherence/complementarity with the partner country's policies and with other donors' interventions
    3) Coherence/complementarity with the other Community policies

Community value added:

  • The extent to which the project/programme adds benefits to what would have resulted from Member States' interventions in the same context.

Each question is commented on in line with the following points:

  • Origin of the question and potential utility of the answer
  • Clarification of the terms used
  • Indicative methodological design (updated), foreseeable difficulties and feasibility problems if any.

This site presents a list of typical questions associated with evaluation criteria. 

1420995755_mgteam.png Inception meeting

Within a few weeks after the start of the work, and after a review of basic documents complemented by a few interviews, the evaluation team defines its overall approach.
This approach is presented in a meeting with the evaluation manager and the reference group members. Subjects to be discussed include:

  • Logical framework.
  • Evaluation questions, either from the ToR or proposed by the evaluation team.
  • Provisional methodological design.
  • Access to informants and to documents, and foreseeable difficulties.

The presentation is supported by a series of slides and by a commented list of evaluation questions. Where relevant, the meeting may be completed by an email consultation.

Specific guidance in the case of:

  • Multi-country programme

In the case of a multi-country programme, the evaluation builds upon a number of country case studies which should be selected as soon as possible and preferably before the end of the inception stage. 

  • Participatory evaluation

The evaluation team extends its initial interviews in order to understand the expectations of beneficiaries and other outside stakeholders.
A stakeholder analysis is performed and discussed in the inception meeting.

                                                                                                                                              .
Inception report
 
1421077476_1418988054_teams.png

The evaluation team prepares an inception report which recalls and formalises all the steps already taken, including an updated list of questions in line with the comments received. 

Each question is further developed into:

  • Indicators to be used for answering the question, and corresponding sources of information
  • Strategy of analysis
  • Sub-questions.

Indicators

The logical framework preferably includes "Objectively Verifiable Indicators (OVIs)" and "Sources of Information" which are useful for structuring the evaluators' work. In so far as OVIs have been properly monitored, including baseline data, they become a major part of the factual basis of the evaluation. 
Indicators may also be available through a performance assessment framework, if the project /programme is linked with such a framework. 
Indicators may also be developed in the framework of the evaluation as part of a questionnaire survey, an analysis of a management database, or an analysis of statistical series. 
Indicators may be quantitative or qualitative.

Analysis strategy

Indicators and other types of data need to be analysed in order to answer evaluation questions. 

Four strategies of analysis can be considered:

  • Change analysis, which compares indicators over time and/or against targets
  • Meta-analysis, which extrapolates upon findings of other evaluations and studies, after having carefully checked their validity and transferability
  • Attribution analysis, which compares the observed changes with a "without intervention" scenario, also called counterfactual
  • Contribution analysis, which confirms or invalidates cause-and-effect assumptions on the basis of a chain of reasoning.

The first strategy is the lightest one and may fit virtually all types of question. The three last strategies are better at answering cause-and-effect questions.

Indicators, sources of information and sub-questions remain provisional at this stage of the process. However, the inception report includes a detailed work plan for the next step. The report needs to be formally approved in order to move to the next step.

1420993037_em.png



 

The evaluation manager receives an inception report which finalises the questions and describes the main lines of the methodological design, including the indicators to be used, the analysis strategy, and a detailed work plan for the next step.
The report is formally approved by an official letter authorising the continuation of the work. If the set of evaluation questions is drawn up at this stage, it becomes part of the ToR.
 
 
Check lists
See for inspiration the Check lists for Geographic, thematic and other complex evaluations.