Evaluation Handbook

IV. C. Inception Report

 

This Section is addressed to:
-    Evaluation teams developing the evaluation methodology for In-depth Evaluations;
- Project Managers judging the quality of the evaluation methodology for Independent Project Evaluations.   
An Inception Report summarizes the review of documentation (''desk review'') undertaken by an evaluator mandated by UNODC and specifies the evaluation methodology determining thereby the exact focus and scope of the exercise, including the evaluation questions, the sampling strategy and the data collection instruments. Consequently, the evaluator is expected to deliver an Inception Report as one of the key deliverables, which is shared with the Project Manager and the Independent Evaluation Unit for comments.

The Inception Report provides an opportunity to elaborate on the evaluation methodology proposed in the ToR and its related issues at an early stage of the evaluation exercise. It also ensures that evaluation stakeholders have a common understanding of how the evaluation will be conducted.

The evaluation team develops an Inception Report which contains the methodology used to answer the evaluation questions based on information derived from the ToR, the desk review and the evaluation team briefing.

UNODC Inception Report Template, which is presented in the paragraph 1 below, provides the minimum requirements for the development of an evaluation methodology. The evaluation team could therefore go beyond the Inception Report Template depending on the needs of the evaluation (paragraph 2 below).

1. The Inception Report Template

The evaluation team is responsible for the development of the Inception Report before departing on field missions. For In-depth Evaluations, Project Managers provide feedback on Inception Reports and IEU clears them.

For Independent Project Evaluations, Project Managers check the quality of the Inception Report, provide extensive feedback and guidance to the evaluation team and finalize it. IEU clears the Inception Report.

Inception Report Table of Contents

As minimum requirements, the Inception Report should include:

I.  Preliminary Findings of the Desk Review;
II. Evaluation Questions;
III. Data Collection Instruments (to answer evaluation questions);
IV. Sampling Strategy;
V. Limitations to the Evaluation;
VI.  Timetable.

The template for the Inception Report can be found in the Chapter IV Tools.

Specific components of the Inception Report are elaborated below.

a) Data Collection Instruments

The data collection instruments to be used depend on:

-    what it is needed to know, i.e. the evaluation questions;
- where the data is located;
- the resources and time available to collect the data;
- the complexity of the data to be collected;
- the frequency of the data collection.

The evaluation team should keep in mind that there must be sufficient data collected to address all evaluation questions.

It is recommended to:

-    Use multiple data collection methods (triangulation);
- Use monitoring data;
- Use available data when possible (secondary data);
- Ensure that the data is relevant, credible and reliable.

Please refer to Chapter III, Section C., paragraph on evaluation methodology for further information on the data collection methods.

Data collection instruments include, but are not limited to, the following:

- desk review;
- questionnaires;
- surveys;
- interviews;
- focus group;
- workshops;
- field visits;
- observations;
- case study.

TIP

- It is recommended to pilot (i.e. test and correct them accordingly) the data collection instruments..

TIP for Interviews

- The quality of the information obtained during an interview depends largely on the interviewer's skills;

- Cultural sensitiveness is a must for the interviewer, who must pay attention to the respondent's reactions during the interview to avoid causing any offense.

TIP for Joint Interviews

- Joint interviews should be avoided as much as possible to ensure confidentiality of the evaluation process.

TIP for Surveys & Questionnaires

- The questions should focus on the purpose of the evaluation;

- The questions should be short, simple and straightforward (particularly important if translation is necessary in different languages);

- It is recommended to have a mixture of open-ended and closed-ended questions;

- Demographic questions can be used to correlate responses among different sub-groups.

b) Sampling Strategy

The evaluation team develop the sampling techniques that will be applied for the different data collection instruments. The evaluation team can use a combination of sampling techniques.

Sampling techniques include but are not limited to:
-    Random sampling: simple random sampling, random interval sampling, random-start and fixed-interval sampling, stratified random sampling, random cluster sampling, multistage random sampling;
- Non-random sampling: purposeful sampling, snowball sampling, convenience sampling.

The evaluation team should identify the whole population (or stakeholder groups) for each data collection instrument (e.g. target group of the questionnaire, focus group or interview etc.) and determine a sample that (i) reflect as closely as possible the whole population, and (ii) is of significant size (the larger the sample, the less the variability). Indeed, the team critically discusses if the chosen sample size is statistically relevant and what sampling errors might occur.

For this purpose the evaluation team could use a Sample Stakeholder Coverage Table which an example is given below. Please see Chapter IV Tools for the Sample Stakeholder Coverage Table.


c) Limitations to the Evaluation

The Inception Report must explicitly and clearly state the limitations to the overall evaluation and to the chosen evaluation methods.

A frequently encountered limitation is the lack of data (baseline and monitoring data) to address the evaluation questions. Alternative solutions have therefore to be found by the evaluation team to reconstruct the baseline data.

2. Beyond the Inception Report Template

The evaluation team could elaborate further on the evaluation methodology beyond the basic requirements listed in the Inception Report Template. The evaluation team could adapt the Inception Report Template to its needs and address the following additional methodology aspects that are explained in the paragraphs below:

-   Methodological approach;
- Design of the evaluation;
- Human Rights and Gender Equality;
- Data analysis methods.

a) Methodological Approach

Special attention shall be paid to the production of a methodological approach that results in unbiased and objective evaluation findings. The choice of the evaluation approach depends on the context. Approaches are not necessarily mutually exclusive.

Regardless of the methodological approach chosen, the same steps must be undertaken: defining evaluation questions, identifying indicators, collecting and analysing data, and reporting and using findings.

Approaches may include:

- Participatory: evaluation in which responsibilities for planning, implementing and reporting are shared with stakeholders, who may help define evaluation questions, collect and analyse data, and draft and review the report;

- Utilization-focused: evaluation judged by how useful it is and how it is actually used;

- Theory-based: evaluation that measures the extent to which the theory of a programme/project is adequate;

- Gender and human rights responsive: evaluation that assesses the effect of programmes/projects on gender equality, women's empowerment and human rights.

b) Design for the Evaluation

An evaluation design consists of the evaluation approach, the evaluation questions, the indicators/measures, the data sources, the methodological strategies for the type of data collection to be used (sampling strategies), the type of data collection instruments, the analysis planned and the dissemination strategy.

Designs include:

- Non-experimental: design where no attempt is made to create treatment and control groups;

- Quasi-experimental: design where treatment and comparison groups are formed either ex ante or ex post without random assignment to groups. Groups with similar characteristics are compared or multiple measures of the same group are taken over time;

- Experimental: design requiring random assignment of a population to at least two groups such that each and every member of the population has an equal chance of being assigned to the treatment group (benefiting from the project) or to the control group (non benefiting from the project).

Control group: group in an experiment whose members are not exposed to a project/programme.

Treatment group: group in an experiment whose members are exposed to a project/programme.

To represent the evaluation design an evaluation matrix can be used by the evaluation team as below.

Questions

 

Indicators

Target

Baseline

Design

Data sources

Sample

Data collection instrument

Data analysis

 

Relevance

 

 

 

 

 

 

 

 

 

 

 

Efficiency

 

 

 

 

 

 

 

 

 

 

 

Effectiveness

 

 

 

 

 

 

 

 

 

 

 

Impact

 

 

 

 

 

 

 

 

 

 

 

Sustainability

 

 

 

 

 

 

 

 

 

 

 

[Additional evaluation criteria]

 

 

 

 

 

 

 

 

 

 

 

The data collection and analysis methods should be sufficiently rigorous to assess the subject of the evaluation and ensure a complete, fair and unbiased assessment.

c) Human Rights and Gender Equality

Evaluations in the United Nations system are guided by the principles of human rights and gender equality. These principles should be translated in the evaluation design and implementation. If applicable, gender sensitive data collection methods should be considered. The evaluator should identify the best methods to collect gender-related information in a reasonable and realistic fashion by looking into various ways of acquiring data on the gender issues that were identified.

d) Data analysis methods

Data analysis is a systematic process that involves organizing and classifying the information collected, tabulating it, summarizing it, and comparing the results with other appropriate information to extract useful information that responds to the evaluation questions and fulfils the purposes of the evaluation.

Data analysis methods include:

-    Qualitative;
- Quantitative;
- Mixed method;
- Synthesis of all sources of information;
- Inductive: analysis of data involving discovery of patters, themes and categories.

Logical and explicit linkages must be provided between the data sources, the data collection methods and the analysis methods.

From the analysis are derived the evaluation findings, conclusions and recommendations.

3. Evaluating Impact

This paragraph is addressed to evaluation teams tasked with assessing the impact of the subject evaluated.

In the event that impact evaluation was planned for at the design stage of a programme or project (with appropriate collection of baseline, creation of control and treatment groups, and creation of a monitoring system), the evaluation team will be able to adopt the following methodology to measure impact: use existing monitoring data and develop large scale sample surveys, in which treatment and control groups are compared before and after the project's implementation.

In the event that impact measurement was not planned for at the design stage of a programme or project, the evaluation team will have to develop a methodology which considers data, budget and time constraints. The following methodology could be used: small-scale rapid assessments and participatory appraisals, where estimates of impact are obtained from combining group interviews, key informants, case studies and available secondary data.

To address data constraints the following solutions are proposed:

-    reconstruct baseline data;
- reconstruct comparison group.

To address budget constraints the following solutions are proposed:

-    simplify the evaluation design;
- clarify client information needs;
- reduce costs by reducing sample size;
- reduce costs of data collection and analysis.

To address time constraints the following solutions are proposed:

-    use existing documentary data;
- reduce sample size;
- undertake rapid data collection methods;
- hire more resource people;
- invest in data collection and analysis technology.
Challenges Solutions Proposed
Inexistent or incomplete baselines
  • Use other data information systems (secondary data) to reconstruct the baseline;
  • Use individual recall/retrospective interviewing techniques (respondents are asked to recall the situation at around the time the project began);
  • Consult pre- and post- project evaluations, if any;
  • Use participatory group techniques to reconstruct the history of the community and to assess the changes that have been produced by the project;
  • Undertake interviews with key informants, preferably persons who know the target community, as well as other communities, and therefore, have a perspective on relative changes occurring over time [2].
Projects not designed with a control group
  • find matching comparison group, if any.
Attribution problem
  • Analyse contextual factors affecting the results;
  • Provide an honest presentation of UNODC contribution alongside other agencies /institutions.
Gaps/unreliable parts in existing information systems
  • Complement/verify information during the primary data collection.

_________________________________

[2] RealWorld Evaluation: Working under budget, time, data, and political constraints, M. Bamberger, J. Rugh, L. Mabry

______________________________________________________

 

 
IEU Home
Evaluation Handbook Home
Table of Contents
Acronyms
Foreword
Introduction
Chapter I: Defining Core Concepts
Chapter II: Planning an Evaluation at the Design Stage
Chapter III: Managing an Independent Project Evaluation

Chapter IV: Undertaking an In-depth Evaluation

Chapter IV Tools:
Chapter V: Undertaking a Participatory Self-Evaluation
Chapter VI: Using the Evaluation
Annexes
Annex I: Evaluation Glossary
Annex II: UNEG Norms
Annex III: UNEG Standards
Download as pdf
Chapter IV: Undertaking an In-depth Evaluation 
Feedback
Feedback