Monitoring and Evaluation
Monitoring and Evaluation
c) Logical frameworks are also known as Log Frames and are commonly
used to help set clear program objectives and define indicators of
success. They also outline the critical assumptions on which a project is
based, similar to the results framework.
d) Logic models also known as M&E frameworks are commonly used to
present a clear plan for the use of resources to meet the desired goals
and objectives. They are a useful tool for presenting programmatic and
evaluation components.
The choice of a particular type of framework—whether a conceptual
framework, results framework, logical framework, or logic model—depends on
the program’s specific needs, the M&E team’s preferences, and donor
requirements.
In particular, the LFA is a systematic planning procedure for complete project
cycle management, a participatory Planning, Monitoring & Evaluation tool;
A tool for planning a logical set of interventions;
A tool for appraising a Programme document;
A concise summary of the Programme;
A tool for monitoring progress made with regard to the delivery of
outputs and activities;
A tool for evaluating the impact of Programme outputs, e.g. progress
in achieving purpose and goal.
COMPARE
PLAN ACTUAL
STATUS
VARIANCES
NO YES
b) Types of monitoring
A project/program usually monitors a variety of things according to its specific
informational needs. These monitoring types often occur simultaneously as
The following are brief descriptions of the most commonly used evaluation
(and research) designs.
a) Evaluation Methods
Informal and less-structured methods
• Conversation with concerned individuals
• Community interviews
• Field visits
• Reviews of records
• Key informant interviews
• Participant observation
• Focus group interviews
b) PARTICIPATORY M&E
Participatory evaluation is a partnership approach to evaluation in which
stakeholders actively engage in developing the evaluation and all phases of
its implementation. Participatory evaluations often use rapid appraisal
techniques. Name a few of them.
• Key Informant Interviews - Interviews with a small number of individuals
who are most knowledgeable about an issue.
• Focus Groups - A small group (8-12) is asked to openly discuss ideas,
issues and experiences.
• Mini-surveys - A small number of people (25-50) is asked a limited
number of questions.
• Neighborhood Mapping - Pictures show the location and types of
changes in an area to be evaluated.
Quantitative data is often considered more objective and less biased than
qualitative data but recent debates have concluded that both quantitative
and qualitative methods have subjective (biased) and objective (unbiased)
characteristics.
Management Response
Template Prepared by:
Reviewed by:
Evaluation recommendation
1.
Management response:
E.g.
Formal reports developed by evaluators typically include six major sections:
(1)Background
(2)Evaluation study questions
(3)Evaluation procedures
(4)Data analyses
Monitoring and Evaluation-Nile Salvation College of Science and Technology Page 24
(5)Findings
(6)Conclusions (and recommendations)
Or detailed:
Summary sections
A. Abstract
B. Executive summary II. Background
A. Problems or needs addressed
B. Literature review
C. Stakeholders and their information needs
D. Participants
E. Project’s objectives
F. Activities and components
G. Location and planned longevity of the project
H. Resources used to implement the project
I. Project’s expected measurable outcomes
J. Constraints
III. Evaluation study questions
A.Questions addressed by the study
B.Questions that could not be addressed by the study (when relevant)
IV. Evaluation procedures
A. Sample
1.Selection procedures
2.Representativeness of the sample
3.Use of comparison or control groups, if applicable
B. Data collection
1.Methods
2.Instruments
C. Summary matrix
1.Evaluation questions
2.Variables
3.Data gathering approaches
4.Respondents
5.Data collection schedule
V. Findings
A. Results of the analyses organized by study question
VI. Conclusions
A.Broad-based, summative statements
B.Recommendations, when applicable
Table of contents
Executive summary
• Introduction
• Evaluation scope, focus, and approach
• Project facts
• Findings, Lessons Learned
o Findings
o Lessons Learned
• Conclusions and recommendations
2. Table of contents
Should always include lists of boxes, figures, tables, and annexes with page
references.
3. List of acronyms and
abbreviations
4. Executive summary
A stand-alone section of two to three pages that should:
• Briefly describe the intervention (the project(s), programme(s), policies
or other interventions) that was evaluated.
• Explain the purpose and objectives of the evaluation, including the
audience for the evaluation and the intended uses.
• Describe key aspects of the evaluation approach and methods.
Summarize principle findings, conclusions, and recommendations.
5. Introduction
Should:
• Explain why the evaluation was conducted (the purpose), why the
intervention is being evaluated at this point, and why it addressed the
questions it did.
• Identify the primary audience or users of the evaluation, what they
wanted to learn from the evaluation and why, and how they are
expected to use the evaluation results.
• Identify the intervention (the project(s) program (s), policies or other
interventions) that was evaluated—see upcoming section on
intervention.
• Acquaint the reader with the structure and contents of the report and
how the information contained in the report will meet the purposes of
the evaluation and satisfy the information needs of the report’s intended
users.
5. Description of the intervention/project/process/program —Provide the
basis for report users to understand the logic and assess the merits of the
evaluation methodology and understand the applicability of the evaluation
results.
Monitoring and Evaluation-Nile Salvation College of Science and Technology Page 26
The description needs to provide sufficient detail for the report user to derive
meaning from the evaluation. The description should:
• Describe what is being evaluated, who seeks to benefit, and the problem
or issue it seeks to address.
• Explain the expected results map or results framework, implementation
strategies, and the key assumptions underlying the strategy.
• Link the intervention to national priorities, Development partner
priorities, corporate strategic plan goals, or other projects, program,
organizational, or country-specific plans and goals.
• Identify the phase in the implementation of the intervention and any
significant changes (e.g., plans, strategies, logical frameworks) that
have occurred over time, and explain the implications of those changes
for the evaluation.
• Identify and describe the key partners involved in the implementation
and their roles.
• Describe the scale of the intervention, such as the number of
components (e.g., phases of a project) and the size of the target
population for each component.
• Indicate the total resources, including human resources and budgets.
• Describe the context of the social, political, economic, and institutional
factors, and the geographical landscape within which the intervention
operates and explain the effects (challenges and opportunities) those
factors present for its implementation and outcomes.
• Point out design weaknesses (e.g., intervention logic) or other
implementation constraints (e.g., resource limitations).
7. Evaluation scope and objectives - The report should provide a clear
explanation of the evaluation’s scope, primary objectives, and main questions.
• Evaluation scope—The report should define the parameters of the
evaluation, for example, the time period, the segments of the target
population included, the geographic area included, and which
components, outputs or outcomes were and were not assessed.
• Evaluation objectives—The report should spell out the types of decisions
evaluation users will make, the issues they will need to consider in
making those decisions, and what the evaluation will need to achieve to
contribute to those decisions.
• Evaluation criteria—The report should define the evaluation criteria or
performance standards used. The report should explain the rationale for
selecting the particular criteria used in the evaluation.
• Evaluation questions—Evaluation questions define the information that
the evaluation will generate. The report should detail the main
evaluation questions addressed by the evaluation and explain how the
answers to these questions address the information needs of users.
8. Evaluation approach and methods - The evaluation report should describe in
detail the selected methodological approaches, theoretical models, methods,
and analysis; the rationale for their selection; and how, within the constraints
of time and money, the approaches and methods employed yielded data that
helped answer the evaluation questions and achieved the evaluation purposes.
The description should help the report users judge the merits of the methods
used in the evaluation and the credibility of the findings, conclusions, and
recommendations.
SESSION 12: BEST PRACTICES, EMERGING TRENDS & M&E CAPACITY BUILDING
IN KENYA
(i) Monitoring Best Practices
• Data well-focused to specific audiences and uses (only what is
necessary and sufficient).
• Systematic, based upon predetermined indicators and assumptions.
• Also look for unanticipated changes in the project/program and its
context, including any changes in project/program assumptions/risks;
this information should be used to adjust project/program
implementation plans.
• Be timely, so information can be readily used to inform project/program
implementation.
• Be participatory, involving key stakeholders –reduce costs, and build
understanding and ownership.
• Not only for project/program management but should be shared when
possible with beneficiaries, donors, and any other relevant stakeholders.
EXERCISES
Exercise 1: Identify 5 key indicators and complete an indicator matrix for the
project/program you are familiar with.
GOAL
PURPOSE
OUTPUTS
ACTIVITIES Inputs
Relevance
Effectiveness
Efficiency
Impact
Sustainabilit
y
What aspects of the What are some of What are some of the limitations of the
training will you the variables you evaluation and its findings?
evaluate? will focus on?
=======++++++++++++++====END====+++++++++++++
+==========