0% found this document useful (0 votes)
83 views32 pages

Project Evaluation

The document outlines the objectives and importance of project evaluation, distinguishing it from project monitoring. It details various types of evaluations, including process, outcome, and impact evaluations, and discusses methods for conducting evaluations such as the Logical Framework Approach and the Kirkpatrick model. The content emphasizes the role of evaluations in assessing project effectiveness and learning for future improvements.

Uploaded by

Amei Damiano
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views32 pages

Project Evaluation

The document outlines the objectives and importance of project evaluation, distinguishing it from project monitoring. It details various types of evaluations, including process, outcome, and impact evaluations, and discusses methods for conducting evaluations such as the Logical Framework Approach and the Kirkpatrick model. The content emphasizes the role of evaluations in assessing project effectiveness and learning for future improvements.

Uploaded by

Amei Damiano
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 32

Project Evaluation Using

High Precision Methods


Dr. Ronald Komata (MBChB,
MSc)
Module objectives

1. To define project evaluation

2. To compare project evaluation


and project monitoring.

3. To explain the role of project


evaluation in the project world

4. To explain the different types of


project evaluation

5. To introduce the different


methods and their application in
project evaluation.
What comes to mind
when you hear the word
evaluation?
The project
cycle-
reminder
What are the components of
the project cycle?

When does M&E begin and end in the project cycle?


What is project
Evaluation?

• Evaluation is defined as the systematic


examination of the project’s results,
impact, relevance, and efficiency at key
stages during the project.
• Project evaluation assesses the extent to
which the project activities resulted in the
achievement of project outcomes or goals.
• This differs from project monitoring, which
is concerned with the ongoing, routine
collection of information about a project’s
activities to measure progress toward
results.
Project evaluation
is about
• Have you achieved the
objectives of your project
activities?-
• Have you done the right things
to achieve your objectives?
• A well-designed evaluation can
provide important learning and
knowledge, which you can use
to strengthen your current or
future projects.
Compariso Project Monitoring Project Evaluation
n aspect
Definition Is the ongoing, routine Is the assessment of the extent to
collection of information which the project activities
about a project’s activities resulted in the achievement of

Comparison to measure progress


toward results.
Sources of Monitoring data is often
project outcomes or goals.

Evaluation requires measuring

between data extracted from already


existing reports or records.
changes at the population or
beneficiary level. To do this, you
need to collect additional data
Project from the intervention’s target
group (i.e., primary data).

Evaluation Types of Monitoring is primarily


participan done by people directly
ts involved in implementing
Evaluation, on the other hand,
usually requires consultation with
project staff, though they play a

and Project the project, such as project


staff and project users.
lesser role. Instead, evaluation
often includes external

Monitoring
evaluators, M&E staff, and/or
donors to increase objectivity.
Timing of Data collection is a Data for evaluation are only
data continuous process from collected at specific points during
collection the beginning of the or after the project, such as the
project to the end. mid-point, following a major shift
in the project’s activities, at the
end, or several years after the
project if you are interested in
impact.
The role of project Evaluation

Evaluations are commonly used to:


• Understand why activities have (or
have not) been implemented as
planned
• Explain whether the project had an
effect, and, if so, why, how, and for
whom? Or, if not, why not?
• Determine the extent to which
measured or observed changes can be
attributed to the intervention
• Describe whether the intervention had
any unintended consequences
• Assess whether a project was cost-
effective
Case study
• Imagine at the beginning of your project cycle, your
project plan includes an activity to conduct six pieces of
training for Ministry of Health data clerks on facility-
level data entry. Monitoring can tell us how many
trainings were conducted, how many data clerks were
trained from each facility, and how much of the budget
was spent on each training. This information is very
helpful in terms of knowing what the project is doing
and what progress has been made.
• Let’s assume all the trainings were completed on time
and within budget. The trainers also tell you they felt
the pieces of training went well. The expected
outcomes of the project are that “Ministry of Health
data entry clerks have increased skills and
knowledge in facility-level data entry.”
• Based on the information we have; can we say that the
pieces of training made a difference in the data clerks’
knowledge and/or skills? What do you think? If you think
this is enough information, why? If you think this isn’t
enough information, why not?
Case • pre-and post-test
• We can’t say the training
increased data clerks’ knowledge
or skills because we have no • written evaluation
study
evidence to back up that
statement. questions
• written self-
continued
• But evaluation can give us that
evidence. It helps us to assessment of
objectively answer the question
of whether the training made a their learning
difference. • Direct observation
• If we were to evaluate the
outcome that “data entry
clerks increased their
knowledge in facility-level
data entry,” we would have to
do so after the intervention
occurred, using different methods
• Together, these methods should
tell us the extent to which the
outcome of ‘increased
knowledge’ was achieved
Types of Project Evaluation

• There are numerous project evaluation


approaches.
• Choosing an evaluation type depends
on how you want to apply or assess
your project data.
• In this module, we will focus on 3
commonly applied types of project
evaluation.
• These are process evaluations,
outcome evaluations, and impact
evaluations.
Process Evaluations
• These demonstrate how well a project is working and
the extent to which it is being implemented as
planned (implementation fidelity).
• They help you understand which project components
act as facilitators or barriers to achieving outcomes
and, as such, describe how, why, and under what
conditions a project works (or doesn’t work).
• Sometimes process evaluation is equated with
monitoring, as both occur during a project’s
implementation and include review of output data.
• However, process evaluations occur at one point in
time and demand supplemental data collection.
• Methods like interviews or focus groups are often used
to ask about participant satisfaction and/or
stakeholders’ perceptions of how well the project is
being delivered.
Process evaluations
continued
• Process evaluations usually result in a comprehensive analysis of a
project’s strengths, challenges, and recommendations for how to make
improvements going forward.
• For example, in the case of the bed net project, you might carry out a
process evaluation at a mid-point in the project to see which elements
are hindering the success of your project and which are having the
strongest effect.
• You might conduct observations of healthcare workers’ interactions
with clients to find out if they are offering the bed nets to pregnant
women; assess the degree to which they are promoting their
importance; see if they complete the registers correctly; and observe
how clients react.
• How often do healthcare workers explain the information to clients and
offer them nets? When they do, is the information factually correct?
How do clients respond? How often do healthcare workers remember
to update the registers?
Process evaluations continued
Or you might conduct focus group discussions (FGDs)
with healthcare workers to ask about the quality of training
they received and to find out if this intervention has been
burdensome to them in any way.

Have there been any adverse effects? Are healthcare workers


frustrated that they have to enter more data into the
registers or must spend extra time with each client?

You could also interview a sample of the clients to


understand their perceptions of the intervention and see how
well it’s working so far.

Were clients offered a bed net? What type of information did


healthcare workers share with them? Are the clients using
the nets? Why or why not? Have clients seen the posters or
brochures? What are their perceptions of each?
Outcome
Evaluations

• Outcome evaluations are conducted after the completion


of a set of activities or at the end of a project to assess
the quality of the intervention and its key results.
• They are used to demonstrate accountability, improve the
design of activities, assess cost-effectiveness, and
promote successful interventions in the future.
• They answer the question: “What changed as a result of
our work?”
• This type of evaluation is often requested by donors
because it tells them whether the project was effective.
• Outcome evaluations typically measure changes in
participant’s or beneficiary’s awareness, knowledge,
attitudes, values, or skills during or after their
involvement in the intervention.
• They may also evaluate changes in behaviour among
these same groups.
Outcome Evaluations
continued
• Outcome evaluations work well in tandem with
process evaluations.
• By using both types, you can learn whether
outcomes were achieved as well as how and why
they were achieved, which is useful if you aspire
to replicate the project elsewhere.
• In the bed net example, we would do the follow-
up survey to compare baseline and endpoint
knowledge, attitudes, and practices of pregnant
women regarding the use of bed nets.
• This would tell us the extent to which the
intervention changed women’s knowledge about
the importance of using bed nets and how to
properly hang them; the degree to which their
attitudes around using bed nets improved; and
how many women changed their behaviour to use
bed nets.
Impact
Evaluations

• These are similar in purpose to outcome evaluations,


though they are broader in that they measure the overall
effects (intended and unintended) of the entire project,
typically years after it ends.
• The purpose is to determine the lasting effect on the target
population and to establish a strong cause-and-effect
relationship between the intervention and outcomes.
• Impact evaluations are more robust in terms of design and
methods because attributing change to your project
becomes increasingly difficult as time passes (e.g., the
“gold standard” is a randomized controlled trial,
where you have a control group that does not receive the
intervention).
• This type of evaluation can be very resource-intensive,
requires high-level evaluation expertise, and, as a result, is
not conducted as often as the other two types of
evaluations.
Impact Evaluations
continued
• Some of the scenarios in which a project might call for an
impact evaluation include if it is a particularly innovative
intervention, if it’s a pilot and the plan is to
substantially scale it up in the future, or if there is little
evidence of impact for such an intervention in the given
context.
• In the case of the bed net intervention, a ‘gold standard’
design would mean you’d have to randomly select a
comparison group from the beginning (i.e., pregnant women
who would not receive the intervention) to determine what
the project’s outcomes would have been in the absence of
the intervention.
• Other designs can work as well. For example, quasi-
experimental designs where participants are matched to
similar non-participants and biases are controlled for.
• Or non-experimental designs, where there is no
comparison group; these become less robust without
randomized selection.
Tools and Methods for Evaluation
• Project cycle-reminder!!!!

Idea &
Analysis

Organizational Project
learning design

Implementati
on
and
Evaluation monitoring
• Monitoring and evaluation are about
Tools or Methods learning – in your project and
organization. They are all about
for doing M&E measuring the change created by your
project.
• In today’s session, we have already
flagged some of the methods used for
evaluating projects under the different
types of evaluations.
• Other methods include the Logical
Framework Approach (LFA), Team Self
Review, Most Significant Change
Stories, Impact Grid, Kirkpatrick
model, interviews, surveys and
questionnaires, focus group
discussions (FGDs), case studies,
Timelines, and Participatory
evaluation methods.
• The LFA is useful as a tool for mapping out the logic of your project

1. The Logical and planning your activities.


• It is also a valuable tool for monitoring progress and change along
Framework the way.
• You can measure progress on the project objectives against the
Approach success criteria and decide on means for verifying the changes.

(LFA) • Use the LFA as a dynamic planning tool, that must also be
adjusted when the context changes or when you encounter
challenges to the original plan.
What is the LFA?

• The Logical Framework Approach (LFA) is a systematic and analytical


planning process used for the results-based planning of a project.
• It is a methodology mainly used for designing, monitoring, and
evaluating international development projects.
• Variations of this tool are known as Goal Oriented Project Planning
(GOPP) or Objectives Oriented Project Planning (OOPP).
• The LFA is a visual approach to designing, executing, and assessing
projects which encourages users to consider the relationships
between available resources, planned activities, and desired changes
or results.
What is the
LFA?
The LFA process:

1. Context analysis: What are the needs?

2. Problem analysis: What problems do you want


to solve?

3. Objectives: What do you want to achieve?

4. Activity: Which activities should you do to


achieve your goals?

5. Monitoring and evaluation: Are you


successful?
6. LFA helps us to find out how we can make the
changes we want to see in a systematic manner
The LFA Matrix
Is what you are
proposing logical?
If…then

In order to…
2. Team self-
review
• Team Self Review uses the experiences of the people
who work with the implementation of the project daily.
• This method can be used for reflecting on the project
along the way as well as for final evaluations.
• Whenever you are doing your projects, it is useful to
pose yourself the three universal evaluation
questions:
Are we doing what we said we would do?
Are we making any difference?
Are we doing the right things?
• You can use the questions to run through the project's
promised project activities and results on an overall
level - and assess if you are doing the right thing to
achieve your objectives.
• You can also use the questions to evaluate smaller
parts of the project, e.g. an activity such as a training.
3. Most Significant
Change Stories and
Impact Grid
• Most Significant Change Stories and
Impact Grid is about collecting and
analysing participants’ stories to
understand the changes that have
happened due to a project.
• The stories give examples of the
knowledge, skills, or confidence
participants have gained and how they
use it in their lives.
• The method is well suited for catching
both expected and unexpected
outcomes of a project – and can deliver
a rich picture of what has happened.
4. The Kirkpatrick model

• This helps you to assess the learning outcomes of


training and education processes. It highlights the
importance of follow-up to participants so that they
apply learning back in their organizations.
• The Kirkpatrick Model is probably the best-known
model for analysing and evaluating the results of
training and educational programs. It considers any
style of training, both informal and formal, to
determine aptitude based on four levels of criteria.
• Level 1: Reaction measures how participants react to
the training (e.g., satisfaction?).
• Level 2: Learning analyses if they truly understood
the training (e.g., increase in knowledge, skills, or
experience?).
• Level 3: Behaviour looks at whether they are utilizing
what they learned at work (e.g., change in
behaviours?).
• Level 4: Results determine if the material had a
positive impact on the business/organization.
5. Interviews
• These are useful for getting in-depth knowledge
and context understanding from stakeholders in a
project.
6. Surveys and questionnaires
• Surveys and questionnaires are useful for getting
responses uniformly from a large group of people
or a statistically representative group of
respondents.

Other Methods 7. Focus group discussions (FGDs)


• FGDs gather a group of people to get their views
on key issues and can be useful when deciding
how to work with a specific issue.
8. Case studies
• These are useful for evaluations. They focus on
an aspect of a project and describe how the
project has affected a given group of people. You
can use participatory methods to get a target
group to give their experience of the changes
that have happened due to a project.
Other methods
9. Timelines
• Timelines are useful for recording and talking
about the history of a project or a partnership.
• This is a good tool for identifying key events or
critical decision points and the effects of events
on a project and its participants.
10. Participatory evaluation methods
• These involve project participants in designing
an evaluation process, collecting data, and
assessing the changes created by the project.
• This can help enhance learning and create
ownership of the project outcomes among
participants.
The End

Congratulations!
How to contact me
• LinkedIn: Ronald Komata MD
• Email:
komataronald25@gmail.com
• Facebook: Ronald Komata
• X (formerly Twitter): Komata
Ronald M.D
• Instagram: Ronald Komata
• Direct call (WhatsApp):
+256777275905

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy