0% found this document useful (0 votes)
31 views33 pages

Monitoring and Evaluation Plan

Uploaded by

ANDREW ODHIAMBO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views33 pages

Monitoring and Evaluation Plan

Uploaded by

ANDREW ODHIAMBO
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 33

Monitoring and Evaluation Plan

Mr. Andrew
Introduction
• A document that helps to track and asses the
results of the interventions throughout the
program.
Elements of Monitoring and Evaluation
Plan
• Logical framework
• Theory of change
• Monitoring indicators
Importance of Monitoring and Evaluation
Plan
• Helps to give clear plan and answer about the
program
• Help staff decide how they are going to collect
data
• Data analysis
• Dissemination of data results
• Report on results
• Developed by a research team or staff with
research experience with input from program
staff
Timeframe for M&E Plan Development
• Developed at the beginning of the program
• Ensures a system is in place to monitor the
program
Steps of M&E Plan Development
• Step 1 Identify Program goals and Objectives:
the problem
steps being taken to solve the problem
How will the staff know when the program is
succesful
Example
• If the program is starting a condom
distribution program for the adolescents; the
answers might look like this:
Problem High rates of unintended pregnancies
and STIs among youth age 15 - 19
Solution Promote and distribute free condoms in
the community at youth friendly locations
Success Lowered rates of unintended pregnancies
and STIs among youth 15 – 19.
Higher percentage use of condom among
sexually active youth
Step 2: Define Indicators
• Should be a mix of those that measure process
and those that measure outcomes
Process indicators
• Checks whether activities are being
implemented as planned
• Include: Number of trainings held with health
providers, outreach activities conducted,
condoms distributed and percentages of youth
reached.
Outcome indicators
• Tracks how successful program activities have
been at achieving program objectives.
• Include: percentages of youth using condoms,
trained health providers offering family
planning services and new STIs among youth.
Step 3: Define Data Collection Methods and
Timeline
• Should be decided among program staff,
stakeholders and donors.
Examples of what data can be collected and
how
Information to be collected Data sources
Implementation process and progress Program-specific M&E tools
Service statistics Facility logs, referral cards
Reach and success of the program Small surveys with primary audience e.g.
intervention subgroups or communities provider interviews or client exit
interviews
Reach of the media interventions Media ratings data, google analytics,
broadcaster logs
Reach and success of the program Nationally representative surveys,
intervention at population level omnibus surveys
Qualitative data about outcomes of the Focus group discussion, indepth
interventions interviews, listener/viewer group
discussions, case studies.
Indicator Data Source Timing
Number of trainings held Training attendance sheets
with health providers Every 6 months

Number of outreach Activity sheet Every 6 months


activities conducted at
youth-friendly locations

Number of condoms Condom distribution sheet Every 6 months


distributed at youth-
friendly locations

Percent of youth receiving Population-based surveys Annually


condom use messages
through the media

Percent of adolescents DHS or other population- Annually


reporting condom use based survey
during first intercourse

Number and percent of Facility logs Every 6 months


trained health providers
offering family planning
services to adolescents
Step 4: Identify M&E Roles and Responsibilities

• Decide from the early planning stages who is


responsible for collecting the data for each
indicator.
• A mix of M&E staff, research staff, and
program staff
Indicator Data Sources Timing Data Manager
Number of trainings Training attendance Every 6 months Activity manager
held with health sheets
providers
Number of Activity sheet Every 6 months Activity manager
outreach activities
conducted at youth-
friendly locations

Number of Condom Every 6 months Activity manager


condoms distribution sheet
distributed at
youth-friendly
locations

Percent of youth Population-based Annually Research assistant


receiving condom survey
use messages
through the media

Percent of DHS or other Annually Research assistant


adolescents population-based
reporting condom survey
use during first
intercourse
Step 5: Create an Analysis Plan and Reporting Templates

• Compile and analyze data to fill in a results


table for internal review and external
reporting
• Done by either an in-house M&E manager or
research assistant for the program.
Indicator Baseline Year 1 Lifetime target % of target
achieved

Number of 0 5 10 50
trainings held
with health
providers

Number of 0 2 6 33%
outreach
activities
conducted at
youth-friendly
locations

Number of 0 25, 000 50, 000 50%


condoms
distributed at
youth-friendly
locations

Percent of 5% 35% 75% 47%


youth receiving
condom use
messages
Step 6: Plan for Dissemination and Donor Reporting

• Describes how and to whom data will be disseminated.


• How will M&E data be used to inform staff and
stakeholders about the success and progress of the
program?
• How will it be used to help staff make modifications
and course corrections, as necessary?
• How will the data be used to move the field forward
and make program practices more effective?
Components of M&E System
• Organizational Structures with M&E Functions
• Human Capacity for M&E
• Partnerships for Planning, Coordinating and
Managing the M&E System
• M&E frameworks/Logical Framework
• M&E Work Plan and costs
• Communication, Advocacy and Culture for
M&E
Components of M&E System continued
• Routine Programme Monitoring
• Surveys and Surveillance
• National and Sub-national databases
• Supportive Supervision and Data Auditing
• Evaluation and Research
• Data Dissemination and Use
M&E Framework

• A structured and systematic blueprint or plan


that outlines how a program or project will
monitor and evaluate its progress,
performance, and impact over a specific
period.
Key Components of an M&E Framework:

• Key indicators
• Means of Verification
• Assumptions and Risks
Key indicators
• Are specific metrics or measures used to
gauge progress, outputs, outcomes, and
impacts of a program.
• For example, in a public health campaign, a
key indicator could be the reduction in the
number of reported cases of a particular
disease.
Significance of M&E Frameworks
• Clarity and precision
• Accountability
• Adaptability
• Informed decision making
Clarity and Precision
• M&E Frameworks provide a clear and precise
roadmap for program evaluation, ensuring
that objectives and goals are well-defined.
• This clarity helps in avoiding vague or
ambiguous assessments.
Accountability
• By specifying key indicators and means of
verification, the framework holds stakeholders
accountable for their roles in the evaluation
process. It reduces subjectivity and enhances
transparency.
Adaptability
• M&E Frameworks are adaptable to changing
circumstances and unexpected challenges.
• They allow for the incorporation of new data
sources or methods as the program evolves.
Informed decision making
• The data collected through the M&E
• Framework empowers decision-makers with
valuable insights.
• It aids in resource allocation, strategy
adjustments, and evidence-based decision-
making.
Means of Verification
• This component outlines the data sources,
methods, and tools that will be used to collect
and verify information related to each key
indicator. It ensures that data collection is
systematic and reliable.
Assumptions and Risks
• An M&E Framework acknowledges external
factors that may affect the program’s success.
Assumptions are conditions or factors considered
favorable for achieving program goals, while risks
are potential challenges or obstacles. For
instance, an assumption might be that
community engagement will positively impact
project outcomes, while a risk could be budget
constraints affecting the availability of resources.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy