0% found this document useful (0 votes)
23 views55 pages

RBM046-PBM - How To (Environment Canada Example)

Uploaded by

assefafikad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views55 pages

RBM046-PBM - How To (Environment Canada Example)

Uploaded by

assefafikad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Manager’s Guide

to
Implementing
Performance-based
Management
Table of Contents

I. Introduction ................................................................................ 1

II. Overview of the Approach .................................................... 5

III. Step-by-Step Approach .......................................................... 7


1: Develop a Performance Framework ...................................... 7
2: Select Key Performance Areas ............................................15
3: Select Performance Measures.............................................19
4: Determine Information "Gaps" .............................................24
5: Develop and Implement "Measurement Strategy" .................25
6: Develop Performance Report ..............................................28
7: Learn from Experience ........................................................30

IV. Conclusion .................................................................................34

Appendix A: Glossary ...................................................................36

Appendix B: Environment Canada's Performance .........


Measurement Strategy .....................................36

Appendix C: References ..............................................................41


Manager’s Guide Implementing Performance-based Management

I. Introduction

How will this guide The purpose of the Guide is to:


help you?
• help Environment Canada managers develop a
basic understanding of how to develop a
Performance Framework
• provide suggestions for identifying key results
areas, performance measures and measurement
strategies
• develop ideas for credible performance reporting.

What is Performance- Simply stated, performance-based management is


based Management? about managing for results. In a government
setting, this implies that the expected results to be
achieved through various programs, policies and
services should be clearly articulated, that
meaningful measures of success are selected and
that accomplishments achieved are reported in a fair
and credible manner. Performance information can
be used to monitor the progress of a program, policy
or service as well as to make decisions about their
strategic objectives and resource allocations.

What is the need? Public sector managers face increasing pressures


from all sides to reduce costs, improve service
levels, make progress towards the achievement of
priority outcomes, and increase accountability. In
order to accomplish these things, a strong vision of
success (goals) is vital. At the outset, managers
must clarify the vision and broad objectives that
define the desired long-term impacts policies and
programs were designed to achieve. From there
they can identify and focus upon the short- and
medium-term results that should contribute to
achieving those impacts and the measures that will

- 1-
Implementing Performance-based Management Manager’s Guide

allow progress to be tracked and reported on over


time.

In science-based departments, the difficulties in


planning, measuring, and reporting performance are
particularly acute. Science-related changes are
often abstract and take place over considerable
periods of time. For this reason, it is in fact more
important to articulate a clear success vision for
environmental protection and sustainable
development initiatives than it would be for less
complex programming.

What is the solution? The required ‘system' for performance-based


management should provide for a precise
articulation of priority results with an emphasis on
addressing target group (client) needs. The
approach should also allow all key delivery
participants to ‘own' the system (i.e. the people
delivering the services believe that the performance
system appropriately articulates their results, goals
and values). The required system must be useful for
all aspects of management including planning,
priority setting, resource allocation, monitoring and
adjustment. It should also be useful to all levels of
management from senior executives right through to
science managers and operating staff.

The system advocated in this Guide, that meets


these requirements, is a seven-step analytical and
interactive process. It commences with a clear
articulation of a Performance Framework or “logic”
model of the program - which answers the question
“what’s our business?” The Framework then serves
as a basis for defining key performance areas,
measures, measurement strategies and reporting

–2–
Manager’s Guide Implementing Performance-based Management

practices - answering the supplementary question


“how’s business?”.

Why is it relevant to The approach recommended here can help


Environment Canada? managers respond to a number of specific concerns
within Environment Canada:

• implementing the revised Planning, Reporting and


Accountability Structure (PRAS)
• handling increased scrutiny of federal programs
by groups such as the Office of the Auditor
General and Commissioner of the Environment
and Sustainable Development
• coping with the complexity inherent in national
integration of highly decentralized programs
• meeting increased demands for service with
stable or declining budgets.

Where can you go for The development of this Guide has been partially
help, advice and funded by the Departmental Learning Fund under
support? the direction of the Commercialization and
Management Practices Branch (CAMP), Corporate
Services. The Branch has supported a number of
departmental groups in implementing a
performance-based approach through workshops,
advice and assistance. As a next step, CAMP plans
to develop a supplementary guide for managers that
specifically links the use of the seven-step process
to meeting the requirements of the departmental
planning and reporting processes.

- 3-
Implementing Performance-based Management Manager’s Guide

Exhibit 1: Steps to Performance-based Management

1. Develop a
7. Refine Performance
Approach Framework

6. Develop 2.2. Select


SelectKey
Key
Performance Performance
Performance
Report Areas
Areas

5. Implement 3.3. Identify


Identify
Measurement Performance
Performance
Strategy Measures
Measures

4.4.Conduct
Conduct
Information
Information
“Gap”
“Gap”Analysis
Analysis

–4–
Manager’s Guide Implementing Performance-based Management

II. Overview of the Approach

What are the basic Exhibit 1 shows a high level diagram of the
steps involved in performance-based management approach. This is
implementing a a guideline, intended to show the overall process
Performance-based
generically. Section III of this Guide provides a
Management
Approach? detailed description of the elements of each step.

Planning phase Step 1: Develop a Performance Framework to


describe what your program is all about.
This involves a description of your
organization’s mission or key objectives,
activities and outputs, reach (clients, co-
delivery agents, stakeholders, etc.) and
the desired intermediate and ultimate
outcomes.
Step 2: Identify the most important elements, or
key performance areas, which are most
critical to understanding and assessing
your program’s success.
Measurement phase
Step 3: Select the most appropriate performance
measures.
Step 4: Determine the “gaps” between what
information you need and what’s
available.
Step 5: Develop and implement a measurement
Reporting and continuous strategy to address the gaps.
improvement phase
Step 6: Develop a performance report which
highlights what you accomplished and
what you learned.
Step 7: Learn from your experience and refine
your approach as required. Keep in mind
that the development of a performance

- 5-
Implementing Performance-based Management Manager’s Guide

management approach is an iterative


process.

Exhibit 2: Generic Performance Framework

Mission Statement: Who does what to whom and why.

n l s
RESOURCES REACH RESULTS

How? Who? What do we Ultimate


want? outcomes

Activities Outputs Users / clients / Direct and Ultimate


co-deliverers / intermediate outcomes
beneficiaries outcomes

What resources What products or Who must we What level of Why does our
are required to services do we influence to make client service do program exist?
run the program provide? progress to we want to
(people, $, achieving our provide? (Eg., What results do
information, other outcomes? - address needs we ultimately
assets)? - meet / exceed want to
Who must we expectations). achieve?
What are the key work with or rely
activities carried upon to help us What influence What are the
out to achieve achieve our do we want to long term
desired results? desired results? exert on our key benefits, effects
target group in or states are we
What role do the intermediate- looking for?
others play? term? (Eg.,
influence people’s
Do we need to behaviour in
focus on specific order to increase
groups or awareness,
segments of the understanding
population in and knowledge;
order to achieve change attitudes /
results? perceptions; or
make decisions
and take action.

⇑ ⇑ ⇑ ⇑ ⇑
Influencing Factors
What external forces / factors could affect the achievement of our desired results?

–6–
Manager’s Guide Implementing Performance-based Management

III. Step-by-Step Approach

Step 1: A Performance Framework (Exhibit 2) is a tool for


developing a logical description of a program1. The
Develop a Framework can be used as a description of how
Performance Environment Canada translates its objectives into
Framework
measurable results. It describes:

• the “results” that EC programs aim to achieve in


terms of a continuum of outcomes from near-term
to ultimate
• the “reach” of the program or its intended scope
of influence
• the “resources” being utilized to perform
activities and create outputs.

The Framework focuses directly on management


needs by responding to stakeholders’ key questions
about the value-added of a program in a straight
forward manner:

• How have results been accomplished?


• Who has been influenced?
• What has been accomplished? Why?
• What are the relationships between resource
utilization, reach and accomplishments?

Elements of the
The Performance Framework approach simplifies
Framework
government programs and services into the
following categories:

1
This guide uses programs as the case example for the Performance Framework approach.
The approach applies equally to a policy, service or organizational area.

- 7-
Implementing Performance-based Management Manager’s Guide

• mission statement
• inputs
• activities
• outputs
• reach
• intermediate outcomes
• ultimate impacts
• influencing factors

A brief description of these categories follows. The


reader is then encouraged to complete a
Performance Framework following a defined
process.

Overall mission A mission statement should describe 'who does


what to whom and why' in an organizational entity. It
basically reflects the global objectives and mandate
of an organization and provides the strategic
direction for a policy or program

HOW (resources) Inputs are the funds, labour, skill types and core
competencies required to carry out activities.

Activities are the specific deeds, tasks or actions


that contribute to the production of goods or
provisions services through which results are
achieved. Activities typically generate costs of
some kind. These need to be articulated in precise
terms along with the activities performed. For
example EC activities might include:

• project management;
• develop educational materials;
• feasibility studies;

–8–
Manager’s Guide Implementing Performance-based Management

• meetings with key stakeholders;


• inspections; and
• review project applications.

Outputs are the direct products and services


produced through program activities. Outputs can
include communication contacts which are produced
and consumed instantaneously and they can include
hard copy agreements, contracts or other physical
evidence which is preserved over time. Outputs are
typically considered to flow outside of a
service/program function, however, they could
include internal communications, plans or services.
For example, EC outputs could include:

• financial assistance to community-based


organizations;
• voluntary codes of practice;
• emissions standards;
• public awareness materials;
• regulations; and
• legislation.

WHO? (reach) Reach is defined as the group, or groups, which are


reached by program/service outputs. Clearly this
may include clients as well as internal staff, co-
delivery agents and other stakeholders and
beneficiaries. For example EC’s reach could
include:

Primary Clients
• industry, academic institutions

Co-delivery Agents/Partners
• other government departments

- 9-
Implementing Performance-based Management Manager’s Guide

• provincial and municipal governments

Other Stakeholders
• environmental groups
• service clubs, associations.

Reach is a scoping category in the Framework.


It identifies the breadth and depth of influence
being targeted to bring about change. It is often
useful to think about primary and secondary
groups being influenced by the program’s
outputs.

WHAT do we want? Intermediate outcomes occur in the group(s)


(results)
immediately reached by program outputs and/or on
the environment. Typically, the outcomes are a
perceptual, attitudinal and/or behavioral response
on the part of the group(s) reached; and an
improvement in the health of the natural
environment. This response or improvement then
leads to longer-term outcomes and impacts along a
causal chain (see Exhibit 3 which follows). For
example direct outcomes could include:

• increased awareness, understanding, skills, and


knowledge;
• people practicing environmentally responsible
behaviour (e.g., home energy conservation,
proper disposal of pesticides, protection of native
species/habitats);
• increased capacity of community-based groups to
take action on issues of national importance;
• land use practices which demonstrate
stewardship;
• reductions in release of toxic substances; and

– 10 –
Manager’s Guide Implementing Performance-based Management

• increase in populations of endangered species.

This is a key category of the Performance


Framework. It represents the understood “theory” of
how a programs tangible products and services bring
about or contribute to broad societal or
environmental change.

Exhibit 3

Figure 3

Results Continuum

Knowledge:: Attitudes
Knowledge Attitudes Behaviour
Behaviour Early
EarlyEffects
Effects Later
LaterEffects
Effects
- -awareness
awareness - -perceptions
perceptions - -involvement
involvement - -habitat
habitatpreserved
preserved - -societal
societalchange
change
- -understanding
understanding - -disposition
disposition - -compliance
compliance - -reduced
reducedemissions
emissions - -eco-system
eco-systemhealth
health
- -skills - -desired
desiredactions
actions
skills

Intermediate Results Ultimate Results

More (Direct) Control Less Control


(Partnerships, Influence)

The ultimate outcomes of a program should relate


to the mission and mandate of the program provider
(e.g., if the mission relates to the reduction of
negative impacts of human activity on the
atmosphere, then the ultimate outcomes should

- 11-
Implementing Performance-based Management Manager’s Guide

show decreases in the use of ozone-depleting


substances). In all cases, the ultimate impacts
reflect the broader strategic objectives of EC
reflecting its role in contributing to the environment.

Influencing Factors In addition to the above seven categories, it is


important to consider outside influences which
effect every category but which become most
important in the reach, intermediate, and ultimate
outcomes areas. Indeed, outside influences such as
the state of the economy or political environment
can have an overwhelming influence on
program/service performance — especially for
initiatives which often have results occurring over
long-term periods.

Advice on “How to” develop a


Helpful tips! Performance Measurement Framework

• involve key managers and staff in the process (consider


including key stakeholders as well)

• use a “facilitator” to guide you through the process. The


Commercialization and Management Practices Branch can
help you here.

• use the questions presented in the “Generic Performance


Framework” presented in Exhibit 2 as prompts

• plan on either a half or full day session depending on your


group’s level of comfort with the concepts

• you may proceed through the Framework categories in


whatever direction feels comfortable; experience has shown
that when dealing with an established program it may be
easier to start with activities and progress forward to results;
when dealing with a new initiative it may be easier to start
with the results you wish to achieve with specific target
groups and work backwards to outputs and activities

• be sure to focus your attention on two categories - reach and


intermediate results

• after completing the Framework, step back from it and assess


its quality using the checklist of questions shown in Exhibit 4.

• refer to the sample performance framework in Exhibit 5. It


has been developed based on EC’s goal to “contribute to
ozone protection”.

– 12 –
Manager’s Guide Implementing Performance-based Management

Exhibit 4: Performance Framework Discussion Questions

1. Logical Consistency • Is there a logical consistency between the strategic


goal/mission and ultimate results statements?

2. Controllable/External • Are intermediate results within the scope of EC’s


influence? What other external factors contribute to the
outcomes?

3. Timeframe/Clarify • Are the stated intermediate results achievable within a


reasonable period of time? Are the results described as
end states or conditions?

4. Players • Do we have a clear focus on primary and secondary


target groups?

5. Secondary Impacts • To what extent must we watch for other significant


consequences that may arise from our actions?

6. Approach/Outputs • Have we got a reasonable set of initiatives adequately


resourced?

• Can we link all activities to corresponding results?

• What information, knowledge, skills are required to


achieve strategic goals?

• Are our outputs stated as tangible goods and services?

7. Balance • Have we got an appropriate balance in the strategy


among results expectations, reach and resources?

8. Service • How important is client satisfaction to our initiative?


Have we reflected these concerns in our intermediate
results?

9. Learning • What kinds of goals and targets are relevant to


organizational learning and continuous improvement?

- 13-
Implementing Performance-based Management Manager’s Guide

Exhibit 5: Example Performance Framework

Mission: Reduce the negative impacts of human activity on the atmosphere and help canadians
better understand, prevent and adapt to the consequences.

n l s
RESOURCES REACH RESULTS

HOW? WHO? WHAT do we want? WHY?

Activities / inputs / outputs Users / clients / Direct and Ultimate


co-deliverers / intermediate outcomes
beneficiaries outcomes

$, # employees, Institutional Improved awareness, Protection of


science/economic partners understanding and the ozone layer
policy/negotiation skills, opportunities for
competencies Industry employing voluntary
approaches
Voluntary Approach Not-for-profit
Work with associations and organizations Actions on the part of
industries in the development of institutional partners
voluntary codes of practice Consumers to allow instruments
to work
Information and Awareness
Tools Reduction in use of
Assist with development and ozone-depleting
promulgation of labelling of substances
products containing ozone- Appropriate
depleting substances continuance of
commercial /
Economic Instruments economic activity
Contribute to the development
and implementation of a ‘CFC tax’ Preservation of
or levy on producers and sellers marketplace fairness
of products containing ozone-
depleting substances

Direct Government
Expenditures
Provide financial incentives to
organizations to replace ozone-
depleting substances in their
products

Command and Control


Ban the use of ozone-depleting
substances in products used in
marketplace

– 14 –
Manager’s Guide Implementing Performance-based Management

Step 2: Once a Performance Framework has laid out the full


spectrum of performance for a program/
Select Key organizational unit, the next step is to identify the
Performance Areas most important elements to focus on in
understanding and measuring your program’s
success. This is a critical step. It is not practical
nor reasonable to consider developing a
performance measurement system which addresses
all aspects of a program. At this point the manager,
ideally in consultation with others, needs to really
hone in of the most critical aspects of the program,
or key performance areas.

Identify the key Key performance areas are those areas within the
performance areas which
Performance Framework which require attention to
are critical to
understanding and ensure the overall success of the program. When
measuring your selecting key performance areas, think about it not
program’s success
only from the standpoint of a manager trying to
ensure success but also from the perspective of
internal and external stakeholders, interest groups
and clients. What dimensions of performance are
the focus of their interest and concern?

In the approach advocated by this Guide, focus is


initially centered on identifying the key intermediate
outcomes being sought by the program. This focus
is consistent with current performance reporting
trends in the federal government which require
departments to account for the value-added
provided by their initiatives and programs. Within
the context of the key outcomes selected, it is then
useful, and important, to identify corresponding
reach and resource concerns.

An important concept in selecting key performance


areas is to ensure that attention is focused

- 15-
Implementing Performance-based Management Manager’s Guide

across the performance spectrum and not


isolated in certain areas. It is the attention to all
dimensions of the Performance Framework that
brings balance and coherence to one’s monitoring
and reporting efforts.

While we initially center our attention on the


progress being made towards the achievement of
desired outcomes, it is also crucial to monitor the
performance of key processes (activities and
outputs) that the program must excel at, in order to
ensure success, and the scope of the program’s
influence relative desired target groups.

It may be useful to think in terms of what aspects


you have to focus on to “tell a story” about what
your program is trying to achieve and how you are
going about it. In other words, you would want to
describe why the program exists, who you are
trying to influence/change, what you want to
accomplish over the life of the program and how
you are going to do it using what level of resources.

There may also be concerns related to relationships


between categories in the Performance Framework.
For example, the relationship between resources
and outputs (efficiency) or the relationship between
resources and outcomes (cost-effectiveness).
Looking at the relationships of results, resources
and reach can also allow for an analysis of strategic
trade-offs (e.g., wide reach versus high impact
results).

– 16 –
Manager’s Guide Implementing Performance-based Management

How to identify Key Performance Areas


Helpful tips!
Start by looking at the “RESULTS” column of your performance
framework. Use the following questions as prompts:

• What difference do we want to make?

• From the Deputy Minister’s and key stakeholder perspectives,


what are the key intended results that should be monitored
and reported?

• What are the key indicators of success that need to be


monitored over time as the program moves towards the
achievement of its longer term objectives?

Once you have determined the most critical elements related to


your desired results, consider the “REACH” using the following
questions:

• What are the key concerns related to coverage of target


groups/locations and their acceptance of the program?

• Can you identify primary and secondary target groups whom


the program is trying to influence?

• Who are the programs key “partners”? What commitments


need to be managed/monitored?

Use the following questions to consider key aspects related


“RESOURCES” in terms of your activities, outputs, as well as,
management processes that are important in relation to
achieving desired results.

• What are the key business activities that the program must
excel at, in order to ensure its success? Are there areas of
risk or particular on-going concern?

• What areas within resource management must be


managed/monitored to ensure that the program sustains its
ability to change and improve? What is the importance of
human, financial, technological resources considerations?

Refer to Exhibit 6 which builds on the “contribution to ozone


protection” example presented as Exhibit 5 in Step 1. A key
performance area would be the “increased awareness,
understanding and opportunities for employing voluntary
approaches”

- 17-
Implementing Performance-based Management Manager’s Guide

Exhibit 6: New Voluntary Codes of Conduct

Reach Results
• evidence of progress • increased awareness,
in target industry understanding and
sectors (critical mass) opportunities for
• number of partners employing voluntary
(consumers,gov’ts, approaches
associations involved
in approach)

Resources
• quality of research
• level of investment

– 18 –
Manager’s Guide Implementing Performance-based Management

Through the development and use of performance


Step 3:
measures, progress towards achieving desired
Select Performance results can be examined. Measures have been
Measures defined in many different ways. In this Guide a
performance measure is simply an indicator that will
tell us over time how well we are doing in each of
the key performance areas.

Measures will tell us which direction we are going:


up or down, forward or backward, getting better or
worse or staying the same. Measures can be
“quantitative” (i.e., quantities, percentages, unit
costs, efficiency or productivity ratios, etc.) or
“qualitative (i.e., attitudes, opinions, observed
behaviours, etc).

Measures must support a story line - first for


managers, then for internal and external
stakeholders - which explains how the resources
committed to specific objectives did or did not make
a difference. Exhibit 7 provides sample
performance measures selected for the example
case, Environment Canada’s contribution to
controlling ozone depletion.

Helpful hints! “Why managers should develop their performance


measures?”

Performance measures should be developed by those


responsible for the program. The reason for this is twofold.
First, those responsible for the program are also likely to be the
resident experts, and the best equipped to determine what
constitutes good performance. Second, if the measure is to
communicate to and motivate people, the measure should be
something they can identify with, something which has meaning
to them.

- 19-
Implementing Performance-based Management Manager’s Guide

Exhibit 7: Performance Measures

Mission: Contribute to ozone protection.

n l s s
HOW? WHO? WHERE? WHAT do we want? WHY?

Activities / Outputs Reach and Intermediate Outcomes Ultimate


Outcomes

Voluntary Approach New Voluntary Codes of Conduct Estimated level


of consumption
• the EC level of effort in fiscal • # industry groups agreeing to set-up of ozone-
year xx based on # FTES and voluntary emission targets for ozone- depleting
$000s. depleting substances x, y, z. substances.

• # initiatives for voluntary • acceptance and agreement by other


emission standards nations at conference x to voluntary
undertaken emission targets.

Information and Awareness Universal Labelling


Tools
• # and types of products containing
• the EC level of effort in fiscal ozone-depleting products x, y, z
year xx based on # FTES and required to place warning information
$000s on their packaging.
• # initiatives undertaken in • % level of compliance among products
fiscal year xx. in these categories.

Direct Government Reduction in use of ozone-depleting


Expenditures substances in products x, y, z

• the EC level funding for fiscal • # organizations received incentive


year xx funding in fiscal year xx.
• # organizations that received financial
incentive which have replaced or
reduced their use of ozone-depleting
substances.

– 20 –
Manager’s Guide Implementing Performance-based Management

Selecting Measures In selecting measures it is always useful to consider


the maxim “what gets measured, gets attention”.
Choosing the wrong measures may create
dysfunctional behaviour on the part of program
personnel trying to optimize the wrong results.

Exhibit 8 identifies some common measures


applicable to Environment Canada’s programs. This
may be a useful starting point for generating your
own versions of key measures.

While creating measures may not be easy, many


represent intuitive judgements already made about
program quality. Why is this program important? Is
this program helping people? How would we know?
A common difficulty lies more in quantifying “good”
or “helpful”, or selecting a single measure which
captures the essence of overall performance.

Choosing measures is not an exact science. The


process that managers go through to identify their
measures is as important as the final list of
measures created. Done well, this process can
build commitment for focusing attention on progress
being made towards success.

Good performance measures should be meaningful,


reliable, practical, and have an intended user. They
should also be limited in number. Once you have
developed a set of tentative measures, subject them
to a screening test using the criteria set out in

Helpful tips! “How to” select the most appropriate performance measures

- 21-
Implementing Performance-based Management Manager’s Guide

• for each key performance area brainstorm a list of potential


indicators for Results, Reach, and Resources using Exhibit 8
as a guide

• screen list of measures against criteria suggested in


Exhibit 9.

Exhibit 8: Sample Performance Measures

Key Performance Areas Sample Measures

Resources • direct expenses by service/product

• # FTEs by service/product

• $000's planned vs actual

• # of potential contaminated sites

• # of projects funded

• $ contribution (total)

• # of inspections carried out

• acreage of sensitive habitat restored

• # of workshops/seminars held

Reach • # of users

• % of total target population

• formal linkages (agreements, joint ventures, association


memberships)

• # of properties confirmed as contaminated sites

• # of staff trained

• # of property owners requesting assistance with water


conservation strategies

• $ leveraged from other parties

– 22 –
Manager’s Guide Implementing Performance-based Management

Results (Intermediate) • level of client satisfaction (e.g. relevance, importance,


accessibility, value-added)

• level of awareness, understanding, and knowledge

• change in perception/attitudes

• # of contaminated sites cleaned up

• % level of compliance with regulations

• % storage of hazardous chemicals/waste converted to non-


hazardous material

• level of water consumption

• % change in transit users

• # of litres reduction in fuel consumption

- 23-
Implementing Performance-based Management Manager’s Guide

Exhibit 9: Screening Criteria for Performance Measures

Indicators

MUST BE Relevant

• Meaningful

• Significant (on its own and in combination with


others)

Valid

• Measures what we want to measure

SHOULD BE As cost-effective as possible (collection, analysis,


reporting)

Reliable across tests and over time

Easy to communicate and therefore insightful

NICE TO BE ‘Benchmarkable’ to outside groups

– 24 –
Manager’s Guide Implementing Performance-based Management

Once key performance areas and measures have


Step 4:
been selected, EC managers should systematically
Determine assess their present ability to collect and report
Information “Gaps” information on the key performance areas. EC
managers may discover that information which is
currently being collected is not useful to reporting
on performance of programs. Exhibit 10 provides a
model for conducting a “gap” analysis and includes
notations as to important factors to be considered.
The questions presented below should be
considered with respect to each of the performance
measures (selected in Step 3).

Exhibit 10: Gap Analysis

Key Existing Gaps Methods for Management Recommende


Performance Information / Closing Challenges / d Next Steps
Areas / Sources Gaps Resource
Indicators Considerations

What What What gaps What methods Who would gather What actions are
measurement information is exist between are needed to the information? recommended as
information is currently information close gaps? With what the next step in
needed to available to needed to frequency? generating
measure assess assess Consider: necessary
success? success? success and How much will be performance
available • surveys required in information?
How good is the information? resources? What
information • electronic are the costs
(quality and databases / involved?
consistency)? files

• case How does one


Where is the resolve trade-offs
studies
information between depth of
available? How • focus information, scope
accessible is the groups and costs?
information?
• audits /
What systems evaluations
are in place to
produce the
information?

- 25-
Implementing Performance-based Management Manager’s Guide

Step 5: Once the "gaps" have been identified, it is important


for EC managers to develop a measurement
Develop and strategy giving consideration to the potential
Implement sources of information and data collection methods.
“Measurement
Strategy”
The term sources refers to the actual source of the
data to be collected. Sources can include objective
or quantitative data types such as financial records,
statistics, contract documents, policies, procedures,
and guidelines. Qualitative data is often derived
from the attitudes, perceptions, and opinions of
various individuals or groups such as program
managers, staff, partners, collaborators, other
stakeholders, as well as users/clients.

Methods refer to the techniques or combinations of


approaches which can be used for measuring
performance.

Environment Canada managers must tailor their


measurement strategy to suit the specific
circumstances of their program, organization or
service area. For example, measurement methods
selected for assessing performance of a general
public awareness campaign would be different than
methods used to assess direct outcomes of a
training session targeted to a specific client group,
or a funding incentive program targeted to industrial
users of specific substances.

Methods for Assessing In order to simplify the discussion of performance


Resources
measurement methods, the various approaches
have been categorized according to the main
elements of the performance framework. In other

– 26 –
Manager’s Guide Implementing Performance-based Management

words, measurement methods typically used for


assessing processes related to activities and
outputs, or the use of Resources would include:

• comprehensive audit;
• control self-assessment;
• financial analysis; and,
• work activity analysis.

Methods for Assessing Methods for measuring the Reach component of the
Reach
performance framework typically use source data
from user or participant records. This data may
include tombstone data such as name, address,
phone, fax, and e-mail, as well as data describing
important aspects related to the user/client
interaction with the program. For example, for
programs providing financial assistance to
individuals or firms, it is critical to capture when the
assistance was provided, for what purpose, the level
of funding.

Methods for Assessing The Results components of the performance


Results framework deal with the measurement of immediate
and long-term impacts. The types of measurement
methods commonly used to assess results include:

• surveys;
• expert opinions;
• modified peer review;
• case studies;
• socio-economic impact analysis;
• cost-benefit analysis; and,
• environmental impact analysis.

- 27-
Implementing Performance-based Management Manager’s Guide

Helpful tips! “How to” develop a measurement strategy:

! after completing the "gap" analysis in Step 4, identify data


collection methods and sources

! refer to the chart presented as Exhibit 11 for sample


indicators, methods, and sources.

Exhibit 11: Sample Measurement Strategy

Key Performance Measures Methods / Sources


Area(s)

Resources • FTEs • HR database/system

• $ expenditures • financial system

• direct expenses • time reporting systems

Reach • users • client/project database

• formal linkages • legal documents / agreements /


files

Results Self assessed impacts: • feedback mechanism (e.g.,


survey)
• awareness
• observation/documentation
• understanding
• tracking by third parties
• knowledge

• perception

• decision

• action

– 28 –
Manager’s Guide Implementing Performance-based Management

Step 6: There are no hard and fast rules concerning the


format for performance reports. The format will be
Develop governed by the requirements for information and of
Performance Report the manager or stakeholder and the type of
information collected on the performance area.
Building on the case presented in Exhibits 5, 6, and
7, a sample performance report is shown as
Exhibit 12.

This sample report would produce actual


performance information for a set period of time
(e.g., a fiscal year quarter). Once a preliminary
‘test’ report has been produced, adjustments and
refinements can be made.

- 29-
ò Implementing Performance-based Management Manager’s Guide

Exhibit 12: Sample Performance Report

Vision: At Environment Canada, we want to see a Canada where people make responsible decisions about the environment and where the environment is thereby sustained for
the benefit of present and future generations.

Business Line Protect health and environment of Canadians by reducing negative impacts on the atmosphere and helping Canadians better understand and adapt to these
Goal: consequences .

Performance Consumption of ozone-depleting substances to be stabilized, reduced or eliminated and ozone layer begins to recover.
Expectations:

Situation The state of the ozone layer is a relatively mature issue. The result of a decade of science, regulation, and global action is that we are beginning to see reduced
Assessment: loadings of ozone-depleting substances on the environment. However, recovery from the damage done will take decades. EC provides world-class scientific expertise
and leadership in atmospheric chemistry aimed at supporting domestic priorities, as well as monitoring the ozone layer and understanding the effectiveness of
international policies. EC's updated National Action Plan for CFCs is currently awaiting approval by the Canadian Council of Ministers of the Environment.

Performance Accomplishments:

Key Performance Area / Resources Reach and Results for FYxx


Strategies

Voluntary approach Planned vs Actual In fiscal year xx industry groups a and b agreed to set-up voluntary emission targets for ozone-depleting
W ork with associations and substances x, y, z which will be at or below international standards accepted and agreed by other nations at
industries in the development of X FTEs Y FTEs conference x.
voluntary codes of practice. $000 $000
x initiatives Y initiatives

Comment: financial expenditures were


less than expected because of increased
cost sharing by industry associations.

Information and awareness Planned vs Actual A, b, c types of products containing ozone-depleting products x, y, z have been required to place warning
tools information on their packaging. A random compliance check performed in fiscal year xx found x%
Assist with development and X FTEs Y FTEs compliance among products in these categories.
promulgation of labelling of $000 $000
products containing ozone- x initiatives Y initiatives
depleting substances.

Direct government • the EC level funding for fiscal year xx 2


CO emissions have continued to decline, but are not projected to reach Montreal Protocol levels by the
expenditures was $__ plus indirect costs of xx. required deadline set by the 1986 Montreal Protocol.
Provide financial incentives to X organizations receive incentive funding totaling $xxxxxx in fiscal year xx. A follow-up review of x projects
organizations to replace ozone- found that x% had successfully met project technical objectives leading to y% of those funded, replacing or
depleting substances in their reducing, their use of ozone-depleting substances. This amounts to an estimated overall reduction in
products. substance use of x kilograms of CFCs, y kilograms and z etc. in fiscal year xx.

Lesson Learned: Results from a random compliance check were disappointing for product type b. EC intends to modify its approach in the area of universal labelling by
redirecting additional resources to manufacturers of product type b to offset labelling costs for a period of one year.

– 30 –
ò Implementing Performance-based Management Manager’s Guide

Step 7: The implementation of a performance-based


management approach takes time. It should be
Learn from viewed as an iterative process.
Experience

Performance management is being implemented


throughout the world. Lessons learned from
experience include:

• Top leadership support is clearly the critical


element that can make or break strategic planning
and performance measurement efforts;
• Personal involvement of senior line managers is
critical;
• Participation of the relevant stakeholders is
needed to develop useful plans and performance
measures;
• Technical assistance in the design of useful
performance measurement systems is often
necessary but may not be available when needed;
• Uncertainty about how performance data will be
used will inhibit the design of useful performance
measurement systems; and,
• Start with a few measures and don’t be too
concerned if the measures are the “best” ones.
The measures can be changed once a system is
in place and employees are comfortable with it.

– 32 –
Manager’s Guide ò Implementing Performance-based Management

Exhibit 13: Conditions of Success

Environmental elements:
• Examination of earlier projects, such as planning Programming Budgeting Systems (PPBS),
to analyze why they did not achieve their objectives.

• Promotion of new organizational arrangements which enhance the chances of success.

• Stimulating demand for performance measurement systems.

Human factor:
• Perseverance (since performance measurement is an ongoing process).

• Top management commitment and involvement.

Training needs:
• Familiarization with concepts linked to the underlying principles.

• On-the-job training derived from the practical framework: first learning to frame a diagnosis,
to define objectives and to translate them into action. Then learning the capacity to construct
performance indicators: although at first they may be abstract and general, they still have an
instructive value. Using indicators in practice will also help to provide a better appreciation of
their benefits and limitations.

Management points:
• Group involvement in designing a performance measurement system. Although a system is
seen only as a part of the improvement process, group involvement is essential for the
process as a whole.

• Feed-back of the information to the lower executive levels to obtain acceptance.

Methodological points:
• A uniform basis for data collection.

• Standards governing quality of information.

• An investigation of the relevant needs for information-seeking processes.

• Methodological expertise, creativity and "handmade" (and not standardized) solutions in the
system design.

• Coherence and logical relationships in the different levels of the performance measurement
systems. Otherwise, accuracy and relevance may be lost.

• Avoiding having one set of rules for one level of the organization and another for lower levels.
This implies that a performance based budget process for an agency should be paralleled by
performance oriented management within the agency. It will be a difficult task for higher
levels to talk in terms of results and performance with heads of an organization which itself is
rigidly governed by rules and regulations and does not pay much attention to results.

• Clear guidelines issued from the center.

• Awareness that it may not be easy to fit performance measurement systems in areas where
there is inevitably limited management discretion, or where management performance cannot
easily be related to outcomes.

- 33-
ò Implementing Performance-based Management Manager’s Guide

IV. Conclusion

This guide has set out the basic concepts of a


Performance Framework, provided you with
suggestions on how to focus on key results areas,
how to select appropriate performance measures
and measurement processes, and finally how to
create a credible performance report.

In planning, measuring and reporting on the


performance on your program, it is critical to
remember that this process takes time,
perseverance and commitment by managers at all
levels in your organization. It also requires
consensus on the part of employees and
collaboration with stakeholders.

A recent study confirms that "measurement-


managed" organizations significantly outperform
their peers. The same study also clarifies the
benefits of a measurement strategy (see Exhibit 14).

– 34 –
Manager’s Guide ò Implementing Performance-based Management

Exhibit 14: The Value of Measurement Management

Indicators of Success Measurement-managed Non-Measuring


Organizations (%) Organizations (%)

Clearer agreement on strategy 90 47


among senior management

Effective communication of 60 8
strategy to organization

Open sharing of information 71 30

Good cooperation and 85 38


teamwork among
management

High levels of self-monitoring 42 16


of performance by employees

Willingness of employees to 52 22
take risk

Source: National study of 203 firms by William Schiemann & Associates, Inc. cited in Seminar
Profile: Using Measurement to Transform the Organization (New York: The Conference Board,
Inc., 1996).

- 35-
ò Implementing Performance-based Management Manager’s Guide

Appendix A: Glossary

Mission statement A mission statement should describe 'who does


what to whom and why' in an organizational entity. A
mission reflects the global objectives and mandate
of an organization for a policy or program.

Results Effects, benefits or consequences of outputs in


targetted groups over time.

Ultimate outcomes The ultimate outcomes of a program relate to the


mission and mandate of the program/service
provider and the long-term effects, benefits, or
consequences sought.

Intermediate outcomes An intermediate outcome is the initial effect, benefit


or consequence of an output. Typically, the
outcomes are a perceptual, attitudinal and/or
behavioral response on the part of the group(s)
reached; and/or related to an improvement in the
health of the natural environment.

Reach Reach is defined as the group, or groups, which are


influenced by activities and outputs. The reach
category has been segmented into three major
groups:

• The primary targets or clients are the groups upon


which the outward attention of the
program/service is focused and which must be
influenced to behave in a certain way in order for
the program/service to achieve its mission.
• Co-deliverer agents, intermediaries are groups
which must be relied on to perform in such a way
as to influence the primary targets (or clients) to

– 36 –
Manager’s Guide ò Implementing Performance-based Management

adopt the desired behaviour or they may deliver


complementary services which help achieve
direct, intermediate, or ultimate outcomes (e.g.,
standards bodies)
• Stakeholders or beneficiaries are groups which
may benefit from the program / service.

Resources Financial, human, physical, technical, information,


products and processes used to achieve results.

Inputs Inputs refer to the funds, labour skill types and core
competences required to carry out activities.

Activities Activities are specific deeds, tasks or actions that


contribute to the production of goods or the
provisions of services through which results are
achieved.

Outputs Outputs are the direct products and services


produced though program activities.

Performance measure An indicator that provides information on the extent


to which results are being achieved.

Influencing factors Outside influences affect the achievements of


activities and outputs but are most important in the
reach and outcomes areas. Outside influences such
as the state of the economy or political environment
can have an overwhelming influence on
program/service performance - especially for
initiatives which often have results occurring over
long-term periods.

- 37-
ò Implementing Performance-based Management Manager’s Guide

Appendix B:
Environment Canada’s Performance
Measurement Strategy

Performance Measurement Strategy

Measuring performance is a key element of modern public management. It


helps us determine the effectiveness and efficiency of selected strategies,
assists in setting priorities, enables more effective demonstration of program
impacts, and ultimately is intended to improve Departmental performance.

Some aspects of performance have always been measured. Inputs of


monetary and human resources are generally tracked, and outputs such as
reports produced or inspections carried out have also been counted. A major
challenge in moving to results-based management is the development of
measures of outcomes, that is, of the impacts of programs and services on
the public and other clients.

Challenges for Performance Measurement

Changes in environmental conditions often take decades to become


visible. Most environmental issues progress through a cycle that extends 25
years or more. For example, acid rain was known to have significant effects
in the 1970s, yet it was not until 1985 that agreements with the provinces
could be reached on cutting emissions levels, and these reductions continue
to be implemented now. The actions of Environment Canada and its partners
have been successful in reducing emissions and some improvement in
affected lakes has been seen, however, other areas continue to deteriorate
and additional controls may yet be required. The length of this issue cycle
poses difficulties for performance measurement. If an indicator of the health
of aquatic ecosystems is used, it would have shown declines for many years
despite effective action by the department. However, if measures of
intermediate outcomes are used exclusively they may give insufficient
evidence of the improvement to the environment.

– 38 –
Manager’s Guide ò Implementing Performance-based Management

Attribution is difficult in the areas of environment and sustainable


development because of the number of players that must be involved to
successfully implement solutions. In part this is because jurisdiction is
shared across government and between levels of government. But many
issues also require the cooperation of other countries, of Aboriginal people,
of industry, of community groups and of individual Canadians. Environment
Canada has an important role to play in bringing together these partners and
ensuring they work together toward the ultimate objective. The challenge is
how to attribute responsibility for success in cases where the benefits of joint
action may not have been realized without Environment Canada’s
intervention, and yet, the Department has certainly not achieved the result on
its own.

Harms avoided through changed behaviour and preventative action are


difficult to demonstrate. A large and increasing portion of Environment
Canada’s work is devoted to preventing various harms from occurring. This
includes the provision of weather warnings, advice on pollution prevention
and eco-efficiency, and the assessment of substances before they enter the
marketplace. It is impossible to say with certainty what effects would have
occurred had such preventive action not been taken. While the wisdom of
prevention over remediation is obvious (we need only look to the cost of
cleaning up a single contaminated site or spill), the benefits of action after
the fact are easier to show.

Good measures of the impacts of scientific and technological research


are not yet available. For most issues, a key strategy involves using
Environment Canada’s expertise to increase understanding of the nature of
environmental problems, their causes, and the effects on health, property or
the environment. This understanding is crucial in building support for
regulatory or other control actions, for engaging domestic and international
partners, and for selecting the most efficient and effective solutions. Many
organizations that engage in scientific research are struggling with this
problem of measuring the impacts of scientific research efforts.

Many of the final outcomes that are anticipated with the achievement of
sustainable development have not yet been clearly defined. While the
Government of Canada and other governments around the world have

- 39-
ò Implementing Performance-based Management Manager’s Guide

adopted the goal of sustainable development, there is a lack of clarity and


consensus as to what the specific outcomes associated with sustainable
development should be, and how progress toward this goal might be
measured.

Our Strategy

Environment Canada’s performance measurement strategy is designed to


provide meaningful information to Parliament and the public on progress
toward departmental objectives while recognizing the above constraints.
Specifically, Environment Canada will:

Continue to develop and report measures of the state of the


environment, reduction of harm to human health and safety, and
economic efficiency. These represent the ultimate outcomes of
Environment Canada’s activities – making sustainable development a reality.
Our success as a department will inevitably rest on our ability to effect
positive change in these areas. Since many of the Department’s activities
serve more than one result, outcome measures are needed to assess the
combined effects of many program activities.

Develop measures of intermediate outcomes that are more directly


attributable to Departmental actions. Ultimate outcomes for environmental
issues are typically achieved over many years and through the actions of
many players. Intermediate outcomes are effects of Environment Canada’s
programs that are considered necessary for achieving ultimate outcomes, but
which may not themselves provide direct public benefit.

Adjust measures of intermediate outcomes periodically as issues


mature and strategies shift. As environmental issues mature the strategies
used by Environment Canada change. For example, more effort is placed on
building public awareness during the middle phase of an issue, once the
causes and effects are sufficiently understood, but controls are not yet in
place. Once controls have been implemented and new practices have been
integrated into routines, this activity will decrease. Performance measures
should be appropriate to the stage of the issue and to the strategy that the
Department has selected.

– 40 –
Manager’s Guide ò Implementing Performance-based Management

Report measures of outputs where adequate outcome measures are not


available. Measures of outputs provide valuable performance information for
internal management, such as for assessing program efficiency. However,
output measures are not a replacement for measures of outcomes as they do
not provide a basis for choosing among alternate strategies, or for
determining whether programs are having the desired effects. Development
of good measures of program outcomes is continuing, but in some cases,
measures of outputs may need to be used where better measures are not yet
available.

Use indirect measures of the impacts of science. Since much of


Environment Canada’s contribution is dependent on the quality of its
scientific research and development, work is underway to develop measures
of the impacts of this activity. Several types of indirect measures have been
proposed, based on: the effectiveness of subsequent policies; the quality of
services (for example, weather forecasting, whose accuracy depends on
understanding how the atmosphere works); how well the public understands
environmental issues; and the behavioural changes Canadians make in
response to science.

Emphasize the integration of performance measures into decision


making. Reporting performance measures externally is important, but their
real value lies in promoting a culture of continuous performance improvement
within the Department. To do this measures must become part of
management decision making and be “owned” by program managers. The
process of determining what constitutes valid measures of performance
forces a degree of rigour in thinking about program activities that can inform
priority setting and the focusing of effort.

Supplement performance measures with rigorous qualitative


assessments to provide a more complete picture of Departmental
performance. Not everything that is important can be measured, and not
everything that can be measured is important. Well chosen examples can
often convey a better impression of the impact of departmental activities than
any number of measures.

- 41-
ò Implementing Performance-based Management Manager’s Guide

Survey public, client and staff opinions of departmental performance,


especially in areas where provision of services is paramount. A
significant portion of Environment Canada’s programs involve the provision
of services to the public or clients (including other federal departments and
agencies). One of the best ways to determine whether intended benefits are
being achieved is to use opinion surveys and other forms of consultation with
the public and clients. A similar approach may also be used for internal
administration and other service activities that provide their services within
the department.

Use program evaluations and special studies to clarify the relationship


between departmental actions and outcomes. Performance measures are
derived from an understanding of the logic of program operation, that is, the
relations of causation and influence that connect Departmental actions to
ultimate effects. For environmental issues these relations are often complex.
Many factors, only a few of which are under the Department’s control, affect
the achievement of ultimate outcomes. Program evaluations and other
studies help to identify these relations. They also provide a much more
detailed picture of program performance than is possible through a small set
of performance measures.

– 42 –
Manager’s Guide ò Implementing Performance-based Management

Appendix C: References

(Source: Canadian Evaluation Society, Performance-based Management


Workshop, course materials)

Monographies
Auld, D., and H. Kitchen, The Supply of Government Services, Vancouver,
The Fraser Institute, 1988.

Beeton, D., Performance Measurement: Getting the Concepts Right, London,


Public Finance Foundation, Discussion Paper 18, 1988.

Benton, Bill, et al., Management Indicators, New Zealand, Department of


Social Services, 1981.

Carley, Michael, Performance Monitoring in a Professional Public Service,


London, Policy Studies Institute, 1988.

Carter, N., R. Klien, and P. Day, How Organizations Measure Success,


London, Routledge, 1991.

Cave, M., M. Kogan, and R. Smith, Output and Performance Measurement in


Government, London, Jessica Kingsley, 1990.

Connolly, M., and S. Richards, Public Money and Management, London,


Public Finance Foundation, Vol 8, No 4, Witner 1988.

Durham, P, Output and Performance Measurement in Central Government:


Some Practical Achievements, London, H.M. Treasury, 1987.

Elkin, R., and M. Molitor, Management Indicators in Nonprofit Organizations,


University of Maryland, Baltimore, Peat, Marwick, Mitchell & Co., 1984.

Epstein, P.D., Using Performance Measurement: A Guide to Improving


Decisions, Performance and Accountability, New York, Van Nostrand

- 43-
ò Implementing Performance-based Management Manager’s Guide

Reinhold, 1984.

Government of Australia, FMIP and Program Budgeting - A Study of


Implementation in Selected Agencies, Commonwealth of Australia,
Department of Finance, Canberra, 1987.

Holzer, Marc, Productivity in Public Organizations, Port Washington, N.Y.


Kennikat Press, 1976.

Jowett, P., and M. Rothwell, Performance Indicators in the Public Sector,


Houndmills, England, MacMillan Press, 1988.

Kanter, Rosabeth M., The Measurement of Organizational Effectiveness,


Productivity, Performance and Success: Issues and Dilemmas in Service
and Non-Profit Organizations, Yale University, Program on Non-Profit
Organizations, Institution for Social and Policies Studies, 1979.

Lewis, S., Output and Performance Measurement in Central Government:


Progress in Department, London, H.M. Treasury, 1986.

MacRae, D. Jr., Policy Indicators: Links Between Social Science and Public
Debate, London, University of North Carolina Press, 1985.

Millar, A., H.P. Hatry, and M. Koss, Monitoring the Outcomes of Social
Services, Washington D.C., The Urban Institute, 1977.

Millar, R., et al., Delivering Client Outcome Monitoring Systems: A Guide for
State and Local Service Agencies, Washington D.C., The Urban Institute,
1981.

Morley, Elaine, A Practitioner's Guide to Public Sector Productivity


Improvement, New York, Van Nostrand, 1986.

OECD, The OECD List of Social Indicators, Paris, 1982.

OECD, Administration as Service to the Public as Client (V.F.


L'Administration au service du public), Paris, 1987, p. 139.

Office of the Comptroller General, Line Managers and Assessing Service to


the Public, Ottawa, April 1991.

– 44 –
Manager’s Guide ò Implementing Performance-based Management

Palmer, Stuart, British Performance Indicator Information System, London,


September 8, 1989.

United Kingdom, Financial Management in Government Departments,


London, Her Majesty's Stationary Office, September 1983, Cmnd. 9085.

U.K. Treasury, Output and Performance Measurement in Central


Government: Progress in Departments, London, S. Lewis Editor, Treasury
Working Paper No 38, 1986.

U.K. Treasury, Measuring Output and Performance: Definitions and


Analytical Framework, London, 1984.

U.K. Treasury and Civil Service Committee, Progress in the Next Step
Initiative, London, 8th Report, Session 1989-90, HMSO, 1990.

U.S.A, Federal Productivity Measurement, Washington D.C. Office of


Personnel Management, 1981.

U.S. Office of Personnel Management, Performance Management Indicators


Report, Washington D.C., OPM, 1987

Wettenhall, R., and C. O Nuallain, Public Enterprise Performance


Evaluation: Seven Country Studies, Brussels, International Institute of
Administrative Sciences, 1990.

Wholey, J.S., K.E. Newcomer, and Associates. Improving Government


Performance, San Francisco, Cal. Jossey-Bass Publishers, 1989.

Wholey J.S., et al., Performance and Credibility - Developing Excellence in


Public and Nonprofit Organizations, Lexington Books, Toronto, 1986.

Windle, C., Program Performance Measurement: Demands, Technologies,


and Dangers, DHHS Publication No Adm/84-1357, Washington D.C., U.S.
Government Printing Office, 1984.
Articles
Allen, John R. Sources of Performance Measurement: A Canadian
Perspective, Governmental Finance, March 1983, pp 3-7.

Altman, Stan, Performance Monitoring Systems for Public Managers, Public


Administration Review, Jan-Feb 1979, pp 31-35.

- 45-
ò Implementing Performance-based Management Manager’s Guide

Baker, Walter, The Triple E Movement and Productivity in Canada’s Federal


Public Service, Optimum, Vol XI, No 3, 1980.

Beeton, D., Performance Measurement: The State of the Art, Public Money
and Management, Vol 8, No 1-2, Spring/Summer 1988.

Bens, Charles K., Strategies for Implementing Performance Measurement,


Management Information Service Report, International City Management
Association, Washington D.C, Vol XVIII, No 11, November 1986.

Carley, Michael, Beyond Performance Measurement in a Professional Public


Service, Public Money and Management, Vol 8, No 4, pp 23-27.

Carter, Neil, Learning to Measure Performance: The Use of Indicators in


Organizations, Public Administration, Vol 69, Spring 1991, pp 85-101.

Carter, Neil, Performance Indicators: Backseat Driving or Hands off Control?,


Policy and Politics, Vol XVII, No 2, April 1989, pp 131-38.

Clarke, Peter J., Performance Evaluation of Public Sector Programmes,


Administration, Vol XXXII, No 3, 1984.

Crompton, John L., A Sensitive Approach to Retrenching Services in the


Public Sector, American Review of Public Administration, Vol XVIII, No 1,
mars 1988, pp 79-93.

Dalton, T.C., and L.C. Dalton, The Politics of Measuring Public Sector
Performance: Productivity and Public Organization, Promoting Productivity in
the Public Sector - Problems, Strategies and Prospects, New York, St-
Martin’s Press, 1988, pp 19-65.

Ewing, B.G., C. Burstein, and C. Wickman, Meeting the Productivity


Challenge in the Federal Government, National Productivity Review, Vol V,
No 3, Summer 1986.

Felix, G.H., and J.L. Riggs, Productivity Measurement by Objectives,


National Productivity Review, Oregon Productivity Centre, Autumn 1983, pp
386-93.

Flynn, A., A. Gray, W.I. Jenkins, et al., Accountable Management in British


Central Government: Some Reflections on the Official Record, Financial

– 46 –
Manager’s Guide ò Implementing Performance-based Management

Accountability and Management, Vol IV, No 3, Autumn 1988, pp 169-89.

Flynn, A., A. Gray, W.I. Jenkins, W.I., and B. Rutherford, Making Indicators
Perform, Public Money and Management, Vol VIII, No 4, pp 35-41.

Harvey, Jean, Measuring Productivity in Professional Services, Public


Productivity Review, No 44, Winter 1987.

Hatry, Harry P.,Measuring the Quality of Public Services, Improving the


Quality of Urban Management, Beverly Hills, Sage Publications, 1974, pp 39-
64.

Hatry, Harry P. The Status of Productivity Measurement in the Public Sector,


Public Administration Review, Vol XXXVIII, No 1, 1978.

Hatry, Harry P., Performance Measurement Principles and Techniques - An


Overview for Local Government, Public Productivity Review, Vol IV, No 4,
dec. 1980, pp 312-39.

Hatry, Harry P., Determining the Effectiveness of Government Services,


Handbook of Public Administration, San Francisco, Cal. Jossey-Bass
Publishers, 1989, pp 469-82.

Hatry, H.P., J.M. Greiner, and M. Swanson, Monitoring the Quality of Local
Government Services, Management Information Service Report, International
City Management Association, Washington D.C. Feb. 1987.

Hepworth, N.P. Measuring Performance in Non-Market Organizations,


International Journal of Public Sector Management, Vol I, No 1, 1988.

Holtham, Clive, Developing a System for Measuring Departmental


Performance, Public Money and Management, Vol 8 No 4, Winter 1988, pp
29-33.

Hurst, Gerald E. Jr. Attributes of Performance Measures, Public Productivity


Review, Vol IV, No 1, March 1980.

Jackson, Peter, The Management of Performance in the Public Sector, Public


Money and Management, Winter 1988, pp 11-15.

Klein, R., and N. Carter, Performance Measurement: A Review of Concepts


and Issues, Performance Measurement: Getting the Concepts Right, London

- 47-
ò Implementing Performance-based Management Manager’s Guide

Public Finance Foundation, Discussion Paper 18, 1988.

Le Pen, Claude, La productivité des services publics non marchands:


quelques réflexions méthodologiques, Revue d’économie politique, 96 th year,
No 5, 1986, pp 476-89.

Mark, Jerome A., Meanings and Measures of Productivity, Public


Administration Review, Vol XXXII, No ?, Nov./Dec. 1979, pp 747-53.

Martin, P.Y., and B. Whiddon, Conceptualization and Measurement of Staff


Performance, Public Productivity Review, Vol XI, No 3, Spring 1988.

Mayne, John, and Eduardo Zapico-Goñi, Monitoring Performance in the


Public Sector, Transaction Publishers, New Brunswick, New Jersey, 1997.

Mayston, David J., Nonprofit Performance Indicators in the Public Sector,


Financial Accountability and Management, Vol I, No 1, Summer 1985.

Newcomer, K.E., Evaluating Public Programs, Handbook of Public


Administration, Jossey-Bass, San Francisco, 1996.

Neves, C.M.P., J.F. Wolf, and B.B. Benton, The Use of Management
Indicators in Monitoring the Performance of Human Service Agencies,
Performance and Credibility - Developing Excellence in Public and Nonprofit
Organizations, Lexington Books, Toronto, 1986, pp 130-48.

Newell, Terry, Why Can’t Government Be Like... Government?, Public


Productivity Review, Vol XII, No 1, Fall 1988, pp 29-41.

OECD, Measuring Performance and Allocating Resources, Public


Management Studies, Draft, No 5, Paris 1989.

Palmer, Stuart, Report on Performance Indicators, Ottawa, Treasury Board


Secretariat, Expenditure Analysis Division, September 1989.

Poister, Theodore H., Performance Monitoring in the Evaluation Process,


Evaluation Review, Vol VI, No 5, Oct. 1985.

Politt, Christopher, Measuring Performance: A New System for the National


Health Service, Policy and Politics, Vol XIII, No 1, Jan. 1985, pp 1-15.

Politt, Christopher, Beyond the Managerial Model: The Case for Broadening

– 48 –
Manager’s Guide ò Implementing Performance-based Management

Performance Assessment in Government and the Public Services, Financial


Accountability & Management, Vol II, No?, Autumn 1986, pp 155-170.

Politt, Christopher, Bringing Consumer into Performance Measurement:


Concepts, Consequences and Constraints, Policy and Politics, Vol XVI, No
2, April 1988, pp 77-88.

REID, Gary J., Measuring Government Performance: the Case of


Government Waste, National Tax Journal, Vol XLII, No 1, pp 29-44.

Rosen Ellen Doree, O*K Work, Incorporating Quality into the Productivity
Equation, Public Productivity Review, Vol V, No 3, Sept. 1983.

Shaw, Nigel, Productivity Developments in the United States, Management


Services, Vol XXXI, No 10, pp 8-12, and No 11, pp 8-14.

Sink, D.S., T.C. Tuttle, and S.J. DeVries, Productivity Measurement and
Evaluation: what is Available?, National Productivity Review, Summer 1984.

Stewart, J., and M. Clarke, The Public Service Orientation: Issues and
Dilemmas, Public Administration, Vol LXV, Summer 1987, pp 161-77.

Swiss, James E, Holding Agencies Accountable for Efficiency - Learning from


past Failures, Administration & Society, Vol XV, No 1, May 1983, pp 75-96.

Swiss, James E., Unbalanced Incentives in Government Productivity


Systems: Misreporting as a Case Point, Public Productivity Review, Vol VII,
No 1, March 1983.

Torgovnik, E., and E. Preisler, Effectiveness Assessment in Public Service


Systems, Human Relations, Vol XL, No 2, 1987.

Tuttle, T.C., and D.S. Sink, Taking the Threat out of Productivity
Measurement, National Productivity Review, Winter 1984-85.

U.S. organizations are doing significant work in the area of performance


indicators, EVALTALK. (See Appendix J, p 7.)

Usialner, Brian, Proceedings of the Symposium on Efficiency Auditing in


Government, January 26, 1989, Office of the Auditor General, Office of the
Comptroller General of Canada.

- 49-
ò Implementing Performance-based Management Manager’s Guide

Weir, Michael, Efficiency Measurement in Government, The Bureaucrat, Vol


XIII. No 2, Summer 1984.

Wholey, Joseph S., A Program for Excellence: Human Services in a


Prosperous and Growing Region, Northern Virginia Challenges - The 1990s
and Beyond, George Mason University, 1989, pp 23-36.

Wholey, Joseph, and Kathryn E. Newcomer, Clarifying Goals, Reporting


Results, Using Performance Measurement to Improve Public and Nonprofit
Programs, Jossey-Bass Inc., San Francisco, 1997.

Williams, Allan, Performance Measurement in the Public Sector: Paving the


Road to Hell?, Arthur Young Lecture No 7, Department of Accountancy,
School of Financial Studies, University of Glasgow, 1985.

Woodward, S.N., Performance Indicators and Management Performance in


Nationalized Industries, Public Administration, Vol 64, Autumn 1986, pp 303-
317.

Wright, K., Output Measurement in Practice, Economic Aspects of Health


Services, Martin Robertson, 1978.

Zedlewski, Edwin W., Performance Measurement in Public Agencies: The


Law Enforcement Evolution, Public Administration Review, Vol XXXIX, No 5
Sept./Oct. 1979, pp 488-94.
Websites of Interest
http://www.city.grande-prairie.ab.ca/perform.htm

Grande Prairie, Alberta: Emerging as one of the premiere sites relating to


performance management, this site contains references for public and private
enterprises.

http://www.dep.state.fl.us/org/ossp/report/intro.htm

State of Florida Department of Environmental Protection, this site contains


the Secretary’s Quarterly Performance Report which deals with measuring
and reporting of environmental results.

http://www.npr.gov/initiati/mfr/index.html

– 50 –
Manager’s Guide ò Implementing Performance-based Management

The National Performance Review, in cooperation with OMB, has assembled


a new series of web pages devoted to managing for results and the
implementation of the Government Performance and Results Act. Nineteen
of the twenty case studies on strategic planning and/or performance
measurement are there, as are several agency strategic plans and
accountability reports. The intent is to keep the site up to date with links to
additional strategic plans, annual performance plans, testimony, training
opportunities, and new materials as they become available.

http://www.pmn.net

Performance Management Network web site. This site will link you to the
above and provides you with more background information on the Three Rs
and other concepts used in this presentation.

http://www.tbs-sct.gc.ca/home_e.html

Treasury Board, Quality Services and Review: This site contains specific
Government of Canada references and cross-references to performance
management work.

- 51-

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy