DOD Earn Value Management Implementation Guide
DOD Earn Value Management Implementation Guide
Earned Value Management (EVM) is a widely accepted industry best practice for program management
that is used across the Department of Defense (DoD), the Federal government, and the commercial sector.
Government and industry program managers use EVM as a program management tool to provide
situational awareness of program status and to assess the cost, schedule, and technical performance of
programs. EVM is meant to be flexible and mirror the management practices of the contractor, not to
impose burdensome requirements. Whenever possible, the Government should tailor management and
EVM requirements to leverage the contractor’s existing processes and data generated by those processes
to obtain sufficient insight into program cost, schedule, and technical performance. An EVM System
(EVMS) is the management control system that integrates a program’s work scope, schedule, and cost
parameters for optimum program planning and control.
To be effective, EVM practices and competencies must be integrated into the program manager’s
acquisition decision-making process. In addition, the data provided by the EVMS must be timely, accurate,
reliable, and auditable. Finally, the EVMS must be implemented in a disciplined manner consistent with
the 32 Guidelines contained in the Electronic Industries Alliance Standard-748 EVMS (EIA-748) (hereafter
referred to as the “Guidelines”).
The Guidelines represent characteristics and objectives of a management and control system for organizing,
planning, scheduling, budgeting, performance measurement, forecasting, analysis, and baseline change
control. As such, the guidelines are interrelated and foundational in the design, implementation, and
operation of an EVMS. Therefore, a supplier has the opportunity to design a management and control
system with the flexibility of applying these guidelines in a manner that uniquely meets the organization’s
needs in procedural guidance and implementation.
Part 1 of the Earned Value Management Implementation Guide (EVMIG) (hereafter referred to as “this
guide”) describes EVM Concepts and Guidelines. Part 2 provides guidance for Government use of EVM,
including guidance for applying EVM requirements to contracts, an introduction to analyzing performance,
and a discussion of baseline review and maintenance and other post award activities. The appendices
contain additional reference material.
Note that DoD EVM policy applies to contracts with industry, as well as to intra-government activities.
Throughout this document, the term “contract” refers to both contracts with private industry and
agreements with intra-governmental activities that meet the DoD reporting thresholds. Similarly, the term
“contractor” refers to entities within both private industry and government.
This document is intended to serve as the central EVM guidance document for DoD personnel. Throughout
the Earned Value Management Implementation Guide (EVMIG), references are made to additional sources
of information such as EVMS standards, handbooks, guidebooks, and websites. Consult these additional
sources as appropriate (reference Appendix A for a list of these documents and hyperlinks to these
resources).
i
DoD EVMIG
FOREWORD ........................................................................................................................ i
PART 1: EARNED VALUE MANAGEMENT CONCEPTS & GUIDELINES................... 1
SECTION 1.1: EARNED VALUE MANAGEMENT .......................................................... 1
1.1.1 Concepts of Earned Value Management ............................................................................. 1
1.1.2 EVM and Management Needs............................................................................................. 1
1.1.3 Uniform Guidance ............................................................................................................... 1
SECTION 1.2: EARNED VALUE MANAGEMENT SYSTEM GUIDELINES .................. 2
1.2.1 Earned Value Management System (EVMS) ...................................................................... 2
1.2.2 EVMS Guidelines Concept ................................................................................................. 2
1.2.3 System Compliance and Acceptance................................................................................... 3
1.2.4 System Documentation ........................................................................................................ 3
1.2.5 Cost Impacts ........................................................................................................................ 4
1.2.6 Conclusion ........................................................................................................................... 4
PART 2: PROCEDURES FOR GOVERNMENT USE OF EARNED VALUE ................... 4
SECTION 2.1: APPLYING EARNED VALUE MANAGEMENT ...................................... 4
2.1.1 Overview ............................................................................................................................. 4
2.1.2 Government EVM Organizations ........................................................................................ 5
2.1.3 Roles and Responsibilities ................................................................................................... 5
SECTION 2.2: PRE-CONTRACT ACTIVITIES ................................................................ 7
2.2.1 Overview ............................................................................................................................. 7
2.2.2 Department of Defense Requirements ................................................................................. 7
2.2.3 General Guidance for Program Managers ......................................................................... 17
2.2.4 Acquisition Strategy/Acquisition Plan .............................................................................. 17
2.2.5 Preparation of the Solicitation ........................................................................................... 17
2.2.6 Source Selection Evaluation .............................................................................................. 36
2.2.7 Preparation of the Contract ................................................................................................ 38
SECTION 2.3: POST-AWARD ACTIVITIES – INTEGRATED BASELINE REVIEWS 38
2.3.1 Overview ........................................................................................................................... 38
2.3.2 Purpose of the IBR ............................................................................................................ 38
2.3.3 IBR Policy and Guidance .................................................................................................. 39
2.3.4 IBR Focus .......................................................................................................................... 40
2.3.5 IBR Team .......................................................................................................................... 41
2.3.6 IBR Process ....................................................................................................................... 41
2.3.7 IBR Results ........................................................................................................................ 44
SECTION 2.4: POST-AWARD ACTIVITIES – SYSTEM COMPLIANCE ..................... 45
2.4.1 Overview ........................................................................................................................... 45
2.4.2 EVMS Approval ................................................................................................................ 45
2.4.3 EVMS Surveillance and Maintenance............................................................................... 49
2.4.4 System Changes................................................................................................................. 53
2.4.5 Reviews for Cause (RFCs) ................................................................................................ 54
2.4.6 Deficiencies in Approved EVMS ...................................................................................... 56
2.4.7 System Disapproval ........................................................................................................... 57
2.4.8 Deficiencies in Disapproved or Not Evaluated Systems ................................................... 57
SECTION 2.5: OTHER POST-AWARD ACTIVITIES..................................................... 58
2.5.1 Overview ........................................................................................................................... 58
ii
DoD EVMIG
iii
DoD EVMIG
• Relate time-phased budgets to specific contract tasks and/or Statements of Work (SOWs)
• Objectively measure work progress
• Properly relate cost, schedule, and technical accomplishments
• Enable informed decision making and corrective action
• Be timely, accurate, reliable, and auditable
• Allow for estimation of future costs and schedule impacts
• Supply managers at all levels with status information at the appropriate level
• Be derived from the same Earned Value Management System (EVMS) used by the contractor to
manage the contract
• Integrate subcontract EVMS data into Prime Contractor’s EVMS
1
DoD EVMIG
• Planning all work scope for the program from inception to completion
• Assignment of authority and responsibility at the work performance level
• Integration of the cost, schedule, and technical aspects of the work into a detailed baseline plan
• Objective measurement of progress at the work performance level with EVM metrics
• Accumulation and assignment of actual direct and indirect costs
• Analysis of variances or deviations from plans
• Summarization and reporting of performance data to higher levels of management for action
• Forecast of achievement of Milestones and completion of contract events
• Estimation of final contract costs
• Disciplined baseline maintenance and incorporation of baseline revisions in a timely manner
Private companies utilize business planning and control systems for management purposes. Tailored,
adapted, or developed for the unique needs of companies, these planning and control systems rely on
software packages and other Information Technology solutions. While most of the basic principles of an
EVMS are already inherent in good business practices and program management, nonetheless there are
unique EVM guidelines that require a more disciplined approach to the integration of management
systems.
While the Guidelines are broad enough to allow for common sense application, they are specific enough
to ensure reliable performance data for the buying activity. The Guidelines do not address all of a
contractor's needs for day-to-day or week-to-week internal controls such as subcontractor status reports.
These important management tools should augment the EVMS as effective elements of program
management.
The Guidelines have been published as the Electronic Industries Alliance (EIA) standard EIA-748, Earned
Value Management Systems. The DoD only recognizes the Guideline statements within the EIA-748 and
periodically reviews the Guidelines to ensure they continue to meet the government’s needs.
The Guidelines provide a consistent basis to assist the government and the contractor in implementing and
maintaining an acceptable EVMS. The DoD Earned Value Management System Interpretation Guide
2
DoD EVMIG
(EVMSIG) provides the overarching DoD interpretation of the Guidelines where an EVMS requirement
is applied.
The Guideline approach provides contractors the flexibility to develop and implement effective
management systems while nonetheless ensuring performance information is provided to management in
a consistent manner.
It is the contractor’s responsibility to develop and apply the specific procedures for complying with the
Guidelines. Current DoD policy (Department of Defense Instruction (DoDI) 5000.02 Table 8), EVM
Requirements, requires contracts that meet certain thresholds use an EVMS that complies with the
Guidelines standard. DoDI 5000.02 also requires the proposed EVMS to be subject to system acceptance
under certain conditions (see Section 2.2 for information on thresholds for compliance and Section 2.3 for
system acceptance). When the contractor’s system does not meet the intent of the Guidelines, the
contractor must make adjustments necessary to achieve system acceptance.
When the government’s solicitation package specifies compliance with the Guidelines and system
acceptance, an element of the evaluation of proposals is the prospective contractor's proposed EVMS. The
prospective contractor should describe the proposed EVMS in sufficient detail to permit evaluation for
validation with the Guidelines. Section 2.2, Pre-Contract Activities includes a discussion of both
government and contractor activities during the period prior to contract award. Refer to the applicable
Defense Federal Acquisition Regulation Supplement (DFARS) clauses for specific EVMS acceptance and
compliance requirements for the contract.
Upon award of the contract, the contractor utilizes the EVMS process description and documentation to
plan and control the contract work. As the government relies on the contractor’s system, it should not
impose duplicative planning and control systems. Contractors are encouraged to maintain and improve
the essential elements and disciplines of the systems and should coordinate system changes with the
government. The Administrative Contracting Officer (ACO) approves system changes in advance for
contracts that meet the threshold for the Guidelines compliance and system acceptance. Refer to DFARS
Subpart 234.2 Earned Value Management System and Paragraph 2.2.6.2.1 of this Guide for more
information on this requirement.
The government PM and EVM analysts are encouraged to obtain copies of the contractor’s System
Description and related documentation and to become familiar with the company’s EVMS. Companies
usually provide training on their systems upon request, enabling the government team to understand how
company processes generate EVMS data, the impacts of EV measurement methodology, and the
3
DoD EVMIG
requirements for government approval of changes. Government EVMS specialists should have the latest
System Description and related documentation and familiarize themselves with the company’s EVMS
before beginning surveillance activities.
The government and contractor should discuss differences arising from divergent needs (such as the level
of reporting detail) during contract negotiations. While the Guidelines are not subject to negotiation, many
problems concerning timing of EVMS implementation and related reporting requirements are avoided or
minimized through negotiation. The contractor often uses the Work Breakdown Structure (WBS) and
contract data requirements defined in the Request for Proposal (RFP) to establish its planning, scheduling,
budgeting, and management infrastructure, including the establishment of Control Accounts (CAs), Work
Packages (WPs), and charge numbers. The Government should seriously consider the WBS and reporting
levels prior to RFP and during negotiations with the contractor. Decisions made prior to RFP have direct
impact on the resources employed by the contractor in the implementation of the EVMS and data available
to the government through the Integrated Program Management Report (IPMR). The government and
contractor should also periodically review processes and data reporting to ensure that the tailored EVMS
approach continues to provide the appropriate level of performance information to management.
1.2.6 Conclusion
Application of the EVMS Guidelines helps to ensure that contractors have adequate management systems
that integrate cost, schedule, and technical performance. This also provides better overall planning,
control, and disciplined management of government contracts. An EVMS compliant with the Guidelines
and properly used helps to ensure that valid cost, schedule, and technical performance information are
generated, providing the PM with an effective decision making tool.
2.1.1 Overview
The intent of this guide is to improve the consistency of EVM application across DoD and within industry.
When PMs use EVM in its proper context as a tool to integrate and manage program performance, the
underlying EVMS and processes become self-regulating and self-correcting. PMs should lead this effort,
as the success of the program depends heavily on the degree to which the PM embraces EVM and utilizes
it on a daily basis.
Government PMs recognize the importance of assigning responsibility for integrated performance to the
Integrated Product Teams (IPTs). While PMs and IPTs are ultimately responsible for managing program
performance, EV analysts should assist them in preparing, coordinating, and integrating analysis.
4
DoD EVMIG
Cooperation, teamwork, and leadership by the PM are paramount for successful implementation and
utilization. There are different support organizations that assist the program team with tailoring and
implementing effective EVM on a program. This section of the guide defines the roles and responsibilities
of the various organizations, offices, and agencies within the DoD.
5
DoD EVMIG
The Departments of the Air Force, Army, and Navy and the Missile Defense Agency (MDA) all have
component EVM focal points.
6
DoD EVMIG
As prescribed in DoDI 5000.02 and DFARS, compliance with the Guidelines is required for DoD cost or
incentive contracts and agreements valued at or greater than $20M. Compliance with the Guidelines and
an EVMS that has been determined to be acceptable by the Cognizant Federal Agency (CFA) are required
for DoD cost reimbursement or incentive contracts and agreements valued at or greater than $100M. If
the contract value is less than $100M, then formal compliance determination of the contractor’s EVMS is
7
DoD EVMIG
not required; however, the contractor needs to maintain compliance with the standard. Contract reporting
requirements are included in Table 9 of the DoDI 5000.02 shown below in Figure 1.
EVM should be a cost-effective system that shares program situational awareness between government
and contractor. In an oversight role, a critical function of the government program office is to use all data,
including cost, schedule, and technical performance metrics, to identify early indicators of problems so
that adjustments can be made to influence future program performance. The decision to apply EVM and
the related EVM reporting requirements should be based on work scope, complexity, and risk, along with
the threshold requirements in the DFARS. Misapplication of EVM can unnecessarily increase costs for
the program.
If the government program office does not believe the full application of EVM would be beneficial, it
should contact its applicable Service/Agency EVM focal point to discuss options so that the program will
still receive the necessary and desired insight into program status. If it is agreed that the full application
of EVM is not necessary, the program office should then request a waiver and/or deviation as required by
their Component policies.
The IPMR can be tailored to collect cost and/or schedule data for any contract regardless of
whether EVM is required. For information on tailoring the IPMR, refer to the DoD IPMR
Implementation Guide.
Formats and reporting requirements for the IPMR are determined and managed by USD(A&S)
through the office of AAP.
8
DoD EVMIG
Non-schedule-based contracts might not permit objective work measurement due to the nature of the work,
most of which cannot be divided into segments that produce tangible and measurable product(s). The
nature of the work associated with the contract is the key factor in determining whether there will be any
appreciable value in obtaining EVM information. Paragraph 2.2.2.8 describes considerations when
determining applicability of work scope.
9
DoD EVMIG
variance analysis; Quarterly Schedule Risk Assessment (SRA); Quarterly Contract Funds Status Report
(CFSR); and a Cost and Software Data Report (CSDR) as required.
It is appropriate to not apply the EVM requirement in cases where the nature of the work would not lend
itself to meaningful EVM information. Exemptions from the EVM policy should be the exception, not the
rule, as they are necessary only in cases where a cost or incentive contract is being used for non-schedule-
based work. This type of work is typically accomplished using a Firm Fixed Price (FFP) contract. Program
offices should follow the process to obtain an EVM applicability decision.
The DoDI 5000.02 requires that the appropriate authority dependent upon ACAT level (i.e. AAP,
Component EVM focal points, CAE or designee) review and determine EVM applicability. If EVM is
determined not to apply based on the nature of the work, then EVM is not placed on contract. If EVM is
determined to apply, then EVM is placed on contract in accordance with established thresholds unless a
waiver is obtained. The Services/Agencies have the ability to delegate waiver or deviation authority from
the Federal Acquisition Regulation (FAR) or DFARS. PMs and COs should address waivers and
deviations to their applicable Service/Agency focal point for guidance, documentation requirements, and
processes.
Application of EVM methodology and system requirements for Full-Rate Production (FRP) contracts are
based on risk and the contractual scope of work. FRP risks are generally low to the government;
subsequently, EVM deviations are requested. If EVM is not applied, program management principles as
well as cost and schedule reporting generally apply. The reporting should include cost information (such
as actuals and top-level schedule information providing delivery dates of end products). Historical data
integrity issues or performance risks may drive additional reporting requirements and/or the application
of EVM.
The EVMS Guidelines provide the basis for determining whether contractors’ management control
systems are acceptable. As management control systems for development and production contracts tend
to differ significantly, it is impossible to provide detailed implementation guidance that specifically
applies to all cases for every contractor. Therefore, users of the guidelines should be alert for areas in
which distinctions in detailed interpretation seem appropriate or reasonable, whether or not they are
specifically identified. Interpretation of the guidelines must be practical as well as sensitive to the overall
requirements for performance measurement. By applying the guidelines instead of specific DoD
prescribed management control systems, contractors have the latitude to meet their unique management
needs. This allows contractors to use existing management control systems or other systems of their
choice, provided they meet the guidelines.
The same EVM reporting requirements in Figure 1 apply to production efforts. However, in more mature
production efforts, the risk associated with the contract is not commensurate with the application of EVM.
10
DoD EVMIG
Programs are encouraged to consult with EVM focal points to determine if a waiver and/or deviation is
an option and to develop alternative program management and reporting strategies and approaches.
Government EVM stakeholders recognize the significance of M/ERP systems in program management of
production contracts requiring EVM implementation and compliance. The National Defense Industrial
Association (NDIA) Integrated Program Management Division’s white paper, “Earned Value
Management in a Production Environment” indicates that an “MRP system is one example of a tool used
in production that potentially drives differences in how an EVMS is used or explained versus
development.” Understanding these differences is paramount to confirming compliance with the
Guidelines. M/ERP systems affect the operation and/or process of almost every EVMS applied on
development contracts. Examples include work authorization processes, the way the IMS is used, how
parts are moved both within and between contracts, how supplier or material cost and performance are
recorded, Control Account Manager (CAM) involvement in baseline development and performance
assessment, and WBS level.
As contractors are ultimately responsible for demonstrating compliance with the Guidelines, it is expected
that their EVM System Description and related documentation include language that identifies and
describes in detail areas where EVM processes differ for development and production contracts. In
addition, contractors should explain how each process complies with the Guidelines. Contractors should
refer to the DoD EVMSIG when describing guideline compliance in the differences section of their EVMS
Description. However, there is no requirement for the differences section when contractors elect to have
separate EVMS descriptions for production and development contracts.
11
DoD EVMIG
Authority and Guidance memo and the DoD Other Transactions Guide for Prototype Projects, OTAs and
MTAs have special considerations, but must still be managed and able to produce information needed for
effective management control of cost, schedule, and technical risk.
However, in extraordinary cases where cost/schedule visibility is deemed necessary and the contract type
(e.g., FFP) is determined to be correct, the government PM is required to obtain a waiver for individual
contracts from the MDA. In these cases the PM conducts a Business Case Analysis (BCA) that includes
supporting rationale for EVMS application (see Appendix C: Essential Elements of a Business Case
Analysis for guidance). When appropriate, include the BCA in the acquisition approach section of the
program AS report. In cases where the contractor already has an EVMS in place and plans to use it on the
FFP contract as part of its regular management process, negotiate EVM reporting requirements before
applying an EVM requirement. However, government personnel should not attempt to dissuade
contractors that use EVMS on all contracts irrespective of contract type from their use of EV processes to
manage FFP contracts.
12
DoD EVMIG
See Paragraph 2.2.5.6.3.4 for guidance on tailoring EVM reporting on FFP contracts.
Keep in mind the potential impact to the CFSR, which can be applied to all contract types with the
exception of FFP. It may be advisable to call for separate reporting by contract type in the CFSR. The
following examples illustrate these concepts.
Example 1: The planned contract is a development contract with an expected award value of $200M. At
the time of award, the contract type is entirely Cost Plus Award Fee (CPAF). Subsequent to award, some
additional work is added to the contract on a T&M CLIN.
Solution: Apply full EVM and IPMR reporting at the time of award to the entire contract but exempt the
T&M efforts from IPMR reporting at the time they are added to the contract. However, the T&M efforts
extend over several years, and the PM wishes to have a separate forecast of expenditures and billings. The
CFSR data item is therefore amended to call for separate reports for the CPAF and T&M efforts.
Example 2: The planned contract is a mix of development and production efforts with an anticipated value
of $90M. At the time of award, the development effort is estimated at $10M under a CPAF CLIN, and the
production is priced as FFP for the remaining $80M.
Solution: After conducting a risk assessment, the PM concluded that the risk did not justify EVM and
IPMR reporting on the FFP production effort and that there was not sufficient schedule risk to justify an
IMS. The PM noted that the development effort fell below the mandatory $20M threshold and, based on
a risk evaluation, determined that EVM was not applicable. However, a CFSR is determined to be
appropriate for the development portion of the contract to monitor expenditures and billings. A CFSR
would not be appropriate for production, as it is priced as FFP.
Example 3: A planned contract calls for development and maintenance of software. The overall value of
the development portion is $30M, and the maintenance portion is $170M. Development is placed on a
CPIF CLIN, while maintenance is spread over several Cost Plus Fixed Fee (CPFF) CLINs. It is anticipated
that the majority of the maintenance effort should be LOE. The PM is concerned about proper segregation
of costs between the efforts and has determined that there is significant schedule risk in development. The
PM is also concerned about agreeing up front to exclude the maintenance portion from EVM reporting.
Since there is a specified reliability threshold that is maintained during the operational phase, performance
risk has been designated as moderate. There are key maintenance tasks that can be measured against the
reliability threshold.
13
DoD EVMIG
Solution: Place EVMS DFARS on the contract and apply IPMR reporting to the development portion at
the time of contract award. Specific thresholds are established at contract award for variance reporting for
the development effort. IPMR reporting is also applied to the maintenance portion of the contract. Format
1 reporting is established at a high level of the WBS, with Format 5 reporting thresholds for maintenance
to be re-evaluated after review of the EVM methodology during the IBR. Variance reporting then
specifically excludes WBS elements that are determined to be LOE. CFSR reporting is also required for
the entire contract with a requirement to prepare separate reports for the development and maintenance
portions, as they are funded from separate appropriations. Format 6 is required for the development effort
but not for the maintenance effort. A CAE waiver is provided to allow for departure from the required 7
Formats.
Example 4: An IDIQ contract is awarded for a total value of $85M. The delivery/task orders include four
delivery/task orders for software development, each under $20M, each with a CPIF or CPFF contract type.
Each delivery/task order’s scope is for a software iteration that culminates in a complete software product.
There is also a material delivery/task order for material purchases of $26M. The estimated contract values
of the delivery/task orders are as follows:
Delivery/Task Order 1: $26M FFP for material purchases (i.e., computers and licenses)
Delivery/Task Order 2: $15M CPIF software development, iteration #1, 12 months
Delivery/Task Order 3: $11M CPIF software development, iteration #2, 12 months
Delivery/Task Order 4: $16M CPFF software development, iteration #3, 12 months
Delivery/Task Order 5: $17M CPFF software development, iteration #4, 12 months
Solution: Each delivery/task order can have different contract types. An IDIQ contract can be awarded to
a single vendor or multiple vendors. Per DoDI 5000.02, for IDIQ contracts, inclusion of EVM
requirements is based on the estimated ceiling of the total IDIQ contract, and then is applied to the
individual task/delivery orders, or group(s) of related task/delivery orders, that meet or are expected to
meet the conditions of contract type, value, duration, and work scope. The EVM requirements should be
placed on the base IDIQ contract and applied to the task/delivery orders, or group(s) of related
task/delivery orders. “Related” refers to dependent efforts that can be measured and scheduled across
task/delivery orders. The summation of the cost reimbursement software development delivery orders is
$59M (i.e., delivery orders 2-5). These are a group of related delivery orders. The EVMS DFARS should
be placed on the base contract and each of the delivery orders within this group. IPMR reporting for all 7
Formats should be applied.
Example 5: A planned contract calls for discrete and LOE type CLINs and is CPAF. This effort is
primarily to provide the execution of Post Shakedown Availabilities for four ships, which includes support
for tests and trials and a relatively small amount of materials may be required. Each Post Shakedown
Availability is a discrete effort that lasts for 12-16 weeks and the Independent Government Estimate states
that on average each Post Shakedown Availability will cost about $17.5M (i.e., $8M under completion
type CLINs and $9.5M under LOE type CLINs). Altogether for four ships, the anticipated contract value
is approximately $70M, of which $32M is completion type and $38M is LOE type. The PM intends on
tailoring IPMR in order to get insight into program status.
Solution: Using the calculations provided there is a total of $32M of completion type CLINS on this
CPAF contract. Using the contract type and dollar thresholds only, the EVMS DFARS would be applied
14
DoD EVMIG
on the contract since $32M is greater than $20M. However, the scope as described is not the type of scope
that would benefit from adhering to a compliant EVMS. Therefore, an EVM applicability determination
from the cognizant official to not apply EVM should be pursued. The EVM applicability decision should
describe the scope of work and the alternative approach planned to ensure insight into program status. In
this case, the PM has decided to use a tailored IPMR. For the $38M of LOE scope, an applicability
determination from the cognizant official should also be pursued.
In conclusion, every contract is carefully examined to determine the proper application of reporting. The
preceding examples were shown to illustrate the various factors to evaluate in order to determine the
appropriate level of reporting. Every contract is different, and the analyst is encouraged to work with the
PM and EV focal points to determine the appropriate requirements.
In some cases, the contract may meet the contract criteria thresholds and EVM applicability determination
based on work scope, but the PM still wishes to exempt EVM for other reasons. In those cases, the
appropriate authority must review and approve the exclusion of DFARS clauses and waivers of mandatory
reporting. A situational example is the award of a “Fixed Price Incentive” contract in a mature, production
environment, which establishes an overall price ceiling and gives the contractor some degree of cost
responsibility in the interim before a firm arrangement can be negotiated. The PM evaluates the risk in the
contract effort and requests an EVM waiver through its component EVM focal point for appropriate
authority evaluation to waive EVM. However, if a program has received a determination of non-
applicability, then a DFARS waiver or deviation is not required.
15
DoD EVMIG
*NOTE ON FIGURE 2: DECISION PROCESS FOR EVM APPLICATION: The PM has the option
to make a business case to apply EVM outside the thresholds and application decision.
All formats should be submitted electronically in accordance with the DoD-approved schemas as
described in the IPMR DID. The Contract Data Requirements List (CDRL) specifies reporting
requirements.
Any program with EVM reporting requirements regardless of ACAT level can use the EVM-CR to collect
and store EVM reporting data.
16
DoD EVMIG
As previously stated, the CDRL defines EVM reporting requirements IAW DI-MGMT-81861 Integrated
Program Management Report (IPMR). The PM should tailor reporting requirements based on a realistic
assessment of management information needs for effective program control within the requirements
prescribed in DI-MGMT-81861 and the IPMR Implementation Guide. The PM can tailor requirements
that optimize contract visibility while minimizing intrusion into the contractor’s operations. Government
reporting requirements are to be specified separately in the contract using a CDRL (DD Form 1423-1 or
equivalent). The solicitation document and the contract should contain these requirements. The PM is also
engaged in the evaluation of the proposed EVMS during source selection. See Appendix D: Sample Award
Fee Criteria for examples that can be used as a summary checklist of implementation actions.
The Acquisition Plan reflects the specific actions necessary to execute the approach established in the
approved AS and guiding contractual implementation. The procuring activity should explain in the
management section of the Acquisition Plan the reason for selection of contract type and the risk
assessment results leading to plans for managing cost, schedule, and technical performance. Refer to the
FAR, Subpart 7.1.
17
DoD EVMIG
18
DoD EVMIG
NOTE: Until there is a final rule on the new DFARS clauses, use the existing clauses.
For contracts valued less than $100M, inclusion of the following paragraph in the SOW is suggested: “In
regards to DFARS 252.234-7001 and 252.234-7002, the contractor is required to have an EVMS that
complies with the EVMS Guidelines; however, the government will not formally accept the contractor’s
management system (no compliance review).”
19
DoD EVMIG
The level of detail in the EVM reporting, which is placed on contract in a CDRL referencing the IPMR,
should also be based on scope, complexity, and level of risk. The IPMR’s primary value to the government
is its utility in reflecting current contract status and projecting future contract performance. It is used by
the DoD component staff, including PMs, engineers, cost estimators, and financial management personnel
as a basis for communicating performance status with the contractor. In establishing the cost and schedule
reporting requirements, the PM shall limit the reporting to what can and should be effectively used. The
PM shall consider the level of information to be used by key stakeholders beyond the PMO. When
established comprehensively and consistently with CWBS-based reports, EVM data is an invaluable
resource for DoD analysis and understanding. Consider how the PMO is or may be organized to manage
the effort, and tailor the reporting to those needs.
The government should consider the management structure and reporting levels prior to RFP and during
negotiations with the contractor when the government identifies a WBS and contract data requirements.
The contractor often uses the framework defined in the RFP to establish its planning and management
infrastructure, including the establishment of CAs, WPs, and charge numbers. Decisions made prior to
RFP have direct impact on the resources employed by the contract in the implementation of the EVMS
and data available to the government.
When finalizing contract documentation, determine the last significant milestone or deliverable early and
include it in the CDRL Block 16. Forward thinking minimizes required contract changes at the end of the
program Period of Performance when it is time to adjust or cease EVM reporting on the contract.
NOTE: The EVM data provided by the contractor can provide a secondary benefit to the cost estimators
during the CSDR planning process. IPMR reporting should be managed by the PMO to include
considerations from the cost, engineering, logistics, and other Government communities in order to ensure
the data will be of use in the future. While the PMO team manages the EVM data process, several other
communities rely on this information to make data-driven predictions of future program costs and
performance characteristics.
The program office should have an internal process to review and approve all CDRLs for the contract.
The EVM analysts at each acquisition command should provide assistance in tailoring the IPMR. The
IPMR is a program management report, and the CDRLs should be prepared by or discussed with the PM.
The IPMR applies to all contracts that meet EVM application requirements. However, for contracts valued
at or greater than $20M but less than $50M, it is recommended that IPMR reporting requirements be
20
DoD EVMIG
tailored. Tailoring to the specific needs of the program is highly encouraged and is described in greater
detail below. Sample DD Forms 1423-1 for the IPMR are included in Appendix E: Sample CDRL Forms.
In addition, refer to Service or Agency data managers of CDRL templates.
2.2.5.6 Tailoring Guidance for the Integrated Program Management Report (IPMR)
2.2.5.6.1 Introduction
As the IPMR conveys information about the performance of a program or contract, it should always be
carefully tailored to meet the needs of the PM and the program team. As such, the IPMR is a useful means
of communicating program status from the contractor to the government. It should reflect how the
contractor is using EVM as a tool to manage contract performance. This section describes additional
tailoring options beyond tailoring specific IPMR formats that may be considered when preparing contract
data deliverable requirements.
The primary challenge for the joint team is to tailor the report so that it meets these primary needs and not
allowing it to degenerate into a report that can only be used to analyze historical costs. Careful attention
is therefore required during the proposal and contract definitization stages to tailor the IPMR DID (DI-
MGMT-81861).
2.2.5.6.2.1 Complexity
Complexity factors can usually be attributed to technical risk, schedule risk, or cost risk. An Integrated
Risk Assessment performed by the program team prior to contract award can help identify these risk
factors and their interdependence. This analysis can pinpoint specific WBS elements with the highest risk
that can be highlighted for more detailed reporting (i.e., reporting at lower levels of the CWBS on the
IPMR cost and schedule performance by WBS, narrative of analysis and variances, IMS, and time-phased
historical and forecast cost submission).
Schedule risk is often overlooked for its contribution to driving contract performance and cost overruns.
The IMS requirement supports schedule assessment and identification of Critical Path (CP) impacts.
Thorough SRA, focusing on integration efforts (e.g., hardware/software, subcontractor effort, material,
etc.), should identify elements that require management attention. The PMO should conduct an SRA as
early as possible in the planning phase to aid in refining the contract reporting requirements (see Paragraph
2.2.5.7.5 for related information on the requirement for the contractor to conduct SRA as part of the IMS).
21
DoD EVMIG
The type and number of risk elements also differ depending on program phase. It is critical for the PMO
to identify any risk areas for the contract to ensure adequate reporting visibility prior to tailoring the
CDRL. Specify areas of risk in the CDRL for more detailed reporting.
Block 12 (Date of first submission): Enter “See Block 16” and describe further in Block 16. Normally,
the first submission is specified to be made no later than 12 working days after the end of the second full
accounting period following the contract ATP.
Block 13 (Date of subsequent submissions): Enter “See Block 16”; describe further in Block 16.
The IPMR DID specifies delivery timing of the IPMR. The default for negotiations should be the timing
specified in the DID. This requirement may be tailored through contract negotiations to allow later
submission as allowed in the DID, provided that the contractor and government agree that the program
complexity and/or integration of subcontractor and vendor performance data warrant additional time and
would yield more accurate performance data. Contractor justification should include reporting data
integration as the primary reason for needing additional time. Highly complex contracts that require a high
degree of integration of performance reporting from contractor partners or subcontractors may require
additional time to integrate performance data. Contractors may also elect to attach subcontractor IPMRs
and/or reference this analysis in the prime contractor’s narrative of analysis and variances to the
government in order to gain time efficiencies and meet submission dates. In addition, the program office
via CDRL language may explicitly require the contractor attach subcontractor IPMRs.
Flash Data: If desired by the government and agreed to by the contractor, specify that cost and schedule
performance data should be delivered as flash data within seven working days and that remaining formats
should be delivered later per the delivery timeframe specified in the DID.
Final Submission: Final submission should be specified within Block 16 as well and typically is specified
as “when the last significant milestone/deliverable as defined by the contract has been achieved and
remaining risk areas have been mitigated” with program office agreement/acknowledgement. If no
significant milestone/deliverable can be identified, use 95% complete as the default stopping point, with
22
DoD EVMIG
a final IPMR delivery at contract completion. Refer to section 2.5.6 for additional items to consider when
pre-planning for final IPMR submission.
Block 16. This block is used to tailor the requirements in the DID. Tailoring can include WBS reporting
levels, required formats, reporting frequencies, designation of time periods for baseline and staffing data,
variance reporting thresholds, and delivery options. These are described below in more detail.
Evaluate and change the reporting level of WBS elements periodically, as necessary, to ensure that the
IPMR continues to satisfy the PM’s needs. Reviewing the amount or type of work remaining is imperative
prior to making decisions to change reporting. Things to consider include type or amount of work
remaining, whether or not remaining work includes risky GFE or contractor-supplied material, anticipated
major modifications, schedule and cost trends, significant milestone completion, percent complete,
risk/opportunities remaining, and phase of program. If the PM is comfortable with ceasing or reducing
EVM reporting given the type and amount of work remaining on the contract, then ceasing or reducing
EVM reporting should be considered.
If a CCDR requirement has also been placed on the contract, there may be a difference between the CCDR
and IPMR as to the allocation and reporting of General and Administrative (G&A) indirect costs. CCDR
requires G&A to be collected and reported separately as an “add” item on the CCDR reports. However,
the IPMR DID allows the contractor flexibility in assigning responsibility and allocating costs for all
indirect costs, including G&A, across the WBS elements. If the contractor does allocate G&A to the WBS
elements in the IPMR, the program office may wish to ask for an additional IPMR cost and schedule
performance data report by WBS coinciding with the CCDR report submission that mirrors the non-
allocation of G&A. The purpose of this additional report would be to reconcile with the CCDR reports,
but this should not drive additional variance reporting.
The time-phased historical and forecast cost submission is required at the same level as WBS cost and
performance report. Optionally, the government may define reporting at a lower level.
23
DoD EVMIG
24
DoD EVMIG
25
DoD EVMIG
2.2.5.6.3.2.4 Designation of Time Periods for IPMR Staffing and Baseline Data
The IPMR DID requires the contractor to provide IPMR staffing and baseline data by specified
periods or periodic increments and as negotiated with the procuring activity. Those specified
periods should be consistent between the two. The CDRL specifies that the next six months are
separately identified and followed by quarterly, six-month, annual, or other increments specified
by the program to complete. The following example demonstrates how the reports may be
specified in the CDRL. EXAMPLE: The baseline data should contain the baseline at the beginning
of the month and changes to that baseline during the reporting period, resulting in the baseline at
the end of the month. The staffing data contains staffing forecasts in Full Time Equivalents (FTEs)
that are consistent with the contractor’s most likely Estimate at Completion (EAC).
The government should require the minimum amount of variance analysis that satisfies its
management information needs but adequately addresses all significant variances. Excessive
26
DoD EVMIG
variance analysis is burdensome and costly and detracts from the IPMR's usefulness, while too
little information is equally undesirable. However, a formal record of performance issues and
mitigation efforts is a means to maintain transparency and situational awareness. It is important to
consider where there is risk in the program when determining what schedule variances to report;
ideally variance reporting and risk management are aligned. Additionally, the use of contractor
formats and informal means (e.g., regular performance meetings) should be maximized to gain the
most useful insight and current insight into program performance.
The CDRL should be explicit as to how the government is notified of the variance pool reportable,
and optionally how the government will notify the contractor of the reportable variance WBS
elements.
Block 16 should include a statement that cost and schedule variance analysis thresholds be
reviewed periodically (normally semiannually) to determine if they continue to meet the
government's information needs. If they do not, change the thresholds at no cost to the government.
There is no prescribed basis via OSD policy for identification of significant cost and schedule
variances for reporting. The government may specify any one of several ways to identify such
variances, including but not limited to the following:
Fixed Number of Variances. Specify the number of variances to be analyzed. The significance of
these variances can be based on any of the following: current month, cumulative to date,
at-completion estimates, or assessments of risk areas as identified through the
government/contractor management review process. Any number of significant variances may be
selected, but the government should be careful to select only the number needed for effective
program management. Government leads are accountable for all data received and should take
action as appropriate.
Specific Variances. In this methodology, the PMO selects elements for variance analysis only after
reviewing cost and schedule performance data. Using this method, the IPMR is delivered promptly
after the contractor's accounting period ends with all required information. Once the government
has reviewed this performance data, it selects specific variances for analysis by the contractor.
27
DoD EVMIG
These variances should align to the risk management process where the PMO sees risk.
Notification will be provided within an agreed upon timeframe prior to the CDRL delivery date.
Given risk and critical path may change over the life of the contract, this method may be the most
efficient, allowing the government to pinpoint areas to be analyzed. As there may be some months
when a review of the performance data yields few or insignificant variance analysis candidates, it
is also the most flexible. When using this methodology it is important to consult with the PMO
and keep the PMO informed of the variance reports. One of the key areas of EVM, variance
analysis and reporting facilitates true integrated program management. However, this method
should only be used if the government is certain it has sufficient resources to review each monthly
IPMR promptly to select the variances for which explanations are needed.
2.2.5.6.3.3 IPMR Tailoring on Cost or Incentive Contracts Valued at Less Than $20M
There is no EVM requirement for contracts valued less than $20M (see section 2.2.2.3.1).
However, in cases where the IPMR CDRL will be utilized, there are more options available in
tailoring the IPMR. IPMR cost and schedule performance data, narrative of analysis and variances,
IMS, and time-phased historical and forecast cost submission are recommended, and variance
analysis can be scaled down to include the top variances. The level of reporting is dependent on
the contract risk regardless of value. The following tailoring options are available depending on
the level of risk:
• Significant variances can be identified and defined by the contractor
• IPMRs may be submitted entirely online
• Formal variance analysis may be replaced with internal reports or status meetings.
2.2.5.6.3.4 IPMR Tailoring Guidance for Firm Fixed Price (FFP) Contracts
Only the MDA can grant a waiver allowing application of the EVMS DFARS to an FFP contract
(see Paragraph 2.2.2.1). However, a waiver from the MDA is not needed if the government wishes
to receive the IMS only and will not be applying the EVMS DFARS.
Once granted, apply only the minimal EVM reporting requirements necessary to provide the
government team with the desired visibility into program performance. Since cost exposure is
minimized in an FFP environment, the government may elect to receive the IMS in order to
manage schedule risk.
In addition to the tailoring guidance described in the preceding paragraphs, the following guidance
should aid in tailoring the IPMR for FFP contracts:
28
DoD EVMIG
NOTE: These exceptions from standard IPMR reporting do not apply to contracts that have CCDR
requirements. These contracts report costs by CWBS and the total profit/fee as a separate line item
in accordance with DoD 5000.4-M-1, CSDR Manual, and the CWBS DID (DI-MGMT-81334D).
2.2.5.6.3.4.2 Baseline
This report is optional for FFP contracts but may be required when there is a high potential for
significant changes in requirements or sequence of activities. It may be important for the PMO to
understand the changes to time phased resources in the baseline.
2.2.5.6.3.4.3 Staffing
Not recommended for FFP contracts.
If concerned more about schedule performance than cost performance, the government may limit
or eliminate variance analysis of the significant cost and VAC, focusing attention on schedule
variances. Another alternative is to eliminate the narrative of analysis and variances altogether and
to rely on the written analysis provided as part of the IMS data item.
The narrative of analysis and variances may be optional if the contractor and government agree on
alternate methods of understanding performance (e.g., weekly team status meetings, online access
to contractor internal reports, or line of balance schedules).
29
DoD EVMIG
The IMS shows “how” and “when” the IMP is accomplished. It should be an extension of the
information contained within the IMP or high-level program plan, reflecting not only the events,
significant accomplishments, and criteria identified in the IMP but also the tasks subordinate to
the criteria. IMS quality should be such that it provides a key tool for ensuring consistency of
actions and unity of purpose among program team members. The IMS should describe a realistic
and supportable schedule consistent with the IMP and the EVM PMB as applicable. The network
should determine the flow of the IMS.
The IMS is an integrated, networked schedule containing all the detailed discrete WPs and
Planning Packages (PPs) (or lower level tasks/activities) necessary to support the events,
accomplishments, and criteria of the IMP (if applicable). The IMP events, accomplishments, and
criteria are duplicated in the IMS. Detailed tasks are added to depict the steps required to satisfy
each criterion. The IMS should be directly traceable to the IMP and should include all the elements
associated with development, production, and/or modification and delivery of the total product
and/or program high level plan. Durations are entered for each discrete WP and PP (or lower level
task/activity), along with predecessor/successor relationships and any constraints that control the
start or finish of each WP and PP (or lower level task/activity). The result is a fully networked
“bottom-up” schedule that supports CP analysis. Note that although durations are assigned at the
WP and PP (or lower level task/activity) level, these durations roll up to show the overall duration
of any event, accomplishment or criterion. When LOE WPs or tasks/activities are included in the
IMS, clearly identify them as such. LOE should not drive the Driving Path(s)/CP.
NOTE: When the work is being done in an Agile environment, visibility to lower level detail (e.g.,
stories) is not in the IMS; therefore, there is no network logic applied to the lower level details.
The lower level details are contained in the contractor’s Agile toolset and are necessary for
determining the appropriate percent complete of the capability or next higher level where IMS
visibility lies. The government team must be an integral member of the vendor team in
daily/weekly scrum meetings, using the Agile metrics, as a measure of progress. See Appendix A
for a link to the Agile and Earned Value Management: A Program Manager’s Desk Guide.
Block 12 (Date of first submission): Tailor the first submission to reflect a higher level of planning
or a detailed IMP and detail subsequent submission of the IMS to the DID specifications.
Block 13 (Date of subsequent submissions): Enter “See Block 16” and describe further in Block
16. In order to align with the IPMR submissions, deliver the IMS in accordance with the required
30
DoD EVMIG
IPMR delivery requirements. Please note that the most current schedule should be available as
soon as the statusing process is complete. Additional efforts may be needed to integrate schedule
data with cost performance data.
If specified in the CDRL, the contractor may be required to submit subcontractor IMS reports.
Subcontractors with an EVM flowdown should status twice, once according to their accounting
calendar and once according to the prime contractor’s accounting calendar, if different. As a
minimum, the prime contractor would have to work with the subcontractor to provide current status
for the parallel tasks that are in the prime contractor’s IMS. It is also recommended that, if the
government requires, the subcontractor IMS reports should specify the status date. All schedules
on the same status date support comparison and development of the program critical path(s).
However, subcontractor schedules not statused on the subcontractor date will not integrate with
the subcontractor’s cost performance data in the IPMR.
2.2.5.7.4.1 IMS Tailoring Guidance for Contracts Valued At or Greater Than $20M, But
Less Than $50M
The government monitors the progress of contracts valued at $20M - $50M with the IPMR IMS.
As with the rest of the IPMR, requirements for variance reporting and the SRA can be tailored.
While there is no “standard” size for an IMS, the contractor should strive to build the IMS of
sufficient detail to describe the program for the government’s evaluation and to manage its own
day-to-day execution and monthly control of the program/project and the PMB. The identification
of workflow interdependencies at the appropriate level is of prime importance and basic to all
network schedules. The analysis should include a narrative describing the current CP to the
program and the Driving Path to the next planning block milestone (e.g., Preliminary Design
Review, Critical Design Review, 1st Flight, etc.), changes to the CP and IMP, and/or major
program milestone impacts. The contractor may wish to eliminate the requirement to monitor and
report Near-Critical Path (NCP) or Driving Path progress. Variance reporting, including
thresholds, may be adjusted to reflect the size and complexity of the contract. The contractor may
wish to perform the SRA on a less frequent basis prior to the start of selected critical milestones
like Preliminary Design Review, Critical Design Review, Flight Test, etc.
31
DoD EVMIG
Driving Path(s) should report all progress and exceptions (e.g., missed baseline starts and finishes)
to date by WBS to facilitate traceability to the IPMR Format 1. The ‘lowest level’ must be defined,
and a requirement linking to the WBS must be established.
The analysis should explain changes to CP/NCP/Driving Path WPs/PPs (or lower level
tasks/activities) from submission to submission as well as any changes to the IMP. The impact of
CP changes on major program milestones or other major schedule risk areas should also be
discussed. Work around, recovery schedules/plans, and associated impacts due to program changes
should also be provided. The schedule narrative should address progress to date and discuss any
significant schedule changes (e.g., added/deleted WPs, PPs, or tasks/activities; significant logic
revisions; and changes in programmatic schedule assumptions).
The IMS may also include the use of Schedule Visibility Tasks, which represent tasks that are not
part of the budgeted program scope but could potentially impact the logic driven network.
Schedule Visibility Tasks are tasks with durations but not resources that could potentially impact
the critical path. The Schedule Visibility Tasks typically represent external elements, such as GFE,
Customer Furnished Equipment, capital equipment, hardware shipping spans, "Wait" times or
"Scheduled Maintenance" times for equipment or Government activities such as review of
submitted CDRL items per the contract. Schedule Visibility Tasks can also provide insight into
activities being done by subcontractors with an FFP contract.
Additional information on the use of Schedule Margin and Schedule Visibility Tasks may be found
within the NDIA Planning & Scheduling Excellence Guide (PASEG).
Finally, the analysis should be able to forecast future potential delays and problems. This type of
analysis should be done as needed and provided to the government and the program team to assist
in the schedule risk mitigation process.
The Program Critical Path is the sequence of discrete tasks/activities in the network that has the
longest total duration through the contract. Accordingly, discrete tasks/activities along the CP have
the least amount of float/slack. The standard for a networked schedule means that all discrete
contractual tasks or activities are logically networked both horizontally and vertically with
predecessor/successor logic, duration, and resources (when available) such that an accurate CP can
be electronically calculated by the scheduling software application. (NOTE: Far term activities
may be held at a higher level of definition but should still be included in the network calculation.)
The CP also includes the associated CP program milestones, key tasks/activities, and IMP events.
Schedule logic should exist at the lowest level within the schedule and minimize the use of
32
DoD EVMIG
constraining dates. Following these general principles should result in a valid schedule network
and CP. A fully networked schedule is always advisable.
The driving path is the longest sequence of discrete tasks/activities from now to a selected interim
contract milestone. Discrete tasks/activities on the driving path have the least amount of total
float/slack to the interim contract milestone. If a task on a driving path slips, the interim contract
milestone will slip. Driving path may not be part of the contract critical path. The government may
specify which driving path is currently reportable. Without government direction, the contractor
reports the driving path to the next major event, at a minimum.
A detailed network schedule should clearly identify activities, product hand-offs, and deliverables
from internal and external interfaces, from the lowest level of contract tasks/activities up to the
summary level schedule activities and milestones. The determination of external significant and
critical interfaces to be identified within the IMS requires agreement between the contractor and
government and is documented accordingly.
LOE activities may be included or excluded in the network based on contractor standard
procedures. LOE activities should not drive the CP, and this can be avoided by including LOE
activities on the IMS without network logic. If LOE activities are included within the IMS, they
are clearly identified as such. As a best practice, understand that LOE WPs (or lower level
tasks/activities), by definition, cannot influence an event-driven schedule and are not required to
be included in the IMS.
If inclusion is desired to maintain consistency with the cost system, include them in such a way
that they do not yield erroneous CPs. LOE is required to be in the IMS whenever a resource-driven
schedule is constructed utilizing resource limitations/constraints. In these cases, LOE is required
to be included in the schedule along with the interdependencies with discrete work.
33
DoD EVMIG
sequencing of all work authorized on the contract in a manner compatible with IMP events and/or
key milestones. Detailed subordinate schedules include, at a minimum, all discrete WPs and PPs
(or lower level tasks/activities) as determined by the contractor’s internal processes. If difficult to
identify logical ties to other discrete work, the connection to the next succeeding IMP event and/or
key milestone is recommended. The IMS should be defined to the level of detail necessary for day-
to-day execution and monthly control of the program/project and the PMB.
The SRA employs software that uses Monte Carlo simulations for each of the WP and PP (or
task/activity) given the range of remaining duration for determining a cumulative confidence
curve. The software performs simulated “runs” of the entire program schedule many times while
randomly varying the remaining durations according to a probability distribution. The results
indicate a “level of confidence” for completing key milestones, events, WPs, PPs (or
tasks/activities) by specific dates. The contractor uses its own SRA software to conduct its
assessment; the government SRA is performed with the SRA software of its choosing.
When an SRA submission is requested, the prime contractor performs the assessment and submits
to the government at the required CDRL intervals. As part of its SRA requirement, the prime
contractor reports most likely, minimum, and maximum remaining durations for each WP, PP,
and/or task/activity on the CP/NCP and Driving Path/Near-Driving Path to selected major
task(s)/milestone(s) with documentation of the assumption and rationale of the three-point
estimates.
When an SRA is specified in the CDRL as part of the risk management process, the government
conducts periodic SRA with the participation of the prime contractor to provide the program
management team with an understanding of the potential schedule impacts.
34
DoD EVMIG
The prime contractor conducts an SRA, and submits the assessment, three-point duration
estimates, and rationale to the government. Government technical (or other qualified) personnel
should review the three-point remaining duration estimates, supporting rationale, and assumptions.
Where there are questions or differences in opinion, the government technical expert contacts the
CAM to discuss and try to reach an understanding or agreement.
For purposes of efficiency, it is important that the review be completed in the shortest time
possible. An SRA should then be performed again. If there are remaining differences in three-point
duration estimates or assumptions and rationale, then the contractor and government should
conduct separate SRAs.
35
DoD EVMIG
While there is no “standard” size for an IMS, the contractor should strive to build the IMS that is
adequately detailed to describe the program for the government’s evaluation and to manage its
own day-to-day execution. The identification of workflow interdependencies at the appropriate
level to identify the CP is of prime importance and basic to all network schedules.
The statusing and reporting of progress may be less frequent than that of cost type contracts, and
variance reporting, including thresholds, may be adjusted to reflect the size and complexity of the
contract. The contractor may wish to eliminate the requirement to perform an SRA or perform
them on a less frequent basis. Alternative methods of monitoring schedules in an FFP environment
include Line of Balance and MRP reporting. If an IMS is still desired, ensure that there is
traceability between the IMS and the alternate methods.
36
DoD EVMIG
2.2.6.2.2 Proposal Submissions Greater than $20M and Less than $100M
If the offeror proposes to use an EVMS that has not been previously accepted, the proposal
includes a written description of the management procedures the offeror will use and maintain in
the performance of any resultant contract. The description of the offeror's EVMS should be in
sufficient detail to show how it complies with the Guidelines. Aspects such as manufacturing,
material, and subcontract management should be included. DFARS clause 252.234-7001 describes
the requirements for this documentation. This clause also requires a matrix that cross references
provisions of the EVMS description to the Guidelines.
The offeror may elect to use and apply an accepted EVMS to meet this requirement and can assert
whether a CO has accepted the offeror’s EVMS.
2.2.6.3 Evaluation
The proposal evaluation process typically includes evaluation of the proposed EVMS against the
Guidelines. The source selection team should ensure that the offeror has described provisions to
flow down EVM requirements to the appropriate subcontractors. Each proposal should also be
reviewed for adequate WBS development and resource adequacy for EVM implementation and
support of the IBR. The offeror’s proposed IMS is evaluated for realism and completeness against
the SOW (refer to local source selection policy and procedures for further guidance).
If the offeror asserts that they have an approved EVMS, the CO shall confirm the assertion using
the Contract Business Analysis Repository. If the CO is unable to validate the assertion using the
Contract Business Analysis Repository, the CO shall request the contractor provide documentation
of the approval or plan to obtain compliance. The Procuring Contracting Officer (PCO) shall obtain
the assistance of the administrative contracting officer in determining the adequacy of an EVMS
plan that an offeror proposes for compliance with the Guidelines, under the provision at DFARS
252.234-7001, Notice of Earned Value Management System. The Government will review and
approve the offeror's EVMS plan before contract award.
When an offeror proposes a plan for compliance with the Guidelines, the CO shall forward the
offeror’s plan to the EVMS functional specialist to obtain an assessment of the offeror’s ability to
implement a system compliant with the Guidelines. The EVMS functional specialist shall provide
its assessment of the offeror’s plan to the CO within the timeframe requested.
2.2.6.4 Clarification
An on-site examination of an offeror's proposed system should not generally be required during
proposal evaluation. When any aspect of the system is not clearly understood, however, the offeror
may be requested to provide clarification. This may be done by written communications or an on-
site visit. Such action should be coordinated with other relevant competent authorities, including
the Source Selection Board and Procuring Activity. Care should be exercised during the entire
review process to ensure that the offeror and the government have the same understanding of the
system described in the proposal. If it is necessary to review plans and reports from other contracts
executed by the offeror, concurrence of that procuring activity is to be obtained.
37
DoD EVMIG
The purpose and objectives should be viewed as a continuing IBR process. The goal of the IBR is
for the government and contractor to achieve a shared understanding of the risks inherent in the
PMB and the management control processes needed to execute the program. Unlike the CR that
focuses on EVMS compliance with the Guidelines, the IBR focuses on understanding the realism
of performing to the baseline.
The IBR is a tool that should be used as necessary throughout the life of the contract. Key benefits
of the IBR are:
• Laying a solid foundation for mutual understanding of project risks
• Government insight into the contractor’s planning assumptions and the resource constraints
built within the baseline
• Ensuring that the PMO budget can support the funding requirements of the contractor’s PMB
• Comparing expectations of PMs and addressing differences before problems arise
• Correction of baseline planning errors and omissions
• In-depth understanding of developing variances and improved early warning of significant
variances
38
DoD EVMIG
IBRs shall be initiated as early as practicable and conducted no later than 180 calendar days after
contract award/ATP, the exercise of significant contract options, the incorporation of major
modifications, or as otherwise agreed.
The IBR should not be considered a one-time event or single-point review. IBRs are also
performed at the discretion of the PM or when major events occur within the life of a program.
Such events include a significant shift in the content and/or time-phasing of the PMB and reaching
the start of the production option of a development contract. Other events that affect the PMB and
may prompt a decision to conduct a subsequent IBR include significant baseline changes, major
contract execution risk changes, AS changes, and government directed funding profile changes.
An IBR should also be conducted whenever an OTB or OTS is implemented.
Incremental IBRs are an alternative approach for long, complex development efforts. In an
incremental IBR, the baseline is reviewed for an increment of time that corresponds to the
contractor’s planning cycles. For example, the baseline may be planned in detail from contract
award to Critical Design Review, and this becomes the basis for the first incremental review. The
first incremental review should also include the top-level planning for the remaining effort.
Conducting incremental IBRs does not abrogate the contractor’s responsibility to plan the full
baseline in as much detail as possible. Other incremental reviews occur over time as the remaining
baseline is planned in detail. Incremental IBRs are not suitable for contracts that are only a few
years in duration or for production contracts. Continuous assessment of the remaining PMB and
program risks aids the PM in identifying when to conduct a new IBR.
The incremental IBR approach should be taken in the case of an Undefinitized Contract Action.
The IBR should precede definitization if definitization will not occur within 180 days. A review
of the known work scope should be conducted within the 180-day window. Follow-up IBRs are
39
DoD EVMIG
scheduled for the remaining work. Any incremental IBR event should not be driven by
definitization but should represent an event driven plan to assess the baseline for the work. A letter
from the CO to the contractor may be needed to clarify initial IBR requirements.
Additional guidance is contained in a guide prepared by a joint OSD / NDIA team, The Program
Manager’s Guide to the Integrated Baseline Review Process. While this is not a detailed how-to-
guide, it describes the key attributes of the IBR and establishes a framework for improving
consistency of the IBR across DoD. In addition, the Services and Agencies may have supplemental
guidance.
The government and contractor should begin discussing the coverage of the IBR as soon as
possible after contract award. The IBR focuses on assessing the baseline realism at the lowest level
and other baseline related risk evaluations as necessary. The following section should help in
establishing the focus for the IBR.
Selection of these CAs should result in at least 80% of the PMB value selected for review. Low
dollar value CAs or LOE accounts may be candidates for exclusion.
The contractor should provide a matrix that lists all CAs, names of responsible CAMs, approved
budget amounts, and EV techniques. This listing represents all performance budgets on the
contract. This list should be jointly reviewed for selection of the CAs per the guidance discussed
above.
40
DoD EVMIG
Functional disciplines that should be included on the team are program management, subcontract
management, and technical management (i.e., systems engineering, software engineering,
manufacturing, integration and test engineering, and integrated logistics support). Business
managers, cost analysts, schedule analysts, EVMS specialists, and COs provide support. The CMO
and, in particular, the EVMS specialist should actively participate. The size and composition of
the team should reflect the PM’s objectives, expectations, and risk assumptions.
After designation of an IBR team, conduct joint training for all members of the IBR team, including
basic training in EVM baseline concepts as necessary. Give specific training for the IBR three to
four weeks before the review. As part of the IBR training, the contractor should provide a short
overview of the specific baseline documents to be reviewed, using an example of a single thread
trace through a CA. Contractor participation in the government IBR training can be structured to
leave more time for CA discussions during the in-person portion of the IBR.
41
DoD EVMIG
Preparation includes the development of an IBR plan by the joint team. An IBR planning schedule
can be developed for joint discussion. This schedule should be statused weekly or bi-weekly as the
planning for the IBR commences and include the following elements: IMS iterations and
finalization, CA budgeting, and RAM finalization.
The PMO may wish to hold an IBR workshop with the contractor to develop and agree to the
elements of the IBR plan. This plan should include the following elements:
• Selection of CAs
• Summary level risk discussions
• IBR team membership
• Training schedule
• Further preparation or document review by the team prior to the IBR
• Planned dates and agenda for the review
• Risk evaluation criteria
• Documentation templates
42
DoD EVMIG
Facilities should be a consideration to ensure that IBR introductory briefings, CA discussions, and
out brief presentations are comfortably conducted with the required number of attendees. During
IBR preparatory meetings, it will be determined how many concurrent CA discussions will be
necessary based on evaluation of the risk areas by the government PM.
To help facilitate and start the discussion, a baseline discussion starter template is shown in Figure
5. Tailorable to reflect the contractor’s terminology, this template provides a framework to guide
the discussion and review of the CA.
43
DoD EVMIG
While no formal IBR report is required for external distribution, the PM should write a memo for
the record and attach all documentation for the official program files. Also, while there is no “pass
or fail” to an IBR, the measure of a successful IBR is when both PMs can answer the following
question with confidence, knowing where and which risks lay ahead:
Do we have an understanding of the risks associated with executing this contract (i.e., technical
work scope) given the available schedule and budget constraints?
After the close of the IBR, emphasis shifts to ongoing management processes, including effective
EVM and risk management processes. Completion of the IBR allows the PMO and contractor to
have a better understanding of ongoing performance relative to the baseline. The IBR also enables
a continuous, mutual understanding of program risks. As a result, the PMs can more effectively
manage/mitigate risk and control the cost/schedule performance of the contract.
44
DoD EVMIG
These objectives can be met through a system approval process for applicable contracts, consistent
surveillance practices, and a controlled approach to system changes for all contracts. Industry
ownership of EVM as an integrated management tool is fostered through corporate commitment,
partnering for joint surveillance, and establishing internal control systems to minimize system
deficiencies. This partnering approach meets the needs of DoD for reliable performance data and
executable contracts while also meeting the needs of industry for a consistent DoD approach to
EVM implementation.
If the offeror submits a proposal greater than $100M, the offeror should assert that it has a system
that has been determined to comply with the Guidelines or prepare a plan for compliance and
submit the plan as part of the proposal. The plan shall:
• Describe the EVMS the offeror intends to use in performance of the contract and how the
proposed EVMS complies with the Guidelines
• Distinguish between the offeror's existing management system and modifications proposed
to meet the Guidelines
• Describe the management system and its application in terms of the Guidelines
• Describe the proposed procedures for administration of the Guidelines as applied to
subcontractors
• Describe the process the offeror will use to determine subcontractor compliance with the
Guidelines
45
DoD EVMIG
• Provide milestones that indicate when the offeror anticipates that the EVMS will be
compliant with the Guidelines
The government will review and approve the offeror’s EVMS plan before contract award.
If the offeror submits a proposal less than $100M, the offeror should assert that it has a system that
has been determined to comply with the Guidelines or submit a written description of the
management procedures it will use and maintain in the performance of any resultant contract to
comply with the requirements of the EVMS clause. The description shall include:
• A matrix that correlates each Guideline to the corresponding process in the offeror’s written
management procedures
• The process the offeror will use to determine subcontractor compliance with the Guidelines
46
DoD EVMIG
This guidance directs the contractor to show that the system complies with the Guidelines. The
plan to become compliant includes not only the actions to be taken but also the timeline to achieve
those actions.
2.4.2.3.1.1 CR Team
Within the DoD, the DCMA is responsible for determining EVMS compliance. Assigned to
coordinate review activities between agencies, the Review Director approves the assignment of
the team members and establishes the areas of review to be emphasized at the outset of the review.
The Review Director and team members are formally assigned to the team. It is recommended that
the team include members from the PMO and CMO. Team members should be experienced with
and understand the Guidelines. Knowledge of both the program and the contract is desirable.
Formal training, such as that provided by the member schools of the Defense Acquisition
University (DAU) or other recognized educational institutions, is recommended. Skills may also
be obtained by training and experience in implementing, maintaining, and operating EVMS.
The Review Director should make all necessary arrangements to ensure availability of team
members for the time required for preliminary indoctrination, training, and each review for which
a team member is needed.
2.4.2.3.1.2 CR Process
The CR begins as soon as possible following the implementation of the EVMS. The review
consists of System Description and related documentation reviews, data tests, and interviews with
contractor personnel. The contractor’s EVMS is assessed against each Guideline.
47
DoD EVMIG
The contractor should have a current approved written System Description available. Applicable
procedures also need to be available at the contractor’s operating levels as necessary to
demonstrate a consistent approach. The review team examines the contractor’s working papers
and other documents to ascertain compliance and to document its findings. The contractor should
make documents used in the contractor’s EVMS available to the team. The documentation needs
to be current and accurate. The contractor demonstrates to the team how the EVMS is structured
and used in actual operation.
The CR may include, but is not limited to, the following activities:
• A data-driven assessment using standard test metrics prior to the review. This data-driven
assessment will focus the on-site assessment of the review team.
• An overview briefing by the contractor to familiarize the review team with the proposed
EVMS.
• A review of the documentation that establishes and records changes to the baseline plan for
the contract, work authorizations, schedules, budgets, resource plans, and change records,
including MR and UB records. The purpose is to verify that the contractor has established
and is maintaining a valid, comprehensive integrated baseline plan for the contract.
• A review, on a sample basis, of the reporting of cost and schedule performance against the
baseline plan, along with appropriate analyses of problems and projection of future costs.
• A test to summarize the cost/schedule performance data from the lowest level of formal
reporting (normally the CA level) to the external performance measurement report. The
purpose of this activity is to verify the adequacy of the control aspects of the system and the
accuracy of the resulting management information.
• Interviews with a selected sample of CAMs, functional and other work teams, and PMs to
discuss issues discovered during the data driven assessment.
• An exit briefing covering the team's findings. During this briefing, any open system
discrepancies should be discussed along with the contractor's corrective action plan, which
establishes responsibility and a time-frame for corrective action.
NOTE: If, at the time of award, the contractor’s EVMS has not been formally approved by the
ACO, the contractor applies its current system to the contract and takes timely action to implement
its plan to obtain compliance. If the contractor does not follow the implementation schedule in the
compliance plan or correct all system deficiencies identified during the CR specified in that plan
within a reasonable time, the CO may take remedial action.
2.4.2.3.1.3 CR Results
At the conclusion of the CR, the Review Director is responsible for a written report. The written
report shall be amended to reflect progress against the contractor’s corrective action plan to resolve
material discrepancies identified during the CR. System approval is granted to the contractor
through the ACO. Contractual actions may be initiated when CR results dictate (see paragraphs
2.4.6.1, 2.4.6.2, 2.4.6.3, and 2.4.6.4).
48
DoD EVMIG
When a contractor has a previously accepted EVMS, conduct additional EVMS CRs only to
reassess compliance if the contractor’s system approval was withdrawn following a Review for
Cause (RFC). The most important element to ensure continuing compliance with the Guidelines
is less about the "one-time" review leading to the system approval but more about the continuous
surveillance process.
In the interest of fostering contractor ownership, the DoD encourages contractors to responsibly
conduct continuous self-evaluation of their EVMS in partnership with the government. The
contractor should use the Guidelines as the basis for assessing its system compliance.
49
DoD EVMIG
The CMO has the primary responsibility for surveillance of the prime contractor and specified
subcontractor EVMS (see paragraph 2.4.3.5 for a discussion of surveillance of subcontractors with
flow down EVMS requirements).
The surveillance team should establish a communication plan with the buying activity. The
communication plan will allow the program EVMSS to better understand the compliance issues
50
DoD EVMIG
that are impacting the government reports and the EVMS surveillance specialists to better
understand pertinent program events and quality/utility of surveillance reports. It will also allow
programs with contracts less than $100M to submit issues to the surveillance team that may
warrant further examination.
2.4.3.3.5 Contractor
The contractor is encouraged to conduct its own internal surveillance program to ensure its EVMS
continues to meet the Guidelines, is implemented on a consistent basis, and is used correctly on
all applicable contracts. The contractor’s internal surveillance program should not replace the
government surveillance process.
The contract administration office should coordinate government surveillance efforts with the
contractor. Joint surveillance between the IST and the contractor is encouraged and, if established,
should be documented in a Joint Surveillance Plan.
If deficiencies are discovered in the contractor's compliance with the Guidelines, the EVMS
Functional Specialist documents the problem and then notifies the contractor and PMO of the
problem along with any corrective action required. The EVMS functional specialist follows up to
ensure the deficiency is resolved in a timely manner. EVMS problems that cannot be resolved with
the contractor through the EVMS functional specialist are reported to the ACO for resolution.
The EVMS functional specialist reviews the IPMR and related internal data flow on a recurring
basis. The EVMS functional specialist provides the PM with an independent and complete
assessment of the accuracy and timeliness of IPMR information. These reports specifically
highlight issues that could affect contract milestones or areas of considerable cost, schedule, or
technical risk.
The EVMS functional specialist documents and maintains surveillance results as part of a
chronological record of the contract. The contract administration office may provide surveillance
information to the PM.
51
DoD EVMIG
The prime contract administration office function normally is limited to evaluating the
effectiveness of the prime contractor's management of the subcontract. However, there may be
occasions when the PM or prime contractor requests, through the ACO, that the government
perform limited or complete EVMS surveillance. Such support administration is not to be
construed as a discharge of the prime contractor's contractual obligations and responsibilities in
subcontract management. Such assistance should generally be provided only when:
• The prime contractor is unable to accomplish the required surveillance because it would
jeopardize the subcontractor's competitive position or proprietary data is involved
• A business relationship exists between the prime contractor and subcontractor that is not
conducive to independence and objectivity, as in the case of a parent-subsidiary or when
prime and subcontracting roles of the companies are frequently reversed
• The subcontractor is sole source and the subcontract costs represent a substantial portion of
the prime contractor’s costs
52
DoD EVMIG
NOTE: Surveillance of disapproved systems may initially focus on a Corrective Action Plan that
resulted from the system disapproval but may revert to routine surveillance in accordance with the
surveillance plan upon completion of all corrective actions.
Upon evaluation and approval of the proposed changes by the ACO, the ACO should advise the
contractor of the acceptability of such changes within 30 calendar days after receipt of the notice
of proposed changes from the contractor. When a proposed change would make the contractor’s
EVMS non-compliant, the ACO should promptly notify the contractor. A flowchart of the system
change process for approved systems is provided at Figure 7.
53
DoD EVMIG
outputs, files, CA documents, EV techniques, and interfaces among those subsystems. The name
of the software may be mentioned in the System Description or related documentation when the
intent is to clarify and describe the capabilities as mentioned above and thereby reduce the amount
of additional content needed in the System Description.
If the EVMS specialist determines that the changes would cause non-compliance to the Guidelines,
the ACO should formally notify the contractor of this non-compliance and therefore its non-
fulfillment of the contract requirements. The letter should request that the contractor modify the
proposed changes to maintain compliance. If the contractor does not take the appropriate corrective
actions in a timely fashion, the ACO should invoke the appropriate contractual remedies to address
non-compliance with the terms of the contract.
The Review Director, working closely with the EVMS specialist, the PMO, the EVMSS, and the
contractor, should establish the scope of the review. Regardless of cause, the scope and conduct of
the RFC should be limited to only the system processes that are affected. Those portions of the
54
DoD EVMIG
EVMS designated for review should be identified at the start of the review. Any previous review
findings and surveillance reports should be analyzed to identify areas of special interest.
55
DoD EVMIG
2.4.6.2 Application
The uniform and consistent application of actions and remedies for EVMS non-compliance is
essential for promoting contractor-initiated corrective action. This requires an awareness and
understanding of regulatory policies, correct identification of the problem areas, and selection and
implementation of appropriate actions and remedies. The appropriate use of contractual actions
and remedies is required to protect the government’s interest if non-compliance occurs. EVMS
value to the government may be significantly greater than its execution cost. The loss of valid
performance measurement data may limit the government’s ability to measure the contractor’s
progress on a contract, which may increase the probability of unearned progress payments. When
DFARS 252.234-7002, Earned Value Management System, is included in a contract, the
contractor’s performance measurement system becomes a material requirement.
2.4.6.3 Actions
The following actions and remedies may be initiated after discussion with the PMO (i.e., PCO)
and CMO (i.e., ACO):
• Issue letter of concern notifying the contractor of a specific problem and requesting
additional information or a corrective action plan with get well dates
• Reduce or suspend progress payments (Fixed Price Incentive Fee (FPIF) contracts) when
contract requirements are not met (FAR 32.503-6 (b) (1))
56
DoD EVMIG
• Reduce contractor billings when EVMS deliverable reports are unacceptable and payments
should be recouped (cost-type and FPIF contracts)
• Reduce overhead billing rates when overhead payments to the contractor have not been
earned and should be recouped (cost-type and FPIF contracts). Prior to implementing this
action, coordinate with the Defense Contract Audit Agency (DCAA)
• Utilize full compliance with the Guidelines as a possible factor in award fee determination
• Inform the CO that an EVMS non-compliance issue is endangering contract performance
and recommend a Cure Notice be issued
• Inform the CO that a condition or conditions endangering performance (described in CO
Cure Notice) has/have not been corrected and recommend issuance of a Show Cause Notice
(this is a last resort measure and a contract is rarely terminated for EVMS non-compliance)
2.4.6.4 Remedies
The following remedies may be initiated by the CO after discussion with the PMO, CMO, or
EVMSS:
• Negotiate a reduction in contract price
• Issue a Cure Notice
• Issue a Show Cause Notice
57
DoD EVMIG
58
DoD EVMIG
An underrun to the budget in the CBB does not automatically mean excess funds have become
available. Practitioners may erroneously treat EVM budget and contract funding in the same ways.
The application of budgets and funding are distinct and follow separate rules; budget follows EVM
rules, while use of funding follows contracting and fiscal rules:
• The term “budget” refers to the resources estimated to be required to complete the
contracted scope of work.
• “Funding” refers to the actual government dollars obligated on the contract and available
for payment for work being accomplished on the contract.
• The amount of obligated funding does not always equal the contract price. There is no rule
that requires the CBB to equal either the amount of obligated funding or the contract price.
When the contract scope has been completed for less than the amount funded, there may exist an
opportunity to use that funding for new scope. The ability to use any underrun for new scope
becomes a contracting action, not an EVM action, and follows applicable laws and regulations.
When funds are available due to an underrun and are then used to acquire new work scope using
proper contracting policies and procedures, budget for the new scope is added to the CBB.
The EVMSIG describes flexibility for a variety of program execution and development
methodologies. An important principle of EVMS outlined in the EVMSIG is a disciplined
approach to maintaining EVM baselines. “To ensure the ongoing integrity of the Contract Budget
Base (CBB), budget traceability throughout the life cycle of a program must be maintained.
Current budgets are reconciled to prior budgets in terms of changes to work scope, resources,
schedule, and rates so that the contract changes and internal re-planning on overall program
growth [are] visible to all stakeholders.”
59
DoD EVMIG
The contractor’s EVMS specifies the management procedures it uses to conduct and approve
internal replanning. The contractor’s system may require government approval for certain
replanning activities. In these cases, the government should promptly review and approve the
changes as appropriate. If the CMO has been given responsibility to authorize these changes, the
CMO should keep the PMO informed of the approved changes (see Paragraph 2.4.4 and supporting
paragraphs). The CMO should include a review of the contractor’s change procedures and
replanning activities in routine surveillance.
2.5.2.4 Over Target Baseline (OTB) and Over Target Schedule (OTS)
2.5.2.4.1 Overview
During contract execution, the contractor may conclude that the budget and schedule for
performing the remaining work are decidedly insufficient and no longer represent a realistic plan.
At this point the contractor should prepare and submit a request to implement an OTB and/or OTS.
An OTB is a new baseline that has been formally reprogrammed to include additional budget in
excess of the contract’s negotiated cost. An OTB increases the performance budget without
modifying the work scope or other constraints of the contract. The value of the OTB incorporated
budget therefore exceeds the CBB and the corresponding value of the contract target cost or
estimated cost target (depending on contract type). The sum of all resulting budgets (i.e., Allocated
Budget, UB, and MR) becomes known as the TAB. The difference between the TAB and the CBB
is the amount of the increase over the previously established budget. See Figure 8.
60
DoD EVMIG
An OTS condition is created when the contractor’s schedule is time-phased beyond the contract
milestones or delivery dates. While it is possible to have an OTS without a corresponding increase
in cost, normally an OTS is accompanied by increased costs and therefore by an OTB.
Implementing an OTB or OTS is a major management decision for the contractor and requires
government approval at the start of the process. Consequently, the PM should fully understand the
concepts and processes. The PM should consider the factors discussed below when considering
whether an OTB or OTS is appropriate for the contract and when evaluating the contractor’s
request.
Since the primary reason for implementing an OTB/OTS should be to improve the contractor's
ability to manage and control ongoing work, the decision to request an OTB/OTS should originate
with the contractor. However, the government may request that the contractor evaluate the need
for an OTB/OTS if the government is not gaining accurate performance insight. The PM should
not unilaterally determine the specifics, such as the amount of additional budget or degree of
schedule stretch. Before the PM approves the OTB/OTS, the following factors should be
considered:
• Do the contractor and government understand why the current work plan is no longer
valid? The parties should identify the problems that rendered the current work plan
unrealistic and implement measures to prevent these problems in the future.
• Is the existing plan for accomplishing the remaining work valid? The plan should reflect
a realistic schedule of how the remaining work actually is to be done, and the new budget
should be adequate and reflect a realistic estimate and remaining program risks with an
appropriate amount of MR.
• Has contract work progressed sufficiently to warrant an OTB/OTS? The use of an
OTB/OTS may be inappropriate in a contract’s early stages because insufficient work has
been accomplished to verify the need for an OTB/OTS. However, nothing precludes the
contractor from implementing an OTB/OTS at the outset, provided the PM and PCO concur.
• Does sufficient time remain on the contract to warrant an OTB/OTS? If there is little
time remaining, an OTB/OTS may not be worthwhile and may be very disruptive.
• Has an OTB been implemented previously? If multiple OTBs are requested, the above
factors, especially the first two on the list, may have not been adequately considered. This
may indicate significant underlying management problems requiring investigation.
61
DoD EVMIG
the contractor to provide its managers with realistic budgets and schedules for accomplishing the
remaining work.
The contractor initiates the process by submitting an OTB/OTS request to the PM detailing its
implementation plan. To expedite the return to a realistic baseline, the PM promptly reviews and
negotiates changes, if necessary, to the contractor's request within 30 days. If the request is not
approved within 30 days, the PM should provide specific reasons as to why it was denied and what
is required to obtain approval. If the request is approved, the PCO promptly sends written approval
to the contractor to proceed. The contractor may not implement an OTB/OTS without this written
approval.
Because OTB budgets represent performance budgets only and are implemented solely for
planning, controlling, and measuring performance on already authorized work, a contract
modification is not needed. The OTB budget does not impact the negotiated value of the contract.
For incentive type contracts with a ceiling, the government’s cost liability is still capped at the
ceiling value. For cost reimbursement contracts, however, the government cost liability continues
to increase as actual costs accrue on the contract. Any funding changes would require contract
action.
The PM should seek support from the PMO/CMO technical and support staff in evaluating an
OTB request, ensuring that the OTB approval process is not inhibited by inappropriate or unrelated
issues. The overriding goal should be to allow the contractor to implement in a timely manner a
baseline that allows it to regain proper management control of the ongoing effort.
62
DoD EVMIG
variances will be retained, and a schedule of implementation for the rebaselining. The
government will acknowledge receipt of the request in a timely manner (generally within 30
calendar days).
• OTB/OTS implementation timeframe. The contractor should fully implement an
OTB/OTS in required reports one to two full accounting periods after receipt of written ATP.
Award fee criteria should be carefully selected to properly motivate the contractor’s management
and performance during the award fee period. Objective criteria tied to identifiable outcomes,
discrete events, or milestones are recommended whenever possible. Clear distinctions should be
established between the performance levels to guide the PMO when evaluating performance. The
PMO should establish the criteria to motivate and encourage improved management processes
during the period, keeping in mind that recognizing improvements in integrated program
management results in more long lasting improvement in cost and schedule performance. If such
qualitative criteria are difficult to support during the evaluation process, the PMO should consider
using subjective criteria for EVMS performance results.
63
DoD EVMIG
ready for the review. Rather, outcomes of technical completion of work leading to an established
baseline evaluation criterion is one way of objectively evaluating and rewarding the contractor
based on success to a baseline plan.
Discipline
• Accuracy, timeliness, and consistency of billings
• Accuracy, timeliness, and consistency of cumulative performance data
• Accuracy, timeliness, and consistency of integration of subcontractor data
• Baseline discipline and system compliance
Sample criteria and varying levels of performance are shown in Appendix D. These criteria should
be selected and tailored as appropriate to the nature of the contract.
Analysis is a team effort and is fully integrated into the overall program management process.
Effective analysis considers all impacts, considers all courses of action, synthesizes an integrated
solution and action plan, and allows informed decisions. The real test for effective, forward-
64
DoD EVMIG
looking analysis is that it is used to manage program performance, not just to report the status and
problems to date.
65
DoD EVMIG
(Block 8.e) of the IPMR. This value may not agree with the most likely EAC. Any
difference shall be explained in terms of risk and opportunities and senior management
knowledge of current or future contract conditions in the IPMR Format 5.
SV measures work accomplishment compared with the plan. The SV is computed by subtracting
the BCWS from the corresponding BCWPA negative SV is unfavorable, indicating that some
amount of planned work was not completed as scheduled. A positive SV is favorable, indicating
that more work was completed than originally planned. SV alone is insufficient to determine the
schedule performance of a program. The SV should be compared to the CP and Driving Critical
Paths to determine its true impact.
CPI is a measure of efficiency calculated by dividing BCWP by ACWP. The metric denotes the
cost expended for the work completed. A CPI value greater than 1.0 indicates the work
accomplished cost less than planned, while a value less than 1.0 indicates the work accomplished
cost more than planned.
SPI is a measure of efficiency calculated by dividing BCWP by BCWS. The metric denotes the
amount of work accomplished versus the amount of work planned. An SPI value greater than 1.0
indicates more work was accomplished than planned, while an SPI value less than 1.0 indicates
less work was accomplished than planned.
Reference the DAU Gold Card (Appendix A) for a synopsis of EVM terms and formulas.
66
DoD EVMIG
Because the IPMR is the primary report for communicating integrated contract cost and schedule
performance data, the PM should ensure that it presents accurate and useful information. The PM
should carefully review each IPMR submission, checking for such things as errors, DID
compliance, and data anomalies. The PM should address any concerns or problems and require
prompt correction by the contractor. If left uncorrected, data errors and anomalies may skew and
distort the EVM analysis, government EAC, and resulting program planning.
While some program offices will want to cease reporting entirely when a certain percentage of the
effort is completed, this may not be the best option. The tail end of the contract can take a long
67
DoD EVMIG
time to complete and tracking progress is desirable. Changes in reporting are ultimately determined
by the level of risk remaining on the project. The entire list of risk factors should be thoroughly
assessed prior to making an informed decision to cease or decrease the level or amount (depth or
breadth) of EVM reporting.
Instead, when the information provided by the contractor “is no longer meaningful” (per the IPMR
DID) or the milestones previously identified and listed in the CDRL have already occurred (as
stated in the IPMR Implementation Guide), resulting in lowered program risk, EVM reporting may
be reduced or suspended altogether. It is important for members of the program team to discuss
risks and reporting then determine what is best for the program.
It is important to remember to report any changes in IPMR reporting schedules to the appropriate
government repositories (such as the EVM-CR and Defense Acquisition Management Information
Retrieval (DAMIR)).
Any changes in EVM reporting schedule, if not covered in the CDRL, must be preceded by a
contract modification letter initiated by the program office.
2.5.6.3 Factors to Consider When Deciding Whether to Decrease or Cease EVM Reporting
Prior to initiating the contract letter, the program office should confer with the EVM Specialist to
determine if modifying EVM variance reporting or ceasing EVM reporting are appropriate. The
following criteria should be considered:
68
DoD EVMIG
quantified. It is important to understand, however, that continuing reporting throughout the risk
mitigation process at the end of the Period of Performance can provide the program office with
useful insight.
If ongoing evaluations of the IMS WPs indicate that variances are increasing, then EVM reporting,
which could provide insight into the reasons for the slippage, should continue to the end of the
69
DoD EVMIG
contract’s Period of Performance. If the IMS updates indicate a potential milestone slip, it will be
necessary to continue full EVM reporting to ensure the program office has the necessary insight
to manage the remaining schedule.
70
DoD EVMIG
71 APPENDIX A
DoD EVMIG
Implementation Guide
(DON EVMIG)
72 APPENDIX A
DoD EVMIG
Department of Defense
Risk, Issue, and
Opportunity Management [*]
Guide for Defense
EVM & Risk Acquisition Programs
Agile and EVM NDIA IPMD An Industry
Management: A Program [*] Practice Guide for Agile [*]
EVM & Software Manager's Desk Guide on EVM Programs
DCMA EVMS Site [*] Surveillance Guide [*]
73 APPENDIX A
DoD EVMIG
APPENDIX B: GUIDELINES-PROCESS
PROCESS GROUPING
EIA-748 GUIDELINES
ORGANIZATION
2-1a Define authorized work X
2-1b Identify Program Organization Structure X
2-1c Company Integration of EVMS subsystems with WBS and OBS X
2-1d Identify organization/function for overhead X
2-1e Integrate WBS & OBS, create control accounts X
ACCOUNTING CONSIDERATIONS
2-3a Record direct costs from accounting system X
2-3b Summarize direct costs into WBS without allocation X
2-3c Summarize direct costs into OBS without allocation X
2-3d Record indirect costs X
2-3e Identify unit costs, equilavent unit costs, or lot costs X
Accurate material cost accumulation by control accounts; EV
X
2-3f measurement at right time; full accountability of material
Legend
74 APPENDIX B
DoD EVMIG
2.0 BCA Contents. The following description contains a generally accepted outline of the contents
of a business case and the BCA report. This is provided as guidance only, and the program office
is encouraged to conduct and tailor the business case in a way that best meets the need of the
individual program. Specific EVM guidance is included as appropriate in the following description.
2.1 Common Elements. BCAs contain a common set of elements that can be tailored according to
the degree of application required for a particular contract. These common elements are problem
definition, data collection, evaluation, and a report or briefing, which are detailed below.
• Problem definition includes establishing an objective for the analysis, stating the assumptions
that frame the analysis, and, as appropriate, laying out alternative solutions to the problem.
NOTE: This should include rationale for the selection of the FFP contract type versus selection
of a cost type or incentive type contract or for application of an EVM requirement to a contract
less than $20M.
• Data collection identifies and obtains the data needed to meet the objective of the analysis (e.g.,
cost, benefits, etc.).
• Evaluation analyzes the data to address the objective of the business case and to develop
findings that specifically relate the data to the objective. Both quantitative and qualitative
benefits for the proposed solution should be evaluated.
• A report or briefing presents the conclusions and recommendations of the BCA.
2.2 BCA Report. The report should document the elements described above. An accompanying
decision briefing should contain the following:
• Charter (i.e., objectives of the BCA)
• Scope (i.e., boundaries of the BCA)
• Assumptions
• Methodology (i.e., description of the data and analysis process)
• Status quo (i.e., description of the status quo- no EVM implementation and baseline costs)
• Proposed solution (i.e., description of EVM implementation, tailoring approach, and costs)
• Summary (i.e., comparison of costs, benefits, and potential drawbacks)
• Recommendation
75 APPENDIX C
DoD EVMIG
MANAGEMENT EXAMPLE: Realistic and current cost, expenditure, and schedule forecasts.
UNSATISFACTORY Contractor does not meet the criteria for satisfactory performance.
Contractor provides procedures for delivering realistic and up-to-date cost and schedule forecasts
SATISFACTORY as presented in the Integrated Program Management Report (IPMR), formal EAC, CFSR, IMS, etc. The
forecasts are complete, consistent with program requirements, and reasonably documented.
GOOD Contractor meets all SATISFACTORY requirements plus the following:
76 APPENDIX D
DoD EVMIG
• Contractor thoroughly documents and justifies all requirements for additional funding and schedule
changes. Contractor creates consistent and logical expenditure forecasts based on program requirements.
Contractor acknowledges cost growth (if any) in the current reporting period and provides well-
documented forecasts.
Contractor meets all GOOD requirements plus the following:
• Contractor constantly scrutinizes expenditure forecasts to ensure accuracy and currency. Contractor
prepares and develops program cost and schedule data that provides clear government visibility into
VERY GOOD
current and forecast program costs and schedule. Schedule milestone tracking and projections are
accurate and reflect true program status. Contractor maintains close and timely communications with
the government.
Contractor meets all VERY GOOD requirements plus the following:
• Contractor consistently submits a high quality EAC that is current and realistic. Reported expenditure
profiles are accurate. Contractor develops comprehensive and clear schedule data that provides excellent
EXCELLENT
correlation with technical performance measures and cost performance reports and permits early
identification of problem areas. Schedule milestone tracking and projections are accurate and recognize
potential program impacts.
MANAGEMENT EXAMPLE: Adequacy of cost proposals submitted during award fee period.
UNSATISFACTORY Contractor does not meet the criteria for satisfactory performance.
Contractor provides proposal data, including subcontractor data that is logically organized and
provides adequate visibility to the government to support technical review and cost analysis.
SATISFACTORY Contractor documents a basis of estimate for each element. If insufficient detail is provided, the
contractor provides the requisite detail to the government on request. The proposal is submitted on
time.
Contractor meets all SATISFACTORY requirements plus the following:
GOOD
• Contractor provides a detailed analysis for subcontractor and material costs.
Contractor meets all GOOD requirements plus the following:
• Contractor provides traceable proposal data that supports a detailed technical review and thorough cost
VERY GOOD
analysis by the government. Data requires only minor clarification. Potential cost savings are considered
and reported in the proposal.
Contractor meets all VERY GOOD requirements plus the following:
EXCELLENT • Change proposals are stand-alone and require no iteration for government understanding. Contractor
communicates during the proposal preparation phase and effectively resolves issues before submission.
77 APPENDIX D
DoD EVMIG
78 APPENDIX D
DoD EVMIG
• Contractor provides extremely thorough variance analysis. Contractor proactively keeps the government
informed of all problem areas, the causes, emerging variances, impacts, and corrective action.
Contractor keeps the government informed on progress made in implementing the corrective action
plans. Analysis is fully integrated with risk management plans and processes.
DISCIPLINE EXAMPLE: Accuracy, timeliness, and consistency of billing and cumulative performance data and integration of
subcontractor data.
Billings to the government may have slight delays and/or minor errors. IPMR, CSFR, and IMS reports
are complete and consistent with only minor errors. Data can be traced to the WBS with minimum
SATISFACTORY effort. Subcontractor cost and schedule data are integrated into the appropriate reports with some
clarification required. Contractor occasionally submits late reports. Contractor submits electronic data
correctly.
Contractor meets all SATISFACTORY requirements plus the following:
• Billings to the government are accurate, although there may be slight delays. Data is complete, accurate,
GOOD consistent, and traceable to the WBS with minor clarification required. Subcontractor performance data is
fully integrated into the appropriate reports with no clarification required, and reports are submitted on
time.
Contractor meets all GOOD requirements plus the following:
VERY GOOD
• Data is complete, accurate, and consistent with little or no clarification required.
Contractor meets all VERY GOOD requirements plus the following:
• Contractor submits billings to the government on time. Data is complete, accurate, and consistent with clear
EXCELLENT
traceability to the WBS. Data elements are fully reconcilable between the IPMR and the CFSR.
Subcontractor schedule performance is vertically and horizontally integrated with the contractor schedule.
79 APPENDIX D
DoD EVMIG
quickly assesses and corrects system deficiencies or baseline planning errors, resulting in minor impacts to
data accuracy. Contractor provides for the continuous review of the baseline to ensure that it is current and
accurate, thereby maintaining its usefulness to management. Cost and schedule baselines are fully
integrated.
Contractor meets all GOOD requirements plus the following:
• Contractor builds proper baseline in a timely manner. Contractor provides realistic performance baseline.
Contractor ensures WPs are detailed and consistent with scope of contract and planned consistently with
VERY GOOD
the schedule. Contractor conducts routine surveillance that reveals minor system deficiencies or minor
baseline planning errors, quickly assessed and corrected, resulting in minimal impact to data accuracy.
Contractor EVMS is effectively integrated with other management processes.
Contractor meets all VERY GOOD requirements plus the following:
• Contractor proactively manages baseline. Contractor maintains timely detail planning as far in advance as
practical and implements proper baseline controls. Contractor controls and minimizes changes to the
EXCELLENT
baseline, particularly in the near term. System deficiencies or planning errors are few and infrequent.
Contractor takes the initiative to streamline internal processes and maintains a high level of EVMS
competency and training across the organization.
80 APPENDIX D
DoD EVMIG
81 APPENDIX E
DoD EVMIG
The public reporting burden for this collection of information is estimated to average 110 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information including suggestions
for reducing this burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0701-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington,
VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not
display a currently valid OMB control number. Please DO NOT RETURN your form to the above address. Send completed form to the Government Issuing Contracting Officer for the Contract/PR No. listed
in Block E.
A. CONTRACT LINE ITEM NO. B. EXHIBIT C. CATEGORY:
A TDP TM OTHER X
D. SYSTEM/ITEM E. CONTRACT/PR NO. F. CONTRACTOR
(Fill in when known) (Enter Full name of Contractor)
1. DATA ITEM NO. 2. TITLE OF DATA ITEM 3. SUBTITLE 17. PRICE GROUP
A0XXXX Integrated Program Management Report (IPMR)
4. AUTHORITY (Data Acquisition Document No.) 5. CONTRACT REFERENCE 6. REQUIRING OFFICE 18. ESTIMATED
TOTAL
PROG/XXXX
PRICE
DI-MGMT-81861 SOW PARA XXXX
7. DD250 REQ 9. DIST STATEMENT 10. FREQUENCY 12. DATE OF FIRST SUBMISSION 14. DISTRIBUTION
REQUIRED
8. APP CODE 11. AS OF DATE 13. DATE OF SUBSEQUENT SUBMISSION Draft Final
The Contractor shall provide monthly IPMRs per DID DI-MGMT-81861; modified per the following: CR 1
1. Block 12 - Date of First Submission. The first submission of Formats 1-6 is due 12 working days after the end
of the second full accounting period following Authorization to Proceed (ATP).
2. Block 13 - Date of Subsequent Submissions: Subsequent submissions containing Formats 1 through 6 shall be
provided within 12 working days 2 after the close of the contractor’s monthly or periodic accounting cycle.
Format 7 is due annually on [add date] 3. Final submission is due when the last significant
milestone/deliverable as defined by the contract has been achieved and remaining risk areas have been
mitigated.
3.1. All formats shall be submitted electronically in accordance with the DOD-approved guidance and XML
requirements located in the EVM Central Repository (EVM-CR) at http://cade.osd.mil/tools/evm-tools.
3.1.1. Formats 1-4 shall be submitted using the DoD-approved XML schema and cost guideline.
3.1.3. Format 6 shall be submitted using the DoD-approved XML schema and schedule guideline.
3.1.4. A copy of the IMS in contractor native software format shall also be submitted 4.
3.1.5. Format 7 shall be submitted using the DoD-approved XML schema and time-phased cost
guideline.
3.2. All IPMR files must be electronically forwarded to the EVM-CR 5 at the DCARC Web site at
https://service.cade.osd.mil/DCARCPortal.
DD FORM 1423-1, FEB 2001 PREVIOUS EDITION MAY BE USED 15. TOTAL → 15
G. PREPARED BY H. DATE I. APPROVED BY J. DATE
Page 1 of 3 Pages
2
DID allows for as late as 17 WD where technical or other significant issues exist.
3
Select a timeframe that meets the PMO needs.
4
Formats 1-4 may be required in hours and/or human readable formats as optional items.
5
EVM-CR requirement is only for ACAT I programs with an EVM requirement on contract.
82 APPENDIX E
DoD EVMIG
4. Block 16 - Remarks:
4.1. Format 1 Instructions: The Work Breakdown Structure (WBS) shall be reported in accordance with the applicable MIL-STD-881 appendix
(latest version at time of award) tailored for execution requirements. The default level of XML reporting is called the “Reporting Level.”
4.2. Format 2 Instructions: Provide the contractor’s functional breakdown structure (e.g. Engineering, Manufacturing, Program Management,
Quality, Test, etc.) or other organizational breakdown such as by Integrated Product Teams (IPTs). Material and major subcontractors with
EVM System flow-down requirements shall be included as separate elements. No formal monthly variance analysis is required for Format
2, however, the contractor should correlate the variances in Format 1 to Format 2, as needed.
4.3. Format 3 Instructions:
4.3.1. Significant differences, those that are absolute values exceeding +/- 5% 6, between the Performance Measurement Baseline (PMB) at
the beginning and end of each specified period by month, and in total, shall be explained in Format 5.
4.3.2. Baseline change breakout on the Format should be by month for the next six months and [insert time interval] 7 thereafter.
6
Value to be evaluated by PMO to ensure it meets risk needs.
7
PMO can select breakout of timeframe beyond the 6-month window.
8
Value to be evaluated by PMO to ensure it meets risk needs.
9
PMO can select breakout of timeframe beyond the 6-month window.
10
Thresholds provided here are notional; they should be evaluated by the Government PMO based on program scope and risk.
83 APPENDIX E
DoD EVMIG
4.5.4.3. Specific corrective actions, forecasted closure date, and impact to the Estimate at Completion (EAC).
4.5.4.4. If there are no changes to the reportable element issue description, the expected impacts, or corrective action plans, then
specify, “no changes since the last reported analysis” and reference the IPMR date when the original narrative was reported.
4.5.5. IPMRs required from subcontractors will be provided electronically using the DOD-approved XML formats.
4.7. Format 7 Instructions: The following items will be provided at the same level as the Format 1 WBS level 13: BCWS, BCWP, ACWP, and
ETC by month, for the period, from contract start to complete, as applicable.
Page 3 of 3 Pages
11
SRAs can be delivered more frequently, but must be listed here. Also, days before IBR are adjustable.
12
If PMO has specific special fields or flags needed in the submission, they should be listed here.
13
Level of the Format 7 reporting can be down to the control account level, but must be specified here.
84 APPENDIX E
DoD EVMIG
The public reporting burden for this collection of information is estimated to average 110 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and
maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information including suggestions
for reducing this burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0701-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington,
VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not
display a currently valid OMB control number. Please DO NOT RETURN your form to the above address. Send completed form to the Government Issuing Contracting Officer for the Contract/PR No. listed
in Block E.
A. CONTRACT LINE ITEM NO. B. EXHIBIT C. CATEGORY:
A TDP TM OTHER X
D. SYSTEM/ITEM E. CONTRACT/PR NO. F. CONTRACTOR
(Fill in when known) (Enter Full name of Contractor)
1. DATA ITEM NO. 2. TITLE OF DATA ITEM 3. SUBTITLE 17. PRICE GROUP
A0XXXX Integrated Program Management Report (IPMR)
4. AUTHORITY (Data Acquisition Document No.) 5. CONTRACT REFERENCE 6. REQUIRING OFFICE 18. ESTIMATED
TOTAL
PROG/XXXX
PRICE
DI-MGMT-81861 SOW PARA XXXX
7. DD250 REQ 9. DIST STATEMENT 10. FREQUENCY 12. DATE OF FIRST SUBMISSION 14. DISTRIBUTION
REQUIRED
8. APP CODE 11. AS OF DATE 13. DATE OF SUBSEQUENT SUBMISSION Draft Final
The Contractor shall provide monthly IPMRs per DID DI-MGMT-81861, except as modified per the following: CR 1
4. Block 12 - Date of First Submission. The first submission of Formats 5 & 6 is due 12 working days after the
end of the second full accounting period following Authorization to Proceed (ATP).
5. Block 13 - Date of Subsequent Submissions: Subsequent submissions shall be provided within 12 working
days 14 after the close of the contractor’s monthly or periodic accounting cycle. Final submissions are due when
the last significant milestone/deliverable as defined by the contract has been achieved and remaining risk areas
have been mitigated.
6.1. Only Formats 5 and 6 are required. Formats 1-4 and 7 are not required.
6.2. Format 5 shall be submitted in contractor format. Only the portions of Format 5 that pertain to the
overall contract status or Format 6 are required as narrative.
6.3. Format 6 shall be submitted electronically in accordance with the DOD-approved XML schemas
located in the EVM Central Repository (EVM-CR) http://dcarc.cape.osd.mil/EVM/Uncefact.aspx.
6.5. All IPMR files must be submitted to the EVM-CR 16 in accordance with the submission process at the
DCARC Web site at http://dcarc.cape.osd.mil/EVM.
DD FORM 1423-1, FEB 2001 PREVIOUS EDITION MAY BE USED 15. TOTAL → 15
G. PREPARED BY H. DATE I. APPROVED BY J. DATE
Page 1 of 3 Pages
NOTES FOR GOVT USE ONLY:
14
DID allows for as late as 17 WD where technical or other significant issues exist.
15
“Native format” is the scheduling tool format and not another scheduling reporting output such as PDF.
16
EVM-CR requirement only for ACAT I programs with an EVM requirement on contract.
85 APPENDIX E
DoD EVMIG
4. Block 16 -
Remarks:
4.8. Format 5 Instructions: Discuss root causes of any schedule variance in terms of float and the impact to the program critical path, if any,
and identify significant missed milestones, impact to major milestones, and expected recovery dates.
4.9. Format 6 Instructions
Page 2 of 3 Pages
86 APPENDIX E
DoD EVMIG
1.1 Contract Work Breakdown Structure (CWBS). The contractor develops and maintains the
CWBS and CWBS dictionary in accordance with DI-MGMT-81334D, using the WBS
structure contained in the Cost and Software Data Reporting (CSDR) plan. The CWBS
provides the basis for further extension by the contractor to lower levels during the
performance of the contract. The contractor extends the CWBS down to the appropriate
level required to provide adequate internal management, surveillance, and performance
measurement, regardless of the reporting level stipulated in the contract for government
visibility. The contractor uses the CWBS as the primary framework for contract planning,
budgeting, and reporting of the cost, schedule, and technical performance status to the
government. The contractor analyzes the system requirements specified in the SOW and
system specification and translates them into a structure representing the products and
services that comprise the entire work effort commensurate with the acquisition phase and
contract requirements. The contractor’s teams or organizational entity responsible for the
systems engineering of the system prepares the technical elements of the extended Contract
WBS. The contractor, if necessary, updates the CWBS during the execution of the contract.
Changes to the CWBS or associated definitions, at any reporting level, require approval of
the government (DI-MGMT-81334D).
1.2 Performance Management System. The contractor utilizes its existing, internal
performance management system to plan, schedule, budget, monitor, manage, and report
cost, schedule, and technical status applicable to the contract. The contractor’s internal
performance management system serves as the single, formal, and integrated system that
meets both the contractor’s internal management requirements and the requirements of the
government for timely, reliable, and auditable performance information. The application
of these concepts provides for early indication of contract cost, schedule, and technical
challenges. Earned Value assessments correlate with technical achievement. The outputs
of this system are used as the basis to report detailed performance status during program
management reviews and other status meetings. The contractor’s system should satisfy the
industry Guidelines delineated in the EIA-748 (“the Guidelines"), EVMS, the general
provisions of the contract, and this SOW. The contractor need not establish a separate or
unique internal performance management system for purposes of planning, scheduling,
directing, statusing, recording or reporting progress under this contract.
1.2.1 Contractor Performance Management System. The contractor’s system shall meet
the Guidelines and be maintained in accordance with the requirements of the
Guidelines as described in this contract, under DFARS Clause 252.242-7002 and
the contractor’s own documented System Description. The Integrated Program
Management Reports (IPMR) are developed, maintained, updated/statused, and
87 APPENDIX F
DoD EVMIG
1.2.2 Integrated Baseline Review (IBR). An IBR focusing on the realism of the
contractor’s integrated Performance Measurement Baseline (PMB) and the
appropriateness of the Earned Value methodology to be employed under the
contract occurs as soon as possible after the contract PMB is in place, but, in no
event without specific authorization of the Contracting Officer, is initiation of the
IBR process to be delayed past the sixth month after award of this contract.
Incremental IBRs will be conducted as needed throughout the life of the contract
for initiation of an Undefinitized Contract Action, and subsequently, when required
following major changes to the baseline or replanning. The government verifies
during the IBR, and follow-on IBRs when required, that the contractor has
established and maintains a reliable PMB. The contractor ensures that the baseline
includes the entire contract technical scope of work consistent with contract
schedule requirements and has adequate resources assigned. The contractor ensures
the government that effective Earned Value methods are used to accurately status
contract cost, schedule, and technical performance. The IBR is used to achieve a
mutual understanding of the baseline plan, cost and schedule risk, and the
underlying management processes used for planning and controlling the program.
Participation in the IBR is a joint responsibility of both the government PM and the
contractor. The contractor flows-down the IBR requirement to those subcontractors
that meet the applicable thresholds for EVM reporting. The contractor leads the
IBR at subcontractors, with active participation from the government.
88 APPENDIX F
DoD EVMIG
1.3 Integrated Program Management Reporting. The contractor reports EVM data as
applicable to this contract in accordance with the requirements stated herein and the CDRL.
All reporting corresponds to applicable Contract WBS elements. The contractor reconciles
reporting elements in the Contract Funds Status Report (CFSR) with the IPMR when these
documents are submitted in the same month. The contractor provides a reconciliation of
the CFSR with IPMR as an addendum to the IPMR. (DI-MGMT-81861 and DI-MGMT-
81468)
1.3.2 Electronic Transmission of Data. The contractor formats the deliverable data for
electronic data interchange (EDI) in accordance with the ANSI X12 Standard or
XML equivalent.
1.4 Integrated Master Schedule (IMS). The IMS will have the following characteristics:
89 APPENDIX F
DoD EVMIG
electronically in the native digital format (i.e., an electronic file produced by the
contractor’s scheduling tool). (DI-MGMT-81861)
1.5 Over Target Baseline (OTB)/Restructure: The contractor may conclude the baseline no
longer represents a realistic plan in terms of budget/schedule execution. In the event the
contractor determines an OTB/restructuring action is necessary, the contractor obtains
government approval prior to implementing an OTB/restructuring action. The request
should also include detailed implementation procedures as well as an implementation
timeframe. The contractor will not implement the OTB/restructuring prior to receiving
written approval from the Contracting Officer.
90 APPENDIX F
DoD EVMIG
Budget at Completion
The sum of all budgets established for the contract through any
(BAC)
given WBS/OBS level. When associated with a level it becomes
19 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/Pages/1382.aspx, (December 30, 2016).
20 “Glossary of Defense Acquisition Acronyms and Terms.” DAU, https://dap.dau.mil/glossary/pages/1398.aspx, (March 6, 2017).
21
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT
SYSTEM INTERPRETATION GUIDE: 77.
22 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1407.aspx, (December 30, 2016).
23 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 77.
24
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 77.
25 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 77.
91 APPENDIX G
DoD EVMIG
26 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 77.
27
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
28 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
29 US Department of Defense, DoD Product Support Business Case Analysis Guidebook: 5.
30 DEPARTMENT OF DEFENSE, DEFENSE CONTRACT MANAGEMENT AGENCY, INSTRUCTION Earned Value Management System Compliance
Reviews, 20.
31 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1773.aspx, (December 30, 2016).
92 APPENDIX G
DoD EVMIG
32 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
33 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
34 “ACQuipedia: Contract Funds Status Report (CFSR)”, DAU, https://dap.dau.mil/acquipedia/Pages/ArticleDetails.aspx?aid=52872594-480d-
4cdf-a01f-ccf5da357c0e, (December 30, 2016).
35 “Subpart 4.10—Contract Line Items”, https://www.acquisition.gov/far/html/Subpart%204_10.html, (December 30, 2016).
36 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1819.aspx, (December 30, 2016).
37 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
38
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1648.aspx, (December 30, 2016).
39 “CSDR Overview and Policy, Defense Cost and Resource Center”, http://dcarc.cape.osd.mil/csdr/CSDROverview.aspx, (January 10, 2017).
93 APPENDIX G
DoD EVMIG
40
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
41 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 78.
42 “CSDR Overview and Policy, Defense Cost and Resource Center”, http://dcarc.cape.osd.mil/csdr/CSDROverview.aspx, (December 30, 2016).
43 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1638.aspx, (December 30, 2016).
44
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/Pages/1639.aspx, (December 30, 2016).
45 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1640.aspx, (December 30, 2016).
94 APPENDIX G
DoD EVMIG
46
OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 79.
47 “MAS Desk Reference”, GSA, https://www.gsa.gov/MASDESKTOP/section6_2.html, (March 15, 2017).
48 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/3343.aspx, (December 30, 2016).
49 “Defense Acquisition University”, Wikipedia, https://en.wikipedia.org/wiki/Defense_Acquisition_University, (December 30, 2016).
50 “ABOUT DCAA”, DCAA Defense Contract Audit Agency, http://www.dcaa.mil/about_dcaa.html, (December 30, 2016).
51
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1750.aspx, (December 30, 2016).
52 “DFARS – DEFENSE FEFDERAL ACQUISITION REGULATION SUPPLEMENT”, DCAA Defense Contract Audit Agency,
http://www.dcaa.mil/dfars.html, (December 30, 2016).
53 “About the Department of Defense (DoD)”, U.S. Department of Defense, http://www.defense.gov/About-DoD, (December 30, 2016).
54
OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 80.
55 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 80.
95 APPENDIX G
DoD EVMIG
56 “PARCA Earned Value Management (EVM) – Central Repository (CR)”, CADE, http://cade.osd.mil/tools/evm-tools, (March 6, 2017).
57
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 80.
58 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 2.
59 “Electronic Industries Alliance”, Wikipedia, https://en.wikipedia.org/wiki/Electronic_Industries_Alliance, (December 30, 2016).
60 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 80.
61
OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 80.
62 “XML”, Wikipedia, https://en.wikipedia.org/wiki/XML, (December 30, 2016).
96 APPENDIX G
DoD EVMIG
63
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1894.aspx, (December 30, 2016).
64 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1643.aspx, (December 30, 2016).
65 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/Pages/1644.aspx, (December 30, 2016).
66 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/1937.aspx, (December 30, 2016).
67
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/Pages/1976.aspx, (December 30, 2016).
68 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/3027.aspx, (December 30, 2016).
97 APPENDIX G
DoD EVMIG
Indefinite Quantity
Provides for furnishing an indefinite quantity, within stated limits,
of specific supplies or services, during a specified contract period,
with deliveries to be scheduled by the timely placement of orders
upon the contractor by activities designated either specifically or
by class. 69
Review of a contractor’s Performance Measurement Baseline
(PMB). It is conducted by Program Managers (PMs) and their
Integrated Baseline technical staffs, or Integrated Product Teams (IPTs), on contracts
Review (IBR) requiring compliance with DoD Earned Value Management
System (EVMS) criteria requirements within 6 months after
contract award. 70
An event-driven plan that documents the significant
Integrated Master Plan
accomplishments necessary to complete the work and ties each
(IMP)
accomplishment to a key program event. 71
An integrated, networked schedule containing all of the detailed
activities necessary to accomplish the objectives of a program.
When coupled with the Integrated Master Plan, it provides the time
Integrated Master
spans needed to complete the accomplishments and criteria of the
Schedule (IMS)
Integrated Master Plan events. The IMS normally contains all
levels of schedule for the program (master, intermediate, and
detailed). 72
69 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2011.aspx, (December 30, 2016).
70
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2060.aspx, (December 30, 2016).
71 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2064.aspx, (December 30, 2016).
72 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 82.
73 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 82.
74
“Integrated Program Management Report”, ACQuipedia, https://dap.dau.mil/acquipedia/Pages/ArticleDetails.aspx?aid=9b577e0d-144a-
4622-a5d7-4ba9c3effc21, (January 10, 2017).
98 APPENDIX G
DoD EVMIG
75
“EVM Interpretation and Issue Resolution Request”, EVM EARNED VALUE MANAGEMENT, http://www.acq.osd.mil/evm/issueRes.shtml,
(December 30, 2016).
76 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 82.
77 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2178.aspx, (December 30, 2016).
78
“AcqNotes Defense Acquisitions Made Easy,” acqnotes.com, acqnotes.com/acqnote/tasks/line-of-balance (March 29, 2018).
79 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 82.
99 APPENDIX G
DoD EVMIG
80 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2247.aspx, (December 30, 2016).
81 “AGENCY IN BRIEF’, MDA, https://www.mda.mil/about/about.html, (February 23, 2018)
82
“NDIA”, NDIA, http://www.ndia.org/Pages/default.aspx, (December 30, 2016).
83 “Naval Sea Systems Command”, Wikipedia, https://en.wikipedia.org/wiki/Naval_Sea_Systems_Command, (January 9, 2017).
84 OUSD AT&L (PARCA), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 83.
85 “Office of the Secretary of Defense”, U.S. Department of Defense, http://www.defense.gov/About-DoD/Office-of-the-Secretary-of-Defense,
(December 30, 2016).
86 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 83.
100 APPENDIX G
DoD EVMIG
87 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 83.
88
“ABOUT PARCA”, PARCA PERFORMANCE ASSESSMENTS AND ROOT CAUSE ANALYSES, http://www.acq.osd.mil/parca/, (December 30,
2016).
89 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 84.
90 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 84.
91
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2423.aspx, (December 30, 2016).
92 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2488.aspx, (December 30, 2016)
101 APPENDIX G
DoD EVMIG
93
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/Pages/2497.aspx, (December 30, 2016).
94 “Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/2564.aspx, (December 30, 2016).
95 DEPARTMENT OF DEFENSE, DEFENSE CONTRACT MANAGEMENT AGENCY, INSTRUCTION Earned Value Management System Compliance
Reviews, 22.
96 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 85.
97
“Acquisition Community Connection”, DAU, https://acc.dau.mil/CommunityBrowser.aspx?id=526665, (March 15, 2017).
98 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 85.
99 “Supervisor of Shipbuilding, Conversion and Repair”, NAVAL SEA SYSTEMS COMMAND, http://www.navsea.navy.mil/Home/SUPSHIP/,
(December 30, 2016).
100
“Glossary of Defense Acquisition Acronyms and Terms,” DAU, https://dap.dau.mil/glossary/pages/3339.aspx, (December 30, 2016
101 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 86.
102 APPENDIX G
DoD EVMIG
102
“Welcome”, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, http://www.acq.osd.mil/, (December 30,
2016).
103 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 86.
104 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 86.
105
OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 87.
106 OUSD A&S (AAP), DEPARTMENT OF DEFENSE EARNED VALUE MANAGEMENT SYSTEM INTERPRETATION GUIDE: 87.
103 APPENDIX G