Sopi 6.0 Final
Sopi 6.0 Final
TABLE OF CONTENTS
1.0 PURPOSE..............................................................................................................................4
1.1 Prerequisites .................................................................................................................... 4
2.0 REFERENCEs..........................................................................................................................4
3.0 SCOPE ..................................................................................................................................5
4.0 SRB Programmatic Team Structure...........................................................................................6
4.1 SRB Programmatic Team Lead............................................................................................. 7
4.2 Schedule Analyst ............................................................................................................... 7
4.3 Cost Analyst ..................................................................................................................... 7
5.0 SRB Programmatic Team Role..................................................................................................8
6.0 Independent Programmatic Assessment LCR Workflow ...............................................................8
6.1 Life Cycle Review (LCR) Planning.......................................................................................... 9
6.2 LCR Analysis and Feedback ................................................................................................11
7.0 Data Drops ......................................................................................................................... 17
8.0 Tailoring ............................................................................................................................. 18
9.0 LCR ToR.............................................................................................................................. 18
10.0 Dissenting Opinions............................................................................................................ 18
11.0 Requirements Tailoring....................................................................................................... 19
11.1 Program Project Management Board (PPMB) .....................................................................19
11.2 Compliance Matrix .........................................................................................................19
12.0 Cost Assessment ................................................................................................................ 19
12.1 Cost Requirements Review Process ...................................................................................20
12.2 Cost Assessment Process .................................................................................................21
13.0 Schedule Assessment ......................................................................................................... 29
13.1 Schedule Requirements Review Process ............................................................................29
13.2 Schedule Assessment Process ..........................................................................................31
14.0 Confidence Level Requirements Review Process ..................................................................... 40
14.1 Cost and Schedule Range Estimate Assessments (KDP-0/KDP-B) ............................................41
14.2 Joint Confidence Level Assessment ...................................................................................45
Appendix A: SRB Programmatic Team Aides and Product Templates ................................................ 51
Example BOE Assessment Criteria ............................................................................................51
Appendix B: SRB Programmatic Team Planning Schedule ................................................................ 53
Appendix C: Acronyms............................................................................................................... 57
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 4 of 59
Title: SRB Programmatic Assessment Process
1.0 PURPOSE
NASA’s Office of the Chief Financial Officers (OCFO) is responsible for the functional oversight of the
independent programmatic assessments and this Standard Operating Procedure Instruction (SOPI) for
Standing Review Board (SRB) Independent Programmatic Assessment Processes. This SOPI documents
OCFO best practices for conducting an independent programmatic assessment within the SRB construct.
The SOPI's purpose is to document the SRB Programmatic Team processes for supporting the completion
of an independent assessment of a project throughout the program/project life cycle, per NASA
Procedural Requirement (NPR) 7120.5E. It is the expectation that the following processes will be followed
as part of any programmatic support to an SRB 1 .
Note that this instruction uses the word “independence” in broad terms, and it encompasses the term
“independent” that is used extensively in NASA policy and requirements documents.
1.1 Prerequisites
Qualified programmatic analysts on a SRB should possess knowledge and/or prior experience in one or
more of the following subject areas2 :
2.0 REFERENCES
NPR 7120.5E, NASA Space Flight Program and Project Management Requirements
1See section 8.0 for guidance on tailoring the review programmatic review process.
2It is recognized that many programmatic analysts will have expertise and experience in a subset of the areas listed
here. It is also recognized that the importance of each area with respect to the review is subject which life-cycle is
being reviewed and the scope of that review. As such, the OCFO will work with programmatic analysts, Mission
Directorates, and SRB’s to ensure that the SRB Programmatic Team is has a collection of requisite skill sets.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 5 of 59
Title: SRB Programmatic Assessment Process
3.0 SCOPE
This SOPI applies to all Independent programmatic assessment activities conducted by the SRB
Programmatic Team. The Independent programmatic assessment consists of three phases: Life Cycle
Review (LCR) Planning; LCR Analysis & Feedback; and Final LCR, Report Presentations, and Closeout. These
phases cover the independent programmatic assessment from planning the review through the LCR which
culminates with the final report and presentation of materials to the governing Program Management
Council (PMC). It is the SRB Programmatic Team lead analyst’s responsibility to ensure that government
and contractor personnel supporting the independent programmatic assessment adhere to all the SOPI
requirements.
The independent programmatic assessment includes a review of strategic goal alignment, development
of project control plans, requirements management, scheduling, workforce planning, resource
management, budgeting, cost estimating, acquisition strategy planning, contract management, risk
management, performance tracking, and performing the project programmatic functions: planning,
execution, tracking, assessment, and reporting out. The Agency does not have a required standard
organizational structure that dictates where these programmatic functions reside. They could reside in
the Business Management Division, Program Planning and Control Office, a technical organization like
Systems Engineering and Integration, or the Office of the Chief Engineer. Wherever they reside, their
processes and products are related and should be using the same requirements, work breakdown
structure (WBS), and planning assumptions, while adhering to NASA policies and directives.
The SRB Programmatic Team is not only assessing how each functional area performs, but also how the
project coordinates and interacts across each programmatic function to ensure that, for example, both
the budget and scheduling products are using consistent assumptions for planning and analysis purposes.
The SRB Programmatic Team should coordinate with both the project and the SRB to conduct the
independent LCR process through a parallel approach. The parallel approach is for the SRB Programmatic
Team to maximize the use of a project’s existing products and to engage within the project LCR meetings,
boards, and/or products development cycle to minimize the impact to the project while balancing the
requirements of the SRB LCR. All products and information requested by the SRB Programmatic Team are
in accordance with NPR 7120.5E requirements and therefore should be readily available.
The SRB Programmatic Team will develop an Independent Programmatic Assessment Plan (IPAP) to
conduct the independent LCR. The IPAP contains what programmatic assessments will be conducted,
project life cycle product delivery dates, and reporting out requirements for the SRB Programmatic Team
assessment.
The LCRs are essential elements of conducting, managing, evaluating, and approving spaceflight programs
and projects. The program manager is responsible for planning and supporting the LCRs. The SOPI focuses
on the unique programmatic requirements that are defined in NPR 7120.5E, however it should be
recognized that the purpose of reviewing those products is to support the Standing Review Board’s
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 6 of 59
Title: SRB Programmatic Assessment Process
assessment of the following six criteria (as identified in NPR 7120.5E). Results of any assessment should
focus on the information that senior management needs to make forward decisions.
Alignment with, and contribution to, Agency strategic goals and the adequacy of requirements
flow down from those strategic goals.
o The scope of this criterion includes alignment of program/project requirements and
designs with Agency strategic goals, constraints, mission needs, and success criteria;
allocation of program requirements to projects; and proactive management of changes
in program and project scope and shortfalls.
Adequacy of management approach.
o The scope of this criterion includes program and project authorization, management
framework and plans, acquisition strategies, and internal and external agreements.
Adequacy of technical approach, as defined by NPR 7123.1B entrance and success criteria.
o The scope of this criterion includes flow down of project requirements to
systems/subsystems; architecture and design; and operations concepts that respond to
and satisfy imposed requirements and mission needs.
Adequacy of the cost and schedule estimates and funding strategy in accordance with NPD
1000.5B.
o The scope of this criterion includes cost and schedule control plans; cost and schedule
baselines that are consistent with the program and project requirements, assumptions,
risks, and margins; Basis of Estimate (BOE); Range Estimate and Joint Confidence Level
(JCL) (when required); and alignment with planned budgets.
Adequacy and availability of resources other than budget.
o The scope of this criterion includes planning, availability, competency and stability of
staffing, infrastructure, and the industrial base/supplier chain requirements, for example,
thermal vacuum chamber availability.
Adequacy of the risk management approach and risk identification and mitigation per NPR
8000.4A.
o The scope of this criterion includes risk management control plans, open, and accepted
risks, risk assessments, risk mitigation plans, and resources for managing or mitigating
risks.
Depending on milestone and category of mission and life cycle the size and make-up of the SRB
Programmatic Team is tailorable and may be less than three analysts, with the assessment responsibilities
distributed appropriately. Regardless of the size of the SRB Programmatic team, the underlying function
of lead, cost, and schedule are still required. As reference, Table 1: SRB Programmatic Team Size
provides guidance for team size.
Table 1: SRB Programmatic Team Size
In addition to leading the programmatic analyst team, the SRB Programmatic Team Lead tailors the IPAP,
serves as the primary point of contact for interfacing with the project’s programmatic points of contact,
develops the SRB planning schedule for the independent programmatic assessment, tracks the SRB
Programmatic Team assessment progress, ensures completion of the final report or briefings, and archives
assessment information at the end of the review. The Lead should work closely with the SRB Chair and
the RM throughout the assessment.
The Lead is also responsible for coordinating with the OCFO if additional resources are required to
adequately assess the project life cycle products and to conduct the review. OCFO will coordinate with
Mission Directorate if additional resources are required.
Pre-Phase A and Phase A: LCRs for projects in the formulation phase, such as standing up a project,
developing requirements, governance control plans, and preliminary cost and schedule estimates
(e.g., System Requirements Review [SRR], System Definition Review [SDR], Key Decision Point B
[KDP-B])
Phase B: LCR that approve a project baseline for cost and schedule (e.g., Preliminary Design
Review [PDR], Key Decision Point C [KDP-C], Rebaseline Review)
Phase C/D/E: LCRs for projects in the implementation phase and can be measured by performance
to baseline (e.g., Critical Design Review [CDR], System Integration Review [SIR], Key Decision Point
D [KDP-D], Operational Readiness Review [ORR])
The SRB Programmatic Team is officially released from its independent programmatic assessment review
duties once the appropriate PMC closes with no further documented actions for the SRB Programmatic
Team to support.
3 For more information on the ToR refer to the NASA Standing Review Board Handbook, Appendix H.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 9 of 59
Title: SRB Programmatic Assessment Process
practices to complete the LCR process and are tailorable to conduct the independent programmatic
assessment. The Review Manager is responsible for the schedule of the SRB, and the guidance below may
need to be adjusted to align with the SRB planning schedule. These steps do not include additional analysis
or actions the SRB Chair or Review Manager may request the SRB Programmatic Team to perform.
The independent programmatic assessment process contains three phases during the LCR and is detailed
in the NASA Standing Review Board Handbook:
LCR Planning: For a new SRB Programmatic Team and/or SRB, the early planning stage includes
standing up the SRB, complete training, begin communication with the project, begin developing
an IPAP, and formulate a life cycle review plan for the ToR. For a SRB Programmatic Team already
assigned to an existing SRB in a follow-on review, the scope of the early planning includes
interfacing with the project to develop/adjust a review plan for the next milestone review as well
as developing the IPAP and starting the IPA.
LCR Analysis & Feedback: This phase begins after project’s first LCR data drop and the SRB
Programmatic Team begins assessing the project products and provides feedback to the project.
The SRB Programmatic Team integrates the programmatic assessment in preparation for the SRB
discussions for identifying the project strengths and weaknesses. The goal of this phase is open
and continuous communication with the SRB and project to have a successful LCR.
Final LCR Report, Presentations, & Closeout: This final phase is the process for the SRB to develop
the final out briefs to the project and governing PMCs in preparation of the KDP. The Agency
collects the SRB assessment information for archiving and lessons learned from the LCR.
6.1.2 SRB Programmatic Team Coordination with the SRB Chair and Review Manager
The purpose of this formal meeting between SRB Programmatic Team, SRB Chair, and Review Manager is
to discuss the team’s role and the required SRB technical team inputs to complete the independent
programmatic assessment. The goal is to ensure roles and responsibilities are defined at the beginning
the independent assessment and that the independent assessment aligns with the expectations of the
Agency and SRB Chair. The OCFO is available to facilitate communication.
6.1.5 Coordinate with the Project on Data Drops, LCR Timeline Flow, and Feedback Loop
The discussion meetings between the SRB and project establish the framework for the LCR to determine
when the project will have the appropriate LCR products available and how the team will conduct the
independent programmatic assessment. The meeting can be face-to-face, telecom, or via email. The SRB
Programmatic Team should discuss the data drops, review timeline with the project, and make changes
as needed.
These meetings create a basis with the project for determining LCR data drops (refer to 6.2.1 Data Drop
1) to finalize the IPAP, planned SRB Programmatic Team independent assessment schedule, and work out
any disconnects between the SRB and the project. Appendix B: SRB Programmatic Team Planning
Schedule contains an example of a team planning schedule for independent programmatic assessment. If
a JCL is required for the LCR, then a JCL agreement may be developed to detail the type of JCL model, data
to be included in the model, and planned delivery or revision dates. These agreements will facilitate
writing the ToR.
6.1.6 Coordinate with the Review Manager on Life Cycle IPAP for ToR
The SRB Programmatic Team should discuss the results of the meetings between the team and project
with the SRB Chair and Review Manager. This should include:
The SRB Programmatic Team and project agreed plan for programmatic LCR data drops products
and delivery dates
Continuous communication plan with project (e.g., weekly or bi-weekly meetings)
IPAP tailoring of the programmatic LCR data drop products or the SRB Programmatic Team
independent programmatic assessments conducted that need to be reflected in the ToR
The Review Manager inputs (e.g., SRB caucus, travel, schedule conflicts) into the SRB Program
Team independent assessment planning schedule
Communication of the IPAP with the OCFO. Per NPR 7120.5E, ToR concurrence includes the
OCFO.
4
Please note that consultant contractors supporting the SRB may need additional time to set up data access because
some repositories may require a NASA email (.gov) account.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 11 of 59
Title: SRB Programmatic Assessment Process
inputs needed to complete the IPA, such as technical risk identification and assessment, or uncertainty
boundary analysis. This will ensure the Review Manager has scheduled SRB technical members to provide
the appropriate assessments and feedback to the SRB Programmatic Team to complete the IPA. The Best
practice is to conduct a preliminary risk assessment meeting with the SRB prior to the life cycle review site
review.
6.2.6 Develop Final JCL or Cost/Schedule Risk Analysis Model Updates with SRB Inputs (If
Applicable)
Update the IPA for any new risks or changes to independent risk assessment or cost and schedule
uncertainty boundaries. If applicable, the SRB Programmatic Team should adjust the JCL or cost/schedule
risk analysis models and provide model output reports to SRB for review. Prepare the IPA for the OCFO
Checkpoint Review. Please refer to 14.0 Confidence Level Requirements Review Process for range
estimates and JCL assessment details.
6.3.6 Final Independent Assessment Findings to SRB Chair and the SRB
The SRB Programmatic Team briefs the results of the IPA and associated presentations to the SRB Chair
and the SRB. These products typically form the programmatic assessment findings portion of the of SRB
presentation materials.
5
Strategic Investments Division SRB website
(https://community.max.gov/display/NASA/Standing+Review+Board+%28SRB%29+Repository).
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 15 of 59
Title: SRB Programmatic Assessment Process
include, but is not limited to, BOEs, uncertainty, risks, parametric models, model assumptions, cost and
schedule benchmarks, and review plans.
Refer to Appendix B: SRB Programmatic Team Planning Schedule for Independent Programmatic
Assessment for the detailed timeline of the team LCR process mapped to appropriate workflow processes.
Analysts should use the file naming and archiving file structure listed is below. For each folder and
individual file name, it is recommended to include the mission directorate, program or project, and review
type as the standard prefix during the execution of the LCR. This is important for archiving for follow on
LCRs and analogous missions research of future programs and projects SRB assessments and analysis. For
each LCR, the OCFO establish a secure website to allow collaboration and file storage for the assigned SRB
Programmatic Team. Figure 1: SRB Programmatic Analysis Archive File Structure shows example of file
structure.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 16 of 59
Title: SRB Programmatic Assessment Process
Figure 1: SRB Programmatic Analysis Archive File Structure (with example files)
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 17 of 59
Title: SRB Programmatic Assessment Process
Data Access: Project provides access to required repositories for the LCR and overview
documentation (e.g., project plan, WBS dictionary, latest monthly status briefing) to assist the SRB
Programmatic Team in understanding the project prior to the beginning of the LCR
Data Drop 1: Project provides preliminary required programmatic LCR products
Data Drop 2: Project provides final required programmatic LCR products
NPR 7120.5 defines project data drop deliverables. Data drop deliverables dates should be included in the
IPAP and programmatic section of the ToR. The NASA Standing Review Board Handbook provides
recommendation for timelines for data drops6 .
6
All timelines should be documented in the ToR and agreed to between the SRB and the project. Timelines are often
negotiated to accommodate project and Center processes.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 18 of 59
Title: SRB Programmatic Assessment Process
Final range estimate or JCL model and analysis schedule (if range or 20 calendar days
Data Drop 2 JCL required) and/or any updates to the risk list, matrix, cost estimate, prior to LCR**
budget, schedule, and project documents.
Single project programs, loosely coupled projects, uncoupled projects, or tightly coupled programs
* The list of programmatic cost and schedule data for each independent LCR is found in the NASA
Standing Review Board Handbook.
**For two-step LCR. The timeline is with respect to the second step of the independent LCR.
8.0 TAILORING
The criteria documented in NPR 7120.5E provides the emphasis and depth of analysis required. Whenever
possible the general review process for each LCR should be followed. However, in certain cases, the
amount of programmatic data for review and depth of analysis may be less or more than standard project
or tightly coupled programs and, thus, the analysis and reporting can be tailored appropriately. Tailoring
should be captured by the ToR.
review and decision by higher level management, and the individual specifically requests that the dissent
be recorded and resolved by the Dissenting Opinion process. For details regarding the dissenting opinion
process, please refer to NASA Governance and Strategic Management Handbook7 ,NASA Space Flight and
Project Management Requirements8 , Section 3.4 Process for Handling Dissenting Options, and the NASA
Standing Review Board Handbook9 .
The Office of the Chief Engineering chairs the Program and Project Management Board with members
from the mission directorates, the OCFO, Centers, and the Jet Propulsion Laboratory (JPL).
While some steps of the cost assessment process are mechanistic, often assessment and especially
estimation is a predictive process for which judgment and experience add value. Effective assessment and
7
NPD 1000.0B
8
NPR 7120.5E
9 NASA/SP-2016-3706 RevB
10 NPR 7120.5E, Appendix C. Compliance Matrix
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 20 of 59
Title: SRB Programmatic Assessment Process
estimation requires an understanding of the technical work to be performed. Please note that
programmatic analysts are intended to perform an assessment of the project programmatic processes
and products, including life cycle cost estimates. An independent cost estimate (ICE)11 is not required.
However, there may be instances where benchmark12 estimates are required as part of the LCR by the
SRB Programmatic Team, see Section 12.2.2 Assessment of Reasonableness. For guidance on the process
and methodologies for developing benchmark estimates, please refer to the NASA Cost Estimating
Handbook.
The intent of the cost assessment is to show the level of confidence that the Agency can commit to
externally to accomplish its technical goals while executing its plan on schedule and within budget.
This section only addresses the cost portion of the above requirement. The schedule section is in 13.0
Schedule Assessment.
12.1.1 Scope
The scope of cost assessment should include the entire life cycle of the project, or as defined in the ToR.
This typically includes Phase A through E. The SRB assesses the Agency’s commitment to the project, so
there could be items that the project manager is not actively managing that still need to be part of
assessment. 13 Cost contributions to the mission (e.g., foreign contributions) do not need to be directly
assessed for JCL purposes; however, costs associated with contribution risks such as fallback options and
delivery schedule risk and uncertainties that could affect the project should be noted in the analysis.
11Independent Cost Estimate: A quantitative assessment and estimate, performed independently from the SRB
Programmatic Assessment, resulting in an independent cost estimate of project’s lifecycle cost.
12 Benchmarking: Comparative analysis of the project’s cost and schedule plan to determine reasonableness used to
inform the SRB Programmatic Assessment. Benchmarking can be performed using any of the methodologies
specified in the NASA Cost Estimating Handbook (including historical, analogy and parametric estimating).
13
Launch vehicle procurement is an example for Science Mission Directorate missions. Typically launch vehicle
procurement is managed by Launch Services Program and the mission directorate.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 21 of 59
Title: SRB Programmatic Assessment Process
BOEs should include the scope, technical description, cost phasing, estimating methodology, ground rules
and assumptions, exclusions, and risks.
The SRB Programmatic Team should be addressing the following attributes for each BOE:
Task(s) Description
o Is there a detailed explanation of how the work will be accomplished?
o Does the BOE have any unique ground rules and assumptions to consider?
Rationale and Methodology
o Is the estimating methodology (e.g., parametric, analogous, grassroots, cost estimating
relationships [CERs]) appropriate for the given milestone?
o Are adjustments and assumptions (e.g., complexity factors, learning curve) adequately
explained?
Source Data
o Does the data come from a credible source and is it representative of the work being
estimated?
o Can the assessor verify and/or access the data upon request?
o Is the supporting data current, accurate, and complete?
Accurate
o Are any supporting equations documented (e.g.., CERs, rates, factors, etc.)?
o Are the BOE calculations correct (i.e., has a check been done to ensure it is free of errors)?
Compile Data
o Request project data
o Compile historical data
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 22 of 59
Title: SRB Programmatic Assessment Process
Review
o Read and comprehend the project scope, assumptions, liens, threats, risks, or any
exclusions.
o Determine if the estimates add up or contain errors.
Analysis/Validation
o Determine if the estimates make sense, and if any of the excluded items are required.
o Ascertain if the project is within family of comparable historical projects
Document
o Write up any questions and or concerns.
o Notate which comparable historical project information was used in assessment.
Discuss/Brief
o Talk to the project frequently; ask questions; share concerns.
o Talk to SRB about findings, especially issues, concerns, and observations.
Iterate
o If/as required
Figure 1: SRB Programmatic Analysis Archive File Structure (with example files)
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 23 of 59
Title: SRB Programmatic Assessment Process
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 24 of 59
Title: SRB Programmatic Assessment Process
7.0 Data Drops. Reviewing estimates is largely covered above in Section 12.1 Cost Requirements Review
Process. The estimate “review” is typically qualitative in nature, focused on ensuring the estimate meets
requirements and best practices. 14 This section will specifically cover the Analysis/Validation step
discussed above. The Analysis/Validation step is broken into two sections:
Explain how the estimate for each element was determined (e.g., grassroots and bottom up,
parametric, analogy, fixed-price vendor quote, pass-through from another organization).
Explain why that estimating methodology was chosen and how the estimate was developed.
o For grassroots estimates, identify the data sources used that provide an accurate estimate
of the schedule and cost required to complete the project.
o For parametric estimates, identify the model(s) used, the major assumptions that went
into the models, and the rationale for those assumptions.
o For analogy-based estimates, identify the missions/systems used and explain why each is
an applicable analog. If the project estimate is out-of-family, explain why.
For fixed-price quotes, the SRB Programmatic Team should assess the level of maturity of the hardware
to be delivered as well as the vendors’ history in delivering that type of hardware on time and for the
promised cost. Analysts should work with the projects to obtain the required information to perform this
type of vendor assessment.
The SRB Programmatic Team should examine how workforce estimates were created (e.g., by cost or
resource-loading the schedule or by some other method), and assess the assumptions behind the
workforce ramp-up and ramp-down and the outcome of workforce sensitivity analysis.
To independently assess this information, the programmatic analyst(s) should determine if the project’s
estimate is documented, traceable, complete, reasonable, and consistent with analogous missions or
systems. The estimate should follow the project lead Center guidance (e.g., GSFC Gold Rules) and any
other requirements (e.g., Announcement of Opportunity). The SRB Programmatic Team should consider
the experience level of the project team and identify areas where the project and reviewers agree and
disagree.
14
Best practices include those covered in the NASA Cost Estimating Handbook and by JPL, the Goddard Space Flight
Center (GSFC), and Johns Hopkins Applied Physics Laboratory (APL) guidance principles.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 25 of 59
Title: SRB Programmatic Assessment Process
When assessing grassroots estimates, assessors should consider the following questions as part of the
assessment:
• Have the project personnel provided sufficient information about the planned work, their
experience in doing or managing similar work, and how they developed their cost and schedule
estimates to provide high confidence in the accuracy of those estimates?
• Is the entire mission content covered in the project’s estimate? If not, what is missing and what
is the rationale for excluding it?
• Are the technical requirements stable?
o Are there potential changes to requirements, whether within a single project element or
handed across an interface from one element to another, that have not been accounted
for in the estimate but could drive cost and schedule changes and cause the grassroots
estimate to be inaccurate?
• Are the hardware/software requirements and designs mature enough to enable an accurate
estimate of the resources required to do the planned work? How mature are the technologies
and/or technical approaches the project plans to use?
o Is there any hardware or software that has not been built and/or flown in space over the
past five years? If so, have viable alternatives been identified?
o Is there a plan regarding how and when a decision to use alternative designs or
technologies will be made?
o Does the project’s estimate fund this plan, including the cost of carrying both alternatives
until the decision is made?
When assessing parametric estimates, assessors should consider the following questions as part of the
assessment:
• Is the modeling approach appropriate to the project’s point in the life cycle?
• Is the model’s database sufficiently analogous to the project or to the individual project element
being estimated that the model can produce a reliable estimate?
• Has the estimator identified all the model inputs (i.e., assumptions and parameters)? Are these
assumptions and parameters reasonable?
• Does the model include the entire project’s content?
• Does the project have external dependencies (e.g., international partners)?
• What assumptions or inputs had the greatest impact on the model’s output?
• Did the estimator do a sensitivity analysis by varying model inputs or using multiple models? If so,
what were the results of the sensitivity analysis?
When assessing analogy-based estimates, assessors should consider the following questions as part of
assessment:
• Which analogs were used? Was sufficient information on each chosen analog provided to
determine that it is an appropriate analog?
o Is the analog applicable at the top level or at the detailed subsystem level? How might
that change the estimate?
o Are any of the analogs a poor choice for developing the project’s estimate? Are there
more appropriate analogs that should have been used?
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 26 of 59
Title: SRB Programmatic Assessment Process
• If information on the chosen analogs was not provided, did the estimator provide a rationale for
analog selection?
• Where does this project fit within the overall envelope of costs and schedule durations, in total
and at lower levels?
• Does the project have external dependencies (e.g., international partners)?
• Is the project at or near an edge of the envelope in key areas of the project (e.g., where there are
known issues and risks)?
When assessing performance-based estimates, assessors should consider the following questions (often
used for later LCRs, CDRs, Systems Integration Reviews [SIRs] and Operations Readiness Review [ORRs]):
Has the technical baseline changed since the last LCR or major planning milestone (e.g., as the
result of the annual planning, programming, budgeting, and execution (PPBE) cycle)?
Does the project track earned value management (EVM)?
o How is the project EVM performance?
Does the project have external dependencies (e.g., international partners)?
Is there a launch window that could drive resource allocation?
What is the nature of the prime contracts (e.g., firm-fixed-price, cost-plus-fixed-fee)?
If performance is not to plan, what are the causes (e.g., realized risks, incorrect estimates)?
What are the project risks, threats, and opportunities? Are they captured in the plan?
Are the project’s sensitivity analyses appropriate based on past performance?
The SRB Programmatic Team should assess whether the planned funding profile adequately supports the
project. The goal of the assessment is to determine if the project’s funding is available when needed.
Including unallocated future expenses (UFE).
One specific area the SRB Programmatic Team should analyze is the annual cost phasing and budget/ New
Obligation Authority (NOA) by fiscal year. This should show how it supports the project's proposed
schedule and deliverables. The SRB Programmatic Team should also assess how the phasing plan was
developed, including the assumptions and strategies used, particularly as they relate to the BOE, historical
analogs, the project’s proposed schedule and deliverables and the SRB’s assessment. The SRB
Programmatic Team should address whether the proposed phasing matches the project ability to support
the project. Lastly, the SRB Programmatic Team should address how much cost carryover is assumed each
year of the project, both in absolute dollars and weeks or months of work.
Benchmarking is a cost or schedule analysis conducted to determine the reasonableness of the project’s
submitted estimate (cost or schedule) or to assess a specific input from of the estimate. Benchmarking
can be performed using any of the cost methodologies specified in the NASA Cost Estimating Handbook,
Appendix C.
When and where benchmarking is required is left to the SRB Programmatic Team discretion. However, it
is recommended that some benchmarking is done for the following project product attributes:
High risk subsystems/elements that are significant drivers of cost and schedule
Elements of the project’s BOE which do not pass BOE assessment criteria
Elements of the project where the SRB requests further analysis to fully understand estimate
and/or risk posture
Elements of work for which only preliminary or ROM cost and schedule estimates exist (i.e., Phase
E/F estimates at early review gates)
If all, or a majority, of a project’s estimate have incomplete BOEs or perceived unrealistic optimism,
benchmarking of all activities may be warranted. Further, if an element within a project has a very
defendable and traceable BOE, a benchmarking activity may not be warranted.
Benchmarking rationale and results should be communicated to the SRB and the project.
Table 4: Cost summarizes expectations and responsibilities, by phase, with regards to cost for the SRB
Programmatic Team.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 28 of 59
Title: SRB Programmatic Assessment Process
Cost Review Expectation Source or Concept &Technology Preliminary Design & Final Design & Fabrication
Responsibility Development Technology Completion (CDR)
(SRR) (PDR)
Compile Project Estimates Project Exist Exist, Detailed Estimate Exist, contract award data
& BOE15 for the phase the Project & detailed estimates for
is entering in to, self-performed work
derivation of out phases
will be at a higher level
Project Summary Project Exist Exist, aligned with Exist, aligned with
Schedule integrated master integrated master
schedule schedule
Historical Data ONCE Project & SRB Project & SRB Project & SRB
Programmatic Team Programmatic Team Programmatic Team
compile compile compile
Review Project Estimates SRB Programmatic Understand scope, content, & layout
& BOE Team
Historical Data SRB Programmatic What is the range of timelines & budgets, which mission or missions are most like
Team the project?
Analyze & Project Estimates SRB Programmatic Entire scope covered Entire scope covered & Entire scope covered &
Validate Team & documented documented & documented &
performance to date. Are performance to date. Are
any corrections realistic? any corrections realistic?
Project Summary SRB Programmatic Major tasks aligned to Major tasks aligned to Major tasks aligned to
Schedule Team funding funding, performance to funding, performance to
date date
Historical Data SRB Programmatic Timelines & budgets, Timelines & budgets, Timelines & budgets,
Team phasing within family? phasing within Family? phasing within Family?
Document Questions SRB Programmatic If SRB Programmatic Team doesn’t understand, IG/GAO, etc., probably won’t
Team either. Ask questions to help project tell story.
Findings SRB Programmatic Identify any apparent disconnects. Explained to project so they can
Team validate/understand them.
Historical SRB Programmatic Show comparisons to all similar missions, any outliers, and identify which
Comparisons Team mission(s) are most like current project
Discuss & Project Team SRB Programmatic Talk to project team first, ask questions, determine if disconnects are disconnects
Brief Team, & project or just lack of clarity
SRB SRB Programmatic Talk to SRB, see if they have any technical concerns that may drive costs. If so, add
Programmatic Team & SRB to findings, and identify to project team.
Team & SRB
Management SRB, SRB Identify any unresolved disconnects & show historical comparisons
Programmatic
Team, & project
15
BOE maturity may not be homogeneous in detail. For example, at SRR BOE detail for phase B should have more
detail than BOE’s in phase D.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 29 of 59
Title: SRB Programmatic Assessment Process
The schedule assessment helps to determine whether the project has implemented scheduling best
practices and is in accordance with Agency requirements. The schedule assessment should validate that:
The schedule control plan aligns with stakeholder objectives, and best practices are being used
to manage the project schedule
The availability of resources other than budget has been considered and appropriate resources
have been incorporated,
Risks have been identified and are being actively managed, consistently risk informing the
schedule so that the project can make informed management decisions.
Another intent of the schedule assessment is to quantify the level of confidence that the Agency can
commit to externally for date of project completion, and that the project will be able to accomplish its
technical goals while executing its schedule.
This section only addresses the schedule portion of the abovementioned requirement. The cost section is
covered in Section 12.0 Cost Assessment.
13.1.1 Scope
Scope of schedule assessment should include entire life cycle of project or as defined in the ToR. This
typically includes phases A through E. The foundation of schedule assessment is the IMS.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 30 of 59
Title: SRB Programmatic Assessment Process
Schedule BOEs should include documented rationale for project task durations. They may take on a variety
of forms and may not be fully contained in one data product. The SRB Programmatic Team should verify
that all project constraints and assumptions along with other supporting historical/ana logous data
sources, and mappings to cost BOE of the same WBS element are identified within the schedule BOE.
The SRB Programmatic Team should be addressing the following attributes for each BOE:
Task(s) Description
o Are the project schedule BOEs formally documented?
o Does the BOE have any unique ground rules and assumptions to consider?
o Is there a clear trace from the schedule to the costs?
Rationale/Methodology
o Are the sources for deriving estimates identified: established standards, expert judgment,
analogous comparisons, time estimates based upon historical data from past/related
projects, parametric analysis, team brainstorming, and extrapolations from known data
and trends?
o Is the estimating methodology appropriate for the given milestone?
o Are adjustments and assumptions adequately explained?
o Are changes from previous estimates tracked?
Source Data
o Is the schedule basis sound, realistic, and executable, such that activity durations are
based upon normal work schedules and calendars and do not contain padding or buffer?
o Are judgments or rationale well justified, analogies appropriate, and schedule estimating
relationships applied?
Accurate
o Is there any evidence of bias?
o Are activity durations based upon the effort required, available resources, and resource
efficiency?
o Are duration time units (i.e., work days) consistent throughout the schedule?
Compile
o Request project data
o Compile historical data
Review
o Read and comprehend the project scope, assumptions, liens, threats, risks, and any
exclusions
Analysis/Validation
o Check if schedule captures project scope
o Check for schedule reasonableness
o Ensure that project duration is within family of analogous projects
o Understand how level of effort activities are captured in the schedule and treated by
project analysis
Document
o Document questions and concerns
o Compare to historical project information
Discuss/Brief
o Talk to project frequently; ask questions; share concerns
o Talk to SRB about findings, specially issues, concerns, and observations
Figure 1: SRB Programmatic Analysis Archive File Structure (with example files)
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 32 of 59
Title: SRB Programmatic Assessment Process
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 33 of 59
Title: SRB Programmatic Assessment Process
7.0 Data Drops. Reviewing the schedule estimating methodology is covered in 13.1.2 Basis of Estimates.
The schedule review focuses on ensuring estimate meets requirements and best practices16 . This section
will specifically cover the Analysis/Validation step discussed above. The Analysis/Validation step is
covered in two sections:
When assessing project schedules, the SRB Programmatic Team should consider the following areas and
questions as part of the assessment:
Is the project plan complete and consistent with other project documents and supported by the
project’s scheduling approach?
The SRB Programmatic Team should assess whether a stand-alone plan or part of either the project Plan
or combined Technical, Schedule, Cost Control Plan, or the Schedule Management Plan captures content
per the key areas in the Schedule Management Plan Template (Schedule Management Handbook,
Appendix F), 17 as well as for its general alignment with scheduling processes and best practices as detailed
in the NASA Schedule Management Handbook. The SRB Programmatic Team should review the
documented approach, techniques, and methods the project intends to use in implementing the schedule
management process.
Is schedule management, including tracking and control, being performed in accordance with and
integrated with the institutional EVM processes and methodologies on projects?
Are there any efficiencies or deficiencies in the project’s processes or any issues with the project’s
ability to follow its processes as the project moves through its life cycle?
Are appropriate analytical tools, reports, and information provided to managers to make
informed decisions?
The SRB Programmatic Team should understand the project’s major acquisitions in relation to the project
WBS and the integrated master schedule.
16
NASA Schedule Management Handbook, NASA/SP-2010-3403.
17
NASA Schedule Management Handbook, NASA/SP-2010-3403. Appendix F. March 2011.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 34 of 59
Title: SRB Programmatic Assessment Process
Are major deliverable items accurately included in the integrated master schedule?
For projects in phases C and D that are required to perform EVM, 18 the SRB Programmatic Team should
be aware of which contracts require EVM, as these contracts should be delivering Integrated Performance
Management Reports (IPMRs), a WBS, and an integrated master schedule that will inform the project. The
SRB Programmatic Team should also be familiar with any findings coming out of the Integrated Baseline
Review (IBR), which is held in preparation for KDP-C.
It is a NASA best practice for all reporting to trace from a single integrated master schedule dataset and
not from separate schedule sources.
Does the integrated master schedule accurately reflect accomplished work and planned work?
How is the project performing integrated master schedule updates, analyzing schedule impacts,
and resolving issues to provide updated schedule reporting to project management and necessary
customers?
The SRB Programmatic Team should review the project schedule reports; these reports are sometimes
contained in Monthly or Quarterly Status Reports (Monthly Status Reports [MSRs] or Quarterly Status
Reports [QSRs]). Schedule reports may include a management summary schedule, logic reports, critical
path reports, total slack reports, schedule risk reports, schedule margin metrics, and performance trends.
18“Projects in phases C and D with a life cycle cost estimated to be greater than $20 million and Phase E project
modifications, enhancements, or upgrades with an estimated development cost greater than $20 million are
required to perform earned value management (EVM) with an EVM system that complies with the guidelines in
ANSI/EIA-748, Standard for Earned Value Management Systems…. EVM system requirements shall be applied to
appropriate suppliers in accordance with the NASA Federal Acquisition Regulation (FAR) Supplement, and to in-
house work elements.” NPR 7120.5E. Chapter 2. Section 2.2.8.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 35 of 59
Title: SRB Programmatic Assessment Process
Do these agreements detail the level of reporting information the project expects to receive and
ensure that the project is incorporating the appropriate level of detail into the integrated master
schedule?
The SRB Programmatic Team should understand the project’s UFE posture as it relates to the integrated
master schedule, milestones, and schedule risk impacts.
The SRB Programmatic Team should verify that the integrated master schedule captures the total scope
of work at an appropriate level of detail. The SRB Programmatic Team should also review the technical
progress against schedule performance, including meeting technology readiness levels (TRLs).
If the project has experienced any technical performance issues, the SRB Programmatic Team should
communicate these with the SRB, as they may indicate additional risk to meeting schedule objectives and
may require further analysis.
Are there changes in scope that were not part of the baseline plan?
o How are these changes incorporated into the schedule?
o Are there technical drivers driving schedule performance?
Schedule credibility is determined by monitoring key indicators within the integrated master schedule
that reflect both good and poor characteristics of schedule structure and maintenance and support
scheduling best practices.
Examples of key indicators within the logic network that should be monitored include, but are not
limited to: missing predecessors and successors, invalid task constraints, omission of task status,
improper status on future tasks, logic ties to and from summary tasks, inaccurate logic ties, and
improperly reflecting tasks as milestones.
The SRB Programmatic Team should use a health and quality check to determine whether the project
schedule has been developed using NASA standard best practices per the NASA Schedule Management
Handbook. Typically, these metrics will have defined thresholds that should be considered as guidelines,
which serve as trigger points for additional analysis.
o Regardless of whether the project provides the SRB with a health and quality check, the
SRB Programmatic Team should perform an independent health/quality check such that
all necessary metrics are reviewed19 (i.e., analysis that focuses on additional metrics may
be necessary to reveal deficiencies not uncovered by the project).
Once the health and quality check of the schedule is performed, the SRB Programmatic Team should work
with the project to resolve issues within the schedule network. The objective is to resolve as many of the
health check issues as possible so that the schedule used for the schedule risk analysis is complete and
well vetted. All health check issues may not be resolved so the analysts should understand their impact
on future schedule risk analysis.
The SRB Programmatic Team should assess the current critical path as calculated in the integrated master
schedule. The analysis should be performed in several ways, first analyzing the critical path in the project’s
native schedule as it is calculated by the scheduling software. They should examine the critical path for
these characteristics:
Does the schedule critical path(s) start at time of assessment and proceed as a continuous path
to project completion?
Does the scheduling software tool generate the same critical path that the project is reporting?
Does the schedule consists of tasks and milestones linked together with network logic using
appropriate relationship-types in a sequence that is programmatically feasible or otherwise
makes sense from a workflow standpoint?
Does the schedule have any unexplainable lags or leads or constraints that cause unimportant
activities to drive a milestone?
The schedule critical path(s) contains no level-of-effort activities, summary activities, or other
unusually long activities.
The schedule has no gaps in time between tasks that cannot be explained.
The integrated master schedule is derived from the integration of lower-level detailed schedules,
not by preselected activities that management has deemed critical.
Total float is fundamental to the critical path method (CPM) of scheduling. If the task/milestone that
represents the completion of the project has a hard constraint date assigned to it, then there would be a
possibility that the critical path could have a positive or negative total float value instead of zero. The SRB
Programmatic Team should examine the total float calculations for critical path activities and other critical
path analysis.
19 Several of these software tools are available for download. The Schedule Test and Assessment Tool (STAT) software may be
requested via the NASA software catalog at https://software.nasa.gov/software/MFS-33362-1. Other software applications are
also available via the One NASA Cost Engineering (ONCE) database at
https://oncedata.msfc.nasa.gov/%28S%28bma3ciyzi0pxghnmv21igowf%29%29/default.aspx
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 37 of 59
Title: SRB Programmatic Assessment Process
Is the logic automatically updating? Are manual entries and constraints misrepresenting the
critical path and float calculations? What are the results of the Defense Contract Management
Agency critical path test?
What is the project integrated master schedule critical path length index?
Through each iteration of critical path analysis, the SRB Programmatic Team should monitor the near -
critical paths, also referred to as secondary or tertiary paths, to understand the sensitivity of the schedule,
as well as where the schedule has flexibility and where it does not. Tasks with total float within a narrow
range of the critical path total float are near critical. They should also review past schedule reports,
including MSRs and QSRs.
What has the project identified as the critical path and near-critical paths over the course of
project execution and what is the rationale for any changes in these paths over time?
Does the project IMS critical path match the SRB programmatic team identified IMS critical
path?
If a SRB analysis schedule is created, does the SRB critical path match the project identified
critical path?
The SRB Programmatic Team should examine the metrics that the project is tracking and discuss the
results of those metrics with the SRB with respect to schedule performance. The team should also assess
whether the program or project is tracking performance at least monthly in the LCR window of 120 days
prior to the site review.
o Are there issues or past problems that should be incorporated in the threat, lien, or risk
lists?
The SRB Programmatic Team should perform margin analysis, 20 determining whether the project has
identified enough margin to account for schedule risk impacts and other unknown unknowns
(uncertainty) that may threaten the project completion. The results of the schedule risk analysis help
determine the adequacy and appropriate placement of the schedule margin in the integrated master
schedule.
Is the margin consistently identifiable within the integrated master schedule? Is it hidden within
the duration of other tasks?
Does the margin meet the Center’s margin guidelines?
Are the schedule margins realistic? Have they been validated (i.e., schedule risk analysis)?
Are there adequate management reserves to cover unfunded schedule margin?
Are the schedule margins consistent with similar missions?
Are margin and slack being tracked synonymously? (Note: Slack/float is not the same as margin)
Is there methodology to how the margins were derived (e.g., expert judgment, rules of thumb,
insight from schedule risk analysis)?
How does the project manage margin? Is there a burn-down plan? Who controls it?
20 Further information about margin assessment can be found in the NASA Schedule Management Handbook.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 38 of 59
Title: SRB Programmatic Assessment Process
Is there a process to guide how margin will be used to offset of scope change, schedule growth,
and potential risks?
Is the schedule margin funded?
Workforce planning is heavily dependent on the integrated master schedule for time phasing. The SRB
Programmatic Team should examine whether the project has considered potential equipment and facility
conflicts.
Is the appropriate workforce (skillset and number) available with regards to the planned work?
Do facility conflicts exist (e.g., for testing in thermal vacuum chambers)? Are there mitigations to
accommodate potential conflicts?
If the appropriate workforce or facilities are not available at the appropriate time, there may be additional
risk to the schedule. Furthermore, augmentation to staffing plans may be needed to cover threats, liens,
and/or possible impacts from schedule risks such as late deliveries. The SRB Programmatic Team needs to
understand the schedule and cost impacts of such threats, liens, and risks as they may reduce UFE and
schedule margin.
Schedule Expectations Concept & Technology Preliminary Design & Final Design &
Completion Technology Fabrication, System
(SRR, SDR) Completion (PDR) Assembly, Integration
& Test, Launch &
Checkout
(CDR, SIR, ORR)
Scheduling Tool Team possesses tool Intermediate level Intermediate level Expert level capable of
Considerations expertise capable of Schedule capable of complex developing what ifs,
Development critical path analysis schedule crashing
scenarios, etc.
21
NASA Schedule Management Handbook, NASA-SP-2010-3402 (https://www.nasa.gov/pdf/420297main_NASA-SP-
2010-3403.pdf)
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 39 of 59
Title: SRB Programmatic Assessment Process
This section is divided into two major sections to cover the requirements above. The first section will
provide both cost and schedule guidance on performing an assessment on the project’s cost and schedule
range estimate requirement. The second section will provide guidance on assessing the Agency’s JCL
requirements.
The SRB Programmatic Team assessment of the range estimate will include aspects covered in 12.0 Cost
Assessment and 13.0 Schedule Assessment; but will also include additional factors such as assessing the
project uncertainty and risks in the plans are properly accounted for in the project range estimates. The
project models used to generate the range estimates will be taken and adjusted with SRB technical subject
matter expert inputs and evaluated to identify any significant impacts to the project plan.
Though the general process for evaluation of both products can be similar, the mechanics of evaluating
the products can vary depending on how the project generates the range estimates.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 42 of 59
Title: SRB Programmatic Assessment Process
14.1.1.1 Scope
Please refer to 12.1.1 Scope and 13.1.1 Scope for a general conversation with regards to scope. The
project’s range estimates scope should include any risks, opportunities, and uncertainties that the project
controls and items that the project doesn’t manage but could affect the project’s baseline. For example,
as specified in the NASA Cost Estimating Handbook, with regards to international/inter-Agency
contributions, inter-project/program risks, and launch vehicle selection; the project is tasked to include
the programmatic risk of cost and schedule impacts to the project stemming from those systems. The
project should coordinate with the international, inter-Agency, inter-project/program, and appropriate
launch services entities when available; as well as coordinate with its mission directorate, to determine
the adjudication and communication of the risks (ownership). Further work should be performed to
determine how those risks will be incorporated and communicated in the range calculations. The SRB
Programmatic Team will have the responsibility to evaluate all aspects of the range, including
international/inter-Agency and inter-project/program relationships.
If the range estimates are generated using parametric or analogous data, then the SRB Programmatic
Team should assess the following:22
Applicability of the project input data and the model (e.g., tools) being used to support range
estimate calculations. Is the basis for the range estimates analogous to the project? If the project
is using analogous data to support range estimates than SRB Programmatic Team should obtain
inputs from the SRB if the analogies are reasonable. The team should use inputs from the technical
22
It is important to note that the SRB Programmatic Team analysts may be required to generate SRB range estimates
in the event the project does not satisfy range estimate requirements or at the request of the SRB Chair or convening
authorities.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 43 of 59
Title: SRB Programmatic Assessment Process
SRB subject matter experts (approved by the SRB Chair) to adjust the project range estimates and
provide results to the SRB.
Assessment that the project input data and model (e.g., tools) are being utilized properly. For
example, if the project uses parametric tools, are both inputs of uncertainty and CER uncertainty
being properly utilized? SRB technical subject matter input can be solicited, but the SRB
Programmatic Team will most likely be the SRB subject matter experts on CER best practices. 23
Assess the project input data and model (e.g., tools). Are the project inputs to the cost and
schedule range estimates reasonable?
Assess if the project risks are captured in the input data or model.
If the range estimates are generated via a schedule risk analysis, then refer to 13.0 Schedule Assessment.
It is important to note that a JCL is not required to fulfill a KDP-0/KDP-B range estimate requirement. The
assessment of a grassroots input looks at the schedule (schedule risk analysis) and cost range estimates
separately.
The project risk list provided for the LCR should include all risks and appropriate actions to mitigate each
risk. Project’s with international or other U.S. Government Agency contributions must plan for, assess,
and report on risks due to international or other government partners and plan for contingencies.
The SRB Programmatic Team will conduct an independent risk assessment to determine if the project
meets RIDM and CRM processes in accordance with NPR 8000.4A requirements regarding programmatic
risk. The SRB technical risk SME will conduct an independent risk assessment to determine if the project
meets RIDM and CRM processes regarding safety and mission assurance risk. As examples of areas to
assess, the project:
Documents risk acceptability criteria and thresholds, as well as elevation protocols (the specific
conditions under which a risk management decision must be elevated through management to
the next higher level).
Establishes risk communication protocols between management levels, including the frequency
and content of reporting, as well as identification of entities that will receive risk-tracking data
from the unit's risk management activity.
Conducts CRM process.
o Identify: Identify contributors to risk.
o Analyze: Estimate the probability and consequence components of the risk through
analysis, including uncertainty in the probabilities and consequences and, as appropriate,
estimate aggregate risks.
23 NASA Cost Estimating Handbook, v.4.0, February 2015, 2.3.3; NASA Schedule Management Handbook, NASA/SP-
20910-3403, March 2011, 7.9
24 As identified by NPR 7120.5E; Tables I -1, I-3, I-7 Program Plan Control Plans, and Appendix G section 3.3 for
Programs, Table I-5 Project Plan Control Plans and Appendix H for projects.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 44 of 59
Title: SRB Programmatic Assessment Process
o Plan: Decide on risk disposition and handling, develop and execute mitigation plans, and
decide what will be tracked.
o Track: Track observables relating to performance measures (e.g., technical performance
data, schedule variances), as well as the cumulative effects of risk disposition (handling)
decisions.
o Control: Control risk by evaluating tracking data to verify effectiveness of mitigation
plans, adjusting the plans as necessary, and executing control measures.
o Communicate and Document: Communicate and document the above activities
throughout the process.
NASA policy does not require a SRB ICE for any LCR milestone.
Though an ICE is not required, it is recognized that performing an adequate independent assessment may
require the SRB Programmatic Team to generate a benchmarking analysis (e.g., performing separate
analysis, as required, to help inform the SRB of the reasonableness of the project estimates) to ensure
cost and schedule estimates are reasonable and to facilitate conversations with regards to assessing the
project’s programmatic products.
14.1.3 Key attributes to a successful Cost and Schedule range estimate SRB assessment
The SRB Programmatic Team supports the SRB to determine if the project has identified and accurately
quantified all the known risks and if the uncertainty boundaries in the cost and schedule estimates are
appropriate. When assessing the project’s risks and confidence level calculation, the SRB should be able
to answer the following questions25 :
25
It should be noted that answers to these questions provides a solid basis to communicate the results of the SRB
programmatic assessment to NASA decision makers.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 45 of 59
Title: SRB Programmatic Assessment Process
Has the SRB identified new risks for the range estimates or adjusted risk likelihoods, consequences
or impacts different from the project? The SRB should provide the technical rationale for the
differences from the project inputs in the risk assessment.
Do the SRB’s uncertainty boundaries differ from the projects? If so provide the technical rationale
for the differences from the project inputs in the risk assessment.
When the project range estimate models are updated using the SRB SME adjustments to risk and
uncertainty distribution, how do the results compare to the project’s proposed budget and
schedule?
After evaluating risk and uncertainty drivers, what changes to the risk list and uncertainty
distribution have the biggest impact compared to the project’s proposed cost and schedule?
How does the ratio of the assessor-identified risks to uncertainty differ from the project’s inputs?
If appropriate, have the risks and impact of missing a launch window been included in the project’s
risk list?
Have risks associated with partner/international contributions been included in the project’s risk
list? Has the project identified the impact if partner/international contributions are not provided
or are provided later than in the project’s plan, and have assessed alternatives if needed?
The SRB is responsible for analyzing the project JCL to determine the validity of the JCL inputs (e.g., cost,
schedule, risk, uncertainty) and the reasonableness of the assumptions. The assessment will include all
aspects covered in the cost and schedule estimate sections (see 12.0 Cost Assessment and 13.0 Schedule
Assessment), but will also include additional factors that are JCL modeling specific. The project JCL model
is adjusted with SRB inputs via subject matter expert analysis and evaluated to identify any significant
impacts to the project current resource plan.
26
NASA Cost Estimating Handbook, v.4.0, February 2015, Section 3.1.3.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 46 of 59
Title: SRB Programmatic Assessment Process
The content of what is included in the costs and schedule of the JCL to meet this Phase-D requirement is
agreed upon between the project and convening authorities as part of the ToR or from the PPMB if the
project uses the board to adjudicate differences. As discussed in 14.1 Cost and Schedule Range Estimate
Assessments (KDP-0/KDP-B), the JCL should include any risks, opportunities, and uncertainties that the
project manages and items that the project does not control (e.g., contributions or other funds lines) that
could affect the project’s mission.
Refer to the NASA Cost Estimating Handbook, with regards to international/inter-Agency contributions,
inter-project/program risks, and launch vehicle selection; the project is tasked to include the
programmatic risk of cost and schedule impacts to the project stemming from those systems. The SRB
Programmatic Team should evaluate all aspects of the JCL, including international/inter-Agency
contributions and inter-project/program relationships.
27
For pros and cons associated with the integrated master schedule versus analysis schedule for JCL, please refer to
NASA Cost Estimating Handbook, Section 3.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 47 of 59
Title: SRB Programmatic Assessment Process
The SRB Programmatic Team should assess the adequacy of the project cost inputs to determine if the
inputs are reasonable to be Time Dependent or Time Independent. JCL model adjustments by SRB should
be communicated to the project. Refer to the NASA Cost Estimating Handbook for more clarification of
time dependent and time independent for JCL modeling.
Did the project identify all the risks that could affect the project? Cross-reference the risks
modeled in the JCL to the project risk list to ensure that all risks that could influence cost and
schedule are modeled. Furthermore, the SRB Programmatic Team should review the risks with
the SRB for subject matter expert inputs to determine if additional risks should be added or
adjustments to existing risks.
Are the risks properly linked to the JCL schedule logic? The SRB Programmatic Team should
evaluate and, as appropriate, review with the SRB to verify that the risks are properly linked to
the schedule and adjust as appropriate per SRB subject matter expert inputs.
Are the risks properly quantified with regards to likelihood and impact? The SRB Programmatic
Team should review project risk likelihood and impact for all JCL-modeled risks and adjust per SRB
subject matter expert inputs as needed.
The JCL model should model each risk against the current cost and schedule plan. Certain risks can be
quantified as pre-mitigated or post-mitigated in the JCL model. If a risk in the JCL model is a post-mitigated
consequence, then the SRB Programmatic Team should make sure that the mitigation strategy is clearly
baselined in the project plan with a funded mitigation strategy. (Refer NASA Cost Estimating Handbook,
Section J.4.1.3, for more discussion on pre-and post-mitigation in JCL models).
The SRB Programmatic Team will work with the SRB to assess the risk management approach starting with
an evaluation of the project Risk Management Plan. 28
Data-driven uncertainty:
28 As identified by NPR 7120.5E; Tables I -1, I-3, I-7 Program Plan Control Plans, and Appendix G section 3.3 for
Programs, Table I-5 Project Plan Control Plans and Appendix H for projects.
29 It’s important to note there will be instances where there will not be uncertainty. For each instance of uncertainty,
o Was the data normalized? If so, how? If the data was not normalized, some simple
normalization may be warranted (e.g., inflation). For normalized data, oftentimes outlier
events will be “normalized” out. Efforts to understand what the data constitutes is very
important.
o What level is the data, and is the data compatible with the JCL model? Uncertainty metrics
are not always easily transferable from one WBS level to another.
o Is the data relevant to what the project is estimating? Ensure the data is homogeneous to
what is being estimated.
o Is there enough data to support the analysis? Sample size matters. Small samples could
introduce statistical bias in the estimate of population range parameters. This bias should
be considered and accounted for.
Performance-based uncertainty:
o Is past performance relevant to work forward? For example, financial mutual funds, past
performance may not be a good indicator of the future.
o At what level was the past performance data collected? Level of performance-based
metrics collected for BOE should be the same general fidelity as the JCL model.
Subject matter expert-based uncertainty:
o Where did the subject matter expert input come from? Ensure the right subject matter
expert provided inputs. For example, a person may be quite the expert in a technical field
but may not have a good handle on the cost and schedule uncertainties of that field;
whereas a recent project manager, or Center cost estimator, may not be as competent in
the technical area but have a better feel for cost and schedule impacts.
o Is there confirmation bias? Confirmation bias is the tendency to search for or interpret
information in a way that confirms one's beliefs or hypotheses. For example, a project
may underestimate the negative uncertainty because they want the project to succeed.
o Is there framing bias? Framing bias can lead to using a too-narrow approach and
description of the situation or issue.
o Is there hindsight bias? Hindsight bias is the inclination to see past events as being
predictable.
In addition to providing the above results, the following questions should be answered by the SRB
Programmatic Team analysis:
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 49 of 59
Title: SRB Programmatic Assessment Process
What are the schedule activities and critical paths that may be impacted by risks?
Has the project identified all probable critical schedule activities? Secondary? Tertiary?
What is the probability that the schedule milestones dates will be completed on time?
What are the risk drivers (e.g., risk tornado chart)? Which risks impact the cost and schedule the
most?
What is the impact of uncertainty inputs?
How much margin and UFE is required to achieve specific confidence levels (e.g. , to achieve 70
percent confidence level)?
How much margin and/or UFE does each risk require for mitigation?
Is the project carrying sufficient margin and/or UFE in appropriate places within the schedule?
Does the ratio of time independent costs to time dependent costs seem reasonable?
Is the project using reasonable correlations values in the JCL analysis?
Does the analysis schedule have a logic network that has minimal constraints and is linked to
major milestones?
Is the schedule cost-loaded? Are the fixed and variable costs within the schedule properly
identified?
Is the project risk list properly linked to the schedule activities with likelihood and cost/schedule
impacts quantified? Does the SRB have adjustment to what risks could affect the project, where
the risks occur, and the likelihood and impact of each risk?
Have the SRB members identified different risks than the project or ranked the risks differently?
The SRB should provide the technical rationale for the differences in risk assessment and
quantified likelihoods, consequences and expected values, and should include a BOE for the
likelihoods, consequences and expected values of the added or changed risks.
Does the SRB’s uncertainty distribution(s) differ from the projects? If so, why?
When the probabilistic estimating models are run using the SRB’s risk list and uncertainty
distribution, how different are the results compared to the project’s proposed budget and
schedule?
What changes to the risk list and uncertainty distribution have the biggest impact compared to
the project’s proposed cost and schedule?
How does the ratio of the assessor-identified risks to uncertainty differ from the projects?
Have the risks and impact of missing a launch window been included in the project’s risk list?
30
It should be noted that answers to these questions provides a solid basis to communicate the results of the SRB
programmatic assessment to NASA decision makers.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 50 of 59
Title: SRB Programmatic Assessment Process
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 51 of 59
Title: SRB Programmatic Assessment Process
Template Name
IPA Briefing
IPA Report
Review Kickoff
Checkpoint Review
SRB Risk and Uncertainty
SRB Snapshot
Knowledge Management and Capture
ToR Project LCR Programmatic Section
Uncertainty is quantitatively
Estimating uncertainty used assessed (expected value, Uncertainty is partially
Uncertainty not applied or
Estimating to inform level of UFE / simulation, S-curve) and applied to some elements
linked to level of UFE /
Uncertainty contingency / reserves / linked to level of UFE / and/or not linked to level of
reserves
margin reserve / contingency / UFE
margin
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 53 of 59
Title: SRB Programmatic Assessment Process
Workflow
Activity/Schedule Reference Date Description
SOPI
Workflow
Activity/Schedule Reference Date Description
SOPI
Workflow
Activity/Schedule Reference Date Description
SOPI
Workflow
Activity/Schedule Reference Date Description
SOPI
Final IPA Findings to SRB Final IPA findings are delivered to the SRB
6.3.5 SR End + 10 days
Chair and Chair.
IPA Report Completed 6.3.7 SR End + 30 days The IPA SRB final report is published.
Brief the designated Center on the SRB
CMC Briefing 6.3.8 Pre MDPMC
results.
Brief the sponsoring mission directorate on
MDPMC Briefing 6.3.8 Pre APMC
the SRB results.
Deliver the final and fully annotated
Final Briefing Package At least 7 days prior to
6.3.8 briefing package, with cover letter, to the
due to APMC Executive APMC
APMC Executive.
APMC Briefing 6.3.8 SR End < + 30 days Brief the Agency PMC on the SRB results.
SRB Programmatic Team document
analysis lessons-learned, issues, and
OCFO Checkpoint– successes. Goal is to inform OCFO on how
Knowledge Management 6.3.9 SR End + 30 days to prioritize programmatic improvement
and Closeout efforts. Lastly, conduct a completeness
review on final products for archival
purposes.
Revision: 6.0 Document No: OCFO-SID-0002
Release Date: May 23, 2017 Page 57 of 59
Title: SRB Programmatic Assessment Process
APPENDIX C: ACRONYMS
DA Decision Authority
DPMC Division Program Management Council
LCR LCR
MD Mission Directorate
MDPMC Mission Directorate Management Council