NIST SP 800-53r5-Draft
NIST SP 800-53r5-Draft
Warning Notice
The attached draft document has been withdrawn, and is provided solely for historical purposes.
It has been superseded by the document identified below.
Superseding Document
Status Final
DOI https://doi.org/10.6028/NIST.SP.800-53r5
March 2020
1 Authority
2 This publication has been developed by NIST to further its statutory responsibilities under the
3 Federal Information Security Modernization Act (FISMA), 44 U.S.C. § 3551 et seq., Public Law
4 (P.L.) 113-283. NIST is responsible for developing information security standards and guidelines,
5 including minimum requirements for federal information systems. Such information security
6 standards and guidelines shall not apply to national security systems without the express
7 approval of the appropriate federal officials exercising policy authority over such systems. This
8 guideline is consistent with the requirements of the Office of Management and Budget (OMB)
9 Circular A-130.
10 Nothing in this publication should be taken to contradict the standards and guidelines made
11 mandatory and binding on federal agencies by the Secretary of Commerce under statutory
12 authority. Nor should these guidelines be interpreted as altering or superseding the existing
13 authorities of the Secretary of Commerce, OMB Director, or any other federal official. This
14 publication may be used by nongovernmental organizations on a voluntary basis and is not
15 subject to copyright in the United States. Attribution would, however, be appreciated by NIST.
40 All comments are subject to release under the Freedom of Information Act (FOIA) [FOIA96].
PAGE i
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
52 Abstract
53 This publication provides a catalog of security and privacy controls for federal information
54 systems and organizations to protect organizational operations and assets, individuals, other
55 organizations, and the Nation from a diverse set of threats and risks, including hostile attacks,
56 natural disasters, structural failures, human errors, and privacy risks. The controls are flexible
57 and customizable and implemented as part of an organization-wide process to manage risk. The
58 controls address diverse requirements derived from mission and business needs, laws, executive
59 orders, directives, regulations, policies, standards, and guidelines. Finally, the consolidated
60 catalog of controls addresses security and privacy from a functionality perspective (i.e., the
61 strength of functions and mechanisms provided by the controls) and an assurance perspective
62 (i.e., the measure of confidence in the security or privacy capability provided by the controls).
63 Addressing both functionality and assurance ensures that information technology products and
64 the information systems that rely on those products are sufficiently trustworthy.
65 Keywords
66 Assurance; availability; computer security; confidentiality; control; cybersecurity; FISMA;
67 information security; information system; integrity; personally identifiable information; Privacy
68 Act; privacy controls; privacy functions; privacy requirements; Risk Management Framework;
69 security controls; security functions; security requirements; system; system security.
PAGE ii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
70 Acknowledgements
71 This publication was developed by the Joint Task Force Interagency Working Group. The group
72 includes representatives from the Civil, Defense, and Intelligence Communities. The National
73 Institute of Standards and Technology wishes to acknowledge and thank the senior leaders from
74 the Departments of Commerce and Defense, the Office of the Director of National Intelligence,
75 the Committee on National Security Systems, and the members of the interagency working
76 group whose dedicated efforts contributed significantly to the publication.
PAGE iii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
114 In addition to the above acknowledgments, a special note of thanks goes to Jeff Brewer, Jim Foti
115 and the NIST web team for their outstanding administrative support. The authors also wish to
116 recognize Kristen Baldwin, Carol Bales, John Bazile, Jennifer Besceglie, Sean Brooks, Ruth
117 Cannatti, Kathleen Coupe, Keesha Crosby, Charles Cutshall, Ja’Nelle DeVore, Jennifer Fabius, Jim
118 Fenton, Matthew Halstead, Kevin Herms, Hildy Ferraiolo, Ryan Galluzzo, Robin Gandhi, Mike
119 Garcia, Paul Grassi, Marc Groman, Matthew Halstead, Scott Hill, Ralph Jones, Martin Kihiko,
120 Raquel Leone, Jason Marsico, Kirsten Moncada, Ellen Nadeau, Elaine Newton, Michael Nieles,
121 Michael Nussdorfer, Taylor Roberts, Jasmeet Seehra, Joe Stuntz, the Federal Privacy Council’s
122 Risk Management Subcommittee, the professional staff from the NIST Computer Security
123 Division and Applied Cybersecurity Division, and representatives from the Federal CIO Council
124 and Interagency Working Group for their ongoing contributions in helping to improve the
125 content of the publication. Finally, the authors gratefully acknowledge the significant
126 contributions from individuals and organizations in the public and private sectors, nationally and
127 internationally, whose insightful and constructive comments improved the overall quality,
128 thoroughness, and usefulness of this publication.
PAGE iv
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
140 There is an urgent need to strengthen the underlying information systems, component
141 products, and services that we depend on in every sector of the critical infrastructure to help
142 ensure those systems, components, and services are sufficiently trustworthy and provide the
143 necessary resilience to support the economic and national security interests of the United
144 States.
145 This update to NIST Special Publication 800-53 responds to the call by the Defense Science
146 Board by embarking on a proactive and systemic approach to develop comprehensive
147 safeguarding measures for all types of computing platforms, including general purpose
148 computing systems, cyber-physical systems, cloud and mobile systems, industrial/process
149 control systems, and Internet of Things (IoT) devices. Those safeguarding measures include
150 security and privacy controls to protect the critical and essential mission and business
151 operations of organizations, the organization’s high value assets, and the personal privacy of
152 individuals. The objective is to make the information systems we depend on more penetration
153 resistant to cyber-attacks; limit the damage from those attacks when they occur; make the
154 systems cyber resilient and survivable; and protect the security and privacy of information.
155 Revision 5 of this foundational NIST publication represents a multi-year effort to develop the
156 next generation security and privacy controls that will be needed to accomplish the above
157 objectives. It includes changes to make the controls more consumable by diverse consumer
158 groups including, for example, enterprises conducting mission and business operations;
159 engineering organizations developing all types of information systems and systems-of-systems;
160 and industry partners developing system components, products, and services. The major
161 changes to the publication include:
162 • Creating security and privacy controls that are more outcome-based by changing the
163 structure of the controls;
164 • Fully integrating privacy controls into the security control catalog creating a consolidated
165 and unified set of controls;
166 • Adding two new control families for privacy and supply chain risk management;
167 • Integrating the Program Management control family into the consolidated catalog of
168 controls;
PAGE v
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
169 • Separating the control selection process from the controls—allowing controls to be used by
170 different communities of interest including systems engineers, systems security engineers,
171 privacy engineers; software developers, enterprise architects; and mission/business owners;
172 • Separating the control catalog from the control baselines;
173 • Promoting alignment with different risk management and cybersecurity approaches and
174 lexicons, including the Cybersecurity Framework and Privacy Framework;
175 • Clarifying the relationship between security and privacy to improve the selection of controls
176 necessary to address the full scope of security and privacy risks; and
177 • Incorporating new, state-of-the-practice controls based on threat intelligence, empirical
178 attack data, and systems engineering and supply chain risk management best practices
179 including controls to strengthen cybersecurity and privacy governance and accountability;
180 controls to support secure system design; and controls to support cyber resiliency and
181 system survivability.
PAGE vi
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
OPTION 1 OPTION 2
209 This collaboration index is a starting point to facilitate discussion between security and privacy
210 programs since the degree of collaboration needed for control implementation for specific
211 systems depends on many factors.
212
213 For purposes of review and comment, three control families are identified as notional examples:
214 Access Control (AC); Program Management (PM); and Personally Identifiable Information
215 Processing and Transparency (PT). The notional examples are provided as a Notes to Reviewers
216 Supplement following Appendix D.
PAGE vii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
228 Summary
229 For ease of review, a short summary of all significant changes made to SP 800-53 from Revision
230 4 to Revision 5 is provided at the publication landing page under Supplemental Material. A
231 number of controls have changed, been renamed, and/or have additional discussion for context
232 for better privacy integration.
233 As part of the project to develop the next generation controls, some of the content in previous
234 versions of Special Publication 800-53 will be moved to other publications, new publications,
235 and the NIST web site. For example, control baselines can be found in a new publication, NIST
236 Special Publication 800-53B, Control Baselines for Information Systems and Organizations.
237 Control mapping tables and keywords can be found on the NIST web site as part of the new
238 automated control delivery system debuting in the near future. The content in NIST Special
239 Publication 800-53, Revision 4, will remain active for one year after the new and the updated
240 publications are finalized.
241 We encourage you to use the comment template provided when submitting your comments.
242 Comments on Draft Special Publication 800-53, Revision 5 must be received by May 15. Please
243 submit comments to sec-cert@nist.gov.
244 Your feedback on this draft publication is important to us. We appreciate each contribution
245 from our reviewers. The very insightful comments from both the public and private sectors,
246 nationally and internationally, continue to help shape the final publication to ensure that it
247 meets the needs and expectations of our customers.
PAGE viii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
257 ITL may require from the patent holder, or a party authorized to make assurances on its behalf,
258 in written or electronic form, either:
259 a) assurance in the form of a general disclaimer to the effect that such party does not hold
260 and does not currently intend holding any essential patent claim(s); or
261 b) assurance that a license to such essential patent claim(s) will be made available to
262 applicants desiring to utilize the license for the purpose of complying with the guidance
263 or requirements in this ITL draft publication either:
264 i) under reasonable terms and conditions that are demonstrably free of any unfair
265 discrimination; or
266 ii) without compensation and under reasonable terms and conditions that are
267 demonstrably free of any unfair discrimination.
268 Such assurance shall indicate that the patent holder (or third party authorized to make
269 assurances on its behalf) will include in any documents transferring ownership of patents
270 subject to the assurance, provisions sufficient to ensure that the commitments in the assurance
271 are binding on the transferee, and that the transferee will similarly include appropriate
272 provisions in the event of future transfers with the goal of binding each successor-in-interest.
273
274 The assurance shall also indicate that it is intended to be binding on successors-in-interest
275 regardless of whether such provisions are included in the relevant transfer documents.
PAGE ix
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
277
PAGE x
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
278
PAGE xi
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
279
PAGE xii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
280
PAGE xiii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
281
CONTROL BASELINES
The control baselines that have previously been included in NIST Special Publication 800-53 have
been relocated to NIST Special Publication 800-53B. Special Publication 800-53B contains control
baselines for federal information systems and organizations. It provides guidance for tailoring
control baselines and for developing overlays to support security and privacy requirements of
stakeholders and their organizations.
PAGE xiv
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
282
PAGE xv
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
283
PAGE xvi
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
PAGE xvii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
336 There is an urgent need to further strengthen the underlying information systems, component
337 products, and services that the nation depends on in every sector of the critical infrastructure—
338 ensuring those systems, components, and services are sufficiently trustworthy and provide the
339 necessary resilience to support the economic and national security interests of the United
340 States. This update to NIST Special Publication 800-53 responds to the call by the DSB by
341 embarking on a proactive and systemic approach to develop and make available to a broad base
342 of public and private sector organizations, a comprehensive set of safeguarding measures for all
343 types of computing platforms, including general purpose computing systems, cyber-physical
344 systems, cloud-based systems, mobile devices, and industrial and process control systems.
345 Those safeguarding measures include implementing security and privacy controls to protect the
346 critical and essential operations and assets of organizations and the privacy of individuals. The
347 objective is to make the information systems we depend on more penetration resistant; limit
348 the damage from attacks when they occur; make the systems cyber resilient and survivable; and
349 protect individuals’ privacy.
350 Revision 5 of this foundational NIST publication represents a multi-year effort to develop the
351 next generation of security and privacy controls that will be needed to accomplish the above
352 objectives. It includes changes to make the controls more usable by diverse consumer groups
353 (e.g., enterprises conducting mission and business operations; engineering organizations
354 developing information systems, IoT devices, and systems-of-systems; and industry partners
355 building system components, products, and services). The most significant changes to the
356 publication include:
357 • Making the controls more outcome-based by changing the control structure to eliminate the
358 distinction within each control statement regarding whether the control is expected to be
359 satisfied by an information system (i.e., using information technology or other information
360 resources) or by an organization (i.e., through policies or procedures);
361 • Integrating information security and privacy controls into a seamless, consolidated control
362 catalog for information systems and organizations;
363 • Establishing a new supply chain risk management control family;
364 • Separating control selection processes from the controls, thereby allowing the controls to be
365 used by different communities of interest, including systems engineers, security architects,
PAGE xviii
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
366 software developers, enterprise architects, systems security and privacy engineers, and
367 mission or business owners;
368 • Removing control baselines and tailoring guidance from the publication and transferring the
369 content to NIST Special Publication 800-53B, Security and Privacy Control Baselines for
370 Information Systems and Organizations (Projected for publication in 2019);
371 • Clarifying the relationship between requirements and controls and the relationship between
372 security and privacy controls; and
373 • Incorporating new, state-of-the-practice controls (e.g., controls to support cyber resiliency,
374 controls to support secure systems design, and controls to strengthen security and privacy
375 governance and accountability)—all based on the latest threat intelligence and cyber-attack
376 data.
377 In separating the process of control selection from the actual controls and removing the control
378 baselines, a significant amount of guidance and other informative material previously contained
379 in Special Publication 800-53 was eliminated from the publication. That content will be moved
380 to other NIST publications such as Special Publication 800-37 (Risk Management Framework)
381 and Special Publication 800-53B during the next update cycle. In the near future, NIST also plans
382 to transition the content of Special Publications 800-53, 800-53A, and 800-53B to a web-based
383 portal to provide its customers interactive, online access to all control, control baseline, overlay,
384 and assessment information.
PAGE xix
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
385 Prologue
386 “…Through the process of risk management, leaders must consider risk to US interests from
387 adversaries using cyberspace to their advantage and from our own efforts to employ the global
388 nature of cyberspace to achieve objectives in military, intelligence, and business operations… “
389 “…For operational plans development, the combination of threats, vulnerabilities, and impacts
390 must be evaluated in order to identify important trends and decide where effort should be
391 applied to eliminate or reduce threat capabilities; eliminate or reduce vulnerabilities; and assess,
392 coordinate, and deconflict all cyberspace operations…”
393 “…Leaders at all levels are accountable for ensuring readiness and security to the same degree as
394 in any other domain…"
405 The promise of these new applications often stems from their ability to create, collect, transmit,
406 process, and archive information on a massive scale. However, the vast increase in the quantity
407 of personal information that is being collected and retained, combined with the increased ability
408 to analyze it and combine it with other information, is creating valid concerns about privacy and
409 about the ability of entities to manage these unprecedented volumes of data responsibly…. A key
410 challenge of this era is to assure that growing capabilities to create, capture, store, and process
411 vast quantities of information will not damage the core values of the country….”
412 “…When systems process personal information, whether by collecting, analyzing, generating,
413 disclosing, retaining, or otherwise using the information, they can impact privacy of individuals.
414 System designers need to account for individuals as stakeholders in the overall development of
415 the solution. …Designing for privacy must connect individuals’ privacy desires with system
416 requirements and controls in a way that effectively bridges the aspirations with development….”
PAGE xx
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
419 Errata
420 This table contains changes that have been incorporated into Special Publication 800-53. Errata
421 updates can include corrections, clarifications, or other minor changes in the publication that
422 are either editorial or substantive in nature.
423
PAGE xxi
NIST SP 800-53 REV. 5 (DRAFT) SECURITY AND PRIVACY CONTROLS FOR INFORMATION SYSTEMS AND ORGANIZATIONS
_________________________________________________________________________________________________
425 INTRODUCTION
426 THE NEED TO PROTECT INFORMATION, SYSTEMS, ORGANIZATIONS, AND INDIVIDUALS
427 Modern information systems 1 can include a variety of computing platforms (e.g., industrial and
428 process control systems; general purpose computing systems; cyber-physical systems; super
429 computers; weapons systems; communications systems; environmental control systems;
430 embedded devices; sensors; medical devices; and mobile devices such as smart phones and
431 tablets). The various platforms all share a common foundation—computers with complex
432 software and firmware providing a capability that supports the essential missions and business
433 functions of organizations.
434 Security controls are the safeguards or countermeasures selected and implemented within an
435 information system or an organization to protect the confidentiality, integrity, and availability of
436 the system and its information and to manage information security risk. Privacy controls are the
437 administrative, technical, and physical safeguards employed within a system or an organization
438 to ensure compliance with applicable privacy requirements and to manage privacy risks. 2
439 Security and privacy controls are selected and implemented to satisfy security and privacy
440 requirements levied on an information system or organization. The requirements are derived
441 from applicable laws, executive orders, directives, regulations, policies, standards, and mission
442 needs to ensure the confidentiality, integrity, and availability of information processed, stored,
443 or transmitted, and to manage risks to individual privacy. The selection, design, and effective
444 implementation of controls 3 are important tasks that have significant implications for the
445 operations and assets of organizations as well as the welfare of individuals and the Nation. 4
446 There are several key questions that should be answered by organizations when addressing
447 information security and privacy requirements:
448 • What security and privacy controls are needed to satisfy security and privacy requirements
449 and to adequately manage risk? 5
450 • Have the selected controls been designed and implemented or is there a design and
451 implementation plan in place?
452 • What is the required level of assurance (i.e., grounds for confidence) that the selected
453 controls, as designed and implemented, are effective? 6
1 An information system is a discrete set of information resources organized for the collection, processing,
maintenance, use, sharing, dissemination, or disposition of information.
2 [OMB A-130] defines security controls and privacy controls.
3 In addition to viewing controls solely from a compliance perspective, controls are important tools that provide
safeguards and countermeasures in systems security and privacy engineering processes to reduce risk during the
system development life cycle.
4 Organizational operations include mission, functions, image, and reputation.
5 Security and privacy risks are ultimately mission/business risks or risks to individuals and must be considered early
correctly, operating as intended, and producing the desired outcome with respect to meeting the designated security
and privacy requirements for the system.
454 The answers to these questions are not given in isolation, but rather in the context of an
455 effective risk management process for the organization that identifies, assesses, responds to,
456 and monitors on an ongoing basis, security and privacy risks arising from its information and
457 systems. The security and privacy controls in this publication are recommended for use by
458 organizations to satisfy their information security and privacy requirements. The control catalog
459 can be viewed as a toolbox containing a collection of mitigations, techniques, and processes to
460 address threats, vulnerabilities, and risk. The controls are employed as part of a well-defined
461 and effective risk management process that supports organizational information security and
462 privacy programs. In turn, those information security and privacy programs are a significant
463 foundation for the success of the missions and business functions of the organization.
464 It is of paramount importance that responsible officials understand the security and privacy risks
465 that could adversely affect organizational operations, organizational assets, individuals, other
466 organizations, and the Nation. 7 These officials must also understand the current status of their
467 security and privacy programs and the controls planned or in place to protect information,
468 information systems, and organizations in order to make informed judgments and investments
469 that respond to identified risks in an acceptable manner. The objective is to manage these risks
470 through the selection and implementation of security and privacy controls.
7 This includes risk to critical infrastructure and key resources described in [HSPD-7].
8A federal information system is an information system used or operated by an agency, by a contractor of an agency,
or by another organization on behalf of an agency.
9 Information systems that have been designated as national security systems, as defined in 44 U.S.C., Section 3542,
are not subject to the requirements in [FISMA]. However, the controls established in this publication may be selected
for national security systems as otherwise required (e.g., the Privacy Act of 1974) or with the approval of federal
officials exercising policy authority over such systems. [CNSSP 22] and [CNSSI 1253] provide guidance for national
security systems. [DODI 8510.01] provides guidance for the Department of Defense.
10 While the controls established in this publication are mandatory for federal information systems and organizations,
other organizations such as state, local, and tribal governments, as well as private sector organizations are
encouraged to consider using these guidelines, as appropriate. See [SP 800-53B] for federal control baselines.
486 Finally, the controls in the catalog are independent of the process employed to select those
487 controls. Such selection processes can be part of an organization-wide risk management
488 process, a systems engineering process, 11 the Risk Management Framework (RMF), or the
489 Cybersecurity Framework. 12 The control selection criteria can be guided and informed by many
490 factors, including mission and business needs; stakeholder protection needs; vulnerabilities;
491 threats; and requirements to comply with laws, executive orders, directives, regulations,
492 policies, standards, and guidelines. The combination of a comprehensive set of the security
493 and privacy controls and a risk-based control selection process can help organizations comply
494 with stated security and privacy requirements, obtain adequate security for their information
495 systems, and protect privacy for individuals.
11Risk management is an integral part of systems engineering, systems security engineering, and privacy engineering.
12[OMB A-130] requires federal agencies to implement the NIST Risk Management Framework for the selection of
controls for federal information systems. [EO 13800] requires federal agencies to implement the NIST Framework for
Improving Critical Infrastructure Cybersecurity to manage cybersecurity risk.
521 • The application of system security and privacy engineering principles and practices to
522 securely integrate system components into information systems;
523 • The employment of security and privacy practices that are well documented and integrated
524 into and supportive of the institutional and operational processes of organizations; and
525 • Continuous monitoring of information systems and organizations to determine the ongoing
526 effectiveness of controls, changes in information systems and environments of operation,
527 and the state of security and privacy organization-wide.
528 Organizations continuously assess the security and privacy risks to organizational operations and
529 assets, individuals, other organizations, and the Nation. These risks arise from the planning and
530 execution of their missions and business functions and by placing information systems into
531 operation or continuing system operations. Realistic assessments of risk require a thorough
532 understanding of the susceptibility to threats based on the vulnerabilities in information
533 systems and organizations and the likelihood and potential adverse impacts of successful
534 exploitations of such vulnerabilities by those threats. 13 Risk assessments also require an
535 understanding of privacy risks. 14 To address these concerns, security and privacy requirements
536 are satisfied with the knowledge and understanding of the organizational risk management
537 strategy 15 considering the cost, schedule, and performance issues associated with the design,
538 development, acquisition, deployment, operation, and sustainment of the organizational
539 information systems.
540 The catalog of security and privacy controls can be effectively used to protect organizations,
541 individuals, and information systems from traditional and advanced persistent threats and
542 privacy risks arising from the processing of personally identifiable information in varied
543 operational, environmental, and technical scenarios. The controls can be used to demonstrate
544 compliance with a variety of governmental, organizational, or institutional security and privacy
545 requirements. Organizations have the responsibility to select the appropriate security and
546 privacy controls, to implement the controls correctly, and to demonstrate the effectiveness of
547 the controls in satisfying security and privacy requirements. 16
548 Organizational risk assessments are used, in part, to inform the security and privacy control
549 selection process. The selection process results in an agreed-upon set of security and privacy
550 controls addressing specific mission or business needs consistent with organizational risk
551 tolerance. 17 The process preserves, to the greatest extent possible, the agility and flexibility that
552 organizations need to address an increasingly sophisticated and hostile threat space, mission
553 and business requirements, rapidly changing technologies, complex supply chains, and many
554 types of operational environments. Security and privacy controls can also be used in developing
555 specialized baselines or overlays for unique or specialized missions or business applications,
17 Authorizing officials or their designated representatives, by accepting the security and privacy plans, agree to the
security and privacy controls proposed to meet the security and privacy requirements for organizations and systems.
18 [SP 800-53B] provides guidance for tailoring security and privacy control baselines and for developing overlays to
support the specific protection needs and requirements of stakeholders and their organizations.
19 Mapping tables and related information are available at https://csrc.nist.gov.
20 [OMB A-130] establishes policy for the planning, budgeting, governance, acquisition, and management of federal
information, personnel, equipment, funds, IT resources and supporting infrastructure and services.
591 regarding control implementation and assessment; a list of related controls to show the
592 relationships and dependencies among controls; and a list of references to supporting
593 publications that may be helpful to organizations.
594 • Supporting appendices provide additional information on the use of security and privacy
595 controls including:
596 - General references; 21
597 - Definitions and terms;
598 - Acronyms; and
599 - Summary tables for controls.
21 Unless otherwise stated, all references to NIST publications refer to the most recent version of those publications.
603 This chapter presents the fundamental concepts associated with security and privacy controls,
604 including the relationship between requirements and controls; the structure of controls; how
605 control flexibility is achieved through well-defined tailoring actions; how controls are organized
606 in the consolidated control catalog; the different ways to designate the types of controls for
607 information systems and organizations; the relationship between security and privacy controls;
608 the purpose of control baselines and how tailoring is used to customize controls and baselines;
609 and the importance of the concepts of trustworthiness and assurance for both security and
610 privacy controls and the effect on achieving trustworthy, secure, and resilient systems.
627 Organizations may divide security and privacy requirements into more granular categories
628 depending on where the requirements are employed in the System Development Life Cycle
629 (SDLC) and for what purpose. Organizations may use the term capability requirement to describe
630 a capability that the system or organization must provide to satisfy a stakeholder protection
631 need. In addition, organizations may refer to system requirements that pertain to particular
632 hardware, software, and firmware components of a system as specification requirements—that
633 is, capabilities that implement all or part of a control and that may be assessed (i.e., as part of
634 the verification, validation, testing, and evaluation processes). Finally, organizations may use the
635 term statement of work requirements to refer to actions that must be performed operationally
636 or during system development.
637 Controls can be viewed as descriptions of the safeguards and protection capabilities appropriate
638 for achieving the particular security and privacy objectives of the organization and reflecting the
639 protection needs of organizational stakeholders. Controls are selected and implemented by the
640 organization in order to satisfy the system requirements. Controls can include technical aspects,
641 administrative aspects, and physical aspects. In some cases, the selection and implementation of
642 a control may necessitate additional specification by the organization in the form of derived
643 requirements or instantiated control parameter values. The derived requirements and control
644 parameter values may be necessary to provide the appropriate level of implementation detail
645 for particular controls within the SDLC.
ID FAMILY ID FAMILY
AC Access Control PE Physical and Environmental Protection
AT Awareness and Training PL Planning
AU Audit and Accountability PM Program Management
CA Assessment, Authorization, and Monitoring PS Personnel Security
CM Configuration Management PT PII Processing and Transparency
CP Contingency Planning RA Risk Assessment
IA Identification and Authentication SA System and Services Acquisition
IR Incident Response SC System and Communications Protection
MA Maintenance SI System and Information Integrity
MP Media Protection SR Supply Chain Risk Management
656
657 Families of controls contain base controls and control enhancements, which are directly related
658 to their base controls. Control enhancements either add functionality or specificity to a base
659 control or increase the strength of a base control. In both cases, control enhancements are used
660 in information systems and environments of operation that require greater protection than
661 provided by the base control due to the potential adverse organizational or individual impacts or
662 when organizations require additions to the base control functionality or assurance based on
663 organizational assessments of risk. The use of control enhancements always requires the use of
664 the base control.
665 Security and privacy controls have the following structure: a base control section; a discussion
666 section; a related controls section; a control enhancements section; and a references section.
22 Seventeen of the twenty control families in NIST Special Publication 800-53 are aligned with the minimum security
requirements in [FIPS 200]. The Program Management (PM) and Supply Chain Risk Management (SR) families address
enterprise-level program management and supply chain risk considerations pertaining to federal mandates emergent
since FIPS Publication 200.
687
688
689 FIGURE 1: CONTROL STRUCTURE
690 The control section prescribes a security or privacy capability to be implemented. Such capability
691 is achieved by the activities or actions, automated or nonautomated, carried out by information
692 systems and organizations. Organizations designate the responsibility for control development,
693 implementation, assessment, and monitoring. Organizations have flexibility to implement the
694 controls selected in whatever manner that satisfies organizational missions or business needs,
695 consistent with law, regulation, and policy.
696 For some controls, additional flexibility is provided by allowing organizations to define specific
697 values for designated parameters associated with the controls. Flexibility is achieved as part of a
698 tailoring process using assignment and selection statements embedded within the controls and
699 enclosed by brackets. The assignment and selection statements give organizations the capability
700 to customize controls based on stakeholder security and privacy requirements. Determination of
701 the organization-defined parameters can evolve from many sources, including laws, executive
702 orders, directives, regulations, policies, standards, guidance, and mission or business needs.
703 Organizational risk assessments and risk tolerance are also important factors in defining the
704 values for control parameters. 23 Organizations are responsible for assigning the parameter
705 values for each selected control. Once specified, the values for the assignment and selection
706 statements become a part of the control. The implementation of the control is assessed against
707 the completed control statement. In contrast to assignment statements which allow complete
708 flexibility in the designation of parameter values, selection statements narrow the range of
709 potential values by providing a specific list of items from which organizations must choose.
710 In addition to assignment and selection statements embedded in a control, additional flexibility
711 is achieved through iteration and refinement actions. Iteration allows organizations to use a
712 control multiple times, with different assignment and selection values, perhaps being applied in
713 different situations or when implementing multiple policies. For example, an organization may
714 have multiple systems implementing a control, but with different parameters established to
715 address different risks for each system and environment of operation. Refinement is the process
716 of providing additional implementation detail to a control. Refinement can also be used to
717 narrow the scope of a control in conjunction with iteration to cover all applicable scopes (e.g.,
718 applying different authentication mechanisms to different system interfaces). The combination
719 of assignment and selection statements and iteration and refinement actions when applied to
720 controls, provides the needed flexibility to allow organizations to satisfy a broad base of security
721 and privacy requirements at the organization, mission/business process, and system levels of
722 implementation.
723 The discussion section provides additional information about a control. Organizations can use
724 the information as needed, when developing, implementing, assessing, or monitoring controls.
725 The information provides important considerations for implementing controls based on mission
726 or business requirements, operational environments, or assessments of risk. The additional
727 information can also explain the purpose of controls and often includes examples. Control
728 enhancements may also include a separate discussion section when the discussion information
729 is applicable only to a specific control enhancement.
730 The related controls section provides a list of controls from the control catalog that impact or
731 support the implementation of a particular control or control enhancement, address a related
732 security or privacy capability, or are referenced in the discussion section. Control enhancements
733 are inherently related to their base control—thus, related controls that are referenced in the
734 base control are not repeated in the control enhancements. However, there may be related
735 controls identified for control enhancements that are not referenced in the base control (i.e.,
736 the related control is only associated with the specific control enhancement). Controls may also
737 be related to enhancements of other base controls. When a control is designated as a related
738 control, a corresponding designation is made on that control in its source location in the catalog
739 to illustrate the two-way relationship.
740 The control enhancements section provides statements of security and privacy capability that
741 augment a base control. The control enhancements are numbered sequentially within each
742 control so that the enhancements can be easily identified when selected to supplement the
743 base control. 24 Each control enhancement has a short subtitle to indicate the intended function
23 In general, organization-defined control parameters used in assignment and selection statements in the base
security and privacy controls apply also to the control enhancements associated with those controls.
24 The numbering or order of the control enhancements does not imply priority or level of importance.
744 or capability provided by the enhancement. In the AU-4 example, if the control enhancement is
745 selected, the control designation becomes AU-4(1). The numerical designation of a control
746 enhancement is used only to identify that enhancement within the control. The designation is
747 not indicative of the strength of the control enhancement, level or degree of protection, or any
748 hierarchical relationship among the enhancements. Control enhancements are not intended to
749 be selected independently. That is, if a control enhancement is selected, then the corresponding
750 base control must also be selected and implemented.
751 The references section includes a list of applicable laws, policies, standards, guidelines, websites,
752 and other useful references that are relevant to a specific control or control enhancement. 25 The
753 references section also contains hyperlinks to specific publications for obtaining additional
754 information for control development, implementation, assessment, and monitoring.
755
756
757
758 SECURITY AS A DESIGN PROBLEM
759 “Providing satisfactory security controls in a computer system is …. a system design problem. A
760 combination of hardware, software, communications, physical, personnel and administrative-
761 procedural safeguards is required for comprehensive security…. software safeguards alone are
762 not sufficient.”
763 -- The Ware Report
764 Defense Science Board Task Force on Computer Security, 1970.
765
766
767
777 Common controls are security or privacy controls whose implementation results in a capability
778 that is inheritable by multiple information systems or programs. A control is deemed inheritable
779 when the information system or program receives protection from the implemented control,
780 but the control is developed, implemented, assessed, authorized, and monitored by an internal
781 or external entity other than the entity responsible for the system or program. The security and
782 privacy capabilities provided by common controls can be inherited from many sources, including
25 References are provided to assist organizations in applying the security and privacy controls and are not intended
to be inclusive or complete.
26 [SP 800-37] provides additional guidance on control designations and how the different types of controls are used
783 mission or business lines, organizations, enclaves, environments of operation, sites, or other
784 information systems or programs. However, the use of common controls can introduce the risk
785 of a single point of failure.
786 Many of the controls needed to protect organizational information systems, including many
787 physical and environmental protection controls, personnel security controls, and incident
788 response controls are inheritable—and therefore, are good candidates for common control
789 status. Common controls can include technology-based controls, for example, boundary
790 protection controls, access controls, audit and accountability controls, and identification and
791 authentication controls. The cost of development, implementation, assessment, authorization,
792 and monitoring can be amortized across multiple information systems, organizational elements,
793 and programs.
794 Controls not designated as common controls are considered system-specific or hybrid controls.
795 System-specific controls are the primary responsibility of information system owners and the
796 authorizing officials for those systems. Organizations can designate a control as hybrid if a part
797 of the control is common (inheritable) and a part of the control is system-specific. For example,
798 an organization may implement control CP-2 using a predefined template for the contingency
799 plan for all organizational information systems with individual system owners tailoring the plan
800 for system-specific uses, where appropriate. The division of a hybrid control into its common
801 (inheritable) and system-specific parts may vary by organization, depending on the types of
802 information technologies employed, the approach used by the organization to manage its
803 controls, and assignment of responsibilities. When a control is designated as a hybrid control,
804 the common control provider is responsible for implementing, assessing, and monitoring the
805 common part of the hybrid control and the system owner is responsible for implementing,
806 assessing, and monitoring the system-specific part of the hybrid control.
814 The planning for a control to be common, hybrid, or system specific is best carried out early in
815 the system development life cycle and is coordinated with the entities providing the control [SP
816 800-37]. Similarly, if a control is to be inheritable, coordination is required with the inheriting
817 entity to ensure the control meets its needs. This is especially important given the nature of
818 control parameters. An inheriting entity cannot assume controls are the same and mitigate the
819 appropriate risk to the system just because the control identifiers (e.g., AC-1) are the same. It is
820 essential to examine the control parameters (e.g., assignment or selection statements) when
821 determining if the control is adequate to mitigate system-specific risks.
826 Privacy programs are responsible for ensuring compliance with applicable privacy requirements
827 and for managing risks to individuals associated with the creation, collection, use, processing,
828 storage, maintenance, dissemination, disclosure, or disposal (collectively referred to as
829 “processing”) of personally identifiable information. 27 Security and privacy program objectives
830 overlap with respect to the security of personally identifiable information; therefore, many
831 controls are selected to meet both sets of objectives and are considered both security controls
832 and privacy controls. Moreover, even when an organization selects a particular control to meet
833 security objectives only, the way the control is implemented may impact aspects of individuals’
834 privacy. Therefore, controls may include privacy considerations in the discussion section so that
835 organizations can take the potential risks for individuals’ privacy into account as they determine
836 the best way to implement the controls.
837 Selecting and implementing the appropriate controls require close collaboration between
838 information security programs and privacy programs when information systems are processing
839 personally identifiable information. Organizations consider how to promote and institutionalize
840 collaboration between the two programs to help ensure that the objectives of both disciplines
841 are met. When a system processes personally identifiable information, the organizations’
842 information security program and privacy program have a shared responsibility for managing
843 the security risks to the personally identifiable information in the system. Due to this shared
844 responsibility, controls that achieve both security and privacy objectives are considered both
845 privacy and security controls. Identification and Authentication (IA) controls are examples of
846 such controls.
857 Two fundamental components affecting the trustworthiness of systems are functionality and
858 assurance. Functionality is defined in terms of the security and privacy features, functions,
859 mechanisms, services, procedures, and architectures implemented within organizational
860 systems and programs, and the environments in which those systems and programs operate.
861 Assurance is the measure of confidence that the system functionality is implemented correctly,
862 operating as intended, and producing the desired outcome with respect to meeting the security
27 Privacy programs may also choose to consider the risks to individuals that may arise from their interactions with
information systems, where the processing of personally identifiable information may be less impactful than the
effect the system has on individuals’ behavior or activities. Such effects would constitute risks to individual autonomy
and organizations may need to take steps to manage those risks in addition to information security and privacy risks.
28 [SP 800-160 v1] provides guidance on systems security engineering and the application of security design principles
863 and privacy requirements for the system—thus possessing the capability to accurately mediate
864 and enforce established security and privacy policies.
865 In general, the task of providing meaningful assurance that a system is likely to do what is
866 expected of it can be enhanced by techniques that simplify or narrow the analysis, for example,
867 by increasing the discipline applied to the system architecture, software design, specifications,
868 code style, and configuration management. Security and privacy controls address functionality
869 and assurance. Certain controls focus primarily on functionality while other controls focus
870 primarily on assurance. Some controls can support functionality and assurance. Organizations
871 can select assurance-related controls to define system development activities, to generate
872 evidence about the functionality and behavior of the system, and to trace the evidence to the
873 specific system elements that provide such functionality or exhibit such behavior. The evidence
874 is used to obtain a degree of confidence that the system satisfies the stated security and privacy
875 requirements—while supporting the organization’s missions and business functions. Assurance-
876 related controls are identified in the control summary tables in Appendix D.
880 This catalog of security and privacy controls provides protective measures for systems,
881 organizations, and individuals. 30 The controls are designed to facilitate compliance with
882 applicable laws, executive orders, directives, regulations, policies, and standards. The security
883 and privacy controls in the catalog, with few exceptions, are policy, technology, and sector
884 neutral—meaning the controls focus on the fundamental measures necessary to protect
885 information and the privacy of individuals across the information life cycle. While security and
886 privacy controls are largely policy, technology, and sector neutral, that does not imply that the
887 controls are policy, technology, and sector unaware. Understanding policies, technologies, and
888 sectors is necessary so that the controls are relevant when implemented. Employing a policy,
889 technology, and sector neutral control catalog has many benefits. It encourages organizations
890 to:
891 • Focus on the security and privacy functions and capabilities required for mission and
892 business success and the protection of information and the privacy of individuals,
893 irrespective of the technologies that are employed in organizational systems;
894 • Analyze each security and privacy control for its applicability to specific technologies,
895 environments of operation, missions and business functions, and communities of interest;
896 and
897 • Specify security and privacy policies as part of the tailoring process for controls that have
898 variable parameters.
899 In the few cases where specific technologies are referenced in controls, organizations are
900 cautioned that the need to manage security and privacy risks in all likelihood goes beyond the
901 requirements in a single control associated with a technology. The additional needed protection
902 measures are obtained from the other controls in the catalog. Federal Information Processing
903 Standards, Special Publications, and Interagency/Internal Reports provide guidance on security
904 and privacy controls for specific technologies and sector-specific applications, including smart
905 grid, cloud, healthcare, mobile, industrial and process control systems, and IoT devices. NIST
906 publications are cited as references as applicable to specific controls in sections 3.1 through
907 3.20.
908 Security and privacy controls in the catalog are expected to change over time, as controls are
909 withdrawn, revised, and added. To maintain stability in security and privacy plans, controls are
910 not renumbered each time a control is withdrawn. Rather, notations of the controls that have
911 been withdrawn are maintained in the control catalog for historical purposes. Controls may be
912 withdrawn for a variety of reasons, including the function or capability provided by the control
913 has been incorporated into another control; the control is redundant to an existing control; or
914 the control is deemed to be no longer necessary or effective.
30 The controls in this publication are available online and can be obtained in various formats. See [NVD 800-53].
915 New controls are developed on a regular basis using threat and vulnerability information and
916 information on the tactics, techniques, and procedures used by adversaries. In addition, new
917 controls are developed based on a better understanding of how to mitigate information security
918 risks to systems and organizations and risks to the privacy of individuals arising from information
919 processing. Finally, new controls are developed based on new or changing requirements in laws,
920 executive orders, regulations, policies, standards, or guidelines. Proposed modifications to the
921 controls are carefully analyzed during each revision cycle, considering the need for stability of
922 controls and the need to be responsive to changing technologies, threats, vulnerabilities, types
923 of attack, and processing methods. The objective is to raise the level of information security and
924 privacy over time to meet the needs of organizations and individuals.
965 d. Specify:
966 1. Authorized users of the system;
967 2. Group and role membership; and
968 3. Access authorizations (i.e., privileges) and [Assignment: organization-defined attributes
969 (as required)] for each account;
970 e. Require approvals by [Assignment: organization-defined personnel or roles] for requests to
971 create accounts;
972 f. Create, enable, modify, disable, and remove accounts in accordance with [Assignment:
973 organization-defined policy, procedures, and conditions];
974 g. Monitor the use of accounts;
975 h. Notify account managers and [Assignment: organization-defined personnel or roles] within:
976 1. [Assignment: organization-defined time-period] when accounts are no longer required;
977 2. [Assignment: organization-defined time-period] when users are terminated or
978 transferred; and
979 3. [Assignment: organization-defined time-period] when system usage or need-to-know
980 changes for an individual;
981 i. Authorize access to the system based on:
982 1. A valid access authorization;
983 2. Intended system usage; and
984 3. [Assignment: organization-defined attributes (as required)];
985 j. Review accounts for compliance with account management requirements [Assignment:
986 organization-defined frequency];
987 k. Establish and implement a process for changing shared or group account credentials (if
988 deployed) when individuals are removed from the group; and
989 l. Align account management processes with personnel termination and transfer processes.
990 Discussion: Examples of system account types include individual, shared, group, system, guest,
991 anonymous, emergency, developer, temporary, and service. Identification of authorized system
992 users and the specification of access privileges reflects the requirements in other controls in the
993 security plan. Users requiring administrative privileges on system accounts receive additional
994 scrutiny by organizational personnel responsible for approving such accounts and privileged
995 access, including system owner, mission or business owner, senior agency information security
996 officer, or senior agency official for privacy. External system accounts are not included in the
997 scope of this control. Organizations address external system accounts through organizational
998 policy.
999 Where access involves personally identifiable information, security programs collaborate with
1000 the senior agency official for privacy on establishing the specific conditions for group and role
1001 membership; specifying for each account, authorized users, group and role membership, and
1002 access authorizations; and creating, adjusting, or removing system accounts in accordance with
1003 organizational policies. Policies can include such information as account expiration dates or other
1004 factors triggering the disabling of accounts. Organizations may choose to define access privileges
1005 or other attributes by account, by type of account, or a combination of the two. Examples of
1006 other attributes required for authorizing access include restrictions on time-of-day, day-of-week,
1007 and point-of-origin. In defining other system account attributes, organizations consider system-
1008 related requirements and mission/business requirements. Failure to consider these factors could
1009 affect system availability.
1010 Temporary and emergency accounts are intended for short-term use. Organizations establish
1011 temporary accounts as a part of normal account activation procedures when there is a need for
1012 short-term accounts without the demand for immediacy in account activation. Organizations
1013 establish emergency accounts in response to crisis situations and with the need for rapid account
1014 activation. Therefore, emergency account activation may bypass normal account authorization
1015 processes. Emergency and temporary accounts are not to be confused with infrequently used
1016 accounts, including local logon accounts used for special tasks or when network resources are
1017 unavailable (may also be known as accounts of last resort). Such accounts remain available and
1018 are not subject to automatic disabling or removal dates. Conditions for disabling or deactivating
1019 accounts include when shared/group, emergency, or temporary accounts are no longer required;
1020 and when individuals are transferred or terminated. Changing shared/group account credentials
1021 when members leave the group is intended to ensure that former group members do not retain
1022 access to the shared or group account. Some types of system accounts may require specialized
1023 training.
1024 Related Controls: AC-3, AC-5, AC-6, AC-17, AC-18, AC-20, AC-24, AU-2, AU-12, CM-5, IA-2, IA-4,
1025 IA-5, IA-8, MA-3, MA-5, PE-2, PL-4, PS-2, PS-4, PS-5, PS-7, SC-7, SC-13, SC-37.
1026 Control Enhancements:
1027 (1) ACCOUNT MANAGEMENT | AUTOMATED SYSTEM ACCOUNT MANAGEMENT
1028 Support the management of system accounts using [Assignment: organization-defined
1029 automated mechanisms].
1030 Discussion: Automated mechanisms include using email or text messaging to automatically
1031 notify account managers when users are terminated or transferred; using the system to
1032 monitor account usage; and using telephonic notification to report atypical system account
1033 usage.
1034 Related Controls: None.
1035 (2) ACCOUNT MANAGEMENT | AUTOMATED TEMPORARY AND EMERGENCY ACCOUNT MANAGEMENT
1036 Automatically [Selection: remove; disable] temporary and emergency accounts after
1037 [Assignment: organization-defined time-period for each type of account].
1038 Discussion: Management of temporary and emergency accounts includes the removal or
1039 disabling of such accounts automatically after a predefined time-period, rather than at the
1040 convenience of the systems administrator. Automatic removal or disabling of accounts
1041 provides a more consistent implementation.
1042 Related Controls: None.
1043 (3) ACCOUNT MANAGEMENT | DISABLE ACCOUNTS
1044 Disable accounts when the accounts:
1045 (a) Have expired;
1046 (b) Are no longer associated with a user or individual;
1047 (c) Are in violation of organizational policy; or
1048 (d) Have been inactive for [Assignment: organization-defined time-period].
1049 Discussion: Disabling expired, inactive, or otherwise anomalous accounts supports the
1050 concept of least privilege and least functionality which reduces the attack surface of the
1051 system.
1052 Related Controls: None.
1378 assurance. Organizational personnel consult with the senior agency official for privacy and
1379 legal counsel to determine appropriate mechanisms and access rights or limitations.
1380 Related Controls: IA-8, PM-22, PT-3, SI-18.
1381 (15) ACCESS ENFORCEMENT | DISCRETIONARY AND MANDATORY ACCESS CONTROL
1382 (a) Enforce [Assignment: organization-defined mandatory access control policy] over the
1383 set of covered subjects and objects specified in the policy; and
1384 (b) Enforce [Assignment: organization-defined discretionary access control policy] over
1385 the set of covered subjects and objects specified in the policy.
1386 Discussion: Implementing a mandatory access control policy and a discretionary access
1387 control policy simultaneously can provide additional protection against the unauthorized
1388 execution of code by users or processes acting on behalf of users. This helps prevent a single
1389 compromised user or process from compromising the entire system.
1390 Related Controls: SC-2, SC-3, AC-4.
1391 References: [OMB A-130]; [SP 800-57-1]; [SP 800-57-2]; [SP 800-57-3]; [SP 800-162]; [SP 800-
1392 178]; [IR 7874].
1426 guards. Such capabilities are generally not available in commercial off-the-shelf information
1427 technology products. This control also applies to control plane traffic (e.g., routing and DNS).
1428 Related Controls: AC-3, AC-6, AC-16, AC-17, AC-19, AC-21, AU-10, CA-3, CA-9, CM-7, PM-24, SA-
1429 17, SC-4, SC-7, SC-16, SC-31.
1430 Control Enhancements:
1431 (1) INFORMATION FLOW ENFORCEMENT | OBJECT SECURITY AND PRIVACY ATTRIBUTES
1432 Use [Assignment: organization-defined security and privacy attributes] associated with
1433 [Assignment: organization-defined information, source, and destination objects] to enforce
1434 [Assignment: organization-defined information flow control policies] as a basis for flow
1435 control decisions.
1436 Discussion: Information flow enforcement mechanisms compare security and privacy
1437 attributes associated with information (i.e., data content and structure) and source and
1438 destination objects and respond appropriately when the enforcement mechanisms
1439 encounter information flows not explicitly allowed by information flow policies. For
1440 example, an information object labeled Secret would be allowed to flow to a destination
1441 object labeled Secret, but an information object labeled Top Secret would not be allowed to
1442 flow to a destination object labeled Secret. A dataset of personally identifiable information
1443 may be tagged with restrictions against combining with other types of datasets, and
1444 therefore, would not be allowed to flow to the restricted dataset. Security and privacy
1445 attributes can also include source and destination addresses employed in traffic filter
1446 firewalls. Flow enforcement using explicit security or privacy attributes can be used, for
1447 example, to control the release of certain types of information.
1448 Related Controls: None.
1449 (2) INFORMATION FLOW ENFORCEMENT | PROCESSING DOMAINS
1450 Use protected processing domains to enforce [Assignment: organization-defined
1451 information flow control policies] as a basis for flow control decisions.
1452 Discussion: Protected processing domains within systems are processing spaces that have
1453 controlled interactions with other processing spaces, enabling control of information flows
1454 between these spaces and to/from information objects. A protected processing domain can
1455 be provided, for example, by implementing domain and type enforcement. In domain and
1456 type enforcement, system processes are assigned to domains; information is identified by
1457 types; and information flows are controlled based on allowed information accesses (i.e.,
1458 determined by domain and type), allowed signaling among domains, and allowed process
1459 transitions to other domains.
1460 Related Controls: SC-39.
1461 (3) INFORMATION FLOW ENFORCEMENT | DYNAMIC INFORMATION FLOW CONTROL
1462 Enforce [Assignment: organization-defined information flow control policies].
1463 Discussion: Organizational policies regarding dynamic information flow control include
1464 allowing or disallowing information flows based on changing conditions or mission or
1465 operational considerations. Changing conditions include changes in risk tolerance due to
1466 changes in the immediacy of mission or business needs, changes in the threat environment,
1467 and detection of potentially harmful or adverse events.
1468 Related Controls: SI-4.
1469 (4) INFORMATION FLOW ENFORCEMENT | FLOW CONTROL OF ENCRYPTED INFORMATION
1470 Prevent encrypted information from bypassing [Assignment: organization-defined
1471 information flow control mechanisms] by [Selection (one or more): decrypting the
1472 information; blocking the flow of the encrypted information; terminating communications
1520 development of rule sets to address the sensitivity of the information conveyed by the data
1521 or the flow enforcement decisions. Unstructured data consists of bitmap objects that are
1522 inherently non-language-based (i.e., image, video, or audio files); and textual objects that
1523 are based on written or printed languages. Organizations can implement more than one
1524 security or privacy policy filter to meet information flow control objectives.
1525 Related Controls: None.
1526 (9) INFORMATION FLOW ENFORCEMENT | HUMAN REVIEWS
1527 Enforce the use of human reviews for [Assignment: organization-defined information
1528 flows] under the following conditions: [Assignment: organization-defined conditions].
1529 Discussion: Organizations define security or privacy policy filters for all situations where
1530 automated flow control decisions are possible. When a fully automated flow control decision
1531 is not possible, then a human review may be employed in lieu of, or as a complement to,
1532 automated security or privacy policy filtering. Human reviews may also be employed as
1533 deemed necessary by organizations.
1534 Related Controls: None.
1535 (10) INFORMATION FLOW ENFORCEMENT | ENABLE AND DISABLE SECURITY OR PRIVACY POLICY FILTERS
1536 Provide the capability for privileged administrators to enable and disable [Assignment:
1537 organization-defined security or privacy policy filters] under the following conditions:
1538 [Assignment: organization-defined conditions].
1539 Discussion: For example, as allowed by the system authorization, administrators can enable
1540 security or privacy policy filters to accommodate approved data types. Administrators also
1541 have the capability to select the filters that are executed on a specific data flow based on the
1542 type of data that is being transferred, the source and destination security or privacy
1543 domains, and other security or privacy relevant features, as needed.
1544 Related Controls: None.
1545 (11) INFORMATION FLOW ENFORCEMENT | CONFIGURATION OF SECURITY OR PRIVACY POLICY FILTERS
1546 Provide the capability for privileged administrators to configure [Assignment:
1547 organization-defined security or privacy policy filters] to support different security or
1548 privacy policies.
1549 Discussion: Documentation contains detailed information for configuring security or privacy
1550 policy filters. For example, administrators can configure security or privacy policy filters to
1551 include the list of “dirty words” that security or privacy policy mechanisms check in
1552 accordance with the definitions provided by organizations.
1553 Related Controls: None.
1554 (12) INFORMATION FLOW ENFORCEMENT | DATA TYPE IDENTIFIERS
1555 When transferring information between different security or privacy domains, use
1556 [Assignment: organization-defined data type identifiers] to validate data essential for
1557 information flow decisions.
1558 Discussion: Data type identifiers include filenames, file types, file signatures or tokens, and
1559 multiple internal file signatures or tokens. Systems allow transfer of data only if compliant
1560 with data type format specifications. Identification and validation of data types is based on
1561 defined specifications associated with each allowed data format. The filename and number
1562 alone are not used for data type identification. Content is validated syntactically and
1563 semantically against its specification to ensure it is the proper data type.
1564 Related Controls: None.
1703 invoked. In general, the use of parallel filtering architectures for content filtering of a single
1704 data type introduces by-pass and non-invocation issues.
1705 Related Controls: None.
1706 (29) INFORMATION FLOW ENFORCEMENT | FILTER ORCHESTRATION ENGINES
1707 When transferring information between different security or privacy domains, employ
1708 content filter orchestration engines to ensure that:
1709 (a) Content filtering mechanisms successfully complete execution without errors; and
1710 (b) Content filtering actions occur in the correct order and comply with [Assignment:
1711 organization-defined policy].
1712 Discussion: Content filtering is the process of inspecting information as it traverses a cross
1713 domain solution and determines if the information meets a pre-defined security policy. An
1714 orchestration engine coordinates the sequencing of activities (manual and automated) in a
1715 content filtering process. Errors are defined as either anomalous actions or unexpected
1716 termination of the content filter process. This is not the same as a filter failing content due
1717 non-compliance with policy. Content filter reports are a commonly used mechanism to
1718 ensure expected filtering actions are completed successfully.
1719 Related Controls: None.
1720 (30) INFORMATION FLOW ENFORCEMENT | FILTER MECHANISMS USING MULTIPLE PROCESSES
1721 When transferring information between different security or privacy domains, implement
1722 content filtering mechanisms using multiple processes.
1723 Discussion: The use of multiple processes to implement content filtering mechanisms
1724 reduces the likelihood of a single point of failure.
1725 Related Controls: None.
1726 (31) INFORMATION FLOW ENFORCEMENT | FAILED CONTENT TRANSFER PREVENTION
1727 When transferring information between different security or privacy domains, prevent the
1728 transfer of failed content to the receiving domain.
1729 Discussion: Content that failed filtering checks, can corrupt the system if transferred to the
1730 receiving domain.
1731 Related Controls: None.
1732 (32) INFORMATION FLOW ENFORCEMENT | PROCESS REQUIREMENTS FOR INFORMATION TRANSFER
1733 When transferring information between different security or privacy domains, the process
1734 that transfers information between filter pipelines:
1735 (a) Does not filter message content;
1736 (b) Validates filtering metadata;
1737 (c) Ensures the content associated with the filtering metadata has successfully completed
1738 filtering; and
1739 (d) Transfers the content to the destination filter pipeline.
1740 Discussion: The processes transferring information between filter pipelines have minimum
1741 complexity and functionality to provide assurance that the processes operate correctly.
1742 Related Controls: None.
1743 References: [SP-800-160 v1]; [SP 800-162]; [SP 800-178].
1878 b. Automatically [Selection (one or more): lock the account or node for an [Assignment:
1879 organization-defined time-period]; lock the account or node until released by an
1880 administrator; delay next logon prompt per [Assignment: organization-defined delay
1881 algorithm]; notify system administrator; take other [Assignment: organization-defined
1882 action]] when the maximum number of unsuccessful attempts is exceeded.
1883 Discussion: This control applies regardless of whether the logon occurs via a local or network
1884 connection. Due to the potential for denial of service, automatic lockouts initiated by systems are
1885 usually temporary and automatically release after a predetermined, organization-defined time
1886 period. If a delay algorithm is selected, organizations may employ different algorithms for
1887 different components of the system based on the capabilities of those components. Responses
1888 to unsuccessful logon attempts may be implemented at the operating system and the application
1889 levels. Organization-defined actions that may be taken when the number of allowed consecutive
1890 invalid logon attempts is exceeded include prompting the user to answer a secret question in
1891 addition to the username and password; invoking a lockdown mode with limited user capabilities
1892 (instead of full lockout); or comparing the IP address to a list of known IP addresses for the user
1893 and then allowing additional logon attempts if the attempts are from a known IP address.
1894 Techniques to help prevent brute force attacks in lieu of an automatic system lockout or the
1895 execution of delay algorithms support the objective of availability while still protecting against
1896 such attacks. Techniques that are effective when used in combination include prompting the user
1897 to respond to a secret question before the number of allowed unsuccessful logon attempts is
1898 exceeded; allowing users to logon only from specified IP addresses; requiring a CAPTCHA to
1899 prevent automated attacks; or applying user profiles such as location, time of day, IP address,
1900 device, or MAC address. Automatically unlocking an account after a specified period of time is
1901 generally not permitted. However, exceptions may be required based on operational mission or
1902 need.
1903 Related Controls: AC-2, AC-9, AU-2, AU-6, IA-5.
1904 Control Enhancements:
1905 (1) UNSUCCESSFUL LOGON ATTEMPTS | AUTOMATIC ACCOUNT LOCK
1906 [Withdrawn: Incorporated into AC-7.]
1907 (2) UNSUCCESSFUL LOGON ATTEMPTS | PURGE OR WIPE MOBILE DEVICE
1908 Purge or wipe information from [Assignment: organization-defined mobile devices] based
1909 on [Assignment: organization-defined purging or wiping requirements and techniques]
1910 after [Assignment: organization-defined number] consecutive, unsuccessful device logon
1911 attempts.
1912 Discussion: A mobile device is a computing device that has a small form factor such that it
1913 can be carried by a single individual; is designed to operate without a physical connection;
1914 possesses local, non-removable or removable data storage; and includes a self-contained
1915 power source. Purging or wiping the device applies only to mobile devices for which the
1916 organization-defined number of unsuccessful logons occurs. The logon is to the mobile
1917 device, not to any one account on the device. Successful logons to accounts on mobile
1918 devices reset the unsuccessful logon count to zero. Purging or wiping may be unnecessary if
1919 the information on the device is protected with sufficiently strong encryption mechanisms.
1920 Related Controls: AC-19, MP-5, MP-6.
1921 (3) UNSUCCESSFUL LOGON ATTEMPTS | BIOMETRIC ATTEMPT LIMITING
1922 Limit the number of unsuccessful biometric logon attempts to [Assignment: organization-
1923 defined number].
1924 Discussion: Biometrics are probabilistic in nature. The ability to successfully authenticate
1925 can be impacted by many factors, including matching performance and presentation attack
1926 detection mechanisms. Organizations select the appropriate number of attempts and fall
1927 back mechanisms for users based on organizationally-defined factors.
1928 Related Controls: IA-3.
1929 (4) UNSUCCESSFUL LOGON ATTEMPTS | USE OF ALTERNATE FACTOR
1930 (a) Allow the use of [Assignment: organization-defined authentication factors] that are
1931 different from the primary authentication factors after the number of organization-
1932 defined consecutive invalid logon attempts have been exceeded; and
1933 (b) Enforce a limit of [Assignment: organization-defined number] consecutive invalid
1934 logon attempts through use of the alternative factors by a user during a [Assignment:
1935 organization-defined time-period].
1936 Discussion: The use of alternate authentication factors supports the objective of availability
1937 and allows a user that has inadvertently been locked out to use additional authentication
1938 factors to bypass the lockout.
1939 Related Controls: IA-3.
1940 References: [SP 800-63-3]; [SP 800-124].
1967 users. Organizations also consult with the Office of the General Counsel for legal review and
1968 approval of warning banner content.
1969 Related Controls: AC-14, PL-4, SI-4.
1970 Control Enhancements: None.
1971 References: None.
2130 Discussion: Information is represented internally within systems using abstractions known as
2131 data structures. Internal data structures can represent different types of entities, both active and
2132 passive. Active entities, also known as subjects, are typically associated with individuals, devices,
2133 or processes acting on behalf of individuals. Passive entities, also known as objects, are typically
2134 associated with data structures such as records, buffers, tables, files, inter-process pipes, and
2135 communications ports. Security attributes, a form of metadata, are abstractions representing the
2136 basic properties or characteristics of active and passive entities with respect to safeguarding
2137 information. Privacy attributes, which may be used independently, or in conjunction with
2138 security attributes, represent the basic properties or characteristics of active or passive entities
2139 with respect to the management of personally identifiable information. Attributes can be either
2140 explicitly or implicitly associated with the information contained in organizational systems or
2141 system components.
2142 Attributes may be associated with active entities (i.e., subjects) that have the potential to send or
2143 receive information, to cause information to flow among objects, or to change the system state.
2144 These attributes may also be associated with passive entities (i.e., objects) that contain or
2145 receive information. The association of attributes to subjects and objects by a system is referred
2146 to as binding and is inclusive of setting the attribute value and the attribute type. Attributes,
2147 when bound to data or information, permit the enforcement of security and privacy policies for
2148 access control and information flow control, including data retention limits, permitted uses of
2149 personally identifiable information, and identification of personal information within data
2150 objects. Such enforcement occurs through organizational processes or system functions or
2151 mechanisms. The binding techniques implemented by systems affect the strength of attribute
2152 binding to information. Binding strength and the assurance associated with binding techniques
2153 play an important part in the trust organizations have in the information flow enforcement
2154 process. The binding techniques affect the number and degree of additional reviews required by
2155 organizations. The content or assigned values of attributes can directly affect the ability of
2156 individuals to access organizational information.
2157 Organizations can define the types of attributes needed for systems to support missions or
2158 business functions. There are many values that can be assigned to a security attribute. Release
2159 markings include US only, NATO (North Atlantic Treaty Organization), or NOFORN (not releasable
2160 to foreign nationals). By specifying the permitted attribute ranges and values, organizations
2161 ensure that attribute values are meaningful and relevant. Labeling refers to the association of
2162 attributes with the subjects and objects represented by the internal data structures within
2163 systems. This facilitates system-based enforcement of information security and privacy policies.
2164 Labels include classification of information in accordance with legal and compliance
2165 requirements; access authorizations; nationality; data life cycle protection (i.e., encryption and
2166 data expiration); personally identifiable information processing permissions; individual consent
2167 to personally identifiable information processing; and affiliation as a contractor. Conversely,
2168 marking refers to the association of attributes with objects in a human-readable form. Marking
2169 enables manual, procedural, or process-based enforcement of information security and privacy
2170 policies. Attribute types include classification level for objects and clearance (access
2171 authorization) level for subjects. An attribute value for both attribute types is Top Secret.
2172 Related Controls: AC-3, AC-4, AC-6, AC-21, AC-25, AU-2, AU-10, MP-3, PE-22, PT-2, PT-5, SC-11,
2173 SC-16, SI-12.
2174 Control Enhancements:
2175 (1) SECURITY AND PRIVACY ATTRIBUTES | DYNAMIC ATTRIBUTE ASSOCIATION
2176 Dynamically associate security and privacy attributes with [Assignment: organization-
2177 defined subjects and objects] in accordance with the following security and privacy policies
2178 as information is created and combined: [Assignment: organization-defined security and
2179 privacy policies].
2227 mitigate the risk of unauthorized exposure of selected information, for example, shoulder
2228 surfing, the outputs display full attribute values when unmasked by the subscriber.
2229 Related Controls: None.
2230 (6) SECURITY AND PRIVACY ATTRIBUTES | MAINTENANCE OF ATTRIBUTE ASSOCIATION BY ORGANIZATION
2231 Require personnel to associate and maintain the association of [Assignment: organization-
2232 defined security and privacy attributes] with [Assignment: organization-defined subjects
2233 and objects] in accordance with [Assignment: organization-defined security and privacy
2234 policies].
2235 Discussion: This control enhancement requires individual users (as opposed to the system)
2236 to maintain associations of defined security and privacy attributes with subjects and objects.
2237 Related Controls: None.
2238 (7) SECURITY AND PRIVACY ATTRIBUTES | CONSISTENT ATTRIBUTE INTERPRETATION
2239 Provide a consistent interpretation of security and privacy attributes transmitted between
2240 distributed system components.
2241 Discussion: To enforce security and privacy policies across multiple system components in
2242 distributed systems, organizations provide a consistent interpretation of security and privacy
2243 attributes employed in access enforcement and flow enforcement decisions. Organizations
2244 can establish agreements and processes to help ensure that distributed system components
2245 implement attributes with consistent interpretations in automated access enforcement and
2246 flow enforcement actions.
2247 Related Controls: None.
2248 (8) SECURITY AND PRIVACY ATTRIBUTES | ASSOCIATION TECHNIQUES AND TECHNOLOGIES
2249 Implement [Assignment: organization-defined techniques and technologies] with
2250 [Assignment: organization-defined level of assurance] in associating security and privacy
2251 attributes to information.
2252 Discussion: The association of security and privacy attributes to information within systems
2253 is important for conducting automated access enforcement and flow enforcement actions.
2254 The association of such attributes to information (i.e., binding) can be accomplished with
2255 technologies and techniques providing different levels of assurance. For example, systems
2256 can bind attributes to information cryptographically using digital signatures supporting
2257 cryptographic keys protected by hardware devices (sometimes known as hardware roots of
2258 trust).
2259 Related Controls: None.
2260 (9) SECURITY AND PRIVACY ATTRIBUTES | ATTRIBUTE REASSIGNMENT — REGRADING MECHANISMS
2261 Change security and privacy attributes associated with information only via regrading
2262 mechanisms validated using [Assignment: organization-defined techniques or procedures].
2263 Discussion: A regrading mechanism is a trusted process authorized to re-classify and re-label
2264 data in accordance with a defined policy exception. Validated regrading mechanisms are
2265 used by organizations to provide the requisite levels of assurance for attribute reassignment
2266 activities. The validation is facilitated by ensuring that regrading mechanisms are single
2267 purpose and of limited function. Since security and privacy attribute changes can directly
2268 affect policy enforcement actions, implementing trustworthy regrading mechanisms is
2269 necessary to help ensure that such mechanisms perform in a consistent and correct mode of
2270 operation.
2271 Related Controls: None.
2272 (10) SECURITY AND PRIVACY ATTRIBUTES | ATTRIBUTE CONFIGURATION BY AUTHORIZED INDIVIDUALS
2273 Provide authorized individuals the capability to define or change the type and value of
2274 security and privacy attributes available for association with subjects and objects.
2275 Discussion: The content or assigned values of security and privacy attributes can directly
2276 affect the ability of individuals to access organizational information. Therefore, it is
2277 important for systems to be able to limit the ability to create or modify attributes to
2278 authorized individuals only.
2279 Related Controls: None.
2280 References: [OMB A-130]; [FIPS 140-3]; [FIPS 186-4]; [SP 800-162]; [SP 800-178].
2316 Discussion: Virtual private networks can be used to protect the confidentiality and integrity
2317 of remote access sessions. Transport Layer Security (TLS) is an example of a cryptographic
2318 protocol that provides end-to-end communications security over networks and is used for
2319 Internet communications and online transactions.
2320 Related Controls: SC-8, SC-12, SC-13.
2321 (3) REMOTE ACCESS | MANAGED ACCESS CONTROL POINTS
2322 Route remote accesses through authorized and managed network access control points.
2323 Discussion: Organizations consider the Trusted Internet Connections initiative [DHS TIC]
2324 requirements for external network connections since limiting the number of access control
2325 points for remote accesses reduces attack surface.
2326 Related Controls: SC-7.
2327 (4) REMOTE ACCESS | PRIVILEGED COMMANDS AND ACCESS
2328 (a) Authorize the execution of privileged commands and access to security-relevant
2329 information via remote access only in a format that provides assessable evidence and
2330 for the following needs: [Assignment: organization-defined needs]; and
2331 (b) Document the rationale for remote access in the security plan for the system.
2332 Discussion: Remote access to systems represents a significant potential vulnerability that
2333 can be exploited by adversaries. As such, restricting the execution of privileged commands
2334 and access to security-relevant information via remote access reduces the exposure of the
2335 organization and the susceptibility to threats by adversaries to the remote access capability.
2336 Related Controls: AC-6, SC-12, SC-13.
2337 (5) REMOTE ACCESS | MONITORING FOR UNAUTHORIZED CONNECTIONS
2338 [Withdrawn: Incorporated into SI-4.]
2339 (6) REMOTE ACCESS | PROTECTION OF MECHANISM INFORMATION
2340 Protect information about remote access mechanisms from unauthorized use and
2341 disclosure.
2342 Discussion: Remote access to organizational information by nonorganizational entities can
2343 increase the risk of unauthorized use and disclosure about remote access mechanisms. The
2344 organization considers including remote access requirements in the information exchange
2345 agreements with other organizations, as applicable. Remote access requirements can also be
2346 included in rules of behavior (see PL-4) and access agreements (see PS-6).
2347 Related Controls: AT-2, AT-3, PS-6.
2348 (7) REMOTE ACCESS | ADDITIONAL PROTECTION FOR SECURITY FUNCTION ACCESS
2349 [Withdrawn: Incorporated into AC-3(10).]
2350 (8) REMOTE ACCESS | DISABLE NONSECURE NETWORK PROTOCOLS
2351 [Withdrawn: Incorporated into CM-7.]
2352 (9) REMOTE ACCESS | DISCONNECT OR DISABLE ACCESS
2353 Provide the capability to disconnect or disable remote access to the system within
2354 [Assignment: organization-defined time-period].
2355 Discussion: This control enhancement requires organizations to have the capability to
2356 rapidly disconnect current users remotely accessing the system or disable further remote
2357 access. The speed of disconnect or disablement varies based on the criticality of missions or
2358 business functions and the need to eliminate immediate or future remote access to systems.
2359 Related Controls: None.
2448 critical software updates and patches, conducting primary operating system (and possibly other
2449 resident software) integrity checks, and disabling unnecessary hardware.
2450 Usage restrictions and authorization to connect may vary among organizational systems. For
2451 example, the organization may authorize the connection of mobile devices to the organizational
2452 network and impose a set of usage restrictions while a system owner may withhold authorization
2453 for mobile device connection to specific applications or may impose additional usage restrictions
2454 before allowing mobile device connections to a system. The need to provide adequate security
2455 for mobile devices goes beyond the requirements in this control. Many controls for mobile
2456 devices are reflected in other controls allocated to the initial control baselines as starting points
2457 for the development of security plans and overlays using the tailoring process. There may also be
2458 some overlap by the security controls within the different families of controls. AC-20 addresses
2459 mobile devices that are not organization-controlled.
2460 Related Controls: AC-3, AC-4, AC-7, AC-11, AC-17, AC-18, AC-20, CA-9, CM-2, CM-6, IA-2, IA-3,
2461 MP-2, MP-4, MP-5, MP-7, PL-4, SC-7, SC-34, SC-43, SI-3, SI-4.
2462 Control Enhancements:
2463 (1) ACCESS CONTROL FOR MOBILE DEVICES | USE OF WRITABLE AND PORTABLE STORAGE DEVICES
2464 [Withdrawn: Incorporated into MP-7.]
2465 (2) ACCESS CONTROL FOR MOBILE DEVICES | USE OF PERSONALLY OWNED PORTABLE STORAGE DEVICES
2466 [Withdrawn: Incorporated into MP-7.]
2467 (3) ACCESS CONTROL FOR MOBILE DEVICES | USE OF PORTABLE STORAGE DEVICES WITH NO
2468 IDENTIFIABLE OWNER
2469 [Withdrawn: Incorporated into MP-7.]
2470 (4) ACCESS CONTROL FOR MOBILE DEVICES | RESTRICTIONS FOR CLASSIFIED INFORMATION
2471 (a) Prohibit the use of unclassified mobile devices in facilities containing systems
2472 processing, storing, or transmitting classified information unless specifically permitted
2473 by the authorizing official; and
2474 (b) Enforce the following restrictions on individuals permitted by the authorizing official
2475 to use unclassified mobile devices in facilities containing systems processing, storing,
2476 or transmitting classified information:
2477 (1) Connection of unclassified mobile devices to classified systems is prohibited;
2478 (2) Connection of unclassified mobile devices to unclassified systems requires
2479 approval from the authorizing official;
2480 (3) Use of internal or external modems or wireless interfaces within the unclassified
2481 mobile devices is prohibited; and
2482 (4) Unclassified mobile devices and the information stored on those devices are
2483 subject to random reviews and inspections by [Assignment: organization-defined
2484 security officials], and if classified information is found, the incident handling
2485 policy is followed.
2486 (c) Restrict the connection of classified mobile devices to classified systems in accordance
2487 with [Assignment: organization-defined security policies].
2488 Discussion: None.
2489 Related Controls: CM-8, IR-4.
2490 (5) ACCESS CONTROL FOR MOBILE DEVICES | FULL DEVICE AND CONTAINER-BASED ENCRYPTION
2491 Employ [Selection: full-device encryption; container-based encryption] to protect the
2492 confidentiality and integrity of information on [Assignment: organization-defined mobile
2493 devices].
2494 Discussion: Container-based encryption provides a more fine-grained approach to data and
2495 information encryption on mobile devices, including encrypting selected data structures
2496 such as files, records, or fields.
2497 Related Controls: SC-13, SC-28.
2498 References: [SP 800-114]; [SP 800-124].
2583 (5) USE OF EXTERNAL SYSTEMS | PORTABLE STORAGE DEVICES — PROHIBITED USE
2584 Prohibit the use of organization-controlled portable storage devices by authorized
2585 individuals on external systems.
2586 Discussion: Limits on the use of organization-controlled portable storage devices in external
2587 systems include a complete prohibition of the use of such devices.
2588 Related Controls: MP-7, SC-41.
2589 (6) USE OF EXTERNAL SYSTEMS | NON-ORGANIZATIONALLY OWNED SYSTEMS — PROHIBITED USE
2590 Prohibit the use of non-organizationally owned systems or system components to process,
2591 store, or transmit organizational information.
2592 Discussion: Non-organizationally owned systems or system components include systems or
2593 system components owned by other organizations and personally owned devices. There are
2594 potential risks to using non-organizationally owned systems or system components. In some
2595 cases, the risk is sufficiently high as to prohibit such use. In other cases, the use of such
2596 systems or system components may be allowed but restricted in some way (see AC-20(4)).
2597 Related Controls: None.
2598 References: [FIPS 199]; [SP 800-171]; [SP 800-171B].
2626 Discussion: Information search and retrieval services identify information system resources
2627 relevant to an information need.
2628 Related Controls: None.
2629 References: [OMB A-130]; [SP 800-150]; [IR 8062].
2670 monitoring for organizational information that may have been mined or otherwise obtained from
2671 data stores and is available as open source information residing on external sites, for example,
2672 through social networking or social media websites.
2673 [EO 13587] requires the establishment of an insider threat program for deterring, detecting, and
2674 mitigating insider threats, including the safeguarding of sensitive information from exploitation,
2675 compromise, or other unauthorized disclosure. This control requires organizations to identify
2676 appropriate techniques to prevent and detect unnecessary or unauthorized data mining, which
2677 can be used by an insider to collect organizational information for the purpose of exfiltration.
2678 Related Controls: PM-12, PT-2.
2679 Control Enhancements: None.
2680 References: [EO 13587].
2715 situations, user identification information is simply not needed for access control decisions
2716 and, especially in the case of distributed systems, transmitting such information with the
2717 needed degree of assurance may be very expensive or difficult to accomplish. MAC, RBAC,
2718 ABAC, and label-based control policies, for example, might not include user identity as an
2719 attribute.
2720 Related Controls: None.
2721 References: [SP 800-162]; [SP 800-178].
2787 1. As part of initial training for new users and [Assignment: organization-defined
2788 frequency] thereafter; and
2789 2. When required by system changes; and
2790 b. Update awareness training [Assignment: organization-defined frequency].
2791 Discussion: Organizations provide foundational and advanced levels of awareness training to
2792 system users, including measures to test the knowledge level of users. Organizations determine
2793 the content of awareness training based on specific organizational requirements, the systems to
2794 which personnel have authorized access, and work environments (e.g., telework). The content
2795 includes an understanding of the need for security and privacy and actions by users to maintain
2796 security and personal privacy and to respond to suspected incidents. The content addresses the
2797 need for operations security and the handling of personally identifiable information.
2798 Awareness techniques include displaying posters, offering supplies inscribed with security and
2799 privacy reminders, displaying logon screen messages, generating email advisories or notices from
2800 organizational officials, and conducting awareness events. Awareness training after the initial
2801 training described in AT-2a.1, is conducted at a minimum frequency consistent with applicable
2802 laws, directives, regulations, and policies. Subsequent awareness training may be satisfied by one
2803 or more short ad hoc sessions and include topical information on recent attack schemes; changes
2804 to organizational security and privacy policies; revised security and privacy expectations; or a
2805 subset of topics from the initial training. Updating awareness training on a regular basis helps to
2806 ensure the content remains relevant and effective.
2807 Related Controls: AC-3, AC-17, AC-22, AT-3, AT-4, CP-3, IA-4, IR-2, IR-7, IR-9, PA-2, PL-4, PM-13,
2808 PM-21, PS-7, PT-2, SA-8, SA-16.
2809 Control Enhancements:
2810 (1) AWARENESS TRAINING | PRACTICAL EXERCISES
2811 Provide practical exercises in awareness training that simulate events and incidents.
2812 Discussion: Practical exercises include no-notice social engineering attempts to collect
2813 information, gain unauthorized access, or simulate the adverse impact of opening malicious
2814 email attachments; or invoking, via spear phishing attacks, malicious web links.
2815 Related Controls: CA-2, CA-7, CP-4, IR-3.
2816 (2) AWARENESS TRAINING | INSIDER THREAT
2817 Provide awareness training on recognizing and reporting potential indicators of insider
2818 threat.
2819 Discussion: Potential indicators and possible precursors of insider threat can include
2820 behaviors such as inordinate, long-term job dissatisfaction; attempts to gain access to
2821 information not required for job performance; unexplained access to financial resources;
2822 bullying or sexual harassment of fellow employees; workplace violence; and other serious
2823 violations of policies, procedures, directives, regulations, rules, or practices. Awareness
2824 training includes how to communicate concerns of employees and management regarding
2825 potential indicators of insider threat through channels established by the organization and in
2826 accordance with established policies and procedures. Organizations may consider tailoring
2827 insider threat awareness topics to the role. For example, training for managers may be
2828 focused on changes in behavior of team members, while training for employees may be
2829 focused on more general observations.
2830 Related Controls: PM-12.
2879 removable systems in non-secure settings, and the potential targeting of individuals at
2880 home.
2881 Related Controls: None.
2882 (7) AWARENESS TRAINING | CYBER THREAT ENVIRONMENT
2883 (a) Provide awareness training on the cyber threat environment; and
2884 (b) Reflect current cyber threat information in system operations.
2885 Discussion: Since threats continue to change over time, the threat awareness training by the
2886 organization is dynamic. Moreover, threat awareness training is not performed in isolation
2887 from the system operations that support organizational missions and business functions.
2888 Related Controls: RA-3.
2889 (8) AWARENESS TRAINING | TRAINING FEEDBACK
2890 Provide feedback on organizational training results to the following personnel
2891 [Assignment: organization-defined frequency]: [Assignment: organization-defined
2892 personnel].
2893 Discussion: Training feedback includes awareness training results and role-based training
2894 results. Training results, especially failures of personnel in critical roles, can be indicative of a
2895 potentially serious problem. Therefore, it is important that senior managers are made aware
2896 of such situations so that they can take appropriate response actions. Training feedback
2897 supports the assessment and update of organization training described in AT-2b.
2898 Related Controls: None.
2899 References: [OMB A-130]; [SP 800-50]; [SP 800-160 v2].
2923 related to operations and supply chain security within the context of organizational security and
2924 privacy programs. Role-based training also applies to contractors providing services to federal
2925 agencies. Types of training include web-based and computer-based training, classroom-style
2926 training, and hands-on training (including micro-training). Updating role-based training on a
2927 regular basis helps to ensure the content remains relevant and effective.
2928 Related Controls: AC-3, AC-17, AC-22, AT-2, AT-4, CP-3, IR-2, IR-7, IR-9, IR-10, PL-4, PM-13, PM-
2929 23, PS-7, SA-3, SA-8, SA-11, SA-16, SR-5, SR-6, SR-11.
2930 Control Enhancements:
2931 (1) ROLE-BASED TRAINING | ENVIRONMENTAL CONTROLS
2932 Provide [Assignment: organization-defined personnel or roles] with initial and
2933 [Assignment: organization-defined frequency] training in the employment and operation
2934 of environmental controls.
2935 Discussion: Environmental controls include fire suppression and detection devices or
2936 systems, sprinkler systems, handheld fire extinguishers, fixed fire hoses, smoke detectors,
2937 temperature or humidity, heating, ventilation, and air conditioning, and power within the
2938 facility.
2939 Related Controls: PE-1, PE-11, PE-13, PE-14, PE-15.
2940 (2) ROLE-BASED TRAINING | PHYSICAL SECURITY CONTROLS
2941 Provide [Assignment: organization-defined personnel or roles] with initial and
2942 [Assignment: organization-defined frequency] training in the employment and operation
2943 of physical security controls.
2944 Discussion: Physical security controls include physical access control devices, physical
2945 intrusion and detection alarms, operating procedures for facility security guards, and
2946 monitoring or surveillance equipment.
2947 Related Controls: PE-2, PE-3, PE-4.
2948 (3) ROLE-BASED TRAINING | PRACTICAL EXERCISES
2949 Provide practical exercises in security and privacy training that reinforce training
2950 objectives.
2951 Discussion: Practical exercises for security include training for software developers that
2952 addresses simulated attacks exploiting common software vulnerabilities or spear or whale
2953 phishing attacks targeted at senior leaders or executives. Practical exercises for privacy
2954 include modules with quizzes on handling personally identifiable information in various
2955 scenarios, or scenarios on conducting privacy impact assessments.
2956 Related Controls: None.
2957 (4) ROLE-BASED TRAINING | SUSPICIOUS COMMUNICATIONS AND ANOMALOUS SYSTEM BEHAVIOR
2958 [Withdrawn: Moved to AT-2(4)].
2959 (5) ROLE-BASED TRAINING | ACCESSING PERSONALLY IDENTIFIABLE INFORMATION
2960 Provide [Assignment: organization-defined personnel or roles] with initial and
2961 [Assignment: organization-defined frequency] training on:
2962 (a) Organizational authority for collecting personally identifiable information;
2963 (b) Authorized uses of personally identifiable information;
2964 (c) Identifying, reporting, and responding to a suspected or confirmed breach;
2965 (d) Content of system of records notices, computer matching agreements, and privacy
2966 impact assessments;
2967 (e) Authorized sharing of personally identifiable information with external parties; and
2968 (f) Rules of behavior and the consequences for unauthorized collection, use, or sharing of
2969 personally identifiable information.
2970 Discussion: Role-based training addresses the responsibility of individuals when accessing
2971 personally identifiable information; the organization’s established rules of behavior when
2972 accessing personally identifiable information; the consequences for violating the rules of
2973 behavior; and how to respond to a breach. Role-based training helps ensure personnel
2974 comply with applicable privacy requirements and is necessary to manage privacy risks.
2975 Related Controls: None.
2976 References: [OMB A-130]; [SP 800-50].
3031 b. Coordinate the event logging function with other organizational entities requiring audit-
3032 related information to guide and inform the selection criteria for events to be logged;
3033 c. Specify the following event types for logging within the system: [Assignment: organization-
3034 defined event types (subset of the event types defined in AU-2 a.) along with the frequency of
3035 (or situation requiring) logging for each identified event type];
3036 d. Provide a rationale for why the event types selected for logging are deemed to be adequate
3037 to support after-the-fact investigations of incidents; and
3038 e. Review and update the event types selected for logging [Assignment: organization-defined
3039 frequency].
3040 Discussion: An event is an observable occurrence in a system. The types of events that require
3041 logging are those events that are significant and relevant to the security of systems and the
3042 privacy of individuals. Event logging also supports specific monitoring and auditing needs. Event
3043 types include password changes; failed logons or failed accesses related to systems; security or
3044 privacy attribute changes; administrative privilege usage; PIV credential usage; data action
3045 changes; query parameters; or external credential usage. In determining the set of event types
3046 that require logging, organizations consider the monitoring and auditing appropriate for each of
3047 the controls to be implemented. For completeness, event logging includes all protocols that are
3048 operational and supported by the system.
3049 To balance monitoring and auditing requirements with other system needs, this control also
3050 requires identifying the subset of event types that are logged at a given point in time. For
3051 example, organizations may determine that systems need the capability to log every file access
3052 successful and unsuccessful, but not activate that capability except for specific circumstances due
3053 to the potential burden on system performance. The types of events that organizations desire to
3054 be logged may change. Reviewing and updating the set of logged events is necessary to help
3055 ensure that the events remain relevant and continue to support the needs of the organization.
3056 Organizations consider how the types of logging events can reveal information about individuals
3057 that may give rise to privacy risk and how best to mitigate such risks. For example, there is the
3058 potential for personally identifiable information in the audit trail especially if the logging event is
3059 based on patterns or time of usage.
3060 Event logging requirements, including the need to log specific event types, may be referenced in
3061 other controls and control enhancements. These include AC-2(4), AC-3(10), AC-6(9), AC-16(11),
3062 AC-17(1), CM-3.f, CM-5(1), IA-3(3.b), MA-4(1), MP-4(2), PE-3, PM-21, PT-8, RA-8, SC-7(9), SC-
3063 7(15), SI-3(8), SI-4(22), SI-7(8), and SI-10(1). Organizations include event types that are required
3064 by applicable laws, executive orders, directives, policies, regulations, standards, and guidelines.
3065 Audit records can be generated at various levels, including at the packet level as information
3066 traverses the network. Selecting the appropriate level of event logging is an important part of a
3067 monitoring and auditing capability and can identify the root causes of problems. Organizations
3068 consider in the definition of event types, the logging necessary to cover related event types such
3069 as the steps in distributed, transaction-based processes and the actions that occur in service-
3070 oriented architectures.
3071 Related Controls: AC-2, AC-3, AC-6, AC-7, AC-8, AC-16, AC-17, AU-3, AU-4, AU-5, AU-6, AU-7, AU-
3072 11, AU-12, CM-3, CM-5, CM-6, CM-13, IA-3, MA-4, MP-4, PE-3, PM-21, PT-2, PT-8, RA-8, SA-8, SC-
3073 7, SC-18, SI-3, SI-4, SI-7, SI-10, SI-11.
3074 Control Enhancements:
3075 (1) EVENT LOGGING | COMPILATION OF AUDIT RECORDS FROM MULTIPLE SOURCES
3076 [Withdrawn: Incorporated into AU-12.]
3118 Discussion: Centralized management of planned audit record content requires that the
3119 content to be captured in audit records be configured from a central location (necessitating
3120 an automated capability). Organizations coordinate the selection of the required audit
3121 record content to support the centralized management and configuration capability
3122 provided by the system.
3123 Related Controls: AU-6, AU-7.
3124 (3) CONTENT OF AUDIT RECORDS | LIMIT PERSONALLY IDENTIFIABLE INFORMATION ELEMENTS
3125 Limit personally identifiable information contained in audit records to the following
3126 elements identified in the privacy risk assessment: [Assignment: organization-defined
3127 elements].
3128 Discussion: Limiting personally identifiable information in audit records when such
3129 information is not needed for operational purposes helps reduce the level of privacy risk
3130 created by a system.
3131 Related Controls: RA-3.
3132 References: [OMB A-130]; [IR 8062].
3161 b. Take the following additional actions: [Assignment: organization-defined additional actions].
3162 Discussion: Audit logging process failures include, for example, software and hardware errors;
3163 reaching or exceeding audit log storage capacity; and failures in audit log capturing mechanisms.
3164 Organization-defined actions include overwriting oldest audit records; shutting down the system;
3165 and stopping the generation of audit records. Organizations may choose to define additional
3166 actions for audit logging process failures based on the type of failure, the location of the failure,
3167 the severity of the failure, or a combination of such factors. When the audit logging process
3168 failure is related to storage, the response is carried out for the audit log storage repository (i.e.,
3169 the distinct system component where the audit logs are stored); the system on which the audit
3170 logs reside; the total audit log storage capacity of the organization (i.e., all audit log storage
3171 repositories combined), or all three. Organizations may decide to take no additional actions after
3172 alerting designated roles or personnel.
3173 Related Controls: AU-2, AU-4, AU-7, AU-9, AU-11, AU-12, AU-14, SI-4, SI-12.
3174 Control Enhancements:
3175 (1) RESPONSE TO AUDIT LOGGING PROCESS FAILURES | STORAGE CAPACITY WARNING
3176 Provide a warning to [Assignment: organization-defined personnel, roles, and/or locations]
3177 within [Assignment: organization-defined time-period] when allocated audit log storage
3178 volume reaches [Assignment: organization-defined percentage] of repository maximum
3179 audit log storage capacity.
3180 Discussion: Organizations may have multiple audit log storage repositories distributed
3181 across multiple system components, with each repository having different storage volume
3182 capacities.
3183 Related Controls: None.
3184 (2) RESPONSE TO AUDIT LOGGING PROCESS FAILURES | REAL-TIME ALERTS
3185 Provide an alert within [Assignment: organization-defined real-time-period] to
3186 [Assignment: organization-defined personnel, roles, and/or locations] when the following
3187 audit failure events occur: [Assignment: organization-defined audit logging failure events
3188 requiring real-time alerts].
3189 Discussion: Alerts provide organizations with urgent messages. Real-time alerts provide
3190 these messages at information technology speed (i.e., the time from event detection to alert
3191 occurs in seconds or less).
3192 Related Controls: None.
3193 (3) RESPONSE TO AUDIT LOGGING PROCESS FAILURES | CONFIGURABLE TRAFFIC VOLUME THRESHOLDS
3194 Enforce configurable network communications traffic volume thresholds reflecting limits
3195 on audit log storage capacity and [Selection: reject; delay] network traffic above those
3196 thresholds.
3197 Discussion: Organizations have the capability to reject or delay the processing of network
3198 communications traffic if audit logging information about such traffic is determined to
3199 exceed the storage capacity of the system audit logging function. The rejection or delay
3200 response is triggered by the established organizational traffic volume thresholds that can be
3201 adjusted based on changes to audit log storage capacity.
3202 Related Controls: None.
3203 (4) RESPONSE TO AUDIT LOGGING PROCESS FAILURES | SHUTDOWN ON FAILURE
3204 Invoke a [Selection: full system shutdown; partial system shutdown; degraded operational
3205 mode with limited mission or business functionality available] in the event of [Assignment:
3206 organization-defined audit logging failures], unless an alternate audit logging capability
3207 exists.
3208 Discussion: Organizations determine the types of audit logging failures that can trigger
3209 automatic system shutdowns or degraded operations. Because of the importance of
3210 ensuring mission and business continuity, organizations may determine that the nature of
3211 the audit logging failure is not so severe that it warrants a complete shutdown of the system
3212 supporting the core organizational missions and business operations. In those instances,
3213 partial system shutdowns or operating in a degraded mode with reduced capability may be
3214 viable alternatives.
3215 Related Controls: AU-15.
3216 (5) RESPONSE TO AUDIT LOGGING PROCESS FAILURES | ALTERNATE AUDIT LOGGING CAPABILITY
3217 Provide an alternate audit logging capability in the event of a failure in primary audit
3218 logging capability that implements [Assignment: organization-defined alternate audit
3219 logging functionality].
3220 Discussion: Since an alternate audit logging capability may be a short-term protection
3221 solution employed until the failure in the primary audit logging capability is corrected,
3222 organizations may determine that the alternate audit logging capability need only provide a
3223 subset of the primary audit logging functionality that is impacted by the failure.
3224 Related Controls: AU-9.
3225 References: None.
3291 (6) AUDIT RECORD REVIEW, ANALYSIS, AND REPORTING | CORRELATION WITH PHYSICAL MONITORING
3292 Correlate information from audit records with information obtained from monitoring
3293 physical access to further enhance the ability to identify suspicious, inappropriate,
3294 unusual, or malevolent activity.
3295 Discussion: The correlation of physical audit record information and the audit records from
3296 systems may assist organizations in identifying suspicious behavior or supporting evidence of
3297 such behavior. For example, the correlation of an individual’s identity for logical access to
3298 certain systems with the additional physical security information that the individual was
3299 present at the facility when the logical access occurred, may be useful in investigations.
3300 Related Controls: None.
3301 (7) AUDIT RECORD REVIEW, ANALYSIS, AND REPORTING | PERMITTED ACTIONS
3302 Specify the permitted actions for each [Selection (one or more): system process; role; user]
3303 associated with the review, analysis, and reporting of audit record information.
3304 Discussion: Organizations specify permitted actions for system processes, roles, and users
3305 associated with the review, analysis, and reporting of audit records through system account
3306 management activities. Specifying permitted actions on audit record information is a way to
3307 enforce the principle of least privilege. Permitted actions are enforced by the system and
3308 include read, write, execute, append, and delete.
3309 Related Controls: None.
3310 (8) AUDIT RECORD REVIEW, ANALYSIS, AND REPORTING | FULL TEXT ANALYSIS OF PRIVILEGED
3311 COMMANDS
3312 Perform a full text analysis of logged privileged commands in a physically distinct
3313 component or subsystem of the system, or other system that is dedicated to that analysis.
3314 Discussion: Full text analysis of privileged commands requires a distinct environment for the
3315 analysis of audit record information related to privileged users without compromising such
3316 information on the system where the users have elevated privileges, including the capability
3317 to execute privileged commands. Full text analysis refers to analysis that considers the full
3318 text of privileged commands (i.e., commands and parameters) as opposed to analysis that
3319 considers only the name of the command. Full text analysis includes the use of pattern
3320 matching and heuristics.
3321 Related Controls: AU-3, AU-9, AU-11, AU-12.
3322 (9) AUDIT RECORD REVIEW, ANALYSIS, AND REPORTING | CORRELATION WITH INFORMATION FROM
3323 NONTECHNICAL SOURCES
3324 Correlate information from nontechnical sources with audit record information to enhance
3325 organization-wide situational awareness.
3326 Discussion: Nontechnical sources include records documenting organizational policy
3327 violations related to sexual harassment incidents and the improper use of information
3328 assets. Such information can lead to a directed analytical effort to detect potential malicious
3329 insider activity. Organizations limit access to information that is available from nontechnical
3330 sources due to its sensitive nature. Limited access minimizes the potential for inadvertent
3331 release of privacy-related information to individuals that do not have a need to know. Thus,
3332 the correlation of information from nontechnical sources with audit record information
3333 generally occurs only when individuals are suspected of being involved in an incident.
3334 Organizations obtain legal advice prior to initiating such actions.
3335 Related Controls: PM-12.
3336 (10) AUDIT RECORD REVIEW, ANALYSIS, AND REPORTING | AUDIT LEVEL ADJUSTMENT
3337 [Withdrawn: Incorporated into AU-6.]
3381 different time granularities for different system components. Time service can be critical to other
3382 security capabilities such as access control and identification and authentication, depending on
3383 the nature of the mechanisms used to support those capabilities.
3384 Related Controls: AU-3, AU-12, AU-14, SC-45.
3385 Control Enhancements:
3386 (1) TIME STAMPS | SYNCHRONIZATION WITH AUTHORITATIVE TIME SOURCE
3387 (a) Compare the internal system clocks [Assignment: organization-defined frequency]
3388 with [Assignment: organization-defined authoritative time source]; and
3389 (b) Synchronize the internal system clocks to the authoritative time source when the time
3390 difference is greater than [Assignment: organization-defined time-period].
3391 Discussion: Synchronization of internal system clocks with an authoritative source provides
3392 uniformity of time stamps for systems with multiple system clocks and systems connected
3393 over a network.
3394 Related Controls: None.
3395 (2) TIME STAMPS | SECONDARY AUTHORITATIVE TIME SOURCE
3396 (a) Identify a secondary authoritative time source that is in a different geographic region
3397 than the primary authoritative time source; and
3398 (b) Synchronize the internal system clocks to the secondary authoritative time source if
3399 the primary authoritative time source is unavailable.
3400 Discussion: It may be necessary to employ geolocation information to determine that the
3401 secondary authoritative time source is in a different geographic region.
3402 Related Controls: None.
3403 References: [IETF 5905].
3425 Recordable (DVD-R). In contrast, the use of switchable write-protection media such as on
3426 tape cartridges or Universal Serial Bus (USB) drives results in write-protected, but not write-
3427 once, media.
3428 Related Controls: AU-4, AU-5.
3429 (2) PROTECTION OF AUDIT INFORMATION | STORE ON SEPARATE PHYSICAL SYSTEMS OR COMPONENTS
3430 Store audit records [Assignment: organization-defined frequency] in a repository that is
3431 part of a physically different system or system component than the system or component
3432 being audited.
3433 Discussion: Storing audit records in a repository separate from the audited system or system
3434 component helps to ensure that a compromise of the system being audited does not also
3435 result in a compromise of the audit records. Storing audit records on separate physical
3436 systems or components also preserves the confidentiality and integrity of audit records and
3437 facilitates the management of audit records as an organization-wide activity. Storing audit
3438 records on separate systems or components applies to initial generation as well as backup or
3439 long-term storage of audit records.
3440 Related Controls: AU-4, AU-5.
3441 (3) PROTECTION OF AUDIT INFORMATION | CRYPTOGRAPHIC PROTECTION
3442 Implement cryptographic mechanisms to protect the integrity of audit information and
3443 audit tools.
3444 Discussion: Cryptographic mechanisms used for protecting the integrity of audit information
3445 include signed hash functions using asymmetric cryptography. This enables the distribution
3446 of the public key to verify the hash information while maintaining the confidentiality of the
3447 secret key used to generate the hash.
3448 Related Controls: AU-10, SC-12, SC-13.
3449 (4) PROTECTION OF AUDIT INFORMATION | ACCESS BY SUBSET OF PRIVILEGED USERS
3450 Authorize access to management of audit logging functionality to only [Assignment:
3451 organization-defined subset of privileged users or roles].
3452 Discussion: Individuals or roles with privileged access to a system and who are also the
3453 subject of an audit by that system, may affect the reliability of the audit information by
3454 inhibiting audit activities or modifying audit records. Requiring privileged access to be
3455 further defined between audit-related privileges and other privileges, limits the number of
3456 users or roles with audit-related privileges.
3457 Related Controls: AC-5.
3458 (5) PROTECTION OF AUDIT INFORMATION | DUAL AUTHORIZATION
3459 Enforce dual authorization for [Selection (one or more): movement; deletion] of
3460 [Assignment: organization-defined audit information].
3461 Discussion: Organizations may choose different selection options for different types of audit
3462 information. Dual authorization mechanisms (also known as two-person control) require the
3463 approval of two authorized individuals to execute audit functions. To reduce the risk of
3464 collusion, organizations consider rotating dual authorization duties to other individuals.
3465 Organizations do not require dual authorization mechanisms when immediate responses are
3466 necessary to ensure public and environmental safety.
3467 Related Controls: AC-3.
3468 (6) PROTECTION OF AUDIT INFORMATION | READ-ONLY ACCESS
3469 Authorize read-only access to audit information to [Assignment: organization-defined
3470 subset of privileged users or roles].
3471 Discussion: Restricting privileged user or role authorizations to read-only helps to limit the
3472 potential damage to organizations that could be initiated by such users or roles, for example,
3473 deleting audit records to cover up malicious activity.
3474 Related Controls: None.
3475 (7) PROTECTION OF AUDIT INFORMATION | STORE ON COMPONENT WITH DIFFERENT OPERATING
3476 SYSTEM
3477 Store audit information on a component running a different operating system than the
3478 system or component being audited.
3479 Discussion: Storing auditing information on a system component running a different
3480 operating system reduces the risk of a vulnerability specific to the system resulting in a
3481 compromise of the audit records.
3482 Related controls: AU-4, AU-5, AU-11, SC-29.
3483 References: [FIPS 140-3]; [FIPS 180-4]; [FIPS 202].
3514 Discussion: Validating the binding of the information producer identity to the information
3515 prevents the modification of information between production and review. The validation of
3516 bindings can be achieved, for example, using cryptographic checksums. Organizations
3517 determine if validations are in response to user requests or generated automatically.
3518 Related Controls: AC-3, AC-4, AC-16.
3519 (3) NON-REPUDIATION | CHAIN OF CUSTODY
3520 Maintain reviewer or releaser identity and credentials within the established chain of
3521 custody for information reviewed or released.
3522 Discussion: Chain of custody is a process that tracks the movement of evidence through its
3523 collection, safeguarding, and analysis life cycle by documenting each person who handled
3524 the evidence, the date and time it was collected or transferred, and the purpose for the
3525 transfer. If the reviewer is a human or if the review function is automated but separate from
3526 the release or transfer function, the system associates the identity of the reviewer of the
3527 information to be released with the information and the information label. In the case of
3528 human reviews, maintaining the identity and credentials of reviewers or releasers provides
3529 organizational officials the means to identify who reviewed and released the information. In
3530 the case of automated reviews, it ensures that only approved review functions are used.
3531 Related Controls: AC-4, AC-16.
3532 (4) NON-REPUDIATION | VALIDATE BINDING OF INFORMATION REVIEWER IDENTITY
3533 (a) Validate the binding of the information reviewer identity to the information at the
3534 transfer or release points prior to release or transfer between [Assignment:
3535 organization-defined security domains]; and
3536 (b) Perform [Assignment: organization-defined actions] in the event of a validation error.
3537 Discussion: Validating the binding of the information reviewer identity to the information at
3538 transfer or release points prevents the unauthorized modification of information between
3539 review and the transfer or release. The validation of bindings can be achieved by using
3540 cryptographic checksums. Organizations determine if validations are in response to user
3541 requests or generated automatically.
3542 Related Controls: AC-4, AC-16.
3543 (5) NON-REPUDIATION | DIGITAL SIGNATURES
3544 [Withdrawn: Incorporated into SI-7.]
3545 References: [FIPS 140-3]; [FIPS 180-4]; [FIPS 186-4]; [FIPS 202]; [SP 800-177].
3690 Discussion: The initiation of session audits automatically at startup helps to ensure the
3691 information being captured on selected individuals is complete and is not subject to
3692 compromise through tampering by malicious threat actors.
3693 Related Controls: None.
3694 (2) SESSION AUDIT | CAPTURE AND RECORD CONTENT
3695 [Withdrawn: Incorporated into AU-14.]
3696 (3) SESSION AUDIT | REMOTE VIEWING AND LISTENING
3697 Provide and implement the capability for authorized users to remotely view and hear
3698 content related to an established user session in real time.
3699 Discussion: None.
3700 Related Controls: AC-17.
3701 References: None.
3733 organizations have appropriate knowledge to make such determinations, thus requiring the
3734 sharing of audit information among organizations.
3735 Related Controls: IR-4, SI-4.
3736 (3) CROSS-ORGANIZATIONAL AUDITING | DISASSOCIABILITY
3737 Implement [Assignment: organization-defined measures] to disassociate individuals from
3738 audit information transmitted across organizational boundaries.
3739 Discussion: Preserving identities in audit trails could have privacy ramifications such as
3740 enabling the tracking and profiling of individuals but may not be operationally necessary.
3741 These risks could be further amplified when transmitting information across organizational
3742 boundaries. Using privacy-enhancing cryptographic techniques can disassociate individuals
3743 from audit information and reduce privacy risk while maintaining accountability.
3744 Related Controls: None.
3745 References: None.
3835 assessments in accordance with organizational continuous monitoring strategies. External audits,
3836 including audits by external entities such as regulatory agencies, are outside the scope of this
3837 control.
3838 Related Controls: AC-20, CA-5, CA-6, CA-7, PM-9, RA-5, SA-11, SC-38, SI-3, SI-12, SR-2, SR-3.
3839 Control Enhancements:
3840 (1) ASSESSMENTS | INDEPENDENT ASSESSORS
3841 Employ independent assessors or assessment teams to conduct control assessments.
3842 Discussion: Independent assessors or assessment teams are individuals or groups
3843 conducting impartial assessments of systems. Impartiality means that assessors are free
3844 from any perceived or actual conflicts of interest regarding development, operation,
3845 sustainment, or management of the systems under assessment or the determination of
3846 control effectiveness. To achieve impartiality, assessors do not create a mutual or conflicting
3847 interest with the organizations where the assessments are being conducted; assess their
3848 own work; act as management or employees of the organizations they are serving; or place
3849 themselves in positions of advocacy for the organizations acquiring their services.
3850 Independent assessments can be obtained from elements within organizations or can be
3851 contracted to public or private sector entities outside of organizations. Authorizing officials
3852 determine the required level of independence based on the security categories of systems
3853 and/or the risk to organizational operations, organizational assets, or individuals. Authorizing
3854 officials also determine if the level of assessor independence provides sufficient assurance
3855 that the results are sound and can be used to make credible, risk-based decisions. Assessor
3856 independence determination also includes whether contracted assessment services have
3857 sufficient independence, for example, when system owners are not directly involved in
3858 contracting processes or cannot influence the impartiality of the assessors conducting the
3859 assessments. During the system design and development phase, the analogy to independent
3860 assessors is having independent SMEs involved in design reviews.
3861 When organizations that own the systems are small or the structures of the organizations
3862 require that assessments are conducted by individuals that are in the developmental,
3863 operational, or management chain of the system owners, independence in assessment
3864 processes can be achieved by ensuring that assessment results are carefully reviewed and
3865 analyzed by independent teams of experts to validate the completeness, accuracy, integrity,
3866 and reliability of the results. Assessments performed for purposes other than to support
3867 authorization decisions, are more likely to be useable for such decisions when performed by
3868 assessors with sufficient independence, thereby reducing the need to repeat assessments.
3869 Related Controls: None.
3870 (2) ASSESSMENTS | SPECIALIZED ASSESSMENTS
3871 Include as part of control assessments, [Assignment: organization-defined frequency],
3872 [Selection: announced; unannounced], [Selection (one or more): in-depth monitoring;
3873 security instrumentation; automated security test cases; vulnerability scanning; malicious
3874 user testing; insider threat assessment; performance and load testing; data leakage or
3875 data loss assessment [Assignment: organization-defined other forms of assessment]].
3876 Discussion: Organizations can conduct specialized assessments, including verification and
3877 validation, system monitoring, insider threat assessments, malicious user testing, and other
3878 forms of testing. These assessments can improve readiness by exercising organizational
3879 capabilities and indicating current levels of performance as a means of focusing actions to
3880 improve security and privacy. Organizations conduct specialized assessments in accordance
3881 with applicable laws, executive orders, directives, regulations, policies, standards, and
3882 guidelines. Authorizing officials approve the assessment methods in coordination with the
3883 organizational risk executive function. Organizations can include vulnerabilities uncovered
3884 during assessments into vulnerability remediation processes. Specialized assessments can
3885 also be conducted early in the system development life cycle, for example, during design,
3886 development, and unit testing.
3887 Related Controls: PE-3, SI-2.
3888 (3) ASSESSMENTS | EXTERNAL ORGANIZATIONS
3889 Leverage the results of control assessments performed by [Assignment: organization-
3890 defined external organization] on [Assignment: organization-defined system] when the
3891 assessment meets [Assignment: organization-defined requirements].
3892 Discussion: Organizations may rely on control assessments of organizational systems by
3893 other (external) organizations. Using such assessments and reusing existing assessment
3894 evidence can decrease the time and resources required for assessments by limiting the
3895 independent assessment activities that organizations need to perform. The factors that
3896 organizations consider in determining whether to accept assessment results from external
3897 organizations can vary. Such factors include the organization’s past experience with the
3898 organization that conducted the assessment; the reputation of the assessment organization;
3899 the level of detail of supporting assessment evidence provided; and mandates imposed by
3900 applicable laws, executive orders, directives, regulations, policies, standards, and guidelines.
3901 Accredited testing laboratories supporting the Common Criteria Program [ISO 15408-1], the
3902 NIST Cryptographic Module Validation Program (CMVP), or the NIST Cryptographic Algorithm
3903 Validation Program (CAVP) can provide independent assessment results that organizations
3904 can leverage.
3905 Related Controls: SA-4.
3906 References: [OMB A-130]; [FIPS 199]; [SP 800-18]; [SP 800-37]; [SP 800-39]; [SP 800-53A]; [SP
3907 800-115]; [SP 800-137]; [IR 8062].
3930 Authorizing officials determine the risk associated with system information exchange and the
3931 controls needed for appropriate risk mitigation. The type of agreement selected is based on
3932 factors such as the impact level of the information being exchanged, the relationship between
3933 the organizations exchanging information (e.g., government to government, government to
3934 business, business to business, government or business to service provider, government or
3935 business to individual), or the level of access to the organizational system by users of the other
3936 system. If systems that exchange information have the same authorizing official, organizations
3937 need not develop agreements. Instead, the interface characteristics between the systems (e.g.,
3938 how the information is being exchanged; how the information is protected) are described in the
3939 respective security and privacy plans. If the systems that exchange information have different
3940 authorizing officials within the same organization, the organizations can develop agreements, or
3941 they can provide the same information that would be provided in the appropriate agreement
3942 type from CA-3a in the respective security and privacy plans for the systems. Organizations may
3943 incorporate agreement information into formal contracts, especially for information exchanges
3944 established between federal agencies and nonfederal organizations (including service providers,
3945 contractors, system developers, and system integrators). Risk considerations include systems
3946 sharing the same networks.
3947 Related Controls: AC-4, AC-20, AU-16, CA-6, IA-3, IR-4, PL-2, PT-8, RA-3, SA-9, SC-7, SI-12.
3948 Control Enhancements:
3949 (1) SYSTEM CONNECTIONS | UNCLASSIFIED NATIONAL SECURITY SYSTEM CONNECTIONS
3950 [Withdrawn: Moved to SC-7(25).]
3951 (2) SYSTEM CONNECTIONS | CLASSIFIED NATIONAL SECURITY SYSTEM CONNECTIONS
3952 [Withdrawn: Moved to SC-7(26).]
3953 (3) SYSTEM CONNECTIONS | UNCLASSIFIED NON-NATIONAL SECURITY SYSTEM CONNECTIONS
3954 [Withdrawn: Moved to SC-7(27).]
3955 (4) SYSTEM CONNECTIONS | CONNECTIONS TO PUBLIC NETWORKS
3956 [Withdrawn: Moved to SC-7(28).]
3957 (5) SYSTEM CONNECTIONS | RESTRICTIONS ON EXTERNAL SYSTEM CONNECTIONS
3958 [Withdrawn: Moved to SC-7(5).]
3959 (6) INFORMATION EXCHANGE | TRANSFER AUTHORIZATIONS
3960 Verify that individuals or systems transferring data between interconnecting systems have
3961 the requisite authorizations (i.e., write permissions or privileges) prior to accepting such
3962 data.
3963 Discussion: To prevent unauthorized individuals and systems from making information
3964 transfers to protected systems, the protected system verifies via independent means,
3965 whether the individual or system attempting to transfer information is authorized to do so.
3966 This control enhancement also applies to control plane traffic (e.g., routing and DNS) and
3967 services such as authenticated SMTP relays.
3968 Related Controls: AC-2, AC-3, AC-4.
3969 (7) INFORMATION EXCHANGE |TRANSITIVE INFORMATION EXCHANGES
3970 (a) Identify transitive (downstream) information exchanges with other systems through
3971 the systems identified in CA-3a; and
3972 (b) Take measures to ensure that transitive (downstream) information exchanges cease
3973 when the controls on identified transitive (downstream) systems cannot be verified or
3974 validated.
4016 b. Assign a senior official as the authorizing official for common controls available for
4017 inheritance by organizational systems;
4018 c. Ensure that the authorizing official for the system, before commencing operations:
4019 1. Accepts the use of common controls inherited by the system; and
4020 2. Authorizes the system to operate;
4021 d. Ensure that the authorizing official for common controls authorizes the use of those controls
4022 for inheritance by organizational systems;
4023 e. Update the authorizations [Assignment: organization-defined frequency].
4024 Discussion: Authorizations are official management decisions by senior officials to authorize
4025 operation of systems, to authorize the use of common controls for inheritance by organizational
4026 systems and to explicitly accept the risk to organizational operations and assets, individuals,
4027 other organizations, and the Nation based on the implementation of agreed-upon controls.
4028 Authorizing officials provide budgetary oversight for organizational systems and for common
4029 controls or assume responsibility for the mission and business operations supported by those
4030 systems or common controls. The authorization process is a federal responsibility and therefore,
4031 authorizing officials must be federal employees. Authorizing officials are both responsible and
4032 accountable for security and privacy risks associated with the operation and use of organizational
4033 systems. Nonfederal organizations may have similar processes to authorize systems and senior
4034 officials that assume the authorization role and associated responsibilities.
4035 Authorizing officials issue ongoing authorizations of systems based on evidence produced from
4036 implemented continuous monitoring programs. Robust continuous monitoring programs reduce
4037 the need for separate reauthorization processes. Through the employment of comprehensive
4038 continuous monitoring processes, the information contained in authorization packages (i.e., the
4039 security and privacy plans, assessment reports, and plans of action and milestones), is updated
4040 on an ongoing basis. This provides authorizing officials, system owners, and common control
4041 providers with an up-to-date status of the security and privacy posture of their systems, controls,
4042 and operating environments. To reduce the cost of reauthorization, authorizing officials can
4043 leverage the results of continuous monitoring processes to the maximum extent possible as the
4044 basis for rendering reauthorization decisions.
4045 Related Controls: CA-2, CA-3, CA-7, PM-9, PM-10, SA-10, SI-12.
4046 Control Enhancements:
4047 (1) AUTHORIZATION | JOINT AUTHORIZATION — INTRA-ORGANIZATION
4048 Employ a joint authorization process for the system that includes multiple authorizing
4049 officials from the same organization conducting the authorization.
4050 Discussion: Assigning multiple authorizing officials from the same organization to serve as
4051 co-authorizing officials for the system, increases the level of independence in the risk-based
4052 decision-making process. It also implements the concepts of separation of duties and dual
4053 authorization as applied to the system authorization process. The intra-organization joint
4054 authorization process is most relevant for connected systems, shared systems, and systems
4055 with multiple information owners.
4056 Related Controls: AC-6.
4057 (2) AUTHORIZATION | JOINT AUTHORIZATION — INTER-ORGANIZATION
4058 Employ a joint authorization process for the system that includes multiple authorizing
4059 officials with at least one authorizing official from an organization external to the
4060 organization conducting the authorization.
4061 Discussion: Assigning multiple authorizing officials, at least one of which comes from an
4062 external organization, to serve as co-authorizing officials for the system, increases the level
4063 of independence in the risk-based decision-making process. It implements the concepts of
4064 separation of duties and dual authorization as applied to the system authorization process.
4065 Employing authorizing officials from external organizations to supplement the authorizing
4066 official from the organization owning or hosting the system may be necessary when the
4067 external organizations have a vested interest or equities in the outcome of the authorization
4068 decision. The inter-organization joint authorization process is relevant and appropriate for
4069 connected systems, shared systems or services, and systems with multiple information
4070 owners. The authorizing officials from the external organizations are key stakeholders of the
4071 system undergoing authorization.
4072 Related Controls: AC-6.
4073 References: [OMB A-130]; [SP 800-37]; [SP 800-137].
4107 the security categories of systems. Monitoring requirements, including the need for specific
4108 monitoring, may be referenced in other controls and control enhancements, for example, AC-2g,
4109 AC-2(7), AC-2(12)(a), AC-2(7)(b), AC-2(7)(c), AC-17(1), AT-4a, AU-13, AU-13(1), AU-13(2), CM-3f,
4110 CM-6d, CM-11c, IR-5, MA-2b, MA-3a, MA-4a, PE-3d, PE-6, PE-14b, PE-16, PE-20, PM-6, PM-23,
4111 PM-31, PS-7e, SA-9c, SR-4, SC-5(3)(b), SC-7a, SC-7(24)(b), SC-18c, SC-43b, SI-4.
4112 Related Controls: AC-2, AC-6, AC-17, AT-4, AU-6, AU-13, CA-2, CA-5, CA-6, CM-3, CM-4, CM-6,
4113 CM-11, IA-5, IR-5, MA-2, MA-3, MA-4, PE-3, PE-6, PE-14, PE-16, PE-20, PL-2, PM-4, PM-6, PM-9,
4114 PM-10, PM-12, PM-14, PM-23, PM-28, PM-31, PS-7, PT-8, RA-3, RA-5, RA-7, SA-8, SA-9, SA-11, SC-
4115 5, SC-7, SC-18, SC-38, SC-43, SC-38, SI-3, SI-4, SI-12, SR-6.
4116 Control Enhancements:
4117 (1) CONTINUOUS MONITORING | INDEPENDENT ASSESSMENT
4118 Employ independent assessors or assessment teams to monitor the controls in the system
4119 on an ongoing basis.
4120 Discussion: Organizations maximize the value of control assessments by requiring that
4121 assessments be conducted by assessors with appropriate levels of independence. The level
4122 of required independence is based on organizational continuous monitoring strategies.
4123 Assessor independence provides a degree of impartiality to the monitoring process. To
4124 achieve such impartiality, assessors do not create a mutual or conflicting interest with the
4125 organizations where the assessments are being conducted; assess their own work; act as
4126 management or employees of the organizations they are serving; or place themselves in
4127 advocacy positions for the organizations acquiring their services.
4128 Related Controls: None.
4129 (2) CONTINUOUS MONITORING | TYPES OF ASSESSMENTS
4130 [Withdrawn: Incorporated into CA-2.]
4131 (3) CONTINUOUS MONITORING | TREND ANALYSES
4132 Employ trend analyses to determine if control implementations, the frequency of
4133 continuous monitoring activities, and the types of activities used in the continuous
4134 monitoring process need to be modified based on empirical data.
4135 Discussion: Trend analyses include examining recent threat information addressing the
4136 types of threat events that have occurred within the organization or the federal government;
4137 success rates of certain types of attacks; emerging vulnerabilities in technologies; evolving
4138 social engineering techniques; the effectiveness of configuration settings; results from
4139 multiple control assessments; and findings from Inspectors General or auditors.
4140 Related Controls: None.
4141 (4) CONTINUOUS MONITORING | RISK MONITORING
4142 Ensure risk monitoring is an integral part of the continuous monitoring strategy that
4143 includes the following:
4144 (a) Effectiveness monitoring;
4145 (b) Compliance monitoring; and
4146 (c) Change monitoring.
4147 Discussion: Risk monitoring is informed by the established organizational risk tolerance.
4148 Effectiveness monitoring determines the ongoing effectiveness of the implemented risk
4149 response measures. Compliance monitoring verifies that required risk response measures
4150 are implemented. It also verifies that security and privacy requirements are satisfied. Change
4151 monitoring identifies changes to organizational systems and environments of operation that
4152 may affect security and privacy risk.
4242 b. Document, for each internal connection, the interface characteristics, security and privacy
4243 requirements, and the nature of the information communicated;
4244 c. Terminate internal system connections after [Assignment: organization-defined conditions];
4245 and
4246 d. Review [Assignment: organization-defined frequency] the continued need for each internal
4247 connection.
4248 Discussion: Internal system connections are connections between organizational systems and
4249 separate constituent system components (i.e., connections between components that are part of
4250 the same system). Intra-system connections include connections with mobile devices, notebook
4251 and desktop computers, workstations, printers, copiers, facsimile machines, scanners, sensors,
4252 and servers. Instead of authorizing each individual internal system connection, organizations can
4253 authorize internal connections for a class of system components with common characteristics
4254 and/or configurations, including printers, scanners, and copiers with a specified processing,
4255 transmission, and storage capability; or smart phones and tablets with a specific baseline
4256 configuration. The continued need for an internal system connection is reviewed from the
4257 perspective of whether it provides support for organizational missions or business functions.
4258 Related Controls: AC-3, AC-4, AC-18, AC-19, CM-2, IA-3, SC-7, SI-12.
4259 Control Enhancements:
4260 (1) INTERNAL SYSTEM CONNECTIONS | COMPLIANCE CHECKS
4261 Perform security and privacy compliance checks on constituent system components prior
4262 to the establishment of the internal connection.
4263 Discussion: Compliance checks include verification of the relevant baseline configuration.
4264 Related Controls: CM-6.
4265 References: [SP 800-124]; [IR 8023].
4351 Discussion: Establishing separate baseline configurations for development, testing, and
4352 operational environments protects systems from unplanned or unexpected events related to
4353 development and testing activities. Separate baseline configurations allow organizations to
4354 apply the configuration management that is most appropriate for each type of configuration.
4355 For example, the management of operational configurations typically emphasizes the need
4356 for stability, while the management of development or test configurations requires greater
4357 flexibility. Configurations in the test environment mirror configurations in the operational
4358 environment to the extent practicable so that the results of the testing are representative of
4359 the proposed changes to the operational systems. Separate baseline configurations does not
4360 necessarily require separate physical environments.
4361 Related Controls: CM-4, SC-3, SC-7.
4362 (7) BASELINE CONFIGURATION | CONFIGURE SYSTEMS AND COMPONENTS FOR HIGH-RISK AREAS
4363 (a) Issue [Assignment: organization-defined systems or system components] with
4364 [Assignment: organization-defined configurations] to individuals traveling to locations
4365 that the organization deems to be of significant risk; and
4366 (b) Apply the following controls to the systems or components when the individuals
4367 return from travel: [Assignment: organization-defined controls].
4368 Discussion: When it is known that systems or system components will be in high-risk areas
4369 external to the organization, additional controls may be implemented to counter the
4370 increased threat in such areas. For example, organizations can take actions for notebook
4371 computers used by individuals departing on and returning from travel. Actions include
4372 determining the locations that are of concern, defining the required configurations for the
4373 components, ensuring that components are configured as intended before travel is initiated,
4374 and applying controls to the components after travel is completed. Specially configured
4375 notebook computers include computers with sanitized hard drives, limited applications, and
4376 more stringent configuration settings. Controls applied to mobile devices upon return from
4377 travel include examining the mobile device for signs of physical tampering and purging and
4378 reimaging disk drives. Protecting information that resides on mobile devices is addressed in
4379 the MP (Media Protection) family.
4380 Related Controls: MP-4, MP-5.
4381 References: [SP 800-124]; [SP 800-128].
4621 platforms as well as instructions for configuring those products or platforms to meet operational
4622 requirements. Common secure configurations can be developed by a variety of organizations,
4623 including information technology product developers, manufacturers, vendors, federal agencies,
4624 consortia, academia, industry, and other organizations in the public and private sectors.
4625 Implementation of a common secure configuration may be mandated at the organization level,
4626 mission/business process level, or system level, or may be mandated at a higher level, including
4627 by a regulatory agency. Common secure configurations include the United States Government
4628 Configuration Baseline [USGCB] and security technical implementation guides (STIGs), which
4629 affect the implementation of CM-6 and other controls such as AC-19 and CM-7. The Security
4630 Content Automation Protocol (SCAP) and the defined standards within the protocol provide an
4631 effective method to uniquely identify, track, and control configuration settings.
4632 Related Controls: AC-3, AC-19, AU-2, AU-6, CA-9, CM-2, CM-3, CM-5, CM-7, CM-11, CP-7, CP-9,
4633 CP-10, IA-3, IA-5, PL-8, RA-5, SA-4, SA-5, SA-8, SA-9, SC-18, SC-28, SC-43, SI-2, SI-4, SI-6.
4634 Control Enhancements:
4635 (1) CONFIGURATION SETTINGS | AUTOMATED MANAGEMENT, APPLICATION, AND VERIFICATION
4636 Centrally manage, apply, and verify configuration settings for [Assignment: organization-
4637 defined system components] using [Assignment: organization-defined automated
4638 mechanisms].
4639 Discussion: Automated tools (e.g., security information and event management tools or
4640 enterprise security monitoring tools) can improve the accuracy, consistency, and availability
4641 of configuration settings information. Automation can also provide data aggregation and
4642 data correlation capabilities; alerting mechanisms; and dashboards to support risk-based
4643 decision making within the organization.
4644 Related Controls: CA-7.
4645 (2) CONFIGURATION SETTINGS | RESPOND TO UNAUTHORIZED CHANGES
4646 Take the following actions in response to unauthorized changes to [Assignment:
4647 organization-defined configuration settings]: [Assignment: organization-defined actions].
4648 Discussion: Responses to unauthorized changes to configuration settings include alerting
4649 designated organizational personnel, restoring established configuration settings, or in
4650 extreme cases, halting affected system processing.
4651 Related Controls: IR-4, IR-6, SI-7.
4652 (3) CONFIGURATION SETTINGS | UNAUTHORIZED CHANGE DETECTION
4653 [Withdrawn: Incorporated into SI-7.]
4654 (4) CONFIGURATION SETTINGS | CONFORMANCE DEMONSTRATION
4655 [Withdrawn: Incorporated into CM-4.]
4656 References: [SP 800-70]; [SP 800-126]; [SP 800-128]; [USGCB]; [NCPR]; [DOD STIG].
4664 Discussion: Systems provide a wide variety of functions and services. Some of the functions and
4665 services routinely provided by default, may not be necessary to support essential organizational
4666 missions, functions, or operations. Additionally, it is sometimes convenient to provide multiple
4667 services from a single system component but doing so increases risk over limiting the services
4668 provided by that single component. Where feasible, organizations limit component functionality
4669 to a single function per component. Organizations consider removing unused or unnecessary
4670 software and disabling unused or unnecessary physical and logical ports and protocols to prevent
4671 unauthorized connection of components, transfer of information, and tunneling. Organizations
4672 employ network scanning tools, intrusion detection and prevention systems, and end-point
4673 protection technologies such as firewalls and host-based intrusion detection systems to identify
4674 and prevent the use of prohibited functions, protocols, ports, and services. Least functionality
4675 can also be achieved as part of the fundamental design and development of the system (see SA-
4676 8, SC-2, and SC-3).
4677 Related Controls: AC-3, AC-4, CM-2, CM-5, CM-6, CM-11, RA-5, SA-4, SA-5, SA-8, SA-9, SA-15, SC-
4678 2, SC-3, SC-7, SC-37, SI-4.
4679 Control Enhancements:
4680 (1) LEAST FUNCTIONALITY | PERIODIC REVIEW
4681 (a) Review the system [Assignment: organization-defined frequency] to identify
4682 unnecessary and/or nonsecure functions, ports, protocols, software, and services; and
4683 (b) Disable or remove [Assignment: organization-defined functions, ports, protocols,
4684 software, and services within the system deemed to be unnecessary and/or
4685 nonsecure].
4686 Discussion: Organizations review functions, ports, protocols, and services provided by
4687 systems or system components to determine the functions and services that are candidates
4688 for elimination. Such reviews are especially important during transition periods from older
4689 technologies to newer technologies (e.g., transition from IPv4 to IPv6). These technology
4690 transitions may require implementing the older and newer technologies simultaneously
4691 during the transition period and returning to minimum essential functions, ports, protocols,
4692 and services at the earliest opportunity. Organizations can either decide the relative security
4693 of the function, port, protocol, and/or service or base the security decision on the
4694 assessment of other entities. Unsecure protocols include Bluetooth, FTP, and peer-to-peer
4695 networking.
4696 Related Controls: AC-18.
4697 (2) LEAST FUNCTIONALITY | PREVENT PROGRAM EXECUTION
4698 Prevent program execution in accordance with [Selection (one or more): [Assignment:
4699 organization-defined policies, rules of behavior, and/or access agreements regarding
4700 software program usage and restrictions]; rules authorizing the terms and conditions of
4701 software program usage].
4702 Discussion: Prevention of program execution addresses organizational policies, rules of
4703 behavior, and/or access agreements restricting software usage and the terms and conditions
4704 imposed by the developer or manufacturer, including software licensing and copyrights.
4705 Restrictions include prohibiting auto-execute features; restricting roles allowed to approve
4706 program execution; program blacklisting and whitelisting; or restricting the number of
4707 program instances executed at the same time.
4708 Related Controls: CM-8, PL-4, PM-5, PS-6.
4709 (3) LEAST FUNCTIONALITY | REGISTRATION COMPLIANCE
4710 Ensure compliance with [Assignment: organization-defined registration requirements for
4711 functions, ports, protocols, and services].
4712 Discussion: Organizations use the registration process to manage, track, and provide
4713 oversight for systems and implemented functions, ports, protocols, and services.
4714 Related Controls: None.
4715 (4) LEAST FUNCTIONALITY | UNAUTHORIZED SOFTWARE — BLACKLISTING
4716 (a) Identify [Assignment: organization-defined software programs not authorized to
4717 execute on the system];
4718 (b) Employ an allow-all, deny-by-exception policy to prohibit the execution of
4719 unauthorized software programs on the system; and
4720 (c) Review and update the list of unauthorized software programs [Assignment:
4721 organization-defined frequency].
4722 Discussion: The process used to identify software programs or categories of software
4723 programs that are not authorized to execute on organizational systems is commonly
4724 referred to as blacklisting. Software programs identified can be limited to specific versions
4725 or from a specific source. The concept of blacklisting may also be applied to user actions,
4726 ports, IP addresses, and media access control (MAC) addresses.
4727 Related Controls: CM-6, CM-8, CM-10, PM-5.
4728 (5) LEAST FUNCTIONALITY | AUTHORIZED SOFTWARE — WHITELISTING
4729 (a) Identify [Assignment: organization-defined software programs authorized to execute
4730 on the system];
4731 (b) Employ a deny-all, permit-by-exception policy to allow the execution of authorized
4732 software programs on the system; and
4733 (c) Review and update the list of authorized software programs [Assignment:
4734 organization-defined frequency].
4735 Discussion: The process used to identify specific software programs or entire categories of
4736 software programs that are authorized to execute on organizational systems is commonly
4737 referred to as whitelisting. Software programs identified can be limited to specific versions
4738 or from a specific source. To facilitate comprehensive whitelisting and increase the strength
4739 of protection for attacks that bypass application level whitelisting, software programs may
4740 be decomposed into and monitored at different levels of detail. Software program levels of
4741 detail include applications, application programming interfaces, application modules, scripts,
4742 system processes, system services, kernel functions, registries, drivers, and dynamic link
4743 libraries. The concept of whitelisting may also be applied to user actions, ports, IP addresses,
4744 and media access control (MAC) addresses. Organizations consider verifying the integrity of
4745 white-listed software programs using, cryptographic checksums, digital signatures, or hash
4746 functions. Verification of white-listed software can occur either prior to execution or at
4747 system startup. Whitelisting of URLs for websites is addressed in CA-3(5) and SC-7.
4748 Related Controls: CM-2, CM-6, CM-8, CM-10, PM-5, SA-10, SC-34, SI-7.
4749 (6) LEAST FUNCTIONALITY | CONFINED ENVIRONMENTS WITH LIMITED PRIVILEGES
4750 Require that the following user-installed software execute in a confined physical or virtual
4751 machine environment with limited privileges: [Assignment: organization-defined user-
4752 installed software].
4753 Discussion: Organizations identify software that may be of concern regarding its origin or
4754 potential for containing malicious code. For this type of software, user installations occur in
4755 confined environments of operation to limit or contain damage from malicious code that
4756 may be executed.
4757 Related Controls: CM-11, SC-44.
4802 Inventory specifications include date of receipt, cost, model, serial number, manufacturer,
4803 supplier information, component type, and physical location.
4804 Related Controls: CM-2, CM-7, CM-9, CM-10, CM-11, CM-13, CP-2, CP-9, MA-2, MA-6, PE-20,
4805 PM-5, SA-4, SA-5, SI-2, SR-4.
4806 Control Enhancements:
4807 (1) SYSTEM COMPONENT INVENTORY | UPDATES DURING INSTALLATION AND REMOVAL
4808 Update the inventory of system components as part of component installations, removals,
4809 and system updates.
4810 Discussion: Organizations can improve the accuracy, completeness, and consistency of
4811 system component inventories if the inventories are updated routinely as part of component
4812 installations or removals, or during general system updates. If inventories are not updated at
4813 these key times, there is a greater likelihood that the information will not be appropriately
4814 captured and documented. System updates include hardware, software, and firmware
4815 components.
4816 Related Controls: PM-16.
4817 (2) SYSTEM COMPONENT INVENTORY | AUTOMATED MAINTENANCE
4818 Maintain the currency, completeness, accuracy, and availability of the inventory of system
4819 components using [Assignment: organization-defined automated mechanisms].
4820 Discussion: Organizations maintain system inventories to the extent feasible. For example,
4821 virtual machines can be difficult to monitor because such machines are not visible to the
4822 network when not in use. In such cases, organizations maintain as up-to-date, complete, and
4823 accurate an inventory as is deemed reasonable. Automated maintenance can be achieved by
4824 the implementation of CM-2(2) for organizations that combine system component inventory
4825 and baseline configuration activities.
4826 Related Controls: None.
4827 (3) SYSTEM COMPONENT INVENTORY | AUTOMATED UNAUTHORIZED COMPONENT DETECTION
4828 (a) Detect the presence of unauthorized hardware, software, and firmware components
4829 within the system using [Assignment: organization-defined automated mechanisms]
4830 [Assignment: organization-defined frequency]; and
4831 (b) Take the following actions when unauthorized components are detected: [Selection
4832 (one or more): disable network access by such components; isolate the components;
4833 notify [Assignment: organization-defined personnel or roles]].
4834 Discussion: Automated unauthorized component detection is applied in addition to the
4835 monitoring for unauthorized remote connections and mobile devices. Monitoring for
4836 unauthorized system components may be accomplished on an ongoing basis or by the
4837 periodic scanning of systems for that purpose. Automated mechanisms can be implemented
4838 in systems or in separate system components. When acquiring and implementing automated
4839 mechanisms, organizations consider whether such mechanisms depend on the ability of the
4840 system component to support an agent or supplicant in order to be detected since some
4841 types of components do not have or cannot support agents (e.g., IoT devices). Isolation can
4842 be achieved, for example, by placing unauthorized system components in separate domains
4843 or subnets or quarantining such components. This type of component isolation is commonly
4844 referred to as sandboxing.
4845 Related Controls: AC-19, CA-7, RA-5, SC-3, SC-39, SC-44, SI-3, SI-4, SI-7.
4936 Organizations can employ templates to help ensure consistent and timely development and
4937 implementation of configuration management plans. Templates can represent a master
4938 configuration management plan for the organization with subsets of the plan implemented on a
4939 system by system basis. Configuration management approval processes include designation of
4940 key management stakeholders responsible for reviewing and approving proposed changes to
4941 systems, and personnel that conduct security impact analyses prior to the implementation of
4942 changes to the systems. Configuration items are the system components, for example, the
4943 hardware, software, firmware, and documentation to be configuration-managed. As systems
4944 continue through the system development life cycle, new configuration items may be identified,
4945 and some existing configuration items may no longer need to be under configuration control.
4946 Related Controls: CM-2, CM-3, CM-4, CM-5, CM-8, PL-2, SA-10, SI-12.
4947 Control Enhancements:
4948 (1) CONFIGURATION MANAGEMENT PLAN | ASSIGNMENT OF RESPONSIBILITY
4949 Assign responsibility for developing the configuration management process to
4950 organizational personnel that are not directly involved in system development.
4951 Discussion: In the absence of dedicated configuration management teams assigned within
4952 organizations, system developers may be tasked to develop configuration management
4953 processes using personnel who are not directly involved in system development or system
4954 integration. This separation of duties ensures that organizations establish and maintain a
4955 sufficient degree of independence between the system development and integration
4956 processes and configuration management processes to facilitate quality control and more
4957 effective oversight.
4958 Related Controls: None.
4959 References: [SP 800-128].
4980 software. From a security perspective, the major advantage of open source software is that
4981 it provides organizations with the ability to examine the source code. However, remediating
4982 vulnerabilities in open source software may be problematic. There may also be licensing
4983 issues associated with open source software, including the constraints on derivative use of
4984 such software. Open source software that is available only in binary form may increase the
4985 level of risk in using such software.
4986 Related Controls: SI-7.
4987 References: None.
5022 Discussion: Information location addresses the need to understand where information is being
5023 processed and stored. Information location includes identifying where specific information types
5024 and associated information reside in the system components; and how information is being
5025 processed so that information flow can be understood, and adequate protection and policy
5026 management provided for such information and system components. The security category of
5027 the information is also a factor in determining the controls necessary to protect the information
5028 and the system component where the information resides (see FIPS 199). The location of the
5029 information and system components is also a factor in the architecture and design of the system
5030 (see SA-4, SA-8, SA-17).
5031 Related Controls: AC-2, AC-3, AC-4, AC-6, AC-23, CM-8, PM-5, RA-2, SA-4, SA-8, SA-17, SC-4, SC-
5032 16, SC-28, SI-4, SI-7.
5033 Control Enhancements:
5034 (1) INFORMATION LOCATION | AUTOMATED TOOLS TO SUPPORT INFORMATION LOCATION
5035 Use automated tools to identify [Assignment: organization-defined information by
5036 information type] on [Assignment: organization-defined system components] to ensure
5037 controls are in place to protect organizational information and individual privacy.
5038 Discussion: The use of automated tools helps to increase the effectiveness and efficiency of
5039 the information location capability implemented within the system. Automation also helps
5040 organizations manage the data produced during information location activities and share
5041 such information organization-wide. The output of automated information location tools can
5042 be used to guide and inform system architecture and design decisions.
5043 Related Controls: None.
5044 References: [FIPS 199]; [SP 800-60 v1]; [SP 800-60 v2].
5148 Discussion: Plans that are related to contingency plans include Business Continuity Plans,
5149 Disaster Recovery Plans, Critical Infrastructure Plans, Continuity of Operations Plans, Crisis
5150 Communications Plans, Insider Threat Implementation Plans, Cyber Incident Response Plans,
5151 and Occupant Emergency Plans.
5152 Related Controls: None.
5153 (2) CONTINGENCY PLAN | CAPACITY PLANNING
5154 Conduct capacity planning so that necessary capacity for information processing,
5155 telecommunications, and environmental support exists during contingency operations.
5156 Discussion: Capacity planning is needed because different threats can result in a reduction
5157 of the available processing, telecommunications, and support services intended to support
5158 essential missions and business functions. Organizations anticipate degraded operations
5159 during contingency operations and factor the degradation into capacity planning. For
5160 capacity planning, environmental support refers to any environmental factor for which the
5161 organization determines that it needs to provide support in a contingency situation, even if
5162 in a degraded state. Such determinations are based on an organizational assessment of risk,
5163 system categorization (impact level), and organizational risk tolerance.
5164 Related Controls: PE-11, PE-12, PE-13, PE-14, PE-18, SC-5.
5165 (3) CONTINGENCY PLAN | RESUME MISSIONS AND BUSINESS FUNCTIONS
5166 Plan for the resumption of [Selection: all; essential] missions and business functions within
5167 [Assignment: organization-defined time-period] of contingency plan activation.
5168 Discussion: Organizations may choose to conduct contingency planning activities to resume
5169 missions and business functions as part of business continuity planning or as part of business
5170 impact analyses. Organizations prioritize the resumption of missions and business functions.
5171 The time-period for the resumption of missions and business functions may be dependent
5172 on the severity and extent of the disruptions to the system and its supporting infrastructure.
5173 Related Controls: None.
5174 (4) CONTINGENCY PLAN | RESUME ALL MISSIONS AND BUSINESS FUNCTIONS
5175 [Withdrawn: Incorporated into CP-2(3).]
5176 (5) CONTINGENCY PLAN | CONTINUE MISSIONS AND BUSINESS FUNCTIONS
5177 Plan for the continuance of [Selection: all; essential] missions and business functions with
5178 minimal or no loss of operational continuity and sustains that continuity until full system
5179 restoration at primary processing and/or storage sites.
5180 Discussion: Organizations may choose to conduct the contingency planning activities to
5181 continue missions and business functions as part of business continuity planning or as part of
5182 business impact analyses. Primary processing and/or storage sites defined by organizations
5183 as part of contingency planning may change depending on the circumstances associated
5184 with the contingency.
5185 Related Controls: None.
5186 (6) CONTINGENCY PLAN | ALTERNATE PROCESSING AND STORAGE SITES
5187 Plan for the transfer of [Selection: all; essential] missions and business functions to
5188 alternate processing and/or storage sites with minimal or no loss of operational continuity
5189 and sustain that continuity through system restoration to primary processing and/or
5190 storage sites.
5191 Discussion: Organizations may choose to conduct the contingency planning activities for
5192 alternate processing and storage sites as part of business continuity planning or as part of
5193 business impact analyses. Primary processing and/or storage sites defined by organizations
5194 as part of contingency planning may change depending on the circumstances associated
5195 with the contingency.
5196 Related Controls: None.
5197 (7) CONTINGENCY PLAN | COORDINATE WITH EXTERNAL SERVICE PROVIDERS
5198 Coordinate the contingency plan with the contingency plans of external service providers
5199 to ensure that contingency requirements can be satisfied.
5200 Discussion: When the capability of an organization to carry out its missions and business
5201 functions is dependent on external service providers, developing a comprehensive and
5202 timely contingency plan may become more challenging. When missions and business
5203 functions are dependent on external service providers, organizations coordinate contingency
5204 planning activities with the external entities to ensure that the individual plans reflect the
5205 overall contingency needs of the organization.
5206 Related Controls: SA-9.
5207 (8) CONTINGENCY PLAN | IDENTIFY CRITICAL ASSETS
5208 Identify critical system assets supporting [Selection: all; essential] missions and business
5209 functions.
5210 Discussion: Organizations may choose to identify critical assets as part of criticality analysis,
5211 business continuity planning, or business impact analyses. Organizations identify critical
5212 system assets so additional controls can be employed (beyond the controls routinely
5213 implemented) to help ensure that organizational missions and business functions can
5214 continue to be conducted during contingency operations. The identification of critical
5215 information assets also facilitates the prioritization of organizational resources. Critical
5216 system assets include technical and operational aspects. Technical aspects include system
5217 components, information technology services, information technology products, and
5218 mechanisms. Operational aspects include procedures (manually executed operations) and
5219 personnel (individuals operating technical controls and/or executing manual procedures).
5220 Organizational program protection plans can assist in identifying critical assets. If critical
5221 assets are resident within or supported by external service providers, organizations consider
5222 implementing CP-2(7) as a control enhancement.
5223 Related Controls: CM-8, RA-9.
5224 References: [SP 800-34]; [IR 8179].
5240 related activities. Training for contingency roles or responsibilities reflects the specific continuity
5241 requirements in the contingency plan.
5242 Related Controls: AT-2, AT-3, AT-4, CP-2, CP-4, CP-8, IR-2, IR-4, IR-9.
5243 Control Enhancements:
5244 (1) CONTINGENCY TRAINING | SIMULATED EVENTS
5245 Incorporate simulated events into contingency training to facilitate effective response by
5246 personnel in crisis situations.
5247 Discussion: The use of simulated events creates an environment for personnel to experience
5248 actual threat events including cyber-attacks that disable web sites, ransom-ware attacks that
5249 encrypt organizational data on servers, hurricanes that damage or destroy organizational
5250 facilities, or hardware or software failures.
5251 Related Controls: None.
5252 (2) CONTINGENCY TRAINING | MECHANISMS USED IN TRAINING ENVIRONMENTS
5253 Employ mechanisms used in operations to provide a more thorough and realistic
5254 contingency training environment.
5255 Discussion: Operational mechanisms refer to processes that have been established to
5256 accomplish an organizational goal or a system that supports a particular organizational
5257 mission or business objective. Actual mission/business processes, systems, and/or facilities
5258 may be used to generate simulated events and/or to enhance the realism of simulated
5259 events during contingency training.
5260 Related Controls: None.
5261 References: [SP 800-50].
5283 Communications Plans, Critical Infrastructure Plans, Cyber Incident Response Plans, and
5284 Occupant Emergency Plans. Coordination of contingency plan testing does not require
5285 organizations to create organizational elements to handle related plans or to align such
5286 elements with specific plans. It does require, however, that if such organizational elements
5287 are responsible for related plans, organizations coordinate with those elements.
5288 Related Controls: IR-8, PM-8.
5289 (2) CONTINGENCY PLAN TESTING | ALTERNATE PROCESSING SITE
5290 Test the contingency plan at the alternate processing site:
5291 (a) To familiarize contingency personnel with the facility and available resources; and
5292 (b) To evaluate the capabilities of the alternate processing site to support contingency
5293 operations.
5294 Discussion: Conditions at the alternate processing site may be significantly different than
5295 the conditions at the primary site. Having the opportunity to visit the alternate site and
5296 experience, firsthand, the actual capabilities available at the site can provide valuable
5297 information on potential vulnerabilities that could affect essential organizational missions
5298 and functions. The on-site visit can also provide an opportunity to refine the contingency
5299 plan to address the vulnerabilities discovered during testing.
5300 Related Controls: CP-7.
5301 (3) CONTINGENCY PLAN TESTING | AUTOMATED TESTING
5302 Test the contingency plan using [Assignment: organization-defined automated
5303 mechanisms].
5304 Discussion: Automated mechanisms facilitate thorough and effective testing of contingency
5305 plans by providing more complete coverage of contingency issues; by selecting more realistic
5306 test scenarios and environments; and by effectively stressing the system and supported
5307 missions and business operations.
5308 Related Controls: None.
5309 (4) CONTINGENCY PLAN TESTING | FULL RECOVERY AND RECONSTITUTION
5310 Include a full recovery and reconstitution of the system to a known state as part of
5311 contingency plan testing.
5312 Discussion: Recovery is executing contingency plan activities to restore organizational
5313 missions and business functions. Reconstitution takes place following recovery and includes
5314 activities for returning systems to fully operational states. Organizations establish a known
5315 state for systems that includes system state information for hardware, software programs,
5316 and data. Preserving system state information facilitates system restart and return to the
5317 operational mode of organizations with less disruption of mission and business processes.
5318 Related Controls: CP-10, SC-24.
5319 References: [FIPS 199]; [SP 800-34]; [SP 800-84].
5326 b. Ensure that the alternate storage site provides controls equivalent to that of the primary
5327 site.
5328 Discussion: Alternate storage sites are sites that are geographically distinct from primary storage
5329 sites and that maintain duplicate copies of information and data if the primary storage site is not
5330 available. In contrast to alternate storage sites, alternate processing sites provide processing
5331 capability if the primary processing site is not available. Geographically distributed architectures
5332 that support contingency requirements may also be considered as alternate storage sites. Items
5333 covered by alternate storage site agreements include environmental conditions at the alternate
5334 sites, access rules for systems and facilities, physical and environmental protection requirements,
5335 and coordination of delivery and retrieval of backup media. Alternate storage sites reflect the
5336 requirements in contingency plans so that organizations can maintain essential missions and
5337 business functions despite disruption, compromise, or failure in organizational systems.
5338 Related Controls: CP-2, CP-7, CP-8, CP-9, CP-10, MP-4, MP-5, PE-3, SC-36, SI-13.
5339 Control Enhancements:
5340 (1) ALTERNATE STORAGE SITE | SEPARATION FROM PRIMARY SITE
5341 Identify an alternate storage site that is sufficiently separated from the primary storage
5342 site to reduce susceptibility to the same threats.
5343 Discussion: Threats that affect alternate storage sites are defined in organizational risk
5344 assessments and include natural disasters, structural failures, hostile attacks, and errors of
5345 omission or commission. Organizations determine what is considered a sufficient degree of
5346 separation between primary and alternate storage sites based on the types of threats that
5347 are of concern. For threats such as hostile attacks, the degree of separation between sites is
5348 less relevant.
5349 Related Controls: RA-3.
5350 (2) ALTERNATE STORAGE SITE | RECOVERY TIME AND RECOVERY POINT OBJECTIVES
5351 Configure the alternate storage site to facilitate recovery operations in accordance with
5352 recovery time and recovery point objectives.
5353 Discussion: Organizations establish recovery time and recovery point objectives as part of
5354 contingency planning. Configuration of the alternate storage site includes physical facilities
5355 and the systems supporting recovery operations ensuring accessibility and correct execution.
5356 Related Controls: None.
5357 (3) ALTERNATE STORAGE SITE | ACCESSIBILITY
5358 Identify potential accessibility problems to the alternate storage site in the event of an
5359 area-wide disruption or disaster and outline explicit mitigation actions.
5360 Discussion: Area-wide disruptions refer to those types of disruptions that are broad in
5361 geographic scope with such determinations made by organizations based on organizational
5362 assessments of risk. Explicit mitigation actions include duplicating backup information at
5363 other alternate storage sites if access problems occur at originally designated alternate sites;
5364 or planning for physical access to retrieve backup information if electronic accessibility to
5365 the alternate site is disrupted.
5366 Related Controls: RA-3.
5367 References: [SP 800-34].
5457 (b) Request Telecommunications Service Priority for all telecommunications services used
5458 for national security emergency preparedness if the primary and/or alternate
5459 telecommunications services are provided by a common carrier.
5460 Discussion: Organizations consider the potential mission or business impact in situations
5461 where telecommunications service providers are servicing other organizations with similar
5462 priority-of-service provisions. Telecommunications Service Priority (TSP) is a Federal
5463 Communications Commission (FCC) program that directs telecommunications service
5464 providers (e.g., wireline and wireless phone companies) to give preferential treatment to
5465 users enrolled in the program when they need to add new lines or have their lines restored
5466 following a disruption of service, regardless of the cause. The FCC sets the rules and policies
5467 for the TSP program and the Department of Homeland Security, manages the TSP program.
5468 The TSP program is always in effect and not contingent on a major disaster or attack taking
5469 place. Federal sponsorship is required to enroll in the TSP program.
5470 Related Controls: None.
5471 (2) TELECOMMUNICATIONS SERVICES | SINGLE POINTS OF FAILURE
5472 Obtain alternate telecommunications services to reduce the likelihood of sharing a single
5473 point of failure with primary telecommunications services.
5474 Discussion: In certain circumstances, telecommunications service providers or services may
5475 share the same physical lines, which increases the vulnerability of a single failure point. It is
5476 important to have provider transparency for the actual physical transmission capability for
5477 telecommunication services.
5478 Related Controls: None.
5479 (3) TELECOMMUNICATIONS SERVICES | SEPARATION OF PRIMARY AND ALTERNATE PROVIDERS
5480 Obtain alternate telecommunications services from providers that are separated from
5481 primary service providers to reduce susceptibility to the same threats.
5482 Discussion: Threats that affect telecommunications services are defined in organizational
5483 assessments of risk and include natural disasters, structural failures, cyber or physical
5484 attacks, and errors of omission or commission. Organizations can reduce common
5485 susceptibilities by minimizing shared infrastructure among telecommunications service
5486 providers and achieving sufficient geographic separation between services. Organizations
5487 may consider using a single service provider in situations where the service provider can
5488 provide alternate telecommunications services meeting the separation needs addressed in
5489 the risk assessment.
5490 Related Controls: None.
5491 (4) TELECOMMUNICATIONS SERVICES | PROVIDER CONTINGENCY PLAN
5492 (a) Require primary and alternate telecommunications service providers to have
5493 contingency plans;
5494 (b) Review provider contingency plans to ensure that the plans meet organizational
5495 contingency requirements; and
5496 (c) Obtain evidence of contingency testing and training by providers [Assignment:
5497 organization-defined frequency].
5498 Discussion: Reviews of provider contingency plans consider the proprietary nature of such
5499 plans. In some situations, a summary of provider contingency plans may be sufficient
5500 evidence for organizations to satisfy the review requirement. Telecommunications service
5501 providers may also participate in ongoing disaster recovery exercises in coordination with
5502 the Department of Homeland Security, state, and local governments. Organizations may use
5503 these types of activities to satisfy evidentiary requirements related to service provider
5504 contingency plan reviews, testing, and training.
5505 Related Controls: CP-3, CP-4.
5506 (5) TELECOMMUNICATIONS SERVICES | ALTERNATE TELECOMMUNICATION SERVICE TESTING
5507 Test alternate telecommunication services [Assignment: organization-defined frequency].
5508 Discussion: Alternate telecommunications services testing is arranged through contractual
5509 agreements with service providers. The testing may occur in parallel with normal operations
5510 to ensure there is no degradation in organizational missions or functions.
5511 Related Controls: CP-3.
5512 References: [SP 800-34].
5592 Discussion: Dual authorization ensures that deletion or destruction of backup information
5593 cannot occur unless two qualified individuals carry out the task. Individuals deleting or
5594 destroying backup information possess the skills or expertise to determine if the proposed
5595 deletion or destruction of information reflects organizational policies and procedures. Dual
5596 authorization may also be known as two-person control. To reduce the risk of collusion,
5597 organizations consider rotating dual authorization duties to other individuals.
5598 Related Controls: AC-3, AC-5, MP-2.
5599 (8) SYSTEM BACKUP | CRYPTOGRAPHIC PROTECTION
5600 Implement cryptographic mechanisms to prevent unauthorized disclosure and
5601 modification of [Assignment: organization-defined backup information].
5602 Discussion: The selection of cryptographic mechanisms is based on the need to protect the
5603 confidentiality and integrity of backup information. The strength of mechanisms selected is
5604 commensurate with the security category or classification of the information. This control
5605 enhancement applies to system backup information in storage at primary and alternate
5606 locations. Organizations implementing cryptographic mechanisms to protect information at
5607 rest also consider cryptographic key management solutions.
5608 Related Controls: SC-12, SC-13, SC-28.
5609 References: [FIPS 140-3]; [FIPS 186-4]; [SP 800-34]; [SP 800-130]; [SP 800-152].
5754 Discussion: Organizations can satisfy the identification and authentication requirements by
5755 complying with the requirements in [HSPD 12]. Organizational users include employees or
5756 individuals that organizations consider having equivalent status of employees (e.g., contractors
5757 and guest researchers). Unique identification and authentication of users applies to all accesses
5758 other than accesses that are explicitly identified in AC-14 and that occur through the authorized
5759 use of group authenticators without individual authentication. Since processes execute on behalf
5760 of groups and roles, organizations may require unique identification of individuals in group
5761 accounts or for detailed accountability of individual activity.
5762 Organizations employ passwords, physical authenticators, or biometrics to authenticate user
5763 identities, or in the case of multifactor authentication, some combination thereof. Access to
5764 organizational systems is defined as either local access or network access. Local access is any
5765 access to organizational systems by users or processes acting on behalf of users, where access is
5766 obtained through direct connections without the use of networks. Network access is access to
5767 organizational systems by users (or processes acting on behalf of users) where access is obtained
5768 through network connections (i.e., nonlocal accesses). Remote access is a type of network access
5769 that involves communication through external networks. Internal networks include local area
5770 networks and wide area networks.
5771 The use of encrypted virtual private networks for network connections between organization-
5772 controlled endpoints and non-organization-controlled endpoints may be treated as internal
5773 networks with respect to protecting the confidentiality and integrity of information traversing
5774 the network. Identification and authentication requirements for non-organizational users are
5775 described in IA-8.
5776 Related Controls: AC-2, AC-3, AC-4, AC-14, AC-17, AC-18, AU-1, AU-6, IA-4, IA-5, IA-8, MA-4, MA-
5777 5, PE-2, PL-4, SA-4, SA-8.
5778 Control Enhancements:
5779 (1) IDENTIFICATION AND AUTHENTICATION (ORGANIZATIONAL USERS) | MULTIFACTOR AUTHENTICATION
5780 TO PRIVILEGED ACCOUNTS
5781 Implement multifactor authentication for access to privileged accounts.
5782 Discussion: Multifactor authentication requires the use of two or more different factors to
5783 achieve authentication. The authentication factors are defined as follows: something you
5784 know (e.g., a personal identification number (PIN)); something you have (e.g., a physical
5785 authenticator or cryptographic private key stored in hardware or software); or something
5786 you are (e.g., a biometric). Multifactor authentication solutions that feature physical
5787 authenticators include hardware authenticators providing time-based or challenge-response
5788 authenticators and smart cards such as the U.S. Government Personal Identity Verification
5789 card or the DoD Common Access Card. In addition to authenticating users at the system level
5790 (i.e., at logon), organizations may also employ authentication mechanisms at the application
5791 level, at their discretion, to provide increased information security. Regardless of the type of
5792 access (i.e., local, network, remote), privileged accounts are authenticated using multifactor
5793 options appropriate for the level of risk. Organizations can add additional security measures,
5794 such as additional or more rigorous authentication mechanisms, for specific types of access.
5795 Related Controls: AC-5, AC-6.
5796 (2) IDENTIFICATION AND AUTHENTICATION (ORGANIZATIONAL USERS) | MULTIFACTOR AUTHENTICATION
5797 TO NON-PRIVILEGED ACCOUNTS
5798 Implement multifactor authentication for access to non-privileged accounts.
5799 Discussion: Multifactor authentication requires the use of two or more different factors to
5800 achieve authentication. The authentication factors are defined as follows: something you
5801 know (e.g., a personal identification number (PIN)); something you have (e.g., a physical
5891 used to independently verify the authentication and/or requested action. For example, a
5892 user authenticates via a notebook computer to a remote server to which the user desires
5893 access and requests some action of the server via that communication path. Subsequently,
5894 the server contacts the user via the user’s cell phone to verify that the requested action
5895 originated from the user. The user may confirm the intended action to an individual on the
5896 telephone or provide an authentication code via the telephone. Out-of-band authentication
5897 can be used to mitigate actual or suspected man-in the-middle attacks. The conditions or
5898 criteria for activation can include suspicious activities, new threat indicators or elevated
5899 threat levels, or the impact or classification level of information in requested transactions.
5900 Related Controls: IA-10, IA-11, SC-37.
5901 References: [FIPS 140-3]; [FIPS 201-2]; [FIPS 202]; [SP 800-63-3]; [SP 800-73-4]; [SP 800-76-2];
5902 [SP 800-78-4]; [SP 800-79-2]; [SP 800-156]; [SP 800-166]; [IR 7539]; [IR 7676]; [IR 7817]; [IR
5903 7849]; [IR 7870]; [IR 7874]; [IR 7966].
5978 for adversaries to guess user identifiers. Prohibiting account identifiers as public identifiers
5979 without the implementation of other supporting controls only complicates guessing of
5980 identifiers. Additional protections are required for authenticators and attributes to protect
5981 the account.
5982 Related Controls: AT-2.
5983 (2) IDENTIFIER MANAGEMENT | SUPERVISOR AUTHORIZATION
5984 [Withdrawn: Incorporated into IA-12(1).]
5985 (3) IDENTIFIER MANAGEMENT | MULTIPLE FORMS OF CERTIFICATION
5986 [Withdrawn: Incorporated into IA-12(2).]
5987 (4) IDENTIFIER MANAGEMENT | IDENTIFY USER STATUS
5988 Manage individual identifiers by uniquely identifying each individual as [Assignment:
5989 organization-defined characteristic identifying individual status].
5990 Discussion: Characteristics identifying the status of individuals include contractors and
5991 foreign nationals. Identifying the status of individuals by characteristics provides additional
5992 information about the people with whom organizational personnel are communicating. For
5993 example, it might be useful for a government employee to know that one of the individuals
5994 on an email message is a contractor.
5995 Related Controls: None.
5996 (5) IDENTIFIER MANAGEMENT | DYNAMIC MANAGEMENT
5997 Manage individual identifiers dynamically in accordance with [Assignment: organization-
5998 defined dynamic identifier policy].
5999 Discussion: In contrast to conventional approaches to identification that presume static
6000 accounts for preregistered users, many distributed systems establish identifiers at run time
6001 for entities that were previously unknown. When identifiers are established at runtime for
6002 previously unknown entities, organizations can anticipate and provision for the dynamic
6003 establishment of identifiers. Pre-established trust relationships and mechanisms with
6004 appropriate authorities to validate identities and related credentials are essential.
6005 Related Controls: AC-16.
6006 (6) IDENTIFIER MANAGEMENT | CROSS-ORGANIZATION MANAGEMENT
6007 Coordinate with the following external organizations for cross-organization management
6008 of identifiers: [Assignment: organization-defined external organizations].
6009 Discussion: Cross-organization identifier management provides the capability to identify
6010 individuals, groups, roles, or devices when conducting cross-organization activities involving
6011 the processing, storage, or transmission of information.
6012 Related Controls: AU-16, IA-2, IA-5.
6013 (7) IDENTIFIER MANAGEMENT | IN-PERSON REGISTRATION
6014 [Withdrawn: Incorporated into IA-12(4).]
6015 (8) IDENTIFIER MANAGEMENT | PAIRWISE PSEUDONYMOUS IDENTIFIERS
6016 Generate pairwise pseudonymous identifiers.
6017 Discussion: A pairwise pseudonymous identifier is an opaque unguessable subscriber
6018 identifier generated by an identify provider for use at a specific individual relying party.
6019 Generating distinct pairwise pseudonymous identifiers, with no identifying information
6020 about a subscriber, discourages subscriber activity tracking and profiling beyond the
6021 operational requirements established by an organization. The pairwise pseudonymous
6022 identifiers are unique to each relying party, except in situations where relying parties can
6023 show a demonstrable relationship justifying an operational need for correlation, or all
6024 parties consent to being correlated in such a manner.
6025 Related Controls: IA-5.
6026 (9) IDENTIFIER MANAGEMENT | ATTRIBUTE MAINTENANCE AND PROTECTION
6027 Maintain the attributes for each uniquely identified individual, device, or service in
6028 [Assignment: organization-defined protected central storage].
6029 Discussion: For each of the entities covered in IA-2, IA-3, IA-8, and IA-9, it is important to
6030 maintain the attributes for each authenticated entity on an ongoing basis in a central
6031 (protected) store.
6032 Related Controls: None.
6033 References: [FIPS 201-2]; [SP 800-63-3]; [SP 800-73-4]; [SP 800-76-2]; [SP 800-78-4].
6063 hashed or encrypted formats or files containing encrypted or hashed passwords accessible with
6064 administrator privileges.
6065 Systems support authenticator management by organization-defined settings and restrictions for
6066 various authenticator characteristics (e.g., minimum password length, validation time window
6067 for time synchronous one-time tokens, and number of allowed rejections during the verification
6068 stage of biometric authentication). Actions can be taken to safeguard individual authenticators,
6069 including maintaining possession of authenticators; not sharing authenticators with others; and
6070 reporting lost, stolen, or compromised authenticators immediately. Authenticator management
6071 includes issuing and revoking authenticators for temporary access when no longer needed.
6072 Related Controls: AC-3, AC-6, CM-6, IA-2, IA-4, IA-7, IA-8, IA-9, MA-4, PE-2, PL-4.
6073 Control Enhancements:
6074 (1) AUTHENTICATOR MANAGEMENT | PASSWORD-BASED AUTHENTICATION
6075 For password-based authentication:
6076 (a) Maintain a list of commonly-used, expected, or compromised passwords and update
6077 the list [Assignment: organization-defined frequency] and when organizational
6078 passwords are suspected to have been compromised directly or indirectly;
6079 (b) Verify, when users create or update passwords, that the passwords are not found on
6080 the organization-defined list of commonly-used, expected, or compromised
6081 passwords;
6082 (c) Transmit only cryptographically-protected passwords;
6083 (d) Store passwords using an approved hash algorithm and salt, preferably using a keyed
6084 hash;
6085 (e) Require immediate selection of a new password upon account recovery;
6086 (f) Allow user selection of long passwords and passphrases, including spaces and all
6087 printable characters;
6088 (g) Employ automated tools to assist the user in selecting strong password
6089 authenticators; and
6090 (h) Enforce the following composition and complexity rules: [Assignment: organization-
6091 defined composition and complexity rules].
6092 Discussion: Password-based authentication applies to passwords regardless of whether they
6093 are used in single-factor or multifactor authentication. Long passwords or passphrases are
6094 preferable over shorter passwords. Enforced composition rules provide marginal security
6095 benefit while decreasing usability. However, organizations may choose to establish certain
6096 rules for password generation (e.g., minimum character length for long passwords) under
6097 certain circumstances and can enforce this requirement in IA-5(1)(h). Account recovery can
6098 occur, for example, in situations when a password is forgotten. Cryptographically-protected
6099 passwords include salted one-way cryptographic hashes of passwords. The list of commonly-
6100 used, compromised, or expected passwords includes passwords obtained from previous
6101 breach corpuses, dictionary words, and repetitive or sequential characters. The list includes
6102 context specific words, for example, the name of the service, username, and derivatives
6103 thereof.
6104 Related Controls: IA-6.
6105 (2) AUTHENTICATOR MANAGEMENT | PUBLIC KEY-BASED AUTHENTICATION
6106 (a) For public key-based authentication:
6107 (1) Enforce authorized access to the corresponding private key; and
6108 (2) Map the authenticated identity to the account of the individual or group; and
6201 performance requirements include the match rate as this rate reflects the accuracy of the
6202 biometric matching algorithm used by a system.
6203 Related Controls: AC-7.
6204 (13) AUTHENTICATOR MANAGEMENT | EXPIRATION OF CACHED AUTHENTICATORS
6205 Prohibit the use of cached authenticators after [Assignment: organization-defined time-
6206 period].
6207 Discussion: If cached authentication information is out-of-date, the validity of the
6208 authentication information may be questionable.
6209 Related Controls: None.
6210 (14) AUTHENTICATOR MANAGEMENT | MANAGING CONTENT OF PKI TRUST STORES
6211 For PKI-based authentication, employ an organization-wide methodology for managing the
6212 content of PKI trust stores installed across all platforms, including networks, operating
6213 systems, browsers, and applications.
6214 Discussion: An organization-wide methodology for managing the content of PKI trust stores
6215 helps improve the accuracy and currency of PKI-based authentication credentials across the
6216 organization.
6217 Related Controls: None.
6218 (15) AUTHENTICATOR MANAGEMENT | GSA-APPROVED PRODUCTS AND SERVICES
6219 Use only General Services Administration-approved and validated products and services
6220 for identity, credential, and access management.
6221 Discussion: General Services Administration (GSA)-approved products and services are the
6222 products and services that have been approved through the GSA conformance program,
6223 where applicable, and posted to the GSA Approved Products List. GSA provides guidance for
6224 teams to design and build functional and secure systems that comply with Federal Identity,
6225 Credential, and Access Management (FICAM) policies, technologies, and implementation
6226 patterns.
6227 Related Controls: None.
6228 (16) AUTHENTICATOR MANAGEMENT | IN-PERSON OR TRUSTED EXTERNAL PARTY AUTHENTICATOR
6229 ISSUANCE
6230 Require that the issuance of [Assignment: organization-defined types of and/or specific
6231 authenticators] be conducted [Selection: in person; by a trusted external party] before
6232 [Assignment: organization-defined registration authority] with authorization by
6233 [Assignment: organization-defined personnel or roles].
6234 Discussion: Issuing authenticators in person or by a trusted external party enhances and
6235 reinforces the trustworthiness of the identity proofing process.
6236 Related Controls: IA-12.
6237 (17) AUTHENTICATOR MANAGEMENT | PRESENTATION ATTACK DETECTION FOR BIOMETRIC
6238 AUTHENTICATORS
6239 Employ presentation attack detection mechanisms for biometric-based authentication.
6240 Discussion: Biometric characteristics do not constitute secrets. Such characteristics can be
6241 obtained by online web accesses; taking a picture of someone with a camera phone to
6242 obtain facial images with or without their knowledge; lifting from objects that someone has
6243 touched, for example, a latent fingerprint; or capturing a high-resolution image, for example,
6244 an iris pattern. Presentation attack detection technologies including liveness detection, can
6245 mitigate the risk of these types of attacks by making it difficult to produce artifacts intended
6246 to defeat the biometric sensor.
6334 applicable laws, executive orders, directives, policies, regulations, standards, and guidelines.
6335 The result is NIST-issued implementation profiles of approved protocols.
6336 Related Controls: None.
6337 (5) IDENTIFICATION AND AUTHENTICATION (NON-ORGANIZATIONAL USERS) | ACCEPTANCE OF PIV-I
6338 CREDENTIALS
6339 Accept and verify federated or PKI credentials that meet [Assignment: organization-
6340 defined policy].
6341 Discussion: This control enhancement can be implemented by PIV , PIV-I, and other
6342 commercial or external identity providers. Acceptance and verification of Personal Identity
6343 Verification (PIV)-I-compliant credentials applies to both logical and physical access control
6344 systems. Acceptance and verification of PIV-I credentials addresses nonfederal issuers of
6345 identity cards that desire to interoperate with United States Government PIV systems and
6346 that can be trusted by federal government-relying parties. The X.509 certificate policy for
6347 the Federal Bridge Certification Authority (FBCA) addresses PIV-I requirements. The PIV-I
6348 card is commensurate with the PIV credentials as defined in cited references. PIV-I
6349 credentials are the credentials issued by a PIV-I provider whose PIV-I certificate policy maps
6350 to the Federal Bridge PIV-I Certificate Policy. A PIV-I provider is cross-certified with the FBCA
6351 (directly or through another PKI bridge) with policies that have been mapped and approved
6352 as meeting the requirements of the PIV-I policies defined in the FBCA certificate policy.
6353 Related Controls: None.
6354 (6) IDENTIFICATION AND AUTHENTICATION (NON-ORGANIZATIONAL USERS) | DISASSOCIABILITY
6355 Implement the following measures to disassociate user attributes or credential assertion
6356 relationships among individuals, credential service providers, and relying parties:
6357 [Assignment: organization-defined measures].
6358 Discussion: Federated identity solutions can create increased privacy risks due to tracking
6359 and profiling of individuals. Using identifier mapping tables or cryptographic techniques to
6360 blind credential service providers and relying parties from each other or to make identity
6361 attributes less visible to transmitting parties can reduce these privacy risks.
6362 Related Controls: None.
6363 References: [OMB A-130]; [FIPS 201-2]; [SP 800-63-3]; [SP 800-79-2]; [SP 800-116]; [IR 8062].
6568 Discussion: Organizations use automated mechanisms to more thoroughly and effectively
6569 test incident response capabilities. This can be accomplished by providing more complete
6570 coverage of incident response issues; by selecting more realistic test scenarios and test
6571 environments; and by stressing the response capability.
6572 Related Controls: None.
6573 (2) INCIDENT RESPONSE TESTING | COORDINATION WITH RELATED PLANS
6574 Coordinate incident response testing with organizational elements responsible for related
6575 plans.
6576 Discussion: Organizational plans related to incident response testing include Business
6577 Continuity Plans, Disaster Recovery Plans, Continuity of Operations Plans, Contingency Plans,
6578 Crisis Communications Plans, Critical Infrastructure Plans, and Occupant Emergency Plans.
6579 Related Controls: None.
6580 (3) INCIDENT RESPONSE TESTING | CONTINUOUS IMPROVEMENT
6581 Use qualitative and quantitative data from testing to:
6582 (a) Determine the effectiveness of incident response processes;
6583 (b) Continuously improve incident response processes; and
6584 (c) Provide incident response measures and metrics that are accurate, consistent, and in a
6585 reproducible format.
6586 Discussion: To help incident response activities function as intended, organizations may use
6587 metrics and evaluation criteria to assess incident response programs as part of an effort to
6588 continually improve response performance. These efforts facilitate improvement in incident
6589 response efficacy and lessen the impact of incidents.
6590 Related Controls: None.
6591 References: [OMB A-130]; [SP 800-84]; [SP 800-115].
6612 suspicious email communications that can contain malicious code. Suspected supply chain
6613 incidents include the insertion of counterfeit hardware or malicious code into organizational
6614 systems or system components. Suspected privacy incidents include a breach of personally
6615 identifiable information or the recognition that the processing of personally identifiable
6616 information creates potential privacy risk.
6617 Related Controls: AC-19, AU-6, AU-7, CM-6, CP-2, CP-3, CP-4, IR-2, IR-3, IR-6, IR-8, IR-10, PE-6, PL-
6618 2, PM-12, SA-8, SC-5, SC-7, SI-3, SI-4, SI-7.
6619 Control Enhancements:
6620 (1) INCIDENT HANDLING | AUTOMATED INCIDENT HANDLING PROCESSES
6621 Support the incident handling process using [Assignment: organization-defined automated
6622 mechanisms].
6623 Discussion: Automated mechanisms supporting incident handling processes include online
6624 incident management systems; and tools that support the collection of live response data,
6625 full network packet capture, and forensic analysis.
6626 Related Controls: None.
6627 (2) INCIDENT HANDLING | DYNAMIC RECONFIGURATION
6628 Include the following types of dynamic reconfiguration for [Assignment: organization-
6629 defined system components] as part of the incident response capability: [Assignment:
6630 organization-defined types of dynamic reconfiguration].
6631 Discussion: Dynamic reconfiguration includes changes to router rules, access control lists,
6632 intrusion detection or prevention system parameters, and filter rules for guards or firewalls.
6633 Organizations perform dynamic reconfiguration of systems, for example, to stop attacks, to
6634 misdirect attackers, and to isolate components of systems, thus limiting the extent of the
6635 damage from breaches or compromises. Organizations include time frames for achieving the
6636 reconfiguration of systems in the definition of the reconfiguration capability, considering the
6637 potential need for rapid response to effectively address cyber threats.
6638 Related Controls: AC-2, AC-4, CM-2.
6639 (3) INCIDENT HANDLING | CONTINUITY OF OPERATIONS
6640 Identify [Assignment: organization-defined classes of incidents] and take the following
6641 actions in response to those incidents to ensure continuation of organizational missions
6642 and business functions: [Assignment: organization-defined actions to take in response to
6643 classes of incidents].
6644 Discussion: Classes of incidents include malfunctions due to design or implementation
6645 errors and omissions, targeted malicious attacks, and untargeted malicious attacks. Incident
6646 response actions include orderly system degradation, system shutdown, fall back to manual
6647 mode or activation of alternative technology whereby the system operates differently,
6648 employing deceptive measures, alternate information flows, or operating in a mode that is
6649 reserved for when systems are under attack. Organizations consider whether continuity of
6650 operations requirements during an incident conflict with the capability to automatically
6651 disable the system as specified as part of IR-4(5).
6652 Related Controls: None.
6653 (4) INCIDENT HANDLING | INFORMATION CORRELATION
6654 Correlate incident information and individual incident responses to achieve an
6655 organization-wide perspective on incident awareness and response.
6656 Discussion: Sometimes a threat event, for example, a hostile cyber-attack, can only be
6657 observed by bringing together information from different sources, including various reports
6658 and reporting procedures established by organizations.
6883 2. Describes the structure and organization of the incident response capability;
6884 3. Provides a high-level approach for how the incident response capability fits into the
6885 overall organization;
6886 4. Meets the unique requirements of the organization, which relate to mission, size,
6887 structure, and functions;
6888 5. Defines reportable incidents;
6889 6. Provides metrics for measuring the incident response capability within the organization;
6890 7. Defines the resources and management support needed to effectively maintain and
6891 mature an incident response capability;
6892 8. Is reviewed and approved by [Assignment: organization-defined personnel or roles]
6893 [Assignment: organization-defined frequency]; and
6894 9. Explicitly designates responsibility for incident response to [Assignment: organization-
6895 defined entities, personnel, or roles].
6896 b. Distribute copies of the incident response plan to [Assignment: organization-defined incident
6897 response personnel (identified by name and/or by role) and organizational elements];
6898 c. Update the incident response plan to address system and organizational changes or
6899 problems encountered during plan implementation, execution, or testing;
6900 d. Communicate incident response plan changes to [Assignment: organization-defined incident
6901 response personnel (identified by name and/or by role) and organizational elements]; and
6902 e. Protect the incident response plan from unauthorized disclosure and modification.
6903 Discussion: It is important that organizations develop and implement a coordinated approach to
6904 incident response. Organizational missions and business functions help determine the structure
6905 of incident response capabilities. As part of the incident response capabilities, organizations
6906 consider the coordination and sharing of information with external organizations, including
6907 external service providers and other organizations involved in the supply chain. For incidents
6908 involving personally identifiable information, include a process to determine whether notice to
6909 oversight organizations or affected individuals is appropriate and provide that notice accordingly.
6910 Related Controls: AC-2, CP-2, CP-4, IR-4, IR-7, IR-9, PE-6, PL-2, SA-15, SI-12, SR-8.
6911 Control Enhancements:
6912 (1) INCIDENT RESPONSE PLAN | PRIVACY BREACHES
6913 Include the following in the Incident Response Plan for breaches involving personally
6914 identifiable information:
6915 (a) A process to determine if notice to individuals or other organizations, including
6916 oversight organizations, is needed;
6917 (b) An assessment process to determine the extent of the harm, embarrassment,
6918 inconvenience, or unfairness to affected individuals and any mechanisms to mitigate
6919 such harms; and
6920 (c) Identification of applicable privacy requirements.
6921 Discussion: Organizations may be required by law, regulation, or policy to follow specific
6922 procedures relating to privacy breaches, including notice to individuals, affected
6923 organizations, and oversight bodies, standards of harm, and mitigation or other specific
6924 requirements.
6925 Related Controls: PT-1, PT-2, PT-3, PT-5, PT-6, PT-8.
7021 b. Approve and monitor all maintenance activities, whether performed on site or remotely and
7022 whether the system or system components are serviced on site or removed to another
7023 location;
7024 c. Require that [Assignment: organization-defined personnel or roles] explicitly approve the
7025 removal of the system or system components from organizational facilities for off-site
7026 maintenance, repair, or replacement;
7027 d. Sanitize equipment to remove the following information from associated media prior to
7028 removal from organizational facilities for off-site maintenance, repair, or replacement:
7029 [Assignment: organization-defined information];
7030 e. Check all potentially impacted controls to verify that the controls are still functioning
7031 properly following maintenance, repair, or replacement actions; and
7032 f. Include the following information in organizational maintenance records: [Assignment:
7033 organization-defined information].
7034 Discussion: Controlling system maintenance addresses the information security aspects of the
7035 system maintenance program and applies to all types of maintenance to system components
7036 conducted by local or nonlocal entities. Maintenance includes peripherals such as scanners,
7037 copiers, and printers. Information necessary for creating effective maintenance records includes
7038 date and time of maintenance; name of individuals or group performing the maintenance; name
7039 of escort, if necessary; a description of the maintenance performed; and system components or
7040 equipment removed or replaced. Organizations consider supply chain issues associated with
7041 replacement components for systems.
7042 Related Controls: CM-2, CM-3, CM-4, CM-5, CM-8, MA-4, MP-6, PE-16, SI-2, SR-3, SR-4, SR-11.
7043 Control Enhancements:
7044 (1) CONTROLLED MAINTENANCE | RECORD CONTENT
7045 [Withdrawn: Incorporated into MA-2.]
7046 (2) CONTROLLED MAINTENANCE | AUTOMATED MAINTENANCE ACTIVITIES
7047 (a) Schedule, conduct, and document maintenance, repair, and replacement actions for
7048 the system using [Assignment: organization-defined automated mechanisms]; and
7049 (b) Produce up-to date, accurate, and complete records of all maintenance, repair, and
7050 replacement actions requested, scheduled, in process, and completed.
7051 Discussion: The use of automated mechanisms to manage and control system maintenance
7052 programs and activities helps to ensure the generation of timely, accurate, complete, and
7053 consistent maintenance records.
7054 Related Controls: MA-3.
7055 References: [OMB A-130]; [IR 8023].
7064 Organizations have flexibility in determining roles for approval of maintenance tools and how
7065 that approval is documented. Periodic review of maintenance tools facilitates withdrawal of the
7066 approval for outdated, unsupported, irrelevant, or no-longer-used tools. Maintenance tools can
7067 include hardware, software, and firmware items. Such tools can be vehicles for transporting
7068 malicious code, intentionally or unintentionally, into a facility and subsequently into systems.
7069 Maintenance tools can include hardware and software diagnostic test equipment and packet
7070 sniffers. The hardware and software components that support system maintenance and are a
7071 part of the system, including the software implementing “ping,” “ls,” “ipconfig,” or the hardware
7072 and software implementing the monitoring port of an Ethernet switch, are not addressed by
7073 maintenance tools.
7074 Related Controls: MA-2, PE-16.
7075 Control Enhancements:
7076 (1) MAINTENANCE TOOLS | INSPECT TOOLS
7077 Inspect the maintenance tools used by maintenance personnel for improper or
7078 unauthorized modifications.
7079 Discussion: Maintenance tools can be brought into a facility directly by maintenance
7080 personnel or downloaded from a vendor’s website. If, upon inspection of the maintenance
7081 tools, organizations determine that the tools have been modified in an improper manner or
7082 the tools contain malicious code, the incident is handled consistent with organizational
7083 policies and procedures for incident handling.
7084 Related Controls: SI-7.
7085 (2) MAINTENANCE TOOLS | INSPECT MEDIA
7086 Check media containing diagnostic and test programs for malicious code before the media
7087 are used in the system.
7088 Discussion: If, upon inspection of media containing maintenance diagnostic and test
7089 programs, organizations determine that the media contain malicious code, the incident is
7090 handled consistent with organizational incident handling policies and procedures.
7091 Related Controls: SI-3.
7092 (3) MAINTENANCE TOOLS | PREVENT UNAUTHORIZED REMOVAL
7093 Prevent the removal of maintenance equipment containing organizational information by:
7094 (a) Verifying that there is no organizational information contained on the equipment;
7095 (b) Sanitizing or destroying the equipment;
7096 (c) Retaining the equipment within the facility; or
7097 (d) Obtaining an exemption from [Assignment: organization-defined personnel or roles]
7098 explicitly authorizing removal of the equipment from the facility.
7099 Discussion: Organizational information includes all information owned by organizations and
7100 any information provided to organizations for which the organizations serve as information
7101 stewards.
7102 Related Controls: MP-6.
7103 (4) MAINTENANCE TOOLS | RESTRICTED TOOL USE
7104 Restrict the use of maintenance tools to authorized personnel only.
7105 Discussion: This control enhancement applies to systems that are used to carry out
7106 maintenance functions.
7107 Related Controls: AC-3, AC-5, AC-6.
7237 (b) Develop and implement [Assignment: organization-defined alternate controls] in the
7238 event a system component cannot be sanitized, removed, or disconnected from the
7239 system.
7240 Discussion: Procedures for individuals who lack appropriate security clearances or who are
7241 not U.S. citizens are intended to deny visual and electronic access to classified or controlled
7242 unclassified information contained on organizational systems. Procedures for the use of
7243 maintenance personnel can be documented in security plans for the systems.
7244 Related Controls: MP-6, PL-2.
7245 (2) MAINTENANCE PERSONNEL | SECURITY CLEARANCES FOR CLASSIFIED SYSTEMS
7246 Verify that personnel performing maintenance and diagnostic activities on a system
7247 processing, storing, or transmitting classified information possess security clearances and
7248 formal access approvals for at least the highest classification level and for compartments
7249 of information on the system.
7250 Discussion: Personnel conducting maintenance on organizational systems may be exposed
7251 to classified information during the course of their maintenance activities. To mitigate the
7252 inherent risk of such exposure, organizations use maintenance personnel that are cleared
7253 (i.e., possess security clearances) to the classification level of the information stored on the
7254 system.
7255 Related Controls: PS-3.
7256 (3) MAINTENANCE PERSONNEL | CITIZENSHIP REQUIREMENTS FOR CLASSIFIED SYSTEMS
7257 Verify that personnel performing maintenance and diagnostic activities on a system
7258 processing, storing, or transmitting classified information are U.S. citizens.
7259 Discussion: Personnel conducting maintenance on organizational systems may be exposed
7260 to classified information during the course of their maintenance activities. If access to
7261 classified information on organizational systems is restricted to U. S. citizens, the same
7262 restriction is applied to personnel performing maintenance on those systems.
7263 Related Controls: PS-3.
7264 (4) MAINTENANCE PERSONNEL | FOREIGN NATIONALS
7265 Verify that:
7266 (a) Foreign nationals with appropriate security clearances are used to conduct
7267 maintenance and diagnostic activities on classified systems only when the systems are
7268 jointly owned and operated by the United States and foreign allied governments, or
7269 owned and operated solely by foreign allied governments; and
7270 (b) Approvals, consents, and detailed operational conditions regarding the use of foreign
7271 nationals to conduct maintenance and diagnostic activities on classified systems are
7272 fully documented within Memoranda of Agreements.
7273 Discussion: Personnel conducting maintenance on organizational systems may be exposed
7274 to classified information during the course of their maintenance activities. To mitigate the
7275 inherent risk of such exposure, organizations use maintenance personnel that are cleared
7276 (i.e., possess security clearances) to the classification level of the information stored on the
7277 system.
7278 Related Controls: PS-3.
7279 (5) MAINTENANCE PERSONNEL | NON-SYSTEM MAINTENANCE
7280 Verify that non-escorted personnel performing maintenance activities not directly
7281 associated with the system but in the physical proximity of the system, have required
7282 access authorizations.
7283 Discussion: Personnel performing maintenance activities in other capacities not directly
7284 related to the system include physical plant personnel and custodial personnel.
7285 Related Controls: None.
7286 References: None.
7388 access to patient medical records in a community hospital unless the individuals seeking access
7389 to such records are authorized healthcare providers is an example of restricting access to non-
7390 digital media. Limiting access to the design specifications stored on compact disks in the media
7391 library to individuals on the system development team is an example of restricting access to
7392 digital media.
7393 Related Controls: AC-19, AU-9, CP-2, CP-9, CP-10, MA-5, MP-4, MP-6, PE-2, PE-3, SC-13, SC-34,
7394 SI-12.
7395 Control Enhancements:
7396 (1) MEDIA ACCESS | AUTOMATED RESTRICTED ACCESS
7397 [Withdrawn: Incorporated into MP-4(2).]
7398 (2) MEDIA ACCESS | CRYPTOGRAPHIC PROTECTION
7399 [Withdrawn: Incorporated into SC-28(1).]
7400 References: [OMB A-130]; [FIPS 199]; [SP 800-111].
7430 compact disks, and digital video disks. Non-digital media includes paper and microfilm. Physically
7431 controlling stored media includes conducting inventories, ensuring procedures are in place to
7432 allow individuals to check out and return media to the library, and maintaining accountability for
7433 stored media. Secure storage includes a locked drawer, desk, or cabinet; or a controlled media
7434 library. The type of media storage is commensurate with the security category or classification of
7435 the information on the media. Controlled areas are spaces that provide physical and procedural
7436 controls to meet the requirements established for protecting information and systems. For
7437 media containing information determined to be in the public domain, to be publicly releasable,
7438 or to have limited adverse impact on organizations, operations, or individuals if accessed by
7439 other than authorized personnel, fewer controls may be needed. In these situations, physical
7440 access controls provide adequate protection.
7441 Related Controls: AC-19, CP-2, CP-6, CP-9, CP-10, MP-2, MP-7, PE-3, PL-2, SC-13, SC-28, SC-34, SI-
7442 12.
7443 Control Enhancements:
7444 (1) MEDIA STORAGE | CRYPTOGRAPHIC PROTECTION
7445 [Withdrawn: Incorporated into SC-28(1).]
7446 (2) MEDIA STORAGE | AUTOMATED RESTRICTED ACCESS
7447 Restrict access to media storage areas, log access attempts, and access granted using
7448 [Assignment: organization-defined automated mechanisms].
7449 Discussion: Automated mechanisms include keypads or card readers on the external entries
7450 to media storage areas.
7451 Related Controls: AC-3, AU-2, AU-6, AU-9, AU-12, PE-3.
7452 References: [FIPS 199]; [SP 800-56A]; [SP 800-56B]; [SP 800-56C]; [SP 800-57-1]; [SP 800-57-2];
7453 [SP 800-57-3]; [SP 800-111].
7475 Organizations establish documentation requirements for activities associated with the transport
7476 of system media in accordance with organizational assessments of risk. Organizations maintain
7477 the flexibility to define record-keeping methods for the different types of media transport as part
7478 of a system of transport-related records.
7479 Related Controls: AC-7, AC-19, CP-2, CP-9, MP-3, MP-4, PE-16, PL-2, SC-13, SC-28, SC-34.
7480 Control Enhancements:
7481 (1) MEDIA TRANSPORT | PROTECTION OUTSIDE OF CONTROLLED AREAS
7482 [Withdrawn: Incorporated into MP-5.]
7483 (2) MEDIA TRANSPORT | DOCUMENTATION OF ACTIVITIES
7484 [Withdrawn: Incorporated into MP-5.]
7485 (3) MEDIA TRANSPORT | CUSTODIANS
7486 Employ an identified custodian during transport of system media outside of controlled
7487 areas.
7488 Discussion: Identified custodians provide organizations with specific points of contact during
7489 the media transport process and facilitate individual accountability. Custodial responsibilities
7490 can be transferred from one individual to another if an unambiguous custodian is identified.
7491 Related Controls: None.
7492 (4) MEDIA TRANSPORT | CRYPTOGRAPHIC PROTECTION
7493 [Withdrawn: Incorporated into SC-28(1).]
7494 References: [FIPS 199]; [SP 800-60 v1]; [SP 800-60 v2].
7519 process for controlled unclassified information. NSA standards and policies control the
7520 sanitization process for media containing classified information.
7521 Related Controls: AC-3, AC-7, AU-11, MA-2, MA-3, MA-4, MA-5, PM-22, SI-12, SI-18, SI-19, SR-11.
7522 Control Enhancements:
7523 (1) MEDIA SANITIZATION | REVIEW, APPROVE, TRACK, DOCUMENT, AND VERIFY
7524 Review, approve, track, document, and verify media sanitization and disposal actions.
7525 Discussion: Organizations review and approve media to be sanitized to ensure compliance
7526 with records-retention policies. Tracking and documenting actions include listing personnel
7527 who reviewed and approved sanitization and disposal actions; types of media sanitized; files
7528 stored on the media; sanitization methods used; date and time of the sanitization actions;
7529 personnel who performed the sanitization; verification actions taken and personnel who
7530 performed the verification; and the disposal actions taken. Organizations verify that the
7531 sanitization of the media was effective prior to disposal.
7532 Related Controls: None.
7533 (2) MEDIA SANITIZATION | EQUIPMENT TESTING
7534 Test sanitization equipment and procedures [Assignment: organization-defined frequency]
7535 to verify that the intended sanitization is being achieved.
7536 Discussion: Testing of sanitization equipment and procedures may be conducted by
7537 qualified and authorized external entities, including federal agencies or external service
7538 providers.
7539 Related Controls: None.
7540 (3) MEDIA SANITIZATION | NONDESTRUCTIVE TECHNIQUES
7541 Apply nondestructive sanitization techniques to portable storage devices prior to
7542 connecting such devices to the system under the following circumstances: [Assignment:
7543 organization-defined circumstances requiring sanitization of portable storage devices].
7544 Discussion: Portable storage devices include external or removable hard disk drives (solid
7545 state, magnetic), optical discs, magnetic or optical tapes, flash memory devices, flash
7546 memory cards, and other external or removable disks. Portable storage devices can be
7547 obtained from untrustworthy sources and can contain malicious code that can be inserted
7548 into or transferred to organizational systems through USB ports or other entry portals. While
7549 scanning storage devices is recommended, sanitization provides additional assurance that
7550 such devices are free of malicious code. Organizations consider nondestructive sanitization
7551 of portable storage devices when the devices are purchased from manufacturers or vendors
7552 prior to initial use or when organizations cannot maintain a positive chain of custody for the
7553 devices.
7554 Related Controls: None.
7555 (4) MEDIA SANITIZATION | CONTROLLED UNCLASSIFIED INFORMATION
7556 [Withdrawn: Incorporated into MP-6.]
7557 (5) MEDIA SANITIZATION | CLASSIFIED INFORMATION
7558 [Withdrawn: Incorporated into MP-6.]
7559 (6) MEDIA SANITIZATION | MEDIA DESTRUCTION
7560 [Withdrawn: Incorporated into MP-6.]
7608 removing the capability to write to such devices. Requiring identifiable owners for storage
7609 devices reduces the risk of using such devices by allowing organizations to assign responsibility
7610 for addressing known vulnerabilities in the devices.
7611 Related Controls: AC-19, AC-20, PL-4, PM-12, SC-34, SC-41.
7612 Control Enhancements:
7613 (1) MEDIA USE | PROHIBIT USE WITHOUT OWNER
7614 [Withdrawn: Incorporated into MP-7.]
7615 (2) MEDIA USE | PROHIBIT USE OF SANITIZATION-RESISTANT MEDIA
7616 Prohibit the use of sanitization-resistant media in organizational systems.
7617 Discussion: Sanitization-resistance refers to non-destructive sanitization techniques and
7618 applies to the capability to purge information from media. Certain types of media do not
7619 support sanitization commands, or if supported, the interfaces are not supported in a
7620 standardized way across these devices. Sanitization-resistant media include compact flash,
7621 embedded flash on boards and devices, solid state drives, and USB removable media.
7622 Related Controls: MP-6.
7623 References: [FIPS 199]; [SP 800-111].
7752 1. Verifying individual access authorizations before granting access to the facility; and
7753 2. Controlling ingress and egress to the facility using [Selection (one or more): [Assignment:
7754 organization-defined physical access control systems or devices]; guards];
7755 b. Maintain physical access audit logs for [Assignment: organization-defined entry or exit
7756 points];
7757 c. Control access to areas within the facility designated as publicly accessible by implementing
7758 the following controls: [Assignment: organization-defined controls];
7759 d. Escort visitors and monitor visitor activity [Assignment: organization-defined circumstances
7760 requiring visitor escorts and monitoring];
7761 e. Secure keys, combinations, and other physical access devices;
7762 f. Inventory [Assignment: organization-defined physical access devices] every [Assignment:
7763 organization-defined frequency]; and
7764 g. Change combinations and keys [Assignment: organization-defined frequency] and/or when
7765 keys are lost, combinations are compromised, or when individuals possessing the keys or
7766 combinations are transferred or terminated.
7767 Discussion: Physical access control applies to employees and visitors. Individuals with permanent
7768 physical access authorization credentials are not considered visitors. Organizations determine
7769 the types of guards needed, including professional security staff, system users, or administrative
7770 staff. Physical access devices include keys, locks, combinations, and card readers. Physical access
7771 control systems comply with applicable laws, executive orders, directives, policies, regulations,
7772 standards, and guidelines. Organizations have flexibility in the types of audit logs employed.
7773 Audit logs can be procedural, automated, or some combination thereof. Physical access points
7774 can include facility access points, interior access points to systems requiring supplemental access
7775 controls, or both. Components of systems may be in areas designated as publicly accessible with
7776 organizations controlling access to the components.
7777 Related Controls: AT-3, AU-2, AU-6, AU-9, AU-13, CP-10, IA-3, IA-8, MA-5, MP-2, MP-4, PE-2, PE-
7778 4, PE-5, PE-8, PS-2, PS-3, PS-6, PS-7, RA-3, SC-28, SI-4, SR-3.
7779 Control Enhancements:
7780 (1) PHYSICAL ACCESS CONTROL | SYSTEM ACCESS
7781 Enforce physical access authorizations to the system in addition to the physical access
7782 controls for the facility at [Assignment: organization-defined physical spaces containing
7783 one or more components of the system].
7784 Discussion: Control of physical access to the system provides additional physical security for
7785 those areas within facilities where there is a concentration of system components.
7786 Related Controls: None.
7787 (2) PHYSICAL ACCESS CONTROL | FACILITY AND SYSTEMS
7788 Perform security checks [Assignment: organization-defined frequency] at the physical
7789 perimeter of the facility or system for exfiltration of information or removal of system
7790 components.
7791 Discussion: Organizations determine the extent, frequency, and/or randomness of security
7792 checks to adequately mitigate risk associated with exfiltration.
7793 Related Controls: AC-4, SC-7.
7840 interlocking doors, or partially automated using security guards to control the number of
7841 individuals entering the mantrap.
7842 Related Controls: None.
7843 References: [FIPS 201-2]; [SP 800-73-4]; [SP 800-76-2]; [SP 800-78-4]; [SP 800-116].
7969 such records facilitates record reviews on regular basis to determine if access authorizations
7970 are current and still required to support organizational missions and business functions.
7971 Related Controls: None.
7972 (2) VISITOR ACCESS RECORDS | PHYSICAL ACCESS RECORDS
7973 [Withdrawn: Incorporated into PE-2.]
7974 References: None.
8053 Discussion: The provision of emergency lighting applies primarily to organizational facilities
8054 containing concentrations of system resources, including data centers, server rooms, and
8055 mainframe computer rooms. Emergency lighting provisions for the system are described in the
8056 contingency plan for the organization. If emergency lighting for the system cannot be provided or
8057 fails, organizations consider alternate processing sites.
8058 Related Controls: CP-2, CP-7.
8059 Control Enhancements:
8060 (1) EMERGENCY LIGHTING | ESSENTIAL MISSIONS AND BUSINESS FUNCTIONS
8061 Provide emergency lighting for all areas within the facility supporting essential missions
8062 and business functions.
8063 Discussion: Organizations define their essential missions and functions.
8064 Related Controls: None.
8065 References: None.
8139 can help to minimize harm to individuals and damage to organizational assets by facilitating
8140 a timely incident response.
8141 Related Controls: None.
8142 References: None.
8177 d. Provide a means for employees to communicate with information security and privacy
8178 personnel in case of incidents.
8179 Discussion: Alternate work sites include government facilities or the private residences of
8180 employees. While distinct from alternative processing sites, alternate work sites can provide
8181 readily available alternate locations during contingency operations. Organizations can define
8182 different sets of controls for specific alternate work sites or types of sites depending on the
8183 work-related activities conducted at those sites. This control supports the contingency planning
8184 activities of organizations.
8185 Related Controls: AC-17, AC-18, CP-7.
8186 Control Enhancements: None.
8187 References: [SP 800-46].
8310 3. Describe the operational context of the system in terms of missions and business
8311 processes;
8312 4. Provide the security categorization of the system, including supporting rationale;
8313 5. Describe any specific threats to the system that are of concern to the organization;
8314 6. Provide the results of a privacy risk assessment for systems processing personally
8315 identifiable information;
8316 7. Describe the operational environment for the system and any dependencies on or
8317 connections to other systems or system components;
8318 8. Provide an overview of the security and privacy requirements for the system;
8319 9. Identify any relevant control baselines or overlays, if applicable;
8320 10. Describe the controls in place or planned for meeting the security and privacy
8321 requirements, including a rationale for any tailoring decisions;
8322 11. Include risk determinations for security and privacy architecture and design decisions;
8323 12. Include security- and privacy-related activities affecting the system that require planning
8324 and coordination with [Assignment: organization-defined individuals or groups]; and
8325 13. Are reviewed and approved by the authorizing official or designated representative
8326 prior to plan implementation.
8327 b. Distribute copies of the plans and communicate subsequent changes to the plans to
8328 [Assignment: organization-defined personnel or roles];
8329 c. Review the plans [Assignment: organization-defined frequency];
8330 d. Update the plans to address changes to the system and environment of operation or
8331 problems identified during plan implementation or control assessments; and
8332 e. Protect the plans from unauthorized disclosure and modification.
8333 Discussion: System security and privacy plans contain an overview of the security and privacy
8334 requirements for the system and the controls selected to satisfy the requirements. The plans
8335 describe the intended application of each selected control in the context of the system with a
8336 sufficient level of detail to correctly implement the control and to subsequently assess the
8337 effectiveness of the control. The control documentation describes how system-specific and
8338 hybrid controls are implemented and the plans and expectations regarding the functionality of
8339 the system. System security and privacy plans can also be used in the design and development of
8340 systems in support of life cycle-based security engineering processes. System security and privacy
8341 plans are living documents that are updated and adapted throughout the system development
8342 life cycle, for example, during capability determination, analysis of alternatives, requests for
8343 proposal, and design reviews. Section 2.1 describes the different types of requirements that are
8344 relevant to organizations during the system development life cycle and the relationship between
8345 requirements and controls.
8346 Organizations may develop a single, integrated security and privacy plan or maintain separate
8347 plans. Security and privacy plans relate security and privacy requirements to a set of controls and
8348 control enhancements. The plans describe how the controls and control enhancements meet the
8349 security and privacy requirements, but do not provide detailed, technical descriptions of the
8350 design or implementation of the controls and control enhancements. Security and privacy plans
8351 contain sufficient information (including specifications of control parameter values for selection
8352 and assignment statements explicitly or by reference) to enable a design and implementation
8353 that is unambiguously compliant with the intent of the plans and subsequent determinations of
8354 risk to organizational operations and assets, individuals, other organizations, and the Nation if
8355 the plan is implemented. Organizations can also apply the tailoring guidance to the control
8356 baselines in [SP 800-53B] to develop overlays for community-wide use or to address specialized
8357 requirements, technologies, missions, business applications, or environments of operation.
8358 Security and privacy plans need not be single documents. The plans can be a collection of various
8359 documents, including documents that already exist. Effective security and privacy plans make
8360 extensive use of references to policies, procedures, and additional documents, including design
8361 and implementation specifications where more detailed information can be obtained. The use of
8362 references helps to reduce the documentation associated with security and privacy programs
8363 and maintains the security- and privacy-related information in other established management
8364 and operational areas, including enterprise architecture, system development life cycle, systems
8365 engineering, and acquisition. Security and privacy plans need not contain detailed contingency
8366 plan or incident response plan information but instead can provide explicitly or by reference,
8367 sufficient information to define what needs to be accomplished by those plans.
8368 Security- and privacy-related activities that may require coordination and planning with other
8369 individuals or groups within the organization include: assessments, audits, and inspections;
8370 hardware and software maintenance; patch management; and contingency plan testing.
8371 Planning and coordination includes emergency and nonemergency (i.e., planned or non-urgent
8372 unplanned) situations. The process defined by organizations to plan and coordinate security- and
8373 privacy-related activities can also be included other documents, as appropriate.
8374 Related Controls: AC-2, AC-6, AC-14, AC-17, AC-20, CA-2, CA-3, CA-7, CM-9, CM-13, CP-2, CP-4,
8375 IR-4, IR-8, MA-4, MA-5, MP-4, MP-5, PL-7, PL-8, PL-10, PL-11, PM-1, PM-7, PM-8, PM-9, PM-10,
8376 PM-11, RA-3, RA-8, RA-9, SA-5, SA-17, SA-22, SI-12, SR-2, SR-4.
8377 Control Enhancements:
8378 (1) SYSTEM SECURITY AND PRIVACY PLANS | CONCEPT OF OPERATIONS
8379 [Withdrawn: Incorporated into PL-7.]
8380 (2) SYSTEM SECURITY AND PRIVACY PLANS | FUNCTIONAL ARCHITECTURE
8381 [Withdrawn: Incorporated into PL-8.]
8382 (3) SYSTEM SECURITY AND PRIVACY PLANS | PLAN AND COORDINATE WITH OTHER ORGANIZATIONAL
8383 ENTITIES
8384 [Withdrawn: Incorporated into PL-2.]
8385 References: [OMB A-130, Appendix II]; [SP 800-18]; [SP 800-37]; [SP 800-160 v1]; [SP 800-160
8386 v2].
8398 d. Require individuals who have acknowledged a previous version of the rules of behavior to
8399 read and re-acknowledge [Selection (one or more): [Assignment: organization-defined
8400 frequency]; when the rules are revised or updated].
8401 Discussion: Rules of behavior represent a type of access agreement for organizational users.
8402 Other types of access agreements include nondisclosure agreements, conflict-of-interest
8403 agreements, and acceptable use agreements (see PS-6). Organizations consider rules of behavior
8404 based on individual user roles and responsibilities, and differentiating, for example, between
8405 rules that apply to privileged users and rules that apply to general users. Establishing rules of
8406 behavior for some types of non-organizational users, including individuals who simply receive
8407 information from federal systems, is often not feasible given the large number of such users and
8408 the limited nature of their interactions with the systems. Rules of behavior for organizational and
8409 non-organizational users can also be established in AC-8. The related controls section provides a
8410 list of controls that are relevant to organizational rules of behavior. PL-4b, the documented
8411 acknowledgment portion of the control, may be satisfied by the awareness training and role-
8412 based training programs conducted by organizations if such training includes rules of behavior.
8413 Documented acknowledgements for rules of behavior include electronic or physical signatures;
8414 and electronic agreement check boxes or radio buttons.
8415 Related Controls: AC-2, AC-6, AC-8, AC-9, AC-17, AC-18, AC-19, AC-20, AT-2, AT-3, CM-11, IA-2,
8416 IA-4, IA-5, MP-7, PS-6, PS-8, SA-5, SI-12.
8417 Control Enhancements:
8418 (1) RULES OF BEHAVIOR | SOCIAL MEDIA AND EXTERNAL SITE/APPLICATION USAGE RESTRICTIONS
8419 Include in the rules of behavior, restrictions on:
8420 (a) Use of social media, social networking sites, and external sites/applications;
8421 (b) Posting organizational information on public websites; and
8422 (c) Use of organization-provided credentials (i.e., email addresses) for creating accounts
8423 on external sites/applications.
8424 Discussion: Social media, social networking, and external site/application usage restrictions
8425 address rules of behavior related to the use of these sites when organizational personnel are
8426 using such sites for official duties or in the conduct of official business; when organizational
8427 information is involved in social media and networking transactions; and when personnel are
8428 accessing social media and networking sites from organizational systems. Organizations also
8429 address specific rules that prevent unauthorized entities from obtaining, either directly or
8430 through inference, non-public organizational information from social media and networking
8431 sites. Non-public information includes, for example, personally identifiable information and
8432 system account information.
8433 Related Controls: AC-22, AU-13.
8434 References: [OMB A-130]; [SP 800-18].
8482 analysis of alternatives through review of the proposed architecture in the RFP responses, to the
8483 design reviews before and during implementation (e.g., during preliminary design reviews and
8484 critical design reviews).
8485 In today’s modern computing architectures, it is becoming less common for organizations to
8486 control all information resources. There may be key dependencies on external information
8487 services and service providers. Describing such dependencies in the security and privacy
8488 architectures is necessary for developing a comprehensive mission and business protection
8489 strategy. Establishing, developing, documenting, and maintaining under configuration control, a
8490 baseline configuration for organizational systems is critical to implementing and maintaining
8491 effective architectures. The development of the architectures is coordinated with the senior
8492 agency information security officer and the senior agency official for privacy to ensure that
8493 controls needed to support security and privacy requirements are identified and effectively
8494 implemented.
8495 PL-8 is primarily directed at organizations to ensure that architectures are developed for the
8496 system, and moreover, that the architectures are integrated with or tightly coupled to the
8497 enterprise architecture. In contrast, SA-17 is primarily directed at the external information
8498 technology product and system developers and integrators. SA-17, which is complementary to
8499 PL-8, is selected when organizations outsource the development of systems or components to
8500 external entities, and when there is a need to demonstrate consistency with the organization’s
8501 enterprise architecture and security and privacy architectures.
8502 Related Controls: CM-2, CM-6, PL-2, PL-7, PL-9, PM-5, PM-7, RA-9, SA-3, SA-5, SA-8, SA-17.
8503 Control Enhancements:
8504 (1) SECURITY AND PRIVACY ARCHITECTURES | DEFENSE-IN-DEPTH
8505 Design the security and privacy architectures for the system using a defense-in-depth
8506 approach that:
8507 (a) Allocates [Assignment: organization-defined controls] to [Assignment: organization-
8508 defined locations and architectural layers]; and
8509 (b) Ensures that the allocated controls operate in a coordinated and mutually reinforcing
8510 manner.
8511 Discussion: Organizations strategically allocate security and privacy controls in the security
8512 and privacy architectures so that adversaries must overcome multiple controls to achieve
8513 their objective. Requiring adversaries to defeat multiple controls makes it more difficult to
8514 attack information resources by increasing the work factor of the adversary; and increases
8515 the likelihood of detection. The coordination of allocated controls is essential to ensure that
8516 an attack that involves one control does not create adverse unintended consequences by
8517 interfering with other controls. Unintended consequences can include system lockout and
8518 cascading alarms. The placement of controls in systems and organizations is an important
8519 activity requiring thoughtful analysis. The value of organizational assets is an important
8520 consideration in providing additional layering. Defense-in-depth architectural approaches
8521 include modularity and layering (see SA-8(3)); separation of system and user functionality
8522 (see SC-2); and security function isolation (see SC-3).
8523 Related Controls: SC-2, SC-3, SC-29, SC-36.
8524 (2) SECURITY AND PRIVACY ARCHITECTURES | SUPPLIER DIVERSITY
8525 Require that [Assignment: organization-defined controls] allocated to [Assignment:
8526 organization-defined locations and architectural layers] are obtained from different
8527 suppliers.
8528 Discussion: Information technology products have different strengths and weaknesses.
8529 Providing a broad spectrum of products complements the individual offerings. For example,
8530 vendors offering malicious code protection typically update their products at different times,
8531 often developing solutions for known viruses, Trojans, or worms based on their priorities
8532 and development schedules. By deploying different products at different locations, there is
8533 an increased likelihood that at least one of the products will detect the malicious code. With
8534 respect to privacy, vendors may offer products that track personally identifiable information
8535 in systems. Products may use different tracking methods. Using multiple products may result
8536 in more assurance that personally identifiable information is inventoried.
8537 Related Controls: SC-29, SR-3.
8538 References: [OMB A-130]; [SP 800-160 v1]; [SP 800-160 v2].
8576 based on the requirements from [FISMA] and [PRIVACT]. The requirements, along with the NIST
8577 standards and guidelines implementing the legislation, direct organizations to select one of the
8578 control baselines after the reviewing the information types and the information that is
8579 processed, stored, and transmitted on the system; analyzing the potential adverse impact of the
8580 loss or compromise of the information or system on the organization’s operations and assets,
8581 individuals, other organizations or the Nation; and considering the results from system and
8582 organizational risk assessments.
8583 Related Controls: PL-2, PL-11, RA-2, RA-3, SA-8.
8584 Control Enhancements: None.
8585 References: [FIPS 199]; [FIPS 200]; [SP 800-30]; [SP 800-37]; [SP 800-39]; [SP 800-53B]; [SP 800-
8586 60 v1]; [SP 800-60 v2]; [SP 800-160 v1]; [CNSSI 1253].
8617
8638 Discussion: An information security program plan is a formal document that provides an
8639 overview of the security requirements for an organization-wide information security program
8640 and describes the program management controls and common controls in place or planned for
8641 meeting those requirements. Information security program plans can be represented in single
8642 documents or compilations of documents.
8643 Information security program plans document the program management and common controls.
8644 The plans provide sufficient information about the controls (including specification of parameters
8645 for assignment and selection statements explicitly or by reference) to enable implementations
8646 that are unambiguously compliant with the intent of the plans and a determination of the risk to
8647 be incurred if the plans are implemented as intended.
8648 Program management controls are generally implemented at the organization level and are
8649 essential for managing the organization’s information security program. Program management
8650 controls are distinct from common, system-specific, and hybrid controls because program
8651 management controls are independent of any particular information system. The individual
8652 system security plans and the organization-wide information security program plan together,
8653 provide complete coverage for the security controls employed within the organization.
8654 Common controls are documented in an appendix to the organization’s information security
8655 program plan unless the controls are included in a separate security plan for a system. The
8656 organization-wide information security program plan indicates which separate security plans
8657 contain descriptions of common controls.
8658 Related Controls: PL-2, PM-8, PM-12, RA-9, SI-12, SR-2.
8659 Control Enhancements: None.
8660 References: [FISMA]; [OMB A-130].
8681 c. Make available for expenditure, the planned information security and privacy resources.
8682 Discussion: Organizations consider establishing champions for information security and privacy
8683 and as part of including the necessary resources, assign specialized expertise and resources as
8684 needed. Organizations may designate and empower an Investment Review Board or similar
8685 group to manage and provide oversight for the information security and privacy aspects of the
8686 capital planning and investment control process.
8687 Related Controls: PM-4, SA-2.
8688 Control Enhancements: None.
8689 References: [OMB A-130].
8805 senior accountable official for risk management, can facilitate consistent application of the risk
8806 management strategy organization-wide. The risk management strategy can be informed by
8807 security and privacy risk-related inputs from other sources, both internal and external to the
8808 organization, to ensure the strategy is broad-based and comprehensive.
8809 Related Controls: AC-1, AU-1, AT-1, CA-1, CA-2, CA-5, CA-6, CA-7, CM-1, CP-1, IA-1, IR-1, MA-1,
8810 MP-1, PE-1, PL-1, PL-2, PM-2, PM-8, PM-18, PM-28, PM-30, PS-1, PT-1, PT-2, PT-3, RA-1, RA-3,
8811 RA-9, SA-1, SA-4, SC-1, SC-38, SI-1, SI-12, SR-1, SR-2.
8812 Control Enhancements: None.
8813 References: [OMB A-130]; [SP 800-30]; [SP 800-39]; [SP 800-161]; [IR 8023].
8848 controls for the organization and the systems. Inherent in defining protection and personally
8849 identifiable information processing needs, is an understanding of adverse impact that could
8850 result if a compromise or breach of information occurs. The categorization process is used to
8851 make such potential impact determinations. Privacy risks to individuals can arise from the
8852 compromise of personally identifiable information, but they can also arise as unintended
8853 consequences or a byproduct of authorized processing of information at any stage of the data
8854 life cycle. Privacy risk assessments are used to prioritize the risks that are created for individuals
8855 from system processing of personally identifiable information. These risk assessments enable the
8856 selection of the required privacy controls for the organization and systems. Mission and business
8857 process definitions and the associated protection requirements are documented in accordance
8858 with organizational policy and procedures.
8859 Related Controls: CP-2, PL-2, PM-7, PM-8, RA-2, RA-3, SA-2.
8860 Control Enhancements: None.
8861 References: [OMB A-130]; [FIPS 199]; [SP 800-60 v1]; [SP 800-60 v2]; [SP 800-160 v1].
8936 b. To maintain currency with recommended security and privacy practices, techniques, and
8937 technologies; and
8938 c. To share current security and privacy information, including threats, vulnerabilities, and
8939 incidents.
8940 Discussion: Ongoing contact with security and privacy groups and associations is important in an
8941 environment of rapidly changing technologies and threats. Groups and associations include
8942 special interest groups, professional associations, forums, news groups, users’ groups, and peer
8943 groups of security and privacy professionals in similar organizations. Organizations select security
8944 and privacy groups and associations based on missions and business functions. Organizations
8945 share threat, vulnerability, and incident information as well as contextual insights, compliance
8946 techniques, and privacy problems consistent with applicable laws, executive orders, directives,
8947 policies, regulations, standards, and guidelines.
8948 Related Controls: SA-11, SI-5.
8949 Control Enhancements: None.
8950 References: [OMB A-130].
9019 The senior agency official for privacy is responsible for designating which privacy controls the
9020 organization will treat as program management, common, system-specific, and hybrid controls.
9021 Privacy program plans provide sufficient information about the privacy program management
9022 and common controls (including the specification of parameters and assignment and selection
9023 statements explicitly or by reference) to enable control implementations that are unambiguously
9024 compliant with the intent of the plans and a determination of the risk incurred if the plans are
9025 implemented as intended.
9026 Program management controls are generally implemented at the organization level and are
9027 essential for managing the organization’s privacy program. Program management controls are
9028 distinct from common, system-specific, and hybrid controls because program management
9029 controls are independent of any particular information system. The privacy plans for individual
9030 systems and the organization-wide privacy program plan together, provide complete coverage
9031 for the privacy controls employed within the organization.
9032 Common controls are documented in an appendix to the organization’s privacy program plan
9033 unless the controls are included in a separate privacy plan for a system. The organization-wide
9034 privacy program plan indicates which separate privacy plans contain descriptions of privacy
9035 controls.
9036 Related Controls: PM-8, PM-9, PM-19.
9037 Control Enhancements: None.
9038 References: [PRIVACT]; [OMB A-130].
9062 including privacy impact assessments, system of records notices, computer matching notices and
9063 agreements, [PRIVACT] exemption and implementation rules, instructions for individuals making
9064 an access or amendment request, privacy reports, privacy policies, email addresses for
9065 questions/complaints, blogs, and periodic publications.
9066 Related Controls: PM-19, PT-6, PT-7, RA-8.
9067 Control Enhancements: None.
9068 References: [PRIVACT]; [OMB A-130]; [OMB M-17-06].
9150 information security officer, and senior agency official for privacy. Federal agencies are required
9151 to establish a Data Governance Body with specific roles and responsibilities in accordance with
9152 the [EVIDACT] and policies set forth under [OMB M-19-23].
9153 Related Controls: AT-2, AT-3, PM-19, PM-22, PM-24, PT-8, SI-4, SI-19.
9154 Control Enhancements: None.
9155 References: [EVIDACT]; [OMB A-130]; [OMB M-19-23]; [SP 800-188].
9272 c. Review and update the supply chain risk management strategy on [Assignment:
9273 organization-defined frequency] or as required, to address organizational changes.
9274 Discussion: An organization-wide supply chain risk management strategy includes an
9275 unambiguous expression of the supply chain risk tolerance for the organization, acceptable
9276 supply chain risk mitigation strategies or controls, a process for consistently evaluating and
9277 monitoring supply chain risk, approaches for implementing and communicating the supply chain
9278 risk management strategy, and the associated roles and responsibilities. Supply chain risk
9279 management includes considerations of both security and privacy risks associated with the
9280 development, acquisition, maintenance, and disposal of systems, system components, and
9281 system services. The supply chain risk management strategy can be incorporated into the
9282 organization’s overarching risk management strategy and can guide and inform the system-level
9283 supply chain risk management plan. The use of a risk executive function can facilitate a
9284 consistent, organization-wide application of the supply chain risk management strategy. The
9285 supply chain risk management strategy is implemented at the organizational level, whereas the
9286 supply chain risk management plan (see SR-2) is applied at the system-level.
9287 Related Controls: PM-9, SR-1, SR-2, SR-3, SR-4, SR-5, SR-6, SR-7, SR-8, SR-9, SR-11.
9288 Control Enhancements: None.
9289 References: [SP 800-161].
9318 MA-3a, MA-4a, PE-3d, PE-6, PE-14b, PE-16, PE-20, PM-6, PM-23, PS-7e, SA-9c, SC-5(3)(b), SC-7a,
9319 SC-7(24)(b), SC-18c, SC-43b, SI-4.
9320 Related Controls: AC-2, AC-6, AC-17, AT-4, AU-6, AU-13, CA-2, CA-5, CA-6, CA-7, CM-3, CM-4,
9321 CM-6, CM-11, IA-5, IR-5, MA-2, MA-3, MA-4, PE-3, PE-6, PE-14, PE-16, PE-20, PL-2, PM-4, PM-6,
9322 PM-9, PM-10, PM-12, PM-14, PM-23, PM-28, PS-7, PT-8, RA-3, RA-5, RA-7, SA-9, SA-11, SC-5, SC-
9323 7, SC-18, SC-38, SC-43, SC-38, SI-3, SI-4, SI-12, SR-2, SR-4.
9324 References: [SP 800-37]; [SP 800-137].
9398 Discussion: Position risk designations reflect Office of Personnel Management (OPM) policy and
9399 guidance. Proper position designation is the foundation of an effective and consistent suitability
9400 and personnel security program. The Position Designation System (PDS) assesses the duties and
9401 responsibilities of a position to determine the degree of potential damage to the efficiency or
9402 integrity of the service from misconduct of an incumbent of a position. This establishes the risk
9403 level of that position. This assessment also determines if a position’s duties and responsibilities
9404 present the potential for position incumbents to bring about a material adverse effect on the
9405 national security, and the degree of that potential effect, which establishes the sensitivity level of
9406 a position. The results of this assessment determine what level of investigation is conducted for a
9407 position. Risk designations can guide and inform the types of authorizations individuals receive
9408 when accessing organizational information and information systems. Position screening criteria
9409 include explicit information security role appointment requirements. Parts 1400 and 731 of Title
9410 5, Code of Federal Regulations establish the requirements for organizations to evaluate relevant
9411 covered positions for a position sensitivity and position risk designation commensurate with the
9412 duties and responsibilities of those positions.
9413 Related Controls: AC-5, AT-3, PE-2, PE-3, PL-2, PS-3, PS-6, SA-5, SA-21, SI-12.
9414 Control Enhancements: None.
9415 References: [5 CFR 731].
9443 Discussion: Types of classified information requiring formal indoctrination include Special
9444 Access Program (SAP), Restricted Data (RD), and Sensitive Compartment Information (SCI).
9445 Related Controls: AC-3, AC-4.
9446 (3) PERSONNEL SCREENING | INFORMATION WITH SPECIAL PROTECTIVE MEASURES
9447 Verify that individuals accessing a system processing, storing, or transmitting information
9448 requiring special protection:
9449 (a) Have valid access authorizations that are demonstrated by assigned official
9450 government duties; and
9451 (b) Satisfy [Assignment: organization-defined additional personnel screening criteria].
9452 Discussion: Organizational information requiring special protection includes controlled
9453 unclassified information. Personnel security criteria include position sensitivity background
9454 screening requirements.
9455 Related Controls: None.
9456 (4) PERSONNEL SCREENING | CITIZENSHIP REQUIREMENTS
9457 Verify that individuals accessing a system processing, storing, or transmitting [Assignment:
9458 organization-defined information types] meet [Assignment: organization-defined
9459 citizenship requirements].
9460 Discussion: None.
9461 Related Controls: None.
9462 References: [EO 13526]; [EO 13587]; [FIPS 199]; [FIPS 201-2]; [SP 800-60 v1]; [SP 800-60 v2]; [SP
9463 800-73-4]; [SP 800-76-2]; [SP 800-78-4].
9692 Discussion: Automated mechanisms augment verification that only authorized processing is
9693 occurring.
9694 Related Controls: CA-6, CM-12, PM-5, PM-22, SC-16, SC-43, SI-10, SI-15, SI-19.
9695 References: [PRIVACT]; [OMB A-130, Appendix II].
9736 Discussion: Data tags support tracking of processing purposes by conveying the purposes
9737 along with the relevant elements of personally identifiable information throughout the
9738 system. By conveying the processing purposes in a data tag along with the personally
9739 identifiable information as the information transits a system, a system owner or operator
9740 can identify whether a change in processing would be compatible with the identified and
9741 documented purposes. Data tags may also support the use of automated tools.
9742 Related Controls: CA-6, CM-12, PM-5, PM-22, SC-16, SC-43, SI-10, SI-15, SI-19.
9743 (2) PERSONALLY IDENTIFIABLE INFORMATION PROCESSING PURPOSES | AUTOMATION
9744 Track processing purposes of personally identifiable information using [Assignment:
9745 organization-defined automated mechanisms].
9746 Discussion: Automated mechanisms augment tracking of the processing purposes.
9747 Related Controls: CA-6, CM-12, PM-5, PM-22, SC-16, SC-43, SI-10, SI-15, SI-19.
9748 References: [PRIVACT]; [OMB A-130, Appendix II].
9824 privacy notices, as well as elements to include in privacy notices and required formats. In
9825 circumstances where laws or government-wide policies do not require privacy notices,
9826 organizational policies and determinations may require privacy notices and may serve as a source
9827 of the elements to include in privacy notices.
9828 Privacy risk assessments identify the privacy risks associated with the processing of personally
9829 identifiable information and may help organizations determine appropriate elements to include
9830 in a privacy notice to manage such risks. To help individuals understand how their information is
9831 being processed, organizations write materials in plain language and avoid technical jargon.
9832 Related Controls: PM-20, PM-22, PT-2, PT-3, PT-5, PT-8, RA-3, SI-18.
9833 Control Enhancements:
9834 (1) PRIVACY NOTICE | JUST-IN-TIME NOTICE
9835 Present notice of personally identifiable information processing to individuals at a time
9836 and location where the individual provides personally identifiable information or in
9837 conjunction with a data action, or [Assignment: organization-defined frequency].
9838 Discussion: Just-in-time notice enables individuals to be informed of how organizations
9839 process their personally identifiable information at a time when such notice may be most
9840 useful to the individual. Individual assumption about how personally identifiable information
9841 will be processed might not be accurate or reliable if time has passed since the organization
9842 last presented notice or the circumstances under which the individual was last provided
9843 notice have changed. Just-in-time notice can explain data actions that organizations have
9844 identified as potentially giving rise to greater privacy risk for individuals. Organizations can
9845 use just-in-time notice to update or remind individuals about specific data actions as they
9846 occur or highlight specific changes that occurred since last presenting notice. Just-in-time
9847 notice can be used in conjunction with just-in-time consent to explain what will occur if
9848 consent is declined. Organizations use discretion to determine when to use just-in-time
9849 notice and may use supporting information on user demographics, focus groups, or surveys
9850 to learn about users’ privacy interests and concerns.
9851 Related Controls: PM-21.
9852 (2) PRIVACY NOTICE | PRIVACY ACT STATEMENTS
9853 Include Privacy Act statements on forms that collect information that will be maintained in
9854 a Privacy Act system of records, or provide Privacy Act statements on separate forms that
9855 can be retained by individuals.
9856 Discussion: If a federal agency asks individuals to supply information that will become part
9857 of a system of records, the agency is required to provide a [PRIVACT] statement on the form
9858 used to collect the information or on a separate form that can be retained by the individual.
9859 The agency provides a [PRIVACT] statement in such circumstances regardless of whether the
9860 information will be collected on a paper or electronic form, on a website, on a mobile
9861 application, over the telephone, or through some other medium. This requirement ensures
9862 that the individual is provided with sufficient information about the request for information
9863 to make an informed decision on whether or not to respond.
9864 [PRIVACT] statements provide formal notice to individuals of the authority that authorizes
9865 the solicitation of the information; whether providing the information is mandatory or
9866 voluntary; the principal purpose(s) for which the information is to be used; the published
9867 routine uses to which the information is subject; the effects on the individual, if any, of not
9868 providing all or any part of the information requested; and an appropriate citation and link
9869 to the relevant system of records notice. Federal agency personnel consult with the senior
9870 agency official for privacy and legal counsel regarding the notice provisions of the [PRIVACT].
9871 Related Controls: PT-7.
9917 regulations include the specific name(s) of any system(s) of records that will be exempt, the
9918 specific provisions of the [PRIVACT] from which the system(s) of records is to be exempted,
9919 the reasons for the exemption, and an explanation for why the exemption is both necessary
9920 and appropriate.
9921 Related Controls: None.
9922 References: [PRIVACT]; [OMB A-108].
10022 c. Verify that the authorizing official or authorizing official designated representative reviews
10023 and approves the security categorization decision.
10024 Discussion: Clearly defined system boundaries are a prerequisite for security categorization
10025 decisions. Security categories describe the potential adverse impacts or negative consequences
10026 to organizational operations, organizational assets, and individuals if organizational information
10027 and systems are comprised through a loss of confidentiality, integrity, or availability. Security
10028 categorization is also a type of asset loss characterization in systems security engineering
10029 processes carried out throughout the system development life cycle. Organizations can use
10030 privacy risk assessments or privacy impact assessments to better understand the potential
10031 adverse effects on individuals.
10032 Organizations conduct the security categorization process as an organization-wide activity with
10033 the direct involvement of chief information officers, senior agency information security officers,
10034 senior agency officials for privacy, system owners, mission and business owners, and information
10035 owners or stewards. Organizations consider the potential adverse impacts to other organizations
10036 and, in accordance with [USA PATRIOT] and Homeland Security Presidential Directives, potential
10037 national-level adverse impacts.
10038 Security categorization processes facilitate the development of inventories of information assets,
10039 and along with CM-8, mappings to specific system components where information is processed,
10040 stored, or transmitted. The security categorization process is revisited throughout the system
10041 development life cycle to ensure the security categories remain accurate and relevant.
10042 Related Controls: CM-8, MP-4, PL-2, PL-10, PL-11, PM-7, RA-3, RA-5, RA-7, RA-8, SA-8, SC-7, SC-
10043 38, SI-12.
10044 Control Enhancements:
10045 (1) SECURITY CATEGORIZATION | IMPACT-LEVEL PRIORITIZATION
10046 Conduct an impact-level prioritization of organizational systems to obtain additional
10047 granularity on system impact levels.
10048 Discussion: Organizations apply the “high water mark” concept to each system categorized
10049 in accordance with [FIPS 199] resulting in systems designated as low impact, moderate
10050 impact, or high impact. Organizations desiring additional granularity in the system impact
10051 designations for risk-based decision making, can further partition the systems into sub-
10052 categories of the initial system categorization. For example, an impact-level prioritization on
10053 a moderate-impact system can produce three new sub-categories: low-moderate systems,
10054 moderate-moderate systems, and high-moderate systems. Impact-level prioritization and
10055 the resulting sub-categories of the system give organizations an opportunity to focus their
10056 investments related to security control selection and the tailoring of control baselines in
10057 responding to identified risks. Impact-level prioritization can also be used to determine
10058 those systems that may be of heightened interest or value to adversaries or represent a
10059 critical loss to the federal enterprise, sometimes described as high value assets. For such
10060 high value assets, organizations may be more focused on complexity, aggregation, and
10061 interconnections. Systems with high value assets can be prioritized by partitioning high-
10062 impact systems into low-high systems, moderate-high systems, and high-high systems.
10063 Related Controls: None.
10064 References: [FIPS 199]; [FIPS 200]; [SP 800-30]; [SP 800-37]; [SP 800-39]; [SP 800-60 v1]; [SP 800-
10065 60 v2]; [SP 800-160 v1].
10158 References: [OMB A-130]; [SP 800-30]; [SP 800-39]; [SP 800-161]; [IR 8023]; [IR 8062].
10202 Vulnerability Assessment Language (OVAL) to determine the presence of vulnerabilities. Sources
10203 for vulnerability information include the Common Weakness Enumeration (CWE) listing and the
10204 National Vulnerability Database (NVD). Control assessments such as red team exercises provide
10205 additional sources of potential vulnerabilities for which to scan. Organizations also consider using
10206 scanning tools that express vulnerability impact by the Common Vulnerability Scoring System
10207 (CVSS).
10208 Vulnerability monitoring also includes a channel and process for receiving reports of security
10209 vulnerabilities from the public at-large. Vulnerability disclosure programs can be as simple as
10210 publishing a monitored email address or web form that can receive reports, including notification
10211 authorizing good-faith research and disclosure of security vulnerabilities. Organizations generally
10212 expect that such research is happening with or without their authorization, and can use public
10213 vulnerability disclosure channels to increase the likelihood that discovered vulnerabilities are
10214 reported directly to the organization for remediation.
10215 Organizations may also employ the use of financial incentives (also known as “bug bounties”) to
10216 further encourage external security researchers to report discovered vulnerabilities. Bug bounty
10217 programs can be tailored to the organization’s needs. Bounties can be operated indefinitely or
10218 over a defined period of time, and can be offered to the general public or to a curated group.
10219 Organizations may run public and private bounties simultaneously, and could choose to offer
10220 partially credentialed access to certain participants in order to evaluate security vulnerabilities
10221 from privileged vantage points.
10222 Related Controls: CA-2, CA-7, CM-2, CM-4, CM-6, CM-8, RA-2, RA-3, SA-11, SA-15, SC-38, SI-2, SI-
10223 3, SI-4, SI-7, SR-11.
10224 Control Enhancements:
10225 (1) VULNERABILITY SCANNING | UPDATE TOOL CAPABILITY
10226 [Withdrawn: Incorporated into RA-5.]
10227 (2) VULNERABILITY MONITORING AND SCANNING | UPDATE SYSTEM VULNERABILITIES
10228 Update the system vulnerabilities to be scanned [Selection (one or more): [Assignment:
10229 organization-defined frequency]; prior to a new scan; when new vulnerabilities are
10230 identified and reported].
10231 Discussion: Due to the complexity of modern software and systems and other factors, new
10232 vulnerabilities are discovered on a regular basis. It is important that newly discovered
10233 vulnerabilities are added to the list of vulnerabilities to be scanned to ensure that the
10234 organization can take steps to mitigate those vulnerabilities in a timely manner.
10235 Related Controls: SI-5.
10236 (3) VULNERABILITY MONITORING AND SCANNING | BREADTH AND DEPTH OF COVERAGE
10237 Define the breadth and depth of vulnerability scanning coverage.
10238 Discussion: The breadth of vulnerability scanning coverage can be expressed, for example,
10239 as a percentage of components within the system, by the particular types of systems, by the
10240 criticality of systems, or by the number of vulnerabilities to be checked. Conversely, the
10241 depth of vulnerability scanning coverage can be expressed as the level of the system design
10242 the organization intends to monitor (e.g., component, module, subsystem). Organizations
10243 can determine the sufficiency of vulnerability scanning coverage with regard to its risk
10244 tolerance and other factors. [SP 800-53A] provides additional information on the breadth
10245 and depth of coverage.
10246 Related Controls: None.
10292 Discussion: An attack vector is a path or means by which an adversary can gain access to a
10293 system in order to deliver malicious code or exfiltrate information. Organizations can use
10294 attack trees to show how hostile activities by adversaries interact and combine to produce
10295 adverse impacts or negative consequences to systems and organizations. Such information,
10296 together with correlated data from vulnerability scanning tools, can provide greater clarity
10297 regarding multi-vulnerability and multi-hop attack vectors. The correlation of vulnerability
10298 scanning information is especially important when organizations are transitioning from older
10299 technologies to newer technologies (e.g., transitioning from IPv4 to IPv6 network protocols).
10300 During such transitions, some system components may inadvertently be unmanaged and
10301 create opportunities for adversary exploitation.
10302 Related Controls: None.
10303 (11) VULNERABILITY MONITORING AND SCANNING | PUBLIC DISCLOSURE PROGRAM
10304 Establish an [Assignment: organization-defined public reporting channel] for receiving
10305 reports of vulnerabilities in organizational systems and system components.
10306 Discussion: The reporting channel is publicly discoverable and contains clear language
10307 authorizing good-faith research and disclosure of vulnerabilities to the organization. The
10308 organization does not condition its authorization on an expectation of indefinite non-
10309 disclosure to the public by the reporting entity, but may request a specific time period to
10310 properly remediate the vulnerability.
10311 Related Controls: None.
10312 References: [SP 800-40]; [SP 800-53A]; [SP 800-70]; [SP 800-115]; [SP 800-126]; [IR 7788]; [IR
10313 8023].
10336 determine an appropriate response to risk before generating a plan of action and milestones
10337 entry. For example, the response may be to accept risk or reject risk, or it may be possible to
10338 mitigate the risk immediately so a plan of action and milestones entry is not needed. However, if
10339 the risk response is to mitigate the risk and the mitigation cannot be completed immediately, a
10340 plan of action and milestones entry is generated.
10341 Related Controls: CA-5, IR-9, PM-4, PM-28, RA-2, RA-3, SR-2.
10342 Control Enhancements: None.
10343 References: [FIPS 199]; [FIPS 200]; [SP 800-30]; [SP 800-37]; [SP 800-39]; [SP 800-160 v1].
10426 and infrastructure for advanced threats. The objective is to track and disrupt cyber adversaries as
10427 early as possible in the attack sequence and to measurably improve the speed and accuracy of
10428 organizational responses. Indications of compromise include unusual network traffic, unusual file
10429 changes, and the presence of malicious code. Threat hunting teams leverage existing threat
10430 intelligence and may create new threat intelligence, which is shared with peer organizations,
10431 Information Sharing and Analysis Organizations (ISAO), Information Sharing and Analysis Centers
10432 (ISAC), and relevant government departments and agencies.
10433 Related Controls: RA-3, RA-5, RA-6.
10434 Control Enhancements: None.
10435 References: [SP 800-30].
10476 b. Determine, document, and allocate the resources required to protect the system or system
10477 service as part of the organizational capital planning and investment control process; and
10478 c. Establish a discrete line item for information security and privacy in organizational
10479 programming and budgeting documentation.
10480 Discussion: Resource allocation for information security and privacy includes funding for system
10481 and services acquisition, sustainment, and supply chain concerns throughout the system
10482 development life cycle.
10483 Related Controls: PL-7, PM-3, PM-11, SA-9, SR-3, SR-5.
10484 Control Enhancements: None.
10485 References: [OMB A-130]; [SP 800-160 v1].
10658 (b) Use the configurations as the default for any subsequent system, component, or
10659 service reinstallation or upgrade.
10660 Discussion: Examples of security configurations include the U.S. Government Configuration
10661 Baseline (USGCB), Security Technical Implementation Guides (STIGs), and any limitations on
10662 functions, ports, protocols, and services. Security characteristics can include requiring that
10663 default passwords have been changed.
10664 Related Controls: None.
10665 (6) ACQUISITION PROCESS | USE OF INFORMATION ASSURANCE PRODUCTS
10666 (a) Employ only government off-the-shelf or commercial off-the-shelf information
10667 assurance and information assurance-enabled information technology products that
10668 compose an NSA-approved solution to protect classified information when the
10669 networks used to transmit the information are at a lower classification level than the
10670 information being transmitted; and
10671 (b) Ensure that these products have been evaluated and/or validated by NSA or in
10672 accordance with NSA-approved procedures.
10673 Discussion: Commercial off-the-shelf IA or IA-enabled information technology products used
10674 to protect classified information by cryptographic means may be required to use NSA-
10675 approved key management. See [NSA CSFC].
10676 Related Controls: SC-8, SC-12, SC-13.
10677 (7) ACQUISITION PROCESS | NIAP-APPROVED PROTECTION PROFILES
10678 (a) Limit the use of commercially provided information assurance and information
10679 assurance-enabled information technology products to those products that have been
10680 successfully evaluated against a National Information Assurance partnership (NIAP)-
10681 approved Protection Profile for a specific technology type, if such a profile exists; and
10682 (b) Require, if no NIAP-approved Protection Profile exists for a specific technology type
10683 but a commercially provided information technology product relies on cryptographic
10684 functionality to enforce its security policy, that the cryptographic module is FIPS-
10685 validated or NSA-approved.
10686 Discussion: See [NIAP CCEVS] for additional information on NIAP. See [NIST CMVP] for
10687 additional information on FIPS-validated cryptographic modules.
10688 Related Controls: IA-7, SC-12, SC-13.
10689 (8) ACQUISITION PROCESS | CONTINUOUS MONITORING PLAN FOR CONTROLS
10690 Require the developer of the system, system component, or system service to produce a
10691 plan for continuous monitoring of control effectiveness that contains the following level of
10692 detail: [Assignment: organization-defined level of detail].
10693 Discussion: The objective of continuous monitoring plans is to determine if the planned,
10694 required, and deployed controls within the system, system component, or system service
10695 continue to be effective over time based on the inevitable changes that occur. Developer
10696 continuous monitoring plans include a sufficient level of detail such that the information can
10697 be incorporated into continuous monitoring strategies and programs implemented by
10698 organizations. Continuous monitoring plans can include the frequency of control monitoring,
10699 types of control assessment and monitoring activities planned, and actions to be taken when
10700 controls fail or become ineffective.
10701 Related Controls: CA-7.
10702 (9) ACQUISITION PROCESS | FUNCTIONS, PORTS, PROTOCOLS, AND SERVICES IN USE
10703 Require the developer of the system, system component, or system service to identify the
10704 functions, ports, protocols, and services intended for organizational use.
10705 Discussion: The identification of functions, ports, protocols, and services early in the system
10706 development life cycle, for example, during the initial requirements definition and design
10707 stages, allows organizations to influence the design of the system, system component, or
10708 system service. This early involvement in the system life cycle helps organizations to avoid or
10709 minimize the use of functions, ports, protocols, or services that pose unnecessarily high risks
10710 and understand the trade-offs involved in blocking specific ports, protocols, or services or
10711 when requiring system service providers to do so. Early identification of functions, ports,
10712 protocols, and services avoids costly retrofitting of controls after the system, component, or
10713 system service has been implemented. SA-9 describes the requirements for external system
10714 services. Organizations identify which functions, ports, protocols, and services are provided
10715 from external sources.
10716 Related Controls: CM-7, SA-9.
10717 (10) ACQUISITION PROCESS | USE OF APPROVED PIV PRODUCTS
10718 Employ only information technology products on the FIPS 201-approved products list for
10719 Personal Identity Verification (PIV) capability implemented within organizational systems.
10720 Discussion: Products on the FIPS 201-approved products list meet NIST requirements for
10721 Personal Identity Verification (PIV) of Federal Employees and Contractors. PIV cards are used
10722 for multifactor authentication in systems and organizations.
10723 Related Controls: IA-2, IA-8, PM-9.
10724 (11) ACQUISITION PROCESS | SYSTEM OF RECORDS
10725 Include [Assignment: organization-defined Privacy Act requirements] in the acquisition
10726 contract for the operation of a system of records on behalf of an organization to
10727 accomplish an organizational mission or function.
10728 Discussion: When an organization provides by a contract for the operation of a system of
10729 records to accomplish an organizational mission or function, the organization, consistent
10730 with its authority, causes the requirements of the [PRIVACT] to be applied to the system of
10731 records.
10732 Related Controls: PT-7.
10733 (12) ACQUISITION PROCESS | DATA OWNERSHIP
10734 (a) Include organizational data ownership requirements in the acquisition contract; and
10735 (b) Require all data to be removed from the contractor’s system and returned to the
10736 organization within [Assignment: organization-defined timeframe].
10737 Discussion: Contractors operating a system that contains data owned by an organization
10738 initiating the contract, have policies and procedures in place to remove the data from their
10739 systems and/or return the data in a timeframe defined by the contract.
10740 Related Controls: None.
10741 References: [PRIVACT]; [OMB A-130]; [ISO 15408-1]; [ISO 15408-2]; [ISO 15408-3]; [FIPS 140-3];
10742 [FIPS 201-2]; [SP 800-35]; [SP 800-37]; [SP 800-70]; [SP 800-73-4]; [SP 800-137]; [SP 800-160 v1];
10743 [SP 800-161]; [IR 7539]; [IR 7622]; [IR 7676]; [IR 7870]; [IR 8062]; [NIAP CCEVS]; [NSA CSFC].
10837 different functionality, depending on how it is used). Information hiding, also known as
10838 representation-independent programming, is a design discipline to ensure that the internal
10839 representation of information in one system component is not visible to another system
10840 component invoking or calling the first component, such that the published abstraction is
10841 not influenced by how the data may be managed internally.
10842 Related Controls: None.
10843 (2) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | LEAST COMMON MECHANISM
10844 Implement the security design principle of least common mechanism in [Assignment:
10845 organization-defined systems or system components].
10846 Discussion: The principle of least common mechanism states that the amount of mechanism
10847 common to more than one user and depended on by all users is minimized [POPEK74].
10848 Minimization of mechanism implies that different components of a system refrain from
10849 using the same mechanism to access a system resource. Every shared mechanism (especially
10850 a mechanism involving shared variables) represents a potential information path between
10851 users and is designed with great care to be sure it does not unintentionally compromise
10852 security [SALTZER75]. Implementing the principle of least common mechanism helps to
10853 reduce the adverse consequences of sharing system state among different programs. A
10854 single program corrupting a shared state (including shared variables) has the potential to
10855 corrupt other programs that are dependent on the state. The principle of least common
10856 mechanism also supports the principle of simplicity of design and addresses the issue of
10857 covert storage channels [LAMPSON73].
10858 Related Controls: None.
10859 (3) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | MODULARITY AND LAYERING
10860 Implement the security design principles of modularity and layering in [Assignment:
10861 organization-defined systems or system components].
10862 Discussion: The principles of modularity and layering are fundamental across system
10863 engineering disciplines. Modularity and layering derived from functional decomposition are
10864 effective in managing system complexity, by making it possible to comprehend the structure
10865 of the system. Modular decomposition, or refinement in system design, is challenging and
10866 resists general statements of principle. Modularity serves to isolate functions and related
10867 data structures into well-defined logical units. Layering allows the relationships of these
10868 units to be better understood, so that dependencies are clear and undesired complexity can
10869 be avoided. The security design principle of modularity extends functional modularity to
10870 include considerations based on trust, trustworthiness, privilege, and security policy.
10871 Security-informed modular decomposition includes the following: allocation of policies to
10872 systems in a network; separation of system applications into processes with distinct address
10873 spaces; allocation of system policies to layers; and separation of processes into subjects with
10874 distinct privileges based on hardware-supported privilege domains.
10875 Related Controls: SC-2, SC-3.
10876 (4) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | PARTIALLY ORDERED DEPENDENCIES
10877 Implement the security design principle of partially ordered dependencies in [Assignment:
10878 organization-defined systems or system components].
10879 Discussion: The principle of partially ordered dependencies states that the synchronization,
10880 calling, and other dependencies in the system are partially ordered. A fundamental concept
10881 in system design is layering, whereby the system is organized into well-defined, functionally
10882 related modules or components. The layers are linearly ordered with respect to inter-layer
10883 dependencies, such that higher layers are dependent on lower layers. While providing
10884 functionality to higher layers, some layers can be self-contained and not dependent upon
10885 lower layers. While a partial ordering of all functions in a given system may not be possible,
10886 if circular dependencies are constrained to occur within layers, the inherent problems of
10887 circularity can be more easily managed. Partially ordered dependencies and system layering
10888 contribute significantly to the simplicity and the coherency of the system design. Partially
10889 ordered dependencies also facilitate system testing and analysis.
10890 Related Controls: None.
10891 (5) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | EFFICIENTLY MEDIATED ACCESS
10892 Implement the security design principle of efficiently mediated access in [Assignment:
10893 organization-defined systems or system components].
10894 Discussion: The principle of efficiently mediated access states that policy-enforcement
10895 mechanisms utilize the least common mechanism available while satisfying stakeholder
10896 requirements within expressed constraints. The mediation of access to system resources
10897 (i.e., CPU, memory, devices, communication ports, services, infrastructure, data and
10898 information) is often the predominant security function of secure systems. It also enables
10899 the realization of protections for the capability provided to stakeholders by the system.
10900 Mediation of resource access can result in performance bottlenecks if the system is not
10901 designed correctly. For example, by using hardware mechanisms, efficiently mediated access
10902 can be achieved. Once access to a low-level resource such as memory has been obtained,
10903 hardware protection mechanisms can ensure that out-of-bounds access does not occur.
10904 Related Controls: None.
10905 (6) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | MINIMIZED SHARING
10906 Implement the security design principle of minimized sharing in [Assignment:
10907 organization-defined systems or system components].
10908 Discussion: The principle of minimized sharing states that no computer resource is shared
10909 between system components (e.g., subjects, processes, functions) unless it is absolutely
10910 necessary to do so. Minimized sharing helps to simplify system design and implementation.
10911 In order to protect user-domain resources from arbitrary active entities, no resource is
10912 shared unless that sharing has been explicitly requested and granted. The need for resource
10913 sharing can be motivated by the design principle of least common mechanism in the case
10914 internal entities, or driven by stakeholder requirements. However, internal sharing is
10915 carefully designed to avoid performance and covert storage- and timing-channel problems.
10916 Sharing via common mechanism can increase the susceptibility of data and information to
10917 unauthorized access, disclosure, use, or modification and can adversely affect the inherent
10918 capability provided by the system. To minimize sharing induced by common mechanisms,
10919 such mechanisms can be designed to be reentrant or virtualized to preserve separation.
10920 Moreover, use of global data to share information is carefully scrutinized. The lack of
10921 encapsulation may obfuscate relationships among the sharing entities.
10922 Related Controls: SC-31.
10923 (7) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | REDUCED COMPLEXITY
10924 Implement the security design principle of reduced complexity in [Assignment:
10925 organization-defined systems or system components].
10926 Discussion: The principle of reduced complexity states that the system design is as simple
10927 and small as possible. A small and simple design is more understandable, more analyzable,
10928 and less prone to error. The reduced complexity principle applies to any aspect of a system,
10929 but it has particular importance for security due to the various analyses performed to obtain
10930 evidence about the emergent security property of the system. For such analyses to be
10931 successful, a small and simple design is essential. Application of the principle of reduced
10932 complexity contributes to the ability of system developers to understand the correctness
10933 and completeness of system security functions. It also facilitates identification of potential
10934 vulnerabilities. The corollary of reduced complexity states that the simplicity of the system is
10935 directly related to the number of vulnerabilities it will contain—that is, simpler systems
10936 contain fewer vulnerabilities. An important benefit of reduced complexity is that it is easier
10937 to understand whether the intended security policy has been captured in the system design,
10938 and that fewer vulnerabilities are likely to be introduced during engineering development.
10939 An additional benefit is that any such conclusion about correctness, completeness, and
10940 existence of vulnerabilities can be reached with a higher degree of assurance in contrast to
10941 conclusions reached in situations where the system design is inherently more complex.
10942 Transitioning from older technologies to newer technologies (e.g., transitioning from IPv4 to
10943 IPv6) may require implementing the older and newer technologies simultaneously during the
10944 transition period. This may result in a temporary increase in system complexity during the
10945 transition.
10946 Related Controls: None.
10947 (8) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SECURE EVOLVABILITY
10948 Implement the security design principle of secure evolvability in [Assignment:
10949 organization-defined systems or system components].
10950 Discussion: The principle of secure evolvability states that a system is developed to facilitate
10951 the maintenance of its security properties when there are changes to the system’s structure,
10952 interfaces, interconnections (i.e., system architecture), functionality, or its configuration (i.e.,
10953 security policy enforcement). Changes include a new, an enhanced, or an upgraded system
10954 capability; maintenance and sustainment activities; and reconfiguration. Although it is not
10955 possible to plan for every aspect of system evolution, system upgrades and changes can be
10956 anticipated by analyses of mission or business strategic direction; anticipated changes in the
10957 threat environment; and anticipated maintenance and sustainment needs. It is unrealistic to
10958 expect that complex systems remain secure in contexts not envisioned during development,
10959 whether such contexts are related to the operational environment or to usage. A system
10960 may be secure in some new contexts, but there is no guarantee that its emergent behavior
10961 will always be secure. It is easier to build trustworthiness into a system from the outset, and
10962 it follows that the sustainment of system trustworthiness requires planning for change as
10963 opposed to adapting in an ad hoc or non-methodical manner. The benefits of this principle
10964 include reduced vendor life-cycle costs; reduced cost of ownership; improved system
10965 security; more effective management of security risk; and less risk uncertainty.
10966 Related Controls: CM-3.
10967 (9) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | TRUSTED COMPONENTS
10968 Implement the security design principle of trusted components in [Assignment:
10969 organization-defined systems or system components].
10970 Discussion: The principle of trusted components states that a component is trustworthy to
10971 at least a level commensurate with the security dependencies it supports (i.e., how much it
10972 is trusted to perform its security functions by other components). This principle enables the
10973 composition of components such that trustworthiness is not inadvertently diminished and
10974 where consequently the trust is not misplaced. Ultimately this principle demands some
10975 metric by which the trust in a component and the trustworthiness of a component can be
10976 measured on the same abstract scale. The principle of trusted components is particularly
10977 relevant when considering systems and components in which there are complex chains of
10978 trust dependencies. A trust dependency is also referred to as a trust relationship and there
10979 may be chains of trust relationships.
10980 The principle of trusted components also applies to a compound component that consists of
10981 subcomponents (e.g., a subsystem), which may have varying levels of trustworthiness. The
10982 conservative assumption is that the trustworthiness of a compound component is that of its
10983 least trustworthy subcomponent. It may be possible to provide a security engineering
10984 rationale that the trustworthiness of a particular compound component is greater than the
10985 conservative assumption; however, any such rationale reflects logical reasoning based on a
10986 clear statement of the trustworthiness objectives, and relevant and credible evidence. The
10987 trustworthiness of a compound component is not the same as increased application of
10988 defense-in-depth layering within the component, or replication of components. Defense-in-
10989 depth techniques do not increase the trustworthiness of the whole above that of the least
10990 trustworthy component.
10991 Related Controls: None.
10992 (10) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | HIERARCHICAL TRUST
10993 Implement the security design principle of hierarchical trust in [Assignment: organization-
10994 defined systems or system components].
10995 Discussion: The principle of hierarchical trust for components builds on the principle of
10996 trusted components and states that the security dependencies in a system will form a partial
10997 ordering if they preserve the principle of trusted components. The partial ordering provides
10998 the basis for trustworthiness reasoning or providing an assurance case or argument when
10999 composing a secure system from heterogeneously trustworthy components. To analyze a
11000 system composed of heterogeneously trustworthy components for its trustworthiness, it is
11001 essential to eliminate circular dependencies with regard to the trustworthiness. If a more
11002 trustworthy component located in a lower layer of the system were to depend upon a less
11003 trustworthy component in a higher layer, this would in effect, put the components in the
11004 same “less trustworthy” equivalence class per the principle of trusted components. Trust
11005 relationships, or chains of trust, can have various manifestations. For example, the root
11006 certificate of a certificate hierarchy is the most trusted node in the hierarchy, whereas the
11007 leaves in the hierarchy may be the least trustworthy nodes. Another example occurs in a
11008 layered high-assurance system where the security kernel (including the hardware base),
11009 which is located at the lowest layer of the system, is the most trustworthy component. The
11010 principle of hierarchical trust, however, does not prohibit the use of overly trustworthy
11011 components. There may be cases in a system of low trustworthiness, where it is reasonable
11012 to employ a highly trustworthy component rather than one that is less trustworthy (e.g., due
11013 to availability or other cost-benefit driver). For such a case, any dependency of the highly
11014 trustworthy component upon a less trustworthy component does not degrade the
11015 trustworthiness of the resulting low-trust system.
11016 Related Controls: None.
11017 (11) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | INVERSE MODIFICATION THRESHOLD
11018 Implement the security design principle of inverse modification threshold in [Assignment:
11019 organization-defined systems or system components].
11020 Discussion: The principle of inverse modification threshold builds on the principle of trusted
11021 components and the principle of hierarchical trust, and states that the degree of protection
11022 provided to a component is commensurate with its trustworthiness. As the trust placed in a
11023 component increases, the protection against unauthorized modification of the component
11024 also increases to the same degree. Protection from unauthorized modification can come in
11025 the form of the component’s own self-protection and innate trustworthiness, or it can come
11026 from the protections afforded to the component from other elements or attributes of the
11027 security architecture (to include protections in the environment of operation).
11028 Related Controls: None.
11029 (12) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | HIERARCHICAL PROTECTION
11030 Implement the security design principle of hierarchical protection in [Assignment:
11031 organization-defined systems or system components].
11032 Discussion: The principle of hierarchical protection states that a component need not be
11033 protected from more trustworthy components. In the degenerate case of the most trusted
11034 component, it protects itself from all other components. For example, if an operating system
11035 kernel is deemed the most trustworthy component in a system, then it protects itself from
11036 all untrusted applications it supports, but the applications, conversely, do not need to
11037 protect themselves from the kernel. The trustworthiness of users is a consideration for
11038 applying the principle of hierarchical protection. A trusted system need not protect itself
11039 from an equally trustworthy user, reflecting use of untrusted systems in “system high”
11040 environments where users are highly trustworthy and where other protections are put in
11041 place to bound and protect the “system high” execution environment.
11042 Related Controls: None.
11043 (13) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | MINIMIZED SECURITY ELEMENTS
11044 Implement the security design principle of minimized security elements in [Assignment:
11045 organization-defined systems or system components].
11046 Discussion: The principle of minimized security elements states that the system does not
11047 have extraneous trusted components. The principle of minimized security elements has two
11048 aspects: the overall cost of security analysis and the complexity of security analysis. Trusted
11049 components are generally costlier to construct and implement, owing to increased rigor of
11050 development processes. Trusted components also require greater security analysis to qualify
11051 their trustworthiness. Thus, to reduce the cost and decrease the complexity of the security
11052 analysis, a system contains as few trustworthy components as possible. The analysis of the
11053 interaction of trusted components with other components of the system is one of the most
11054 important aspects of system security verification. If the interactions between components
11055 are unnecessarily complex, the security of the system will also be more difficult to ascertain
11056 than one whose internal trust relationships are simple and elegantly constructed. In general,
11057 fewer trusted components result in fewer internal trust relationships and a simpler system.
11058 Related Controls: None.
11059 (14) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | LEAST PRIVILEGE
11060 Implement the security design principle of least privilege in [Assignment: organization-
11061 defined systems or system components].
11062 Discussion: The principle of least privilege states that each system component is allocated
11063 sufficient privileges to accomplish its specified functions, but no more. Applying the principle
11064 of least privilege limits the scope of the component’s actions, which has two desirable
11065 effects: the security impact of a failure, corruption, or misuse of the component will have a
11066 minimized security impact; and the security analysis of the component will be simplified.
11067 Least privilege is a pervasive principle that is reflected in all aspects of the secure system
11068 design. Interfaces used to invoke component capability are available to only certain subsets
11069 of the user population, and component design supports a sufficiently fine granularity of
11070 privilege decomposition. For example, in the case of an audit mechanism, there may be an
11071 interface for the audit manager, who configures the audit settings; an interface for the audit
11072 operator, who ensures that audit data is safely collected and stored; and, finally, yet another
11073 interface for the audit reviewer, who has need only to view the audit data that has been
11074 collected but no need to perform operations on that data.
11075 In addition to its manifestations at the system interface, least privilege can be used as a
11076 guiding principle for the internal structure of the system itself. One aspect of internal least
11077 privilege is to construct modules so that only the elements encapsulated by the module are
11078 directly operated upon by the functions within the module. Elements external to a module
11079 that may be affected by the module’s operation are indirectly accessed through interaction
11080 (e.g., via a function call) with the module that contains those elements. Another aspect of
11081 internal least privilege is that the scope of a given module or component includes only those
11082 system elements that are necessary for its functionality, and that the access modes for the
11083 elements (e.g., read, write) are minimal.
11084 Related Controls: AC-6, CM-7.
11085 (15) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | PREDICATE PERMISSION
11086 Implement the security design principle of predicate permission in [Assignment:
11087 organization-defined systems or system components].
11088 Discussion: The principle of predicate permission states that system designers consider
11089 requiring multiple authorized entities to provide consent before a highly critical operation or
11090 access to highly sensitive data, information, or resources is allowed to proceed. [SALTZER75]
11091 originally named predicate permission the separation of privilege. It is also equivalent to
11092 separation of duty. The division of privilege among multiple parties decreases the likelihood
11093 of abuse and provides the safeguard that no single accident, deception, or breach of trust is
11094 sufficient to enable an unrecoverable action that can lead to significantly damaging effects.
11095 The design options for such a mechanism may require simultaneous action (e.g., the firing of
11096 a nuclear weapon requires two different authorized individuals to give the correct command
11097 within a small time window) or a sequence of operations where each successive action is
11098 enabled by some prior action, but no single individual is able to enable more than one
11099 action.
11100 Related Controls: AC-5.
11101 (16) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SELF-RELIANT TRUSTWORTHINESS
11102 Implement the security design principle of self-reliant trustworthiness in [Assignment:
11103 organization-defined systems or system components].
11104 Discussion: The principle of self-reliant trustworthiness states that systems minimize their
11105 reliance on other systems for their own trustworthiness. A system is trustworthy by default
11106 with any connection to an external entity used to supplement its function. If a system were
11107 required to maintain a connection with another external entity in order to maintain its
11108 trustworthiness, then that system would be vulnerable to malicious and non-malicious
11109 threats that result in loss or degradation of that connection. The benefit to the principle of
11110 self-reliant trustworthiness is that the isolation of a system will make it less vulnerable to
11111 attack. A corollary to this principle relates to the ability of the system (or system component)
11112 to operate in isolation and then resynchronize with other components when it is rejoined
11113 with them.
11114 Related Controls: None.
11115 (17) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SECURE DISTRIBUTED COMPOSITION
11116 Implement the security design principle of secure distributed composition in [Assignment:
11117 organization-defined systems or system components].
11118 Discussion: The principle of secure distributed composition states that the composition of
11119 distributed components that enforce the same system security policy result in a system that
11120 enforces that policy at least as well as the individual components do. Many of the design
11121 principles for secure systems deal with how components can or should interact. The need to
11122 create or enable capability from the composition of distributed components can magnify the
11123 relevancy of these principles. In particular, the translation of security policy from a stand-
11124 alone to a distributed system or a system-of-systems can have unexpected or emergent
11125 results. Communication protocols and distributed data consistency mechanisms help to
11126 ensure consistent policy enforcement across a distributed system. To ensure a system-wide
11127 level of assurance of correct policy enforcement, the security architecture of a distributed
11128 composite system is thoroughly analyzed.
11176 (20) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SECURE METADATA MANAGEMENT
11177 Implement the security design principle of secure metadata management in [Assignment:
11178 organization-defined systems or system components].
11179 Discussion: The principle of secure metadata management states that metadata are “first
11180 class” objects with respect to security policy when the policy requires complete protection of
11181 information or it requires that the security subsystem to be self-protecting. The principle of
11182 secure metadata management is driven by the recognition that a system, subsystem, or
11183 component cannot achieve self-protection unless it protects the data it relies upon for
11184 correct execution. Data is generally not interpreted by the system that stores it. It may have
11185 semantic value (i.e., it comprises information) to users and programs that process the data.
11186 In contrast, metadata is information about data, such as a file name or the date when the
11187 file was created. Metadata is bound to the target data that it describes in a way that the
11188 system can interpret, but it need not be stored inside of or proximate to its target data.
11189 There may be metadata whose target is itself metadata (e.g., the sensitivity level of a file
11190 name), to include self-referential metadata.
11191 The apparent secondary nature of metadata can lead to a neglect of its legitimate need for
11192 protection, resulting in a violation of the security policy that includes the exfiltration of
11193 information. A particular concern associated with insufficient protections for metadata is
11194 associated with multilevel secure (MLS) systems. MLS systems mediate access by a subject to
11195 an object based on relative sensitivity levels. It follows that all subjects and objects in the
11196 scope of control of the MLS system are either directly labeled or indirectly attributed with
11197 sensitivity levels. The corollary of labeled metadata for MLS systems states that objects
11198 containing metadata are labeled. As with protection needs assessment for data, attention is
11199 given to ensure that the confidentiality and integrity protections are individually assessed,
11200 specified, and allocated to metadata, as would be done for mission, business, and system
11201 data.
11202 Related Controls: None.
11203 (21) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SELF-ANALYSIS
11204 Implement the security design principle of self-analysis in [Assignment: organization-
11205 defined systems or system components].
11206 Discussion: The principle of self-analysis states that a system component is able to assess its
11207 internal state and functionality to a limited extent at various stages of execution, and that
11208 this self-analysis capability is commensurate with the level of trustworthiness invested in the
11209 system. At the system level, self-analysis can be achieved through hierarchical assessments
11210 of trustworthiness established in a bottom up fashion. In this approach, the lower-level
11211 components check for data integrity and correct functionality (to a limited extent) of higher-
11212 level components. For example, trusted boot sequences involve a trusted lower-level
11213 component attesting to the trustworthiness of the next higher-level components so that a
11214 transitive chain of trust can be established. At the root, a component attests to itself, which
11215 usually involves an axiomatic or environmentally enforced assumption about its integrity.
11216 Results of the self-analyses can be used to guard against externally induced errors, or
11217 internal malfunction or transient errors. By following this principle, some simple errors or
11218 malfunctions can be detected without allowing the effects of the error or malfunction to
11219 propagate outside the component. Further, the self-test can also be used to attest to the
11220 configuration of the component, detecting any potential conflicts in configuration with
11221 respect to the expected configuration.
11222 Related Controls: CA-7.
11223 (22) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | ACCOUNTABILITY AND TRACEABILITY
11224 Implement the security design principle of accountability and traceability in [Assignment:
11225 organization-defined systems or system components].
11226 Discussion: The principle of accountability and traceability states that it is possible to trace
11227 security-relevant actions (i.e., subject-object interactions) to the entity on whose behalf the
11228 action is being taken. The principle of accountability and traceability requires a trustworthy
11229 infrastructure that can record details about actions that affect system security (e.g., an audit
11230 subsystem). To record the details about actions, the system is able to uniquely identify the
11231 entity on whose behalf the action is being carried out and also record the relevant sequence
11232 of actions that are carried out. The accountability policy also requires the audit trail itself be
11233 protected from unauthorized access and modification. The principle of least privilege assists
11234 in tracing the actions to particular entities, as it increases the granularity of accountability.
11235 Associating specific actions with system entities, and ultimately with users, and making the
11236 audit trail secure against unauthorized access and modifications provides non-repudiation,
11237 because once an action is recorded, it is not possible to change the audit trail. Another
11238 important function that accountability and traceability serves is in the routine and forensic
11239 analysis of events associated with the violation of security policy. Analysis of audit logs may
11240 provide additional information that may be helpful in determining the path or component
11241 that allowed the violation of the security policy, and the actions of individuals associated
11242 with the violation of security policy.
11243 Related Controls: AC-6, AU-2, AU-3, AU-6, AU-9, AU-10, AU-12, IA-2, IR-4.
11244 (23) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SECURE DEFAULTS
11245 Implement the security design principle of secure defaults in [Assignment: organization-
11246 defined systems or system components].
11247 Discussion: The principle of secure defaults states that the default configuration of a system
11248 (to include its constituent subsystems, components, and mechanisms) reflects a restrictive
11249 and conservative enforcement of security policy. The principle of secure defaults applies to
11250 the initial (i.e., default) configuration of a system as well as to the security engineering and
11251 design of access control and other security functions that follow a “deny unless explicitly
11252 authorized” strategy. The initial configuration aspect of this principle requires that any “as
11253 shipped” configuration of a system, subsystem, or system component does not aid in the
11254 violation of the security policy, and can prevent the system from operating in the default
11255 configuration for those cases where the security policy itself requires configuration by the
11256 operational user.
11257 Restrictive defaults mean that the system will operate “as-shipped” with adequate self-
11258 protection, and is able to prevent security breaches before the intended security policy and
11259 system configuration is established. In cases where the protection provided by the “as-
11260 shipped” product is inadequate, stakeholders assess the risk of using it prior to establishing a
11261 secure initial state. Adherence to the principle of secure defaults guarantees that a system is
11262 established in a secure state upon successfully completing initialization. In situations where
11263 the system fails to complete initialization, either it will perform a requested operation using
11264 secure defaults or it will not perform the operation. Refer to the principles of continuous
11265 protection and secure failure and recovery that parallel this principle to provide the ability to
11266 detect and recover from failure.
11267 The security engineering approach to this principle states that security mechanisms deny
11268 requests unless the request is found to be well-formed and consistent with the security
11269 policy. The insecure alternative is to allow a request unless it is shown to be inconsistent
11270 with the policy. In a large system, the conditions that are satisfied to grant a request that is
11271 by default denied are often far more compact and complete than those that would need to
11272 be checked in order to deny a request that is by default granted.
11369 relevant feedback and warnings when insecure choices are being made. Particular attention
11370 is given to interfaces through which personnel responsible for system administration and
11371 operation configure and set up the security policies. Ideally, these personnel are able to
11372 understand the impact of their choices. The personnel with system administrative and
11373 operation responsibility are able to configure systems before start-up and administer them
11374 during runtime, in both cases with confidence that their intent is correctly mapped to the
11375 system’s mechanisms. Security services, functions, and mechanisms do not impede or
11376 unnecessarily complicate the intended use of the system. There is a trade-off between
11377 system usability and the strictness necessitated for security policy enforcement. If security
11378 mechanisms are frustrating or difficult to use, then users may disable or avoid them, or use
11379 the mechanisms in ways inconsistent with the security requirements and protection needs
11380 the mechanisms were designed to satisfy.
11381 Related Controls: None.
11382 (28) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | ACCEPTABLE SECURITY
11383 Implement the security design principle of acceptable security in [Assignment:
11384 organization-defined systems or system components].
11385 Discussion: The principle of acceptable security requires that the level of privacy and
11386 performance the system provides is consistent with the users’ expectations. The perception
11387 of personal privacy may affect user behavior, morale, and effectiveness. Based on the
11388 organizational privacy policy and the system design, users should be able to restrict their
11389 actions to protect their privacy. When systems fail to provide intuitive interfaces, or meet
11390 privacy and performance expectations, users may either choose to completely avoid the
11391 system or use it in ways that may be inefficient or even insecure.
11392 Related Controls: None.
11393 (29) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | REPEATABLE AND DOCUMENTED PROCEDURES
11394 Implement the security design principle of repeatable and documented procedures in
11395 [Assignment: organization-defined systems or system components].
11396 Discussion: The principle of repeatable and documented procedures states that the
11397 techniques and methods employed to construct a system component permits the same
11398 component to be completely and correctly reconstructed at a later time. Repeatable and
11399 documented procedures support the development of a component that is identical to the
11400 component created earlier that may be in widespread use. In the case of other system
11401 artifacts (e.g., documentation and testing results), repeatability supports consistency and
11402 ability to inspect the artifacts. Repeatable and documented procedures can be introduced at
11403 various stages within the system development life cycle and can contribute to the ability to
11404 evaluate assurance claims for the system. Examples include systematic procedures for code
11405 development and review; procedures for configuration management of development tools
11406 and system artifacts; and procedures for system delivery.
11407 Related Controls: CM-1, SA-1, SA-10, SA-11, SA-15, SA-17, SC-1, SI-1.
11408 (30) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | PROCEDURAL RIGOR
11409 Implement the security design principle of procedural rigor in [Assignment: organization-
11410 defined systems or system components].
11411 Discussion: The principle of procedural rigor states that the rigor of a system life cycle
11412 process is commensurate with its intended trustworthiness. Procedural rigor defines the
11413 scope, depth, and detail of the system life cycle procedures. Rigorous system life cycle
11414 procedures contribute to the assurance that the system is correct and free of unintended
11415 functionality in several ways. First, the procedures impose checks and balances on the life
11416 cycle process such that the introduction of unspecified functionality is prevented.
11417 Second, rigorous procedures applied to systems security engineering activities that produce
11418 specifications and other system design documents contribute to the ability to understand
11419 the system as it has been built, rather than trusting that the component as implemented, is
11420 the authoritative (and potentially misleading) specification.
11421 Finally, modifications to an existing system component are easier when there are detailed
11422 specifications describing its current design, instead of studying source code or schematics to
11423 try to understand how it works. Procedural rigor helps to ensure that security functional and
11424 assurance requirements have been satisfied, and it contributes to a better-informed basis
11425 for the determination of trustworthiness and risk posture. Procedural rigor is commensurate
11426 with the degree of assurance desired for the system. If the required trustworthiness of the
11427 system is low, a high level of procedural rigor may add unnecessary cost, whereas when high
11428 trustworthiness is critical, the cost of high procedural rigor is merited.
11429 Related Controls: None.
11430 (31) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SECURE SYSTEM MODIFICATION
11431 Implement the security design principle of secure system modification in [Assignment:
11432 organization-defined systems or system components].
11433 Discussion: The principle of secure system modification states that system modification
11434 maintains system security with respect to the security requirements and risk tolerance of
11435 stakeholders. Upgrades or modifications to systems can transform secure systems into
11436 systems that are not secure. The procedures for system modification ensure that, if the
11437 system is to maintain its trustworthiness, the same rigor that was applied to its initial
11438 development is applied to any system changes. Because modifications can affect the ability
11439 of the system to maintain its secure state, a careful security analysis of the modification is
11440 needed prior to its implementation and deployment. This principle parallels the principle of
11441 secure evolvability.
11442 Related Controls: CM-3, CM-4.
11443 (32) SECURITY AND PRIVACY ENGINEERING PRINCIPLES | SUFFICIENT DOCUMENTATION
11444 Implement the security design principle of sufficient documentation in [Assignment:
11445 organization-defined systems or system components].
11446 Discussion: The principle of sufficient documentation states that organizational personnel
11447 with responsibility to interact with the system are provided with adequate documentation
11448 and other information such that the personnel contribute to rather than detract from
11449 system security. Despite attempts to comply with principles such as human factored security
11450 and acceptable security, systems are inherently complex, and the design intent for the use of
11451 security mechanisms is not always intuitively obvious. Neither are the ramifications of the
11452 misuse or misconfiguration of security mechanisms. Uninformed and insufficiently trained
11453 users can introduce vulnerabilities due to errors of omission and commission. The availability
11454 of documentation and training can help to ensure a knowledgeable cadre of personnel, all of
11455 whom have a critical role in the achievement of principles such as continuous protection.
11456 Documentation is written clearly and supported by training that provides security awareness
11457 and understanding of security-relevant responsibilities.
11458 Related Controls: AT-2, AT-3, SA-5.
11459 References: [FIPS 199]; [FIPS 200]; [SP 800-53A]; [SP 800-60 v1]; [SP 800-60 v2]; [SP 800-160 v1];
11460 [IR 8062].
11507 (3) EXTERNAL SYSTEM SERVICES | ESTABLISH AND MAINTAIN TRUST RELATIONSHIP WITH PROVIDERS
11508 Establish, document, and maintain trust relationships with external service providers
11509 based on the following requirements, properties, factors, or conditions: [Assignment:
11510 organization-defined security and privacy requirements, properties, factors, or conditions
11511 defining acceptable trust relationships].
11512 Discussion: The degree of confidence that the risk from using external services is at an
11513 acceptable level depends on the trust that organizations place in the external providers,
11514 individually or in combination. Trust relationships can help organizations to gain increased
11515 levels of confidence that participating service providers are providing adequate protection
11516 for the services rendered and can also be useful when conducting incident response or when
11517 planning for upgrades or obsolescence. Trust relationships can be complicated due to the
11518 potentially large number of entities participating in the consumer-provider interactions,
11519 subordinate relationships and levels of trust, and types of interactions between the parties.
11520 In some cases, the degree of trust is based on the level of control organizations can exert on
11521 external service providers regarding the controls necessary for the protection of the service,
11522 information, or individual privacy and the evidence brought forth as to the effectiveness of
11523 the implemented controls. The level of control is established by the terms and conditions of
11524 the contracts or service-level agreements.
11525 Related Controls: SR-2.
11526 (4) EXTERNAL SYSTEM SERVICES |CONSISTENT INTERESTS OF CONSUMERS AND PROVIDERS
11527 Take the following actions to verify that the interests of [Assignment: organization-
11528 defined external service providers] are consistent with and reflect organizational interests:
11529 [Assignment: organization-defined actions].
11530 Discussion: As organizations increasingly use external service providers, it is possible that
11531 the interests of the service providers may diverge from organizational interests. In such
11532 situations, simply having the required technical, management, or operational controls in
11533 place may not be sufficient if the providers that implement and manage those controls are
11534 not operating in a manner consistent with the interests of the consuming organizations.
11535 Actions that organizations take to address such concerns include requiring background
11536 checks for selected service provider personnel; examining ownership records; employing
11537 only trustworthy service providers, including providers with which organizations have had
11538 successful trust relationships; and conducting routine periodic, unscheduled visits to service
11539 provider facilities.
11540 Related Controls: None.
11541 (5) EXTERNAL SYSTEM SERVICES | PROCESSING, STORAGE, AND SERVICE LOCATION
11542 Restrict the location of [Selection (one or more): information processing; information or
11543 data; system services] to [Assignment: organization-defined locations] based on
11544 [Assignment: organization-defined requirements or conditions].
11545 Discussion: The location of information processing, information and data storage, or system
11546 services that are critical to organizations can have a direct impact on the ability of those
11547 organizations to successfully execute their missions and business functions. The impact
11548 occurs when external providers control the location of processing, storage, or services. The
11549 criteria that external providers use for the selection of processing, storage, or service
11550 locations may be different from the criteria organizations use. For example, organizations
11551 may desire that data or information storage locations are restricted to certain locations to
11552 help facilitate incident response activities in case of information security or privacy incidents.
11553 Incident response activities including forensic analyses and after-the-fact investigations, may
11554 be adversely affected by the governing laws, policies, or protocols in the locations where
11555 processing and storage occur and/or the locations from which system services emanate.
11600 Controls include protecting from unauthorized modification or destruction, the master copies of
11601 material used to generate security-relevant portions of the system hardware, software, and
11602 firmware. Maintaining the integrity of changes to the system, system component, or system
11603 service requires strict configuration control throughout the system development life cycle to
11604 track authorized changes and to prevent unauthorized changes.
11605 The configuration items that are placed under configuration management include: the formal
11606 model; the functional, high-level, and low-level design specifications; other design data;
11607 implementation documentation; source code and hardware schematics; the current running
11608 version of the object code; tools for comparing new versions of security-relevant hardware
11609 descriptions and source code with previous versions; and test fixtures and documentation.
11610 Depending on the mission and business needs of organizations and the nature of the contractual
11611 relationships in place, developers may provide configuration management support during the
11612 operations and maintenance stage of the system development life cycle.
11613 Related Controls: CM-2, CM-3, CM-4, CM-7, CM-9, SA-4, SA-5, SA-8, SA-15, SI-2, SR-3, SR-4, SR-5,
11614 SR-6.
11615 Control Enhancements:
11616 (1) DEVELOPER CONFIGURATION MANAGEMENT | SOFTWARE AND FIRMWARE INTEGRITY VERIFICATION
11617 Require the developer of the system, system component, or system service to enable
11618 integrity verification of software and firmware components.
11619 Discussion: Software and firmware integrity verification allows organizations to detect
11620 unauthorized changes to software and firmware components using developer-provided
11621 tools, techniques, and mechanisms. The integrity checking mechanisms can also address
11622 counterfeiting of software and firmware components. Organizations verify the integrity of
11623 software and firmware components, for example, through secure one-way hashes provided
11624 by developers. Delivered software and firmware components also include any updates to
11625 such components.
11626 Related Controls: SI-7, SR-11.
11627 (2) DEVELOPER CONFIGURATION MANAGEMENT | ALTERNATIVE CONFIGURATION MANAGEMENT
11628 Provide an alternate configuration management process using organizational personnel in
11629 the absence of a dedicated developer configuration management team.
11630 Discussion: Alternate configuration management processes may be required, for example,
11631 when organizations use commercial off-the-shelf information technology products. Alternate
11632 configuration management processes include organizational personnel that review and
11633 approve proposed changes to systems, system components, and system services; and that
11634 conduct security and privacy impact analyses prior to the implementation of changes to
11635 systems, components, or services.
11636 Related Controls: None.
11637 (3) DEVELOPER CONFIGURATION MANAGEMENT | HARDWARE INTEGRITY VERIFICATION
11638 Require the developer of the system, system component, or system service to enable
11639 integrity verification of hardware components.
11640 Discussion: Hardware integrity verification allows organizations to detect unauthorized
11641 changes to hardware components using developer-provided tools, techniques, methods, and
11642 mechanisms. Organizations verify the integrity of hardware components, for example, with
11643 hard-to-copy labels and verifiable serial numbers provided by developers, and by requiring
11644 the implementation of anti-tamper technologies. Delivered hardware components also
11645 include hardware and firmware updates to such components.
11646 Related Controls: SI-7.
11692 Discussion: Developmental testing and evaluation confirms that the required controls are
11693 implemented correctly, operating as intended, enforcing the desired security and privacy
11694 policies, and meeting established security and privacy requirements. Security properties of
11695 systems and the privacy of individuals may be affected by the interconnection of system
11696 components or changes to those components. The interconnections or changes, including
11697 upgrading or replacing applications, operating systems, and firmware, may adversely affect
11698 previously implemented controls. Ongoing assessment during development allows for additional
11699 types of testing and evaluation that developers can conduct to reduce or eliminate potential
11700 flaws. Testing custom software applications may require approaches such as manual code
11701 review; security architecture review; penetration testing; and static analysis, dynamic analysis,
11702 binary analysis, or a hybrid of the three analysis approaches.
11703 Developers can use the analysis approaches, along with security instrumentation and fuzzing, in a
11704 variety of tools and in source code reviews. The security and privacy assessment plans include
11705 the specific activities that developers plan to carry out, including the types of analyses, testing,
11706 evaluation, and reviews of software and firmware components, the degree of rigor to be applied,
11707 the frequency of the ongoing testing and evaluation, and the types of artifacts produced during
11708 those processes. The depth of testing and evaluation refers to the rigor and level of detail
11709 associated with the assessment process. The coverage of testing and evaluation refers to the
11710 scope (i.e., number and type) of the artifacts included in the assessment process. Contracts
11711 specify the acceptance criteria for security and privacy assessment plans, flaw remediation
11712 processes, and the evidence that the plans and processes have been diligently applied. Methods
11713 for reviewing and protecting assessment plans, evidence, and documentation are commensurate
11714 with the security category or classification level of the system. Contracts may specify protection
11715 requirements for documentation.
11716 Related Controls: CA-2, CA-7, CM-4, SA-3, SA-4, SA-5, SA-8, SA-15, SA-17, SI-2, SR-5, SR-6, SR-7.
11717 Control Enhancements:
11718 (1) DEVELOPER TESTING AND EVALUATION | STATIC CODE ANALYSIS
11719 Require the developer of the system, system component, or system service to employ
11720 static code analysis tools to identify common flaws and document the results of the
11721 analysis.
11722 Discussion: Static code analysis provides a technology and methodology for security reviews
11723 and includes checking for weaknesses in the code and checking for incorporation of libraries
11724 or other included code with known vulnerabilities or that are out-of-date and not supported.
11725 Static code analysis can be used to identify vulnerabilities and to enforce secure coding
11726 practices and Static code analysis is most effective when used early in the development
11727 process, when each code change can be automatically scanned for potential weaknesses.
11728 Static code analysis can provide clear remediation guidance along with defects to enable
11729 developers to fix such defects. Evidence of correct implementation of static analysis include
11730 aggregate defect density for critical defect types; evidence that defects were inspected by
11731 developers or security professionals; and evidence that defects were remediated. A high
11732 density of ignored findings, commonly referred to as false positives, indicates a potential
11733 problem with the analysis process or the analysis tool. In such cases, organizations weigh the
11734 validity of the evidence against evidence from other sources.
11735 Related Controls: None.
11736 (2) DEVELOPER TESTING AND EVALUATION | THREAT MODELING AND VULNERABILITY ANALYSES
11737 Require the developer of the system, system component, or system service to perform
11738 threat modeling and vulnerability analyses during development and the subsequent
11739 testing and evaluation of the system, component, or service that:
11787 (a) At the following level of rigor: [Assignment: organization-defined breadth and depth
11788 of testing]; and
11789 (b) Under the following constraints: [Assignment: organization-defined constraints].
11790 Discussion: Penetration testing is an assessment methodology in which assessors, using all
11791 available information technology product or system documentation and working under
11792 specific constraints, attempt to circumvent implemented security and privacy features of
11793 information technology products and systems. Useful information for assessors conducting
11794 penetration testing includes product and system design specifications, source code, and
11795 administrator and operator manuals. Penetration testing can include white-box, gray-box, or
11796 black box testing with analyses performed by skilled professionals simulating adversary
11797 actions. The objective of penetration testing is to discover vulnerabilities in systems, system
11798 components and services resulting from implementation errors, configuration faults, or
11799 other operational weaknesses or deficiencies. Penetration tests can be performed in
11800 conjunction with automated and manual code reviews to provide greater levels of analysis
11801 than would ordinarily be possible. When user session information and other personally
11802 identifiable information is captured or recorded during penetration testing, such information
11803 is handled appropriately to protect privacy.
11804 Related Controls: CA-8, PM-14, PM-25, PT-2, SA-3, SI-2, SI-6.
11805 (6) DEVELOPER TESTING AND EVALUATION | ATTACK SURFACE REVIEWS
11806 Require the developer of the system, system component, or system service to perform
11807 attack surface reviews.
11808 Discussion: Attack surfaces of systems and system components are exposed areas that
11809 make those systems more vulnerable to attacks. Attack surfaces include any accessible areas
11810 where weaknesses or deficiencies in the hardware, software, and firmware components
11811 provide opportunities for adversaries to exploit vulnerabilities. Attack surface reviews
11812 ensure that developers analyze the design and implementation changes to systems and
11813 mitigate attack vectors generated as a result of the changes. Correction of identified flaws
11814 includes deprecation of unsafe functions.
11815 Related Controls: SA-15.
11816 (7) DEVELOPER TESTING AND EVALUATION | VERIFY SCOPE OF TESTING AND EVALUATION
11817 Require the developer of the system, system component, or system service to verify that
11818 the scope of testing and evaluation provides complete coverage of the required controls at
11819 the following level of rigor: [Assignment: organization-defined breadth and depth of
11820 testing and evaluation].
11821 Discussion: Verifying that testing and evaluation provides complete coverage of required
11822 controls can be accomplished by a variety of analytic techniques ranging from informal to
11823 formal. Each of these techniques provides an increasing level of assurance corresponding to
11824 the degree of formality of the analysis. Rigorously demonstrating control coverage at the
11825 highest levels of assurance can be provided using formal modeling and analysis techniques,
11826 including correlation between control implementation and corresponding test cases.
11827 Related Controls: SA-15.
11828 (8) DEVELOPER TESTING AND EVALUATION | DYNAMIC CODE ANALYSIS
11829 Require the developer of the system, system component, or system service to employ
11830 dynamic code analysis tools to identify common flaws and document the results of the
11831 analysis.
11832 Discussion: Dynamic code analysis provides run-time verification of software programs,
11833 using tools capable of monitoring programs for memory corruption, user privilege issues,
11834 and other potential security problems. Dynamic code analysis employs run-time tools to
11835 ensure that security functionality performs in the way it was designed. A specialized type of
11836 dynamic analysis, known as fuzz testing, induces program failures by deliberately introducing
11837 malformed or random data into software programs. Fuzz testing strategies derive from the
11838 intended use of applications and the associated functional and design specifications for the
11839 applications. To understand the scope of dynamic code analysis and hence the assurance
11840 provided, organizations may also consider conducting code coverage analysis (checking the
11841 degree to which the code has been tested using metrics such as percent of subroutines
11842 tested or percent of program statements called during execution of the test suite) and/or
11843 concordance analysis (checking for words that are out of place in software code such as non-
11844 English language words or derogatory terms).
11845 Related Controls: None.
11846 (9) DEVELOPER TESTING AND EVALUATION | INTERACTIVE APPLICATION SECURITY TESTING
11847 Require the developer of the system, system component, or system service to employ
11848 interactive application security testing tools to identify flaws and document the results.
11849 Discussion: Interactive (also known as instrumentation-based) application security testing is
11850 a method of detecting vulnerabilities by observing applications as they run during testing.
11851 The use of instrumentation relies on direct measurements of the actual running applications,
11852 and uses access to the code, user interaction, libraries, frameworks, backend connections,
11853 and configurations to measure control effectiveness directly. When combined with analysis
11854 techniques, interactive application security testing can identify a broad range of potential
11855 vulnerabilities and confirm control effectiveness. Instrumentation-based testing works in
11856 real time and can be used continuously throughout the system development life cycle.
11857 Related Controls: None.
11858 References: [ISO 15408-3]; [SP 800-30]; [SP 800-53A]; [SP 800-154]; [SP 800-160 v1].
11959 documentation includes functional specifications, high-level designs, low-level designs, and
11960 source code and hardware schematics. Criticality analysis is important for organizational
11961 systems that are designated as high value assets. High value assets can be moderate- or
11962 high-impact systems due to heightened adversarial interest or potential adverse effects on
11963 the federal enterprise. Developer input is especially important when organizations conduct
11964 supply chain criticality analyses.
11965 Related Controls: RA-9.
11966 (4) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | THREAT MODELING AND VULNERABILITY
11967 ANALYSIS
11968 [Withdrawn: Incorporated into SA-11(2).]
11969 (5) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | ATTACK SURFACE REDUCTION
11970 Require the developer of the system, system component, or system service to reduce
11971 attack surfaces to [Assignment: organization-defined thresholds].
11972 Discussion: Attack surface reduction is closely aligned with threat and vulnerability analyses
11973 and system architecture and design. Attack surface reduction is a means of reducing risk to
11974 organizations by giving attackers less opportunity to exploit weaknesses or deficiencies (i.e.,
11975 potential vulnerabilities) within systems, system components, and system services. Attack
11976 surface reduction includes implementing the concept of layered defenses; applying the
11977 principles of least privilege and least functionality; applying secure software development
11978 practices; deprecating unsafe functions; reducing entry points available to unauthorized
11979 users; reducing the amount of code executing; and eliminating application programming
11980 interfaces (APIs) that are vulnerable to attacks.
11981 Related Controls: AC-6, CM-7, RA-3, SA-11.
11982 (6) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | CONTINUOUS IMPROVEMENT
11983 Require the developer of the system, system component, or system service to implement
11984 an explicit process to continuously improve the development process.
11985 Discussion: Developers of systems, system components, and system services consider the
11986 effectiveness and efficiency of their current development processes for meeting quality
11987 objectives and for addressing the security and privacy capabilities in current threat
11988 environments.
11989 Related Controls: None.
11990 (7) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | AUTOMATED VULNERABILITY ANALYSIS
11991 Require the developer of the system, system component, or system service [Assignment:
11992 organization-defined frequency] to:
11993 (a) Perform an automated vulnerability analysis using [Assignment: organization-defined
11994 tools];
11995 (b) Determine the exploitation potential for discovered vulnerabilities;
11996 (c) Determine potential risk mitigations for delivered vulnerabilities; and
11997 (d) Deliver the outputs of the tools and results of the analysis to [Assignment:
11998 organization-defined personnel or roles].
11999 Discussion: Automated tools can be more effective in analyzing exploitable weaknesses or
12000 deficiencies in large and complex systems; prioritizing vulnerabilities by severity; and
12001 providing recommendations for risk mitigations.
12002 Related Controls: RA-5, SA-11.
12003 (8) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | REUSE OF THREAT AND VULNERABILITY
12004 INFORMATION
12005 Require the developer of the system, system component, or system service to use threat
12006 modeling and vulnerability analyses from similar systems, components, or services to
12007 inform the current development process.
12008 Discussion: Analysis of vulnerabilities found in similar software applications can inform
12009 potential design and implementation issues for systems under development. Similar systems
12010 or system components may exist within developer organizations. Vulnerability information is
12011 available from a variety of public and private sector sources, including the NIST National
12012 Vulnerability Database.
12013 Related Controls: None.
12014 (9) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | USE OF LIVE DATA
12015 [Withdrawn: Incorporated into SA-3(2).]
12016 (10) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | INCIDENT RESPONSE PLAN
12017 Require the developer of the system, system component, or system service to provide,
12018 implement, and test an incident response plan.
12019 Discussion: The incident response plan provided by developers may be incorporated into
12020 organizational incident response plans. Developer incident response information provides
12021 information that is not readily available to organizations. Such information may be extremely
12022 helpful, for example, when organizations respond to vulnerabilities in commercial off-the-
12023 shelf products.
12024 Related Controls: IR-8.
12025 (11) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | ARCHIVE SYSTEM OR COMPONENT
12026 Require the developer of the system or system component to archive the system or
12027 component to be released or delivered together with the corresponding evidence
12028 supporting the final security and privacy review.
12029 Discussion: Archiving system or system components requires the developer to retain key
12030 development artifacts, including hardware specifications, source code, object code, and
12031 relevant documentation from the development process that can provide a readily available
12032 configuration baseline for system and component upgrades or modifications.
12033 Related Controls: CM-2.
12034 (12) DEVELOPMENT PROCESS, STANDARDS, AND TOOLS | MINIMIZE PERSONALLY IDENTIFIABLE
12035 INFORMATION
12036 Require the developer of the system or system component to minimize the use of
12037 personally identifiable information in development and test environments.
12038 Discussion: Organizations can minimize the risk to an individual’s privacy by using
12039 techniques such as de-identification or synthetic data. Limiting the use of personally
12040 identifiable information in development and test environments helps reduce the level of
12041 privacy risk created by a system.
12042 Related Controls: PM-25.
12043 References: [SP 800-160 v1]; [IR 8179].
12048 Discussion: Developer-provided training applies to external and internal (in-house) developers.
12049 Training of personnel is an essential element to help ensure the effectiveness of the controls
12050 implemented within organizational systems. Types of training include web-based and computer-
12051 based training; classroom-style training; and hands-on training (including micro-training).
12052 Organizations can also request training materials from developers to conduct in-house training or
12053 offer self-training to organizational personnel. Organizations determine the type of training
12054 necessary and may require different types of training for different security and privacy functions,
12055 controls, and mechanisms.
12056 Related Controls: AT-2, AT-3, PE-3, SA-4, SA-5.
12057 Control Enhancements: None.
12058 References: None.
12092 the nature of the behaviors and policies to be described and the available tools. Formal
12093 modeling tools include Gypsy and Zed.
12094 Related Controls: AC-3, AC-4, AC-25.
12095 (2) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | SECURITY-RELEVANT COMPONENTS
12096 Require the developer of the system, system component, or system service to:
12097 (a) Define security-relevant hardware, software, and firmware; and
12098 (b) Provide a rationale that the definition for security-relevant hardware, software, and
12099 firmware is complete.
12100 Discussion: The security-relevant hardware, software, and firmware represent the portion
12101 of the system, component, or service that is trusted to perform correctly to maintain
12102 required security properties.
12103 Related Controls: AC-25, SA-5.
12104 (3) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | FORMAL CORRESPONDENCE
12105 Require the developer of the system, system component, or system service to:
12106 (a) Produce, as an integral part of the development process, a formal top-level
12107 specification that specifies the interfaces to security-relevant hardware, software, and
12108 firmware in terms of exceptions, error messages, and effects;
12109 (b) Show via proof to the extent feasible with additional informal demonstration as
12110 necessary, that the formal top-level specification is consistent with the formal policy
12111 model;
12112 (c) Show via informal demonstration, that the formal top-level specification completely
12113 covers the interfaces to security-relevant hardware, software, and firmware;
12114 (d) Show that the formal top-level specification is an accurate description of the
12115 implemented security-relevant hardware, software, and firmware; and
12116 (e) Describe the security-relevant hardware, software, and firmware mechanisms not
12117 addressed in the formal top-level specification but strictly internal to the security-
12118 relevant hardware, software, and firmware.
12119 Discussion: Correspondence is an important part of the assurance gained through modeling.
12120 It demonstrates that the implementation is an accurate transformation of the model, and
12121 that any additional code or implementation details that are present have no impact on the
12122 behaviors or policies being modeled. Formal methods can be used to show that the high-
12123 level security properties are satisfied by the formal system description, and that the formal
12124 system description is correctly implemented by a description of some lower level, including a
12125 hardware description. Consistency between the formal top-level specification and the formal
12126 policy models is generally not amenable to being fully proven. Therefore, a combination of
12127 formal and informal methods may be needed to demonstrate such consistency. Consistency
12128 between the formal top-level specification and the actual implementation may require the
12129 use of an informal demonstration due to limitations in the applicability of formal methods to
12130 prove that the specification accurately reflects the implementation. Hardware, software, and
12131 firmware mechanisms internal to security-relevant components include mapping registers
12132 and direct memory input and output.
12133 Related Controls: AC-3, AC-4, AC-25, SA-4, SA-5.
12134 (4) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | INFORMAL CORRESPONDENCE
12135 Require the developer of the system, system component, or system service to:
12136 (a) Produce, as an integral part of the development process, an informal descriptive top-
12137 level specification that specifies the interfaces to security-relevant hardware,
12138 software, and firmware in terms of exceptions, error messages, and effects;
12139 (b) Show via [Selection: informal demonstration, convincing argument with formal
12140 methods as feasible] that the descriptive top-level specification is consistent with the
12141 formal policy model;
12142 (c) Show via informal demonstration, that the descriptive top-level specification
12143 completely covers the interfaces to security-relevant hardware, software, and
12144 firmware;
12145 (d) Show that the descriptive top-level specification is an accurate description of the
12146 interfaces to security-relevant hardware, software, and firmware; and
12147 (e) Describe the security-relevant hardware, software, and firmware mechanisms not
12148 addressed in the descriptive top-level specification but strictly internal to the security-
12149 relevant hardware, software, and firmware.
12150 Discussion: Correspondence is an important part of the assurance gained through modeling.
12151 It demonstrates that the implementation is an accurate transformation of the model, and
12152 that any additional code or implementation details present has no impact on the behaviors
12153 or policies being modeled. Consistency between the descriptive top-level specification (i.e.,
12154 high-level/low-level design) and the formal policy model is generally not amenable to being
12155 fully proven. Therefore, a combination of formal and informal methods may be needed to
12156 show such consistency. Hardware, software, and firmware mechanisms strictly internal to
12157 security-relevant hardware, software, and firmware include mapping registers and direct
12158 memory input and output.
12159 Related Controls: AC-3, AC-4, AC-25, SA-4, SA-5.
12160 (5) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | CONCEPTUALLY SIMPLE DESIGN
12161 Require the developer of the system, system component, or system service to:
12162 (a) Design and structure the security-relevant hardware, software, and firmware to use a
12163 complete, conceptually simple protection mechanism with precisely defined
12164 semantics; and
12165 (b) Internally structure the security-relevant hardware, software, and firmware with
12166 specific regard for this mechanism.
12167 Discussion: The principle of reduced complexity states that the system design is as simple
12168 and small as possible (see SA-8(7)). A small and simple design is easier to understand and
12169 analyze, and is also less prone to error (see AC-25, SA-8(13)). The principle of reduced
12170 complexity applies to any aspect of a system, but it has particular importance for security
12171 due to the various analyses performed to obtain evidence about the emergent security
12172 property of the system. For such analyses to be successful, a small and simple design is
12173 essential. Application of the principle of reduced complexity contributes to the ability of
12174 system developers to understand the correctness and completeness of system security
12175 functions and facilitates the identification of potential vulnerabilities. The corollary of
12176 reduced complexity states that the simplicity of the system is directly related to the number
12177 of vulnerabilities it will contain—that is, simpler systems contain fewer vulnerabilities. An
12178 important benefit of reduced complexity is that it is easier to understand whether the
12179 security policy has been captured in the system design, and that fewer vulnerabilities are
12180 likely to be introduced during engineering development. An additional benefit is that any
12181 such conclusion about correctness, completeness, and existence of vulnerabilities can be
12182 reached with a higher degree of assurance in contrast to conclusions reached in situations
12183 where the system design is inherently more complex.
12184 Related Controls: AC-25, SA-8, SC-3.
12185 (6) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | STRUCTURE FOR TESTING
12186 Require the developer of the system, system component, or system service to structure
12187 security-relevant hardware, software, and firmware to facilitate testing.
12188 Discussion: Applying the security design principles in [SP 800-160 v1] promotes complete,
12189 consistent, and comprehensive testing and evaluation of systems, system components, and
12190 services. The thoroughness of such testing contributes to the evidence produced to generate
12191 an effective assurance case or argument as to the trustworthiness of the system, system
12192 component, or service.
12193 Related Controls: SA-5, SA-11.
12194 (7) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | STRUCTURE FOR LEAST PRIVILEGE
12195 Require the developer of the system, system component, or system service to structure
12196 security-relevant hardware, software, and firmware to facilitate controlling access with
12197 least privilege.
12198 Discussion: The principle of least privilege states that each component is allocated sufficient
12199 privileges to accomplish its specified functions, but no more (see SA-8(14)). Applying the
12200 principle of least privilege limits the scope of the component’s actions, which has two
12201 desirable effects. First, the security impact of a failure, corruption, or misuse of the system
12202 component results in a minimized security impact. Second, the security analysis of the
12203 component is simplified. Least privilege is a pervasive principle that is reflected in all aspects
12204 of the secure system design. Interfaces used to invoke component capability are available to
12205 only certain subsets of the user population, and component design supports a sufficiently
12206 fine granularity of privilege decomposition. For example, in the case of an audit mechanism,
12207 there may be an interface for the audit manager, who configures the audit settings; an
12208 interface for the audit operator, who ensures that audit data is safely collected and stored;
12209 and, finally, yet another interface for the audit reviewer, who has need only to view the
12210 audit data that has been collected but no need to perform operations on that data.
12211 In addition to its manifestations at the system interface, least privilege can be used as a
12212 guiding principle for the internal structure of the system itself. One aspect of internal least
12213 privilege is to construct modules so that only the elements encapsulated by the module are
12214 directly operated upon by the functions within the module. Elements external to a module
12215 that may be affected by the module’s operation are indirectly accessed through interaction
12216 (e.g., via a function call) with the module that contains those elements. Another aspect of
12217 internal least privilege is that the scope of a given module or component includes only those
12218 system elements that are necessary for its functionality, and that the access modes to the
12219 elements (e.g., read, write) are minimal.
12220 Related Controls: AC-5, AC-6, SA-8.
12221 (8) DEVELOPER SECURITY ARCHITECTURE AND DESIGN | ORCHESTRATION
12222 Design [Assignment: organization-defined critical systems or system components] with
12223 coordinated behavior to implement the following capabilities: [Assignment: organization-
12224 defined capabilities, by system or component].
12225 Discussion: Security resources that are distributed, located at different layers or in different
12226 system elements, or are implemented to support different aspects of trustworthiness can
12227 interact in unforeseen or incorrect ways. Adverse consequences can include cascading
12228 failures, interference, or coverage gaps. Coordination of the behavior of security resources
12229 (e.g., by ensuring that one patch is installed across all resources before making a
12230 configuration change that assumes that the patch is propagated) can avert such negative
12231 interactions.
12232 Related Controls: None.
12273 Discussion: Organizations determine that certain system components likely cannot be trusted
12274 due to specific threats to and vulnerabilities in those components, and for which there are no
12275 viable security controls to adequately mitigate the resulting risk. Re-implementation or custom
12276 development of such components may satisfy requirements for higher assurance and is carried
12277 out by initiating changes to system components (including hardware, software, and firmware)
12278 such that the standard attacks by adversaries are less likely to succeed. In situations where no
12279 alternative sourcing is available and organizations choose not to re-implement or custom
12280 develop critical system components, additional controls can be employed. Controls include
12281 enhanced auditing; restrictions on source code and system utility access; and protection from
12282 deletion of system and application files.
12283 Related Controls: CP-2, RA-9, SA-8.
12284 Control Enhancements: None.
12285 References: [SP 800-160 v1].
12395 require privileged user access. The separation of user functions from system management
12396 functions is physical or logical. Organizations implement separation of system management
12397 functions from user functions, for example, by using different computers, instances of operating
12398 systems, central processing units, or network addresses; by employing virtualization techniques;
12399 or some combination of these or other methods. Separation of system management functions
12400 from user functions includes web administrative interfaces that employ separate authentication
12401 methods for users of any other system resources. Separation of system and user functions may
12402 include isolating administrative interfaces on different domains and with additional access
12403 controls. The separation of system and user functionality can be achieved by applying the
12404 systems security engineering design principles in SA-8 including SA-8(1), SA-8(3), SA-8(4), SA-
12405 8(10), SA-8(12), SA-8(13), SA-8(14), and SA-8(18).
12406 Related Controls: AC-6, SA-4, SA-8, SC-3, SC-7, SC-22, SC-32, SC-39.
12407 Control Enhancements:
12408 (1) SEPARATION OF SYSTEM AND USER FUNCTIONALITY | INTERFACES FOR NON-PRIVILEGED USERS
12409 Prevent the presentation of system management functionality at interfaces to non-
12410 privileged users.
12411 Discussion: Preventing the presentation of system management functionality at interfaces
12412 to non-privileged users ensures that system administration options, including administrator
12413 privileges, are not available to the general user population. Restricting user access also
12414 prohibits the use of the grey-out option commonly used to eliminate accessibility to such
12415 information. One potential solution is to withhold system administration options until users
12416 establish sessions with administrator privileges.
12417 Related Controls: AC-3.
12418 (2) SEPARATION OF SYSTEM AND USER FUNCTIONALITY | DISASSOCIABILITY
12419 Store state information from applications and software separately.
12420 Discussion: If a system is compromised, storing applications and software separately from
12421 state information about users’ interactions with an application, may better protect
12422 individuals’ privacy.
12423 Related Controls: None.
12424 References: None.
12440 Related Controls: AC-3, AC-6, AC-25, CM-2, CM-4, SA-4, SA-5, SA-8, SA-15, SA-17, SC-2, SC-7, SC-
12441 32, SC-39, SI-16.
12442 Control Enhancements:
12443 (1) SECURITY FUNCTION ISOLATION | HARDWARE SEPARATION
12444 Employ hardware separation mechanisms to implement security function isolation.
12445 Discussion: Hardware separation mechanisms include hardware ring architectures that are
12446 implemented within microprocessors, and hardware-enforced address segmentation used to
12447 support logically distinct storage objects with separate attributes (i.e., readable, writeable).
12448 Related Controls: None.
12449 (2) SECURITY FUNCTION ISOLATION | ACCESS AND FLOW CONTROL FUNCTIONS
12450 Isolate security functions enforcing access and information flow control from nonsecurity
12451 functions and from other security functions.
12452 Discussion: Security function isolation occurs because of implementation. The functions can
12453 still be scanned and monitored. Security functions that are potentially isolated from access
12454 and flow control enforcement functions include auditing, intrusion detection, and malicious
12455 code protection functions.
12456 Related Controls: None.
12457 (3) SECURITY FUNCTION ISOLATION | MINIMIZE NONSECURITY FUNCTIONALITY
12458 Minimize the number of nonsecurity functions included within the isolation boundary
12459 containing security functions.
12460 Discussion: Where it is not feasible to achieve strict isolation of nonsecurity functions from
12461 security functions, it is necessary to take actions to minimize nonsecurity-relevant functions
12462 within the security function boundary. Nonsecurity functions contained within the isolation
12463 boundary are considered security-relevant because errors or malicious code in the software,
12464 can directly impact the security functions of systems. The fundamental design objective is
12465 that the specific portions of systems providing information security are of minimal size and
12466 complexity. Minimizing the number of nonsecurity functions in the security-relevant system
12467 components allows designers and implementers to focus only on those functions which are
12468 necessary to provide the desired security capability (typically access enforcement). By
12469 minimizing the nonsecurity functions within the isolation boundaries, the amount of code
12470 that is trusted to enforce security policies is significantly reduced, thus contributing to
12471 understandability.
12472 Related Controls: None.
12473 (4) SECURITY FUNCTION ISOLATION | MODULE COUPLING AND COHESIVENESS
12474 Implement security functions as largely independent modules that maximize internal
12475 cohesiveness within modules and minimize coupling between modules.
12476 Discussion: The reduction in inter-module interactions helps to constrain security functions
12477 and manage complexity. The concepts of coupling and cohesion are important with respect
12478 to modularity in software design. Coupling refers to the dependencies that one module has
12479 on other modules. Cohesion refers to the relationship between functions within a module.
12480 Best practices in software engineering and systems security engineering rely on layering,
12481 minimization, and modular decomposition to reduce and manage complexity. This produces
12482 software modules that are highly cohesive and loosely coupled.
12483 Related Controls: None.
12527 b. Employ the following controls to achieve the denial of service objective: [Assignment:
12528 organization-defined controls by type of denial of service event].
12529 Discussion: Denial of service events may occur due to a variety of internal and external causes
12530 such as an attack by an adversary or a lack of planning to support organizational needs with
12531 respect to capacity and bandwidth. Such attacks can occur across a variety of network protocols
12532 (e.g., IPv4, IPv6). A variety of technologies are available to limit or eliminate the origination and
12533 effects of denial of service events. For example, boundary protection devices can filter certain
12534 types of packets to protect system components on internal networks from being directly affected
12535 by, or the source of, denial of service attacks. Employing increased network capacity and
12536 bandwidth combined with service redundancy also reduces the susceptibility to denial of service
12537 events.
12538 Related Controls: CP-2, IR-4, SC-6, SC-7, SC-40.
12539 Control Enhancements:
12540 (1) DENIAL OF SERVICE PROTECTION | RESTRICT ABILITY TO ATTACK OTHER SYSTEMS
12541 Restrict the ability of individuals to launch the following denial-of-service attacks against
12542 other systems: [Assignment: organization-defined denial of service attacks].
12543 Discussion: Restricting the ability of individuals to launch denial of service attacks requires
12544 the mechanisms commonly used for such attacks are unavailable. Individuals of concern
12545 include hostile insiders or external adversaries that have breached or compromised the
12546 system and are using the system to launch a denial of service attack. Organizations can
12547 restrict the ability of individuals to connect and transmit arbitrary information on the
12548 transport medium (i.e., wired networks, wireless networks, spoofed Internet protocol
12549 packets). Organizations can also limit the ability of individuals to use excessive system
12550 resources. Protection against individuals having the ability to launch denial of service attacks
12551 may be implemented on specific systems or on boundary devices prohibiting egress to
12552 potential target systems.
12553 Related Controls: None.
12554 (2) DENIAL OF SERVICE PROTECTION | CAPACITY, BANDWIDTH, AND REDUNDANCY
12555 Manage capacity, bandwidth, or other redundancy to limit the effects of information
12556 flooding denial of service attacks.
12557 Discussion: Managing capacity ensures that sufficient capacity is available to counter
12558 flooding attacks. Managing capacity includes establishing selected usage priorities, quotas,
12559 partitioning, or load balancing.
12560 Related Controls: None.
12561 (3) DENIAL OF SERVICE PROTECTION | DETECTION AND MONITORING
12562 (a) Employ the following monitoring tools to detect indicators of denial of service attacks
12563 against, or launched from, the system: [Assignment: organization-defined monitoring
12564 tools]; and
12565 (b) Monitor the following system resources to determine if sufficient resources exist to
12566 prevent effective denial of service attacks: [Assignment: organization-defined system
12567 resources].
12568 Discussion: Organizations consider utilization and capacity of system resources when
12569 managing risk from denial of service due to malicious attacks. Denial of service attacks can
12570 originate from external or internal sources. System resources sensitive to denial of service
12571 include physical disk storage, memory, and CPU cycles. Controls used to prevent denial of
12572 service attacks related to storage utilization and capacity include instituting disk quotas;
12573 configuring systems to automatically alert administrators when specific storage capacity
12574 thresholds are reached; using file compression technologies to maximize available storage
12575 space; and imposing separate partitions for system and user data.
12576 Related Controls: CA-7, SI-4.
12577 References: [SP 800-189].
12710 and control centers; monitoring for steganography; disassembling and reassembling packet
12711 headers; and employing data loss and data leakage prevention tools. Devices that enforce
12712 strict adherence to protocol formats include deep packet inspection firewalls and XML
12713 gateways. The devices verify adherence to protocol formats and specifications at the
12714 application layer and identify vulnerabilities that cannot be detected by devices operating at
12715 the network or transport layers. Prevention of exfiltration is similar to data loss prevention
12716 or data leakage prevention and is closely associated with cross-domain solutions and system
12717 guards enforcing information flow requirements.
12718 Related Controls: AC-2, SI-3.
12719 (11) BOUNDARY PROTECTION | RESTRICT INCOMING COMMUNICATIONS TRAFFIC
12720 Only allow incoming communications from [Assignment: organization-defined authorized
12721 sources] to be routed to [Assignment: organization-defined authorized destinations].
12722 Discussion: General source address validation techniques should be applied to restrict the
12723 use of illegal and unallocated source addresses and source addresses that should only be
12724 used inside the system boundary. Restriction of incoming communications traffic provides
12725 determinations that source and destination address pairs represent authorized or allowed
12726 communications. Determinations can be based on several factors, including the presence of
12727 such address pairs in the lists of authorized or allowed communications; the absence of such
12728 address pairs in lists of unauthorized or disallowed pairs; or meeting more general rules for
12729 authorized or allowed source and destination pairs. Strong authentication of network
12730 addresses is not possible without the use of explicit security protocols and thus, addresses
12731 can often be spoofed. Further, identity-based incoming traffic restriction methods can be
12732 employed, including router access control lists and firewall rules.
12733 Related Controls: AC-3.
12734 (12) BOUNDARY PROTECTION | HOST-BASED PROTECTION
12735 Implement [Assignment: organization-defined host-based boundary protection
12736 mechanisms] at [Assignment: organization-defined system components].
12737 Discussion: Host-based boundary protection mechanisms include host-based firewalls.
12738 System components employing host-based boundary protection mechanisms include
12739 servers, workstations, notebook computers, and mobile devices.
12740 Related Controls: None.
12741 (13) BOUNDARY PROTECTION | ISOLATION OF SECURITY TOOLS, MECHANISMS, AND SUPPORT
12742 COMPONENTS
12743 Isolate [Assignment: organization-defined information security tools, mechanisms, and
12744 support components] from other internal system components by implementing physically
12745 separate subnetworks with managed interfaces to other components of the system.
12746 Discussion: Physically separate subnetworks with managed interfaces are useful, for
12747 example, in isolating computer network defenses from critical operational processing
12748 networks to prevent adversaries from discovering the analysis and forensics techniques
12749 employed by organizations.
12750 Related Controls: SC-2, SC-3.
12751 (14) BOUNDARY PROTECTION | PROTECT AGAINST UNAUTHORIZED PHYSICAL CONNECTIONS
12752 Protect against unauthorized physical connections at [Assignment: organization-defined
12753 managed interfaces].
12754 Discussion: Systems operating at different security categories or classification levels may
12755 share common physical and environmental controls, since the systems may share space
12756 within the same facilities. In practice, it is possible that these separate systems may share
12757 common equipment rooms, wiring closets, and cable distribution paths. Protection against
12758 unauthorized physical connections can be achieved, for example, by using clearly identified
12759 and physically separated cable trays, connection frames, and patch panels for each side of
12760 managed interfaces with physical access controls enforcing limited authorized access to
12761 these items.
12762 Related Controls: PE-4, PE-19.
12763 (15) BOUNDARY PROTECTION | NETWORKED PRIVILEGED ACCESSES
12764 Route networked, privileged accesses through a dedicated, managed interface for
12765 purposes of access control and auditing.
12766 Discussion: Privileged access provides greater accessibility to system functions, including
12767 security functions. Adversaries typically attempt to gain privileged access to systems through
12768 remote access to cause adverse mission or business impact, for example, by exfiltrating
12769 sensitive information or bringing down a critical system capability. Routing networked,
12770 privileged access requests through a dedicated, managed interface can facilitate strong
12771 access controls (including strong authentication) and a comprehensive auditing capability.
12772 Related Controls: AC-2, AC-3, AU-2, SI-4.
12773 (16) BOUNDARY PROTECTION | PREVENT DISCOVERY OF COMPONENTS AND DEVICES
12774 Prevent the discovery of specific system components that represent a managed interface.
12775 Discussion: This control enhancement protects network addresses of system components
12776 that are part of managed interfaces from discovery through common tools and techniques
12777 used to identify devices on networks. Network addresses are not available for discovery,
12778 requiring prior knowledge for access. Preventing discovery of components and devices can
12779 be accomplished by not publishing network addresses, using network address translation, or
12780 not entering the addresses in domain name systems. Another prevention technique is to
12781 periodically change network addresses.
12782 Related Controls: None.
12783 (17) BOUNDARY PROTECTION | AUTOMATED ENFORCEMENT OF PROTOCOL FORMATS
12784 Enforce adherence to protocol formats.
12785 Discussion: System components that enforce protocol formats include deep packet
12786 inspection firewalls and XML gateways. The components verify adherence to protocol
12787 formats and specifications at the application layer and identify vulnerabilities that cannot be
12788 detected by devices operating at the network or transport layers.
12789 Related Controls: SC-4.
12790 (18) BOUNDARY PROTECTION | FAIL SECURE
12791 Prevent systems from entering unsecure states in the event of an operational failure of a
12792 boundary protection device.
12793 Discussion: Fail secure is a condition achieved by employing mechanisms to ensure that in
12794 the event of operational failures of boundary protection devices at managed interfaces,
12795 systems do not enter into unsecure states where intended security properties no longer
12796 hold. Managed interfaces include routers, firewalls, and application gateways residing on
12797 protected subnetworks commonly referred to as demilitarized zones. Failures of boundary
12798 protection devices cannot lead to, or cause information external to the devices to enter the
12799 devices, nor can failures permit unauthorized information releases.
12800 Related Controls: CP-2, CP-12, SC-24.
12983 patterns prevents the derivation of intelligence from the system communications patterns.
12984 Alternative physical controls include protected distribution systems.
12985 Related Controls: SC-12, SC-13.
12986 (5) TRANSMISSION CONFIDENTIALITY AND INTEGRITY | PROTECTED DISTRIBUTION SYSTEM
12987 Implement [Assignment: organization-defined protected distribution system] to [Selection
12988 (one or more): prevent unauthorized disclosure of information; detect changes to
12989 information] during transmission.
12990 Discussion: The purpose of a protected distribution system is to deter, detect and/or make
12991 difficult physical access to the communication lines carrying national security information.
12992 Related Controls: None.
12993 References: [FIPS 140-3]; [FIPS 197]; [SP 800-52]; [SP 800-77]; [SP 800-81-2]; [SP 800-113]; [SP
12994 800-177]; [IR 8023].
13025 example, the <CTRL> + <ALT> + <DEL> keys. Note, however, that any such key combinations are
13026 platform-specific and may not provide a trusted path implementation in every case. Enforcement
13027 of trusted communications paths is typically provided by a specific implementation that meets
13028 the reference monitor concept.
13029 Related Controls: AC-16, AC-25, SC-12, SC-23.
13030 Control Enhancements:
13031 (1) TRUSTED PATH | IRREFUTABLE COMMUNICATIONS PATH
13032 (a) Provide a trusted communications path that is irrefutably distinguishable from other
13033 communications paths; and
13034 (b) Initiate the trusted communications path for communications between the
13035 [Assignment: organization-defined security functions] of the system and the user.
13036 Discussion: An irrefutable communications path permits the system to initiate a trusted path
13037 which necessitates that the user can unmistakably recognize the source of the communication as
13038 a trusted system component. For example, the trusted path may appear in an area of the display
13039 that other applications cannot access or be based on the presence of an identifier that cannot be
13040 spoofed.
13041 Related Controls: None.
13042 References: [OMB A-130].
13068 Discussion: [SP 800-56A], [SP 800-56B], and [SP 800-56C] provide guidance on cryptographic
13069 key establishment schemes and key derivation methods. [SP 800-57-1], [SP 800-57-2], and
13070 [SP 800-57-3] provide guidance on cryptographic key management.
13071 Related Controls: None.
13072 (3) CRYPTOGRAPHIC KEY ESTABLISHMENT AND MANAGEMENT | ASYMMETRIC KEYS
13073 Produce, control, and distribute asymmetric cryptographic keys using [Selection: NSA-
13074 approved key management technology and processes; prepositioned keying material;
13075 DoD-approved or DoD-issued Medium Assurance PKI certificates; DoD-approved or DoD-
13076 issued Medium Hardware Assurance PKI certificates and hardware security tokens that
13077 protect the user’s private key; certificates issued in accordance with organization-defined
13078 requirements].
13079 Discussion: [SP 800-56A], [SP 800-56B], and [SP 800-56C] provide guidance on cryptographic
13080 key establishment schemes and key derivation methods. [SP 800-57-1], [SP 800-57-2], and
13081 [SP 800-57-3] provide guidance on cryptographic key management.
13082 Related Controls: None.
13083 (4) CRYPTOGRAPHIC KEY ESTABLISHMENT AND MANAGEMENT | PKI CERTIFICATES
13084 [Withdrawn: Incorporated into SC-12(3).]
13085 (5) CRYPTOGRAPHIC KEY ESTABLISHMENT AND MANAGEMENT | PKI CERTIFICATES / HARDWARE TOKENS
13086 [Withdrawn: Incorporated into SC-12(3).]
13087 (6) CRYPTOGRAPHIC KEY ESTABLISHMENT AND MANAGEMENT | PHYSICAL CONTROL OF KEYS
13088 Maintain physical control of cryptographic keys when stored information is encrypted by
13089 external service providers.
13090 Discussion: For organizations using external service providers, for example, cloud service
13091 providers or data center providers, physical control of cryptographic keys provides additional
13092 assurance that information stored by such external providers is not subject to unauthorized
13093 disclosure or modification.
13094 Related Controls: None.
13095 References: [FIPS 140-3]; [SP 800-56A]; [SP 800-56B]; [SP 800-56C]; [SP 800-57-1]; [SP 800-57-2];
13096 [SP 800-57-3]; [SP 800-63-3]; [IR 7956]; [IR 7966].
13112 implemented in accordance with applicable laws, executive orders, directives, regulations,
13113 policies, standards, and guidelines.
13114 Related Controls: AC-2, AC-3, AC-7, AC-17, AC-18, AC-19, AU-9, AU-10, CM-11, CP-9, IA-3, IA-7,
13115 MA-4, MP-2, MP-4, MP-5, SA-4, SA-8, SA-9, SC-8, SC-12, SC-20, SC-23, SC-28, SC-40, SI-3, SI-7.
13116 Control Enhancements: None.
13117 (1) CRYPTOGRAPHIC PROTECTION | FIPS-VALIDATED CRYPTOGRAPHY
13118 [Withdrawn: Incorporated into SC-13.]
13119 (2) CRYPTOGRAPHIC PROTECTION | NSA-APPROVED CRYPTOGRAPHY
13120 [Withdrawn: Incorporated into SC-13.]
13121 (3) CRYPTOGRAPHIC PROTECTION | INDIVIDUALS WITHOUT FORMAL ACCESS APPROVALS
13122 [Withdrawn: Incorporated into SC-13.]
13123 (4) CRYPTOGRAPHIC PROTECTION | DIGITAL SIGNATURES
13124 [Withdrawn: Incorporated into SC-13.]
13125 References: [FIPS 140-3].
13151 (3) COLLABORATIVE COMPUTING DEVICES | DISABLING AND REMOVAL IN SECURE WORK AREAS
13152 Disable or remove collaborative computing devices and applications from [Assignment:
13153 organization-defined systems or system components] in [Assignment: organization-defined
13154 secure work areas].
13155 Discussion: Failing to disable or remove collaborative computing devices and applications
13156 from systems or system components can result in compromises of information, including
13157 eavesdropping on conversations. A secure work area includes a sensitive compartmented
13158 information facility (SCIF).
13159 Related Controls: None.
13160 (4) COLLABORATIVE COMPUTING DEVICES | EXPLICITLY INDICATE CURRENT PARTICIPANTS
13161 Provide an explicit indication of current participants in [Assignment: organization-defined
13162 online meetings and teleconferences].
13163 Discussion: Explicitly indicating current participants prevents unauthorized individuals from
13164 participating in collaborative computing sessions without the explicit knowledge of other
13165 participants.
13166 Related Controls: None.
13167 References: None.
13195 system. The alteration of attributes leads organizations to believe that a greater number of
13196 security functions are in place and operational than have actually been implemented.
13197 Related Controls: SI-3, SI-4, SI-7.
13198 References: [OMB A-130].
13236 processing files with embedded macros when such macros have been determined to be
13237 unacceptable mobile code.
13238 Related Controls: None.
13239 (2) MOBILE CODE | ACQUISITION, DEVELOPMENT, AND USE
13240 Verify that the acquisition, development, and use of mobile code to be deployed in the
13241 system meets [Assignment: organization-defined mobile code requirements].
13242 Discussion: None.
13243 Related Controls: None.
13244 (3) MOBILE CODE | PREVENT DOWNLOADING AND EXECUTION
13245 Prevent the download and execution of [Assignment: organization-defined unacceptable
13246 mobile code].
13247 Discussion: None.
13248 Related Controls: None.
13249 (4) MOBILE CODE | PREVENT AUTOMATIC EXECUTION
13250 Prevent the automatic execution of mobile code in [Assignment: organization-defined
13251 software applications] and enforce [Assignment: organization-defined actions] prior to
13252 executing the code.
13253 Discussion: Actions enforced before executing mobile code include prompting users prior to
13254 opening email attachments or clicking on web links. Preventing automatic execution of
13255 mobile code includes disabling auto execute features on system components employing
13256 portable storage devices such as Compact Disks (CDs), Digital Versatile Disks (DVDs), and
13257 Universal Serial Bus (USB) devices.
13258 Related Controls: None.
13259 (5) MOBILE CODE | ALLOW EXECUTION ONLY IN CONFINED ENVIRONMENTS
13260 Allow execution of permitted mobile code only in confined virtual machine environments.
13261 Discussion: Permitting execution of mobile code only in confined virtual machine
13262 environments helps prevent the introduction of malicious code into other systems and
13263 system components.
13264 Related Controls: SC-44, SI-7.
13265 References: [SP 800-28].
13278 address resolution information obtained through the service. Systems that provide name and
13279 address resolution services include domain name system (DNS) servers. Additional artifacts
13280 include DNS Security (DNSSEC) digital signatures and cryptographic keys. Authoritative data
13281 include DNS resource records. The means to indicate the security status of child zones include
13282 the use of delegation signer resource records in the DNS. Systems that use technologies other
13283 than the DNS to map between host and service names and network addresses provide other
13284 means to assure the authenticity and integrity of response data.
13285 Related Controls: AU-10, SC-8, SC-12, SC-13, SC-21, SC-22.
13286 Control Enhancements:
13287 (1) SECURE NAME/ADDRESS RESOLUTION SERVICE (AUTHORITATIVE SOURCE) | CHILD SUBSPACES
13288 [Withdrawn: Incorporated into SC-20.]
13289 (2) SECURE NAME/ADDRESS RESOLUTION SERVICE (AUTHORITATIVE SOURCE) | DATA ORIGIN AND
13290 INTEGRITY
13291 Provide data origin and integrity protection artifacts for internal name/address resolution
13292 queries.
13293 Discussion: None.
13294 Related Controls: None.
13295 References: [FIPS 140-3]; [FIPS 186-4]; [SP 800-81-2].
13320 typically deploy the servers in two geographically separated network subnetworks (i.e., not
13321 located in the same physical facility). For role separation, DNS servers with internal roles only
13322 process name and address resolution requests from within organizations (i.e., from internal
13323 clients). DNS servers with external roles only process name and address resolution information
13324 requests from clients external to organizations (i.e., on external networks including the Internet).
13325 Organizations specify clients that can access authoritative DNS servers in certain roles, for
13326 example, by address ranges and explicit lists.
13327 Related Controls: SC-2, SC-20, SC-21, SC-24.
13328 Control Enhancements: None.
13329 References: [SP 800-81-2].
13533 also increases the chances that adversaries may inadvertently disclose aspects of tradecraft
13534 while attempting to locate critical organizational resources.
13535 Related Controls: None.
13536 (4) CONCEALMENT AND MISDIRECTION | MISLEADING INFORMATION
13537 Employ realistic, but misleading information in [Assignment: organization-defined system
13538 components] about its security state or posture.
13539 Discussion: This control enhancement is intended to mislead potential adversaries regarding
13540 the nature and extent of controls deployed by organizations. Thus, adversaries may employ
13541 incorrect and ineffective, attack techniques. One technique for misleading adversaries is for
13542 organizations to place misleading information regarding the specific controls deployed in
13543 external systems that are known to be targeted by adversaries. Another technique is the use
13544 of deception nets that mimic actual aspects of organizational systems but use, for example,
13545 out-of-date software configurations.
13546 Related Controls: SC-26.
13547 (5) CONCEALMENT AND MISDIRECTION | CONCEALMENT OF SYSTEM COMPONENTS
13548 Employ the following techniques to hide or conceal [Assignment: organization-defined
13549 system components]: [Assignment: organization-defined techniques].
13550 Discussion: By hiding, disguising, or concealing critical system components, organizations
13551 may be able to decrease the probability that adversaries target and successfully compromise
13552 those assets. Potential means to hide, disguise, or conceal system components include
13553 configuration of routers or the use of encryption or virtualization techniques.
13554 Related Controls: None.
13555 References: None.
13659 techniques requires some supporting isolation measures to ensure that any malicious code
13660 discovered during the search and subsequently executed does not infect organizational systems.
13661 Virtualization is a common technique for achieving such isolation.
13662 Related Controls: SC-26, SC-44, SI-3, SI-4.
13663 Control Enhancements: None.
13664 References: None.
13746 Related Controls: CA-2, CA-7, PL-1, PM-9, PM-12, RA-2, RA-3, RA-5, SC-7, SR-3, SR-7.
13747 Control Enhancements: None.
13748 References: None.
13879 traffic navigation could be misused to track movements of individuals. Measures to mitigate
13880 such activities include additional training to ensure that authorized individuals do not abuse
13881 their authority; and in the case where sensor data or information is maintained by external
13882 parties, contractual restrictions on the use of such data or information.
13883 Related Controls: PT-2.
13884 (3) SENSOR CAPABILITY AND DATA | PROHIBIT USE OF DEVICES
13885 Prohibit the use of devices possessing [Assignment: organization-defined environmental
13886 sensing capabilities] in [Assignment: organization-defined facilities, areas, or systems].
13887 Discussion: For example, organizations may prohibit individuals from bringing cell phones or
13888 digital cameras into certain designated facilities or controlled areas within facilities where
13889 classified information is stored or sensitive conversations are taking place.
13890 Related Controls: None.
13891 (4) SENSOR CAPABILITY AND DATA | NOTICE OF COLLECTION
13892 Employ the following measures to facilitate an individual’s awareness that personally
13893 identifiable information is being collected by [Assignment: organization-defined sensors]:
13894 [Assignment: organization-defined measures].
13895 Discussion: Awareness that organizational sensors are collecting data enable individuals to
13896 more effectively engage in managing their privacy. Measures can include conventional
13897 written notices and sensor configurations that make individuals aware directly or indirectly
13898 through other devices that the sensor is collecting information. Usability and efficacy of the
13899 notice are important considerations.
13900 Related Controls: PT-1, PT-5, PT-6.
13901 (5) SENSOR CAPABILITY AND DATA | COLLECTION MINIMIZATION
13902 Employ [Assignment: organization-defined sensors] that are configured to minimize the
13903 collection of information about individuals that is not needed.
13904 Discussion: Although policies to control for authorized use can be applied to information
13905 once it is collected, minimizing the collection of information that is not needed mitigates
13906 privacy risk at the system entry point and mitigates the risk of policy control failures. Sensor
13907 configurations include the obscuring of human features such as blurring or pixelating flesh
13908 tones.
13909 Related Controls: SI-12.
13910 References: [OMB A-130]; [SP 800-124].
14007 Discussion: System owners may require additional strength of mechanism and robustness to
14008 ensure domain separation and policy enforcement for specific types of threats and environments
14009 of operation. Hardware-enforced separation and policy enforcement provide greater strength of
14010 mechanism than software-enforced separation and policy enforcement.
14011 Related Controls: AC-4, SA-8, SC-50.
14012 Control Enhancements: None.
14013 References: [SP 800-160 v1].
14051 security as a foundational property. Connections to and from such devices are generally not
14052 encrypted, do not provide the necessary authentication, are not monitored, and are not logged.
14053 As a result, these devices pose a significant cyber threat. In some instances, gaps in IoT, OT, and
14054 IIoT security capabilities may be addressed by employing intermediary devices that can provide
14055 encryption, authentication, security scanning, and logging capabilities, and preclude the devices
14056 from being accessible from the Internet. But such mitigating options are not always available.
14057 The situation is further complicated because some of the IoT/OT/IIoT devices are needed for
14058 essential missions and functions. In those instances, it is necessary that such devices are isolated
14059 from the Internet to reduce the susceptibility to hostile cyber-attacks.
14060 Related Controls: AC-3, AC-4, SA-8, SC-2, SC-3, SC-49.
14061 Control Enhancements: None.
14062 References: [SP 800-160 v1].
14102 b. Test software and firmware updates related to flaw remediation for effectiveness and
14103 potential side effects before installation;
14104 c. Install security-relevant software and firmware updates within [Assignment: organization-
14105 defined time-period] of the release of the updates; and
14106 d. Incorporate flaw remediation into the organizational configuration management process.
14107 Discussion: The need to remediate system flaws applies to all types of software and firmware.
14108 Organizations identify systems affected by software flaws, including potential vulnerabilities
14109 resulting from those flaws, and report this information to designated organizational personnel
14110 with information security and privacy responsibilities. Security-relevant updates include patches,
14111 service packs, and malicious code signatures. Organizations also address flaws discovered during
14112 assessments, continuous monitoring, incident response activities, and system error handling. By
14113 incorporating flaw remediation into configuration management processes, required remediation
14114 actions can be tracked and verified.
14115 Organization-defined time-periods for updating security-relevant software and firmware may
14116 vary based on a variety of risk factors, including the security category of the system or the
14117 criticality of the update (i.e., severity of the vulnerability related to the discovered flaw); the
14118 organizational mission; or the threat environment. Some types of flaw remediation may require
14119 more testing than other types. Organizations determine the type of testing needed for the
14120 specific type of flaw remediation activity under consideration and the types of changes that are
14121 to be configuration-managed. In some situations, organizations may determine that the testing
14122 of software or firmware updates is not necessary or practical, for example, when implementing
14123 simple malicious code signature updates. Organizations consider in testing decisions whether
14124 security-relevant software or firmware updates are obtained from authorized sources with
14125 appropriate digital signatures.
14126 Related Controls: CA-5, CM-3, CM-4, CM-5, CM-6, CM-8, MA-2, RA-5, SA-8, SA-10, SA-11, SI-3, SI-
14127 5, SI-7, SI-11.
14128 Control Enhancements:
14129 (1) FLAW REMEDIATION | CENTRAL MANAGEMENT
14130 Centrally manage the flaw remediation process.
14131 Discussion: Central management is the organization-wide management and implementation
14132 of flaw remediation processes. It includes planning, implementing, assessing, authorizing,
14133 and monitoring the organization-defined, centrally managed flaw remediation controls.
14134 Related Controls: PL-9.
14135 (2) FLAW REMEDIATION | AUTOMATED FLAW REMEDIATION STATUS
14136 Determine if system components have applicable security-relevant software and firmware
14137 updates installed using [Assignment: organization-defined automated mechanisms]
14138 [Assignment: organization-defined frequency].
14139 Discussion: Automated mechanisms can track and determine the status of known flaws for
14140 system components.
14141 Related Controls: CA-7, SI-4.
14142 (3) FLAW REMEDIATION | TIME TO REMEDIATE FLAWS AND BENCHMARKS FOR CORRECTIVE ACTIONS
14143 (a) Measure the time between flaw identification and flaw remediation; and
14144 (b) Establish the following benchmarks for taking corrective actions: [Assignment:
14145 organization-defined benchmarks].
14146 Discussion: Organizations determine the time it takes on average to correct system flaws
14147 after such flaws have been identified, and subsequently establish organizational benchmarks
14148 (i.e., time frames) for taking corrective actions. Benchmarks can be established by the type
14149 of flaw or the severity of the potential vulnerability if the flaw can be exploited.
14150 Related Controls: None.
14151 (4) FLAW REMEDIATION | AUTOMATED PATCH MANAGEMENT TOOLS
14152 Employ automated patch management tools to facilitate flaw remediation to the following
14153 system components: [Assignment: organization-defined system components].
14154 Discussion: Using automated tools to support patch management helps to ensure the
14155 timeliness and completeness of system patching operations.
14156 Related Controls: None.
14157 (5) FLAW REMEDIATION | AUTOMATIC SOFTWARE AND FIRMWARE UPDATES
14158 Install [Assignment: organization-defined security-relevant software and firmware
14159 updates] automatically to [Assignment: organization-defined system components].
14160 Discussion: Due to system integrity and availability concerns, organizations consider the
14161 methodology used to carry out automatic updates. Organizations balance the need to
14162 ensure that the updates are installed as soon as possible with the need to maintain
14163 configuration management and control with any mission or operational impacts that
14164 automatic updates might impose.
14165 Related Controls: None.
14166 (6) FLAW REMEDIATION | REMOVAL OF PREVIOUS VERSIONS OF SOFTWARE AND FIRMWARE
14167 Remove previous versions of [Assignment: organization-defined software and firmware
14168 components] after updated versions have been installed.
14169 Discussion: Previous versions of software or firmware components that are not removed
14170 from the system after updates have been installed may be exploited by adversaries. Some
14171 products may remove previous versions of software and firmware automatically from the
14172 system.
14173 Related Controls: None.
14174 References: [OMB A-130]; [FIPS 140-3]; [FIPS 186-4]; [SP 800-40]; [SP 800-128]; [IR 7788].
14190 d. Address the receipt of false positives during malicious code detection and eradication and
14191 the resulting potential impact on the availability of the system.
14192 Discussion: System entry and exit points include firewalls, remote-access servers, workstations,
14193 electronic mail servers, web servers, proxy servers, notebook computers, and mobile devices.
14194 Malicious code includes viruses, worms, Trojan horses, and spyware. Malicious code can also be
14195 encoded in various formats contained within compressed or hidden files, or hidden in files using
14196 techniques such as steganography. Malicious code can be inserted into systems in a variety of
14197 ways, including by electronic mail, the world-wide web, and portable storage devices. Malicious
14198 code insertions occur through the exploitation of system vulnerabilities. A variety of technologies
14199 and methods exist to limit or eliminate the effects of malicious code.
14200 Malicious code protection mechanisms include both signature- and nonsignature-based
14201 technologies. Nonsignature-based detection mechanisms include artificial intelligence
14202 techniques that use heuristics to detect, analyze, and describe the characteristics or behavior of
14203 malicious code and to provide controls against such code for which signatures do not yet exist or
14204 for which existing signatures may not be effective. Malicious code for which active signatures do
14205 yet exist or may be ineffective includes polymorphic malicious code (i.e., code that changes
14206 signatures when it replicates). Nonsignature-based mechanisms also include reputation-based
14207 technologies. In addition to the above technologies, pervasive configuration management,
14208 comprehensive software integrity controls, and anti-exploitation software may be effective in
14209 preventing execution of unauthorized code. Malicious code may be present in commercial off-
14210 the-shelf software and in custom-built software and could include logic bombs, back doors, and
14211 other types of attacks that could affect organizational missions and business functions.
14212 In situations where malicious code cannot be detected by detection methods or technologies,
14213 organizations rely on other types of controls, including secure coding practices, configuration
14214 management and control, trusted procurement processes, and monitoring practices to ensure
14215 that software does not perform functions other than the functions intended. Organizations may
14216 determine in response to the detection of malicious code, different actions may be warranted.
14217 For example, organizations can define actions in response to malicious code detection during
14218 periodic scans, actions in response to detection of malicious downloads, or actions in response to
14219 detection of maliciousness when attempting to open or execute files.
14220 Related Controls: AC-4, AC-19, CM-3, CM-8, IR-4, MA-3, MA-4, RA-5, SC-7, SC-23, SC-26, SC-28,
14221 SC-44, SI-2, SI-4, SI-7, SI-8, SI-15.
14222 Control Enhancements:
14223 (1) MALICIOUS CODE PROTECTION | CENTRAL MANAGEMENT
14224 Centrally manage malicious code protection mechanisms.
14225 Discussion: Central management addresses the organization-wide management and
14226 implementation of malicious code protection mechanisms. Central management includes
14227 planning, implementing, assessing, authorizing, and monitoring the organization-defined,
14228 centrally managed flaw and malicious code protection controls.
14229 Related Controls: PL-9.
14230 (2) MALICIOUS CODE PROTECTION | AUTOMATIC UPDATES
14231 [Withdrawn: Incorporated into SI-3.]
14232 (3) MALICIOUS CODE PROTECTION | NON-PRIVILEGED USERS
14233 [Withdrawn: Incorporated into AC-6(10).]
14234 (4) MALICIOUS CODE PROTECTION | UPDATES ONLY BY PRIVILEGED USERS
14235 Update malicious code protection mechanisms only when directed by a privileged user.
14236 Discussion: Protection mechanisms for malicious code are typically categorized as security-
14237 related software and as such, are only updated by organizational personnel with appropriate
14238 access privileges.
14239 Related Controls: CM-5.
14240 (5) MALICIOUS CODE PROTECTION | PORTABLE STORAGE DEVICES
14241 [Withdrawn: Incorporated into MP-7.]
14242 (6) MALICIOUS CODE PROTECTION | TESTING AND VERIFICATION
14243 (a) Test malicious code protection mechanisms [Assignment: organization-defined
14244 frequency] by introducing known benign code into the system; and
14245 (b) Verify that the detection of the code and the associated incident reporting occur.
14246 Discussion: None.
14247 Related Controls: CA-2, CA-7, RA-5.
14248 (7) MALICIOUS CODE PROTECTION | NONSIGNATURE-BASED DETECTION
14249 [Withdrawn: Incorporated into SI-3.]
14250 (8) MALICIOUS CODE PROTECTION | DETECT UNAUTHORIZED COMMANDS
14251 (a) Detect the following unauthorized operating system commands through the kernel
14252 application programming interface on [Assignment: organization-defined system
14253 hardware components]: [Assignment: organization-defined unauthorized operating
14254 system commands]; and
14255 (b) [Selection (one or more): issue a warning; audit the command execution; prevent the
14256 execution of the command].
14257 Discussion: Detecting unauthorized commands can be applied to critical interfaces other
14258 than kernel-based interfaces, including interfaces with virtual machines and privileged
14259 applications. Unauthorized operating system commands include commands for kernel
14260 functions from system processes that are not trusted to initiate such commands, or
14261 commands for kernel functions that are suspicious even though commands of that type are
14262 reasonable for processes to initiate. Organizations can define the malicious commands to be
14263 detected by a combination of command types, command classes, or specific instances of
14264 commands. Organizations can also define hardware components by component type,
14265 component, component location in the network, or combination therein. Organizations may
14266 select different actions for different types, classes, or instances of malicious commands.
14267 Related Controls: AU-2, AU-6, AU-12.
14268 (9) MALICIOUS CODE PROTECTION | AUTHENTICATE REMOTE COMMANDS
14269 Implement [Assignment: organization-defined mechanisms] to authenticate [Assignment:
14270 organization-defined remote commands].
14271 Discussion: This control enhancement protects against unauthorized remote commands and
14272 the replay of authorized commands. This capability is important for those remote systems
14273 whose loss, malfunction, misdirection, or exploitation would have immediate and/or serious
14274 consequences, including, for example, injury or death, property damage, loss of high-value
14275 assets, compromise of classified or controlled unclassified information, or failure of missions
14276 or business functions. Authentication safeguards for remote commands ensure that systems
14277 accept and execute commands in the order intended, execute only authorized commands,
14278 and reject unauthorized commands. Cryptographic mechanisms can be employed, for
14279 example, to authenticate remote commands.
14280 Related Controls: SC-12, SC-13, SC-23.
14324 Depending on the security architecture implementation, the distribution and configuration of
14325 monitoring devices may impact throughput at key internal and external boundaries, and at other
14326 locations across a network due to the introduction of network throughput latency. If throughput
14327 management is needed, such devices are strategically located and deployed as part of an
14328 established organization-wide security architecture. Strategic locations for monitoring devices
14329 include selected perimeter locations and near key servers and server farms supporting critical
14330 applications. Monitoring devices are typically employed at the managed interfaces associated
14331 with controls SC-7 and AC-17. The information collected is a function of the organizational
14332 monitoring objectives and the capability of systems to support such objectives. Specific types of
14333 transactions of interest include Hyper Text Transfer Protocol (HTTP) traffic that bypasses HTTP
14334 proxies. System monitoring is an integral part of organizational continuous monitoring and
14335 incident response programs and output from system monitoring serves as input to those
14336 programs. System monitoring requirements, including the need for specific types of system
14337 monitoring, may be referenced in other controls (e.g., AC-2g, AC-2(7), AC-2(12)(a), AC-17(1), AU-
14338 13, AU-13(1), AU-13(2), CM-3f, CM-6d, MA-3a, MA-4a, SC-5(3)(b), SC-7a, SC-7(24)(b), SC-18c, SC-
14339 43b). Adjustments to levels of system monitoring are based on law enforcement information,
14340 intelligence information, or other sources of information. The legality of system monitoring
14341 activities is based on applicable laws, executive orders, directives, regulations, policies,
14342 standards, and guidelines.
14343 Related Controls: AC-2, AC-3, AC-4, AC-8, AC-17, AU-2, AU-6, AU-7, AU-9, AU-12, AU-13, AU-14,
14344 CA-7, CM-3, CM-6, CM-8, CM-11, IA-10, IR-4, MA-3, MA-4, PM-12, RA-5, SC-5, SC-7, SC-18, SC-26,
14345 SC-31, SC-35, SC-36, SC-37, SC-43, SI-3, SI-6, SI-7, SR-9, SR-10.
14346 Control Enhancements:
14347 (1) SYSTEM MONITORING | SYSTEM-WIDE INTRUSION DETECTION SYSTEM
14348 Connect and configure individual intrusion detection tools into a system-wide intrusion
14349 detection system.
14350 Discussion: Linking individual intrusion detection tools into a system-wide intrusion
14351 detection system provides additional coverage and effective detection capability. The
14352 information contained in one intrusion detection tool can be shared widely across the
14353 organization making the system-wide detection capability more robust and powerful.
14354 Related Controls: None.
14355 (2) SYSTEM MONITORING | AUTOMATED TOOLS AND MECHANISMS FOR REAL-TIME ANALYSIS
14356 Employ automated tools and mechanisms to support near real-time analysis of events.
14357 Discussion: Automated tools and mechanisms include host-based, network-based,
14358 transport-based, or storage-based event monitoring tools and mechanisms or Security
14359 Information and Event Management technologies that provide real time analysis of alerts
14360 and notifications generated by organizational systems. Automated monitoring techniques
14361 can create unintended privacy risks because automated controls may connect to external or
14362 otherwise unrelated systems. The matching of records between these systems may create
14363 linkages with unintended consequences. Organizations assess and document these risks in
14364 their privacy impact assessment and make determinations that are in alignment with their
14365 privacy program plan.
14366 Related Controls: PM-23, PM-25.
14367 (3) SYSTEM MONITORING | AUTOMATED TOOL AND MECHANISM INTEGRATION
14368 Employ automated tools and mechanisms to integrate intrusion detection tools and
14369 mechanisms into access control and flow control mechanisms.
14370 Discussion: Using automated tools and mechanisms to integrate intrusion detection tools
14371 and mechanisms into access and flow control mechanisms facilitates a rapid response to
14416 objectives of organizations. The frequency and depth of testing depends on the types of
14417 tools and mechanisms used by organizations and the methods of deployment.
14418 Related Controls: CP-9.
14419 (10) SYSTEM MONITORING | VISIBILITY OF ENCRYPTED COMMUNICATIONS
14420 Make provisions so that [Assignment: organization-defined encrypted communications
14421 traffic] is visible to [Assignment: organization-defined system monitoring tools and
14422 mechanisms].
14423 Discussion: Organizations balance the need for encrypting communications traffic to protect
14424 data confidentiality with the need for having visibility into such traffic from a monitoring
14425 perspective. Organizations determine whether the visibility requirement applies to internal
14426 encrypted traffic, encrypted traffic intended for external destinations, or a subset of the
14427 traffic types.
14428 Related Controls: None.
14429 (11) SYSTEM MONITORING | ANALYZE COMMUNICATIONS TRAFFIC ANOMALIES
14430 Analyze outbound communications traffic at the external interfaces to the system and
14431 selected [Assignment: organization-defined interior points within the system] to discover
14432 anomalies.
14433 Discussion: Organization-defined interior points include subnetworks and subsystems.
14434 Anomalies within organizational systems include large file transfers, long-time persistent
14435 connections, attempts to access information from unexpected locations, the use of unusual
14436 protocols and ports, the use of unmonitored network protocols (e.g. IPv6 usage during IPv4
14437 transition), and attempted communications with suspected malicious external addresses.
14438 Related Controls: None.
14439 (12) SYSTEM MONITORING | AUTOMATED ORGANIZATION-GENERATED ALERTS
14440 Alert [Assignment: organization-defined personnel or roles] using [Assignment:
14441 organization-defined automated mechanisms] when the following indications of
14442 inappropriate or unusual activities with security or privacy implications occur:
14443 [Assignment: organization-defined activities that trigger alerts].
14444 Discussion: Organizational personnel on the system alert notification list include system
14445 administrators, mission or business owners, system owners, senior agency information
14446 security officer, senior agency official for privacy, system security officers, or privacy officers.
14447 This control enhancement focuses on the security alerts generated by organizations and
14448 transmitted using automated means. In contrast to the alerts generated by systems in SI-4(5)
14449 that focus on information sources that are internal to the systems such as audit records, the
14450 sources of information for this enhancement focus on other entities such as suspicious
14451 activity reports and reports on potential insider threats.
14452 Related Controls: None.
14453 (13) SYSTEM MONITORING | ANALYZE TRAFFIC AND EVENT PATTERNS
14454 (a) Analyze communications traffic and event patterns for the system;
14455 (b) Develop profiles representing common traffic and event patterns; and
14456 (c) Use the traffic and event profiles in tuning system-monitoring devices.
14457 Discussion: Identifying and understanding common communications traffic and event
14458 patterns helps organizations provide useful information to system monitoring devices to
14459 more effectively identify suspicious or anomalous traffic and events when they occur. Such
14460 information can help reduce the number of false positives and false negatives during system
14461 monitoring.
14462 Related Controls: None.
14598 Discussion: The Cybersecurity and Infrastructure Security Agency (CISA) generates security alerts
14599 and advisories to maintain situational awareness throughout the federal government. Security
14600 directives are issued by OMB or other designated organizations with the responsibility and
14601 authority to issue such directives. Compliance with security directives is essential due to the
14602 critical nature of many of these directives and the potential (immediate) adverse effects on
14603 organizational operations and assets, individuals, other organizations, and the Nation should the
14604 directives not be implemented in a timely manner. External organizations include supply chain
14605 partners, external mission or business partners, external service providers, and other peer or
14606 supporting organizations.
14607 Related Controls: PM-15, RA-5, SI-2.
14608 Control Enhancements:
14609 (1) SECURITY ALERTS, ADVISORIES, AND DIRECTIVES | AUTOMATED ALERTS AND ADVISORIES
14610 Broadcast security alert and advisory information throughout the organization using
14611 [Assignment: organization-defined automated mechanisms].
14612 Discussion: The significant number of changes to organizational systems and environments
14613 of operation requires the dissemination of security-related information to a variety of
14614 organizational entities that have a direct interest in the success of organizational missions
14615 and business functions. Based on information provided by security alerts and advisories,
14616 changes may be required at one or more of the three levels related to the management of
14617 information security and privacy risk, including the governance level, mission and business
14618 process level, and the information system level.
14619 Related Controls: None.
14620 References: [SP 800-40].
14641 (2) SECURITY AND PRIVACY FUNCTION VERIFICATION | AUTOMATION SUPPORT FOR DISTRIBUTED
14642 TESTING
14643 Implement automated mechanisms to support the management of distributed security
14644 and privacy function testing.
14645 Discussion: The use of automated mechanisms to support the management of distributed
14646 function testing helps to ensure the integrity, timeliness, completeness, and efficacy of such
14647 testing.
14648 Related Controls: SI-2.
14649 (3) SECURITY AND PRIVACY FUNCTION VERIFICATION | REPORT VERIFICATION RESULTS
14650 Report the results of security and privacy function verification to [Assignment:
14651 organization-defined personnel or roles].
14652 Discussion: Organizational personnel with potential interest in the results of the verification
14653 of security and privacy function include systems security officers, senior agency information
14654 security officers, and senior agency officials for privacy.
14655 Related Controls: SI-4, SR-4, SR-5.
14656 References: [OMB A-130].
14683 (2) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | AUTOMATED NOTIFICATIONS OF INTEGRITY
14684 VIOLATIONS
14685 Employ automated tools that provide notification to [Assignment: organization-defined
14686 personnel or roles] upon discovering discrepancies during integrity verification.
14687 Discussion: The employment of automated tools to report system and information integrity
14688 violations and to notify organizational personnel in a timely matter is essential to effective
14689 risk response. Personnel having an interest in system and information integrity violations
14690 include mission and business owners, system owners, senior agency information security
14691 official, senior agency official for privacy, systems administrators, software developers,
14692 systems integrators, and information security officers, and privacy officers.
14693 Related Controls: None.
14694 (3) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | CENTRALLY-MANAGED INTEGRITY TOOLS
14695 Employ centrally managed integrity verification tools.
14696 Discussion: Centrally-managed integrity verification tools provides greater consistency in
14697 the application of such tools and can facilitate more comprehensive coverage of integrity
14698 verification actions.
14699 Related Controls: AU-3, SI-2, SI-8.
14700 (4) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | TAMPER-EVIDENT PACKAGING
14701 [Withdrawn: Incorporated into SR-9.]
14702 (5) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | AUTOMATED RESPONSE TO INTEGRITY
14703 VIOLATIONS
14704 Automatically [Selection (one or more): shut the system down; restart the system;
14705 implement [Assignment: organization-defined controls]] when integrity violations are
14706 discovered.
14707 Discussion: Organizations may define different integrity checking responses by type of
14708 information, by specific information, or a combination of both. Types of information include
14709 firmware, software, and user data. Specific information includes boot firmware for certain
14710 types of machines. The automatic implementation of controls within organizational systems
14711 includes reversing the changes, halting the system, or triggering audit alerts when
14712 unauthorized modifications to critical security files occur.
14713 Related Controls: None.
14714 (6) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | CRYPTOGRAPHIC PROTECTION
14715 Implement cryptographic mechanisms to detect unauthorized changes to software,
14716 firmware, and information.
14717 Discussion: Cryptographic mechanisms used to protect integrity include digital signatures
14718 and the computation and application of signed hashes using asymmetric cryptography;
14719 protecting the confidentiality of the key used to generate the hash; and using the public key
14720 to verify the hash information. Organizations employing cryptographic mechanisms also
14721 consider cryptographic key management solutions (see SC-12 and SC-13).
14722 Related Controls: SC-12, SC-13.
14723 (7) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | INTEGRATION OF DETECTION AND
14724 RESPONSE
14725 Incorporate the detection of the following unauthorized changes into the organizational
14726 incident response capability: [Assignment: organization-defined security-relevant changes
14727 to the system].
14728 Discussion: This control enhancement helps to ensure that detected events are tracked,
14729 monitored, corrected, and available for historical purposes. Maintaining historical records is
14730 important both for being able to identify and discern adversary actions over an extended
14731 time-period and for possible legal actions. Security-relevant changes include unauthorized
14732 changes to established configuration settings or unauthorized elevation of system privileges.
14733 Related Controls: AU-2, AU-6, IR-4, IR-5, SI-4.
14734 (8) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | AUDITING CAPABILITY FOR SIGNIFICANT
14735 EVENTS
14736 Upon detection of a potential integrity violation, provide the capability to audit the event
14737 and initiate the following actions: [Selection (one or more): generate an audit record; alert
14738 current user; alert [Assignment: organization-defined personnel or roles]; [Assignment:
14739 organization-defined other actions]].
14740 Discussion: Organizations select response actions based on types of software, specific
14741 software, or information for which there are potential integrity violations.
14742 Related Controls: AU-2, AU-6, AU-12.
14743 (9) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | VERIFY BOOT PROCESS
14744 Verify the integrity of the boot process of the following system components: [Assignment:
14745 organization-defined system components].
14746 Discussion: Ensuring the integrity of boot processes is critical to starting system components
14747 in known, trustworthy states. Integrity verification mechanisms provide a level of assurance
14748 that only trusted code is executed during boot processes.
14749 Related Controls: SI-6.
14750 (10) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | PROTECTION OF BOOT FIRMWARE
14751 Implement the following mechanisms to protect the integrity of boot firmware in
14752 [Assignment: organization-defined system components]: [Assignment: organization-
14753 defined mechanisms].
14754 Discussion: Unauthorized modifications to boot firmware may indicate a sophisticated,
14755 targeted attack. These types of targeted attacks can result in a permanent denial of service
14756 or a persistent malicious code presence. These situations can occur, for example, if the
14757 firmware is corrupted or if the malicious code is embedded within the firmware. System
14758 components can protect the integrity of boot firmware in organizational systems by verifying
14759 the integrity and authenticity of all updates to the firmware prior to applying changes to the
14760 system component; and preventing unauthorized processes from modifying the boot
14761 firmware.
14762 Related Controls: SI-6.
14763 (11) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | CONFINED ENVIRONMENTS WITH LIMITED
14764 PRIVILEGES
14765 [Withdrawn: Moved to CM-7(6).]
14766 (12) SOFTWARE, FIRMWARE, AND INFORMATION INTEGRITY | INTEGRITY VERIFICATION
14767 Require that the integrity of the following user-installed software be verified prior to
14768 execution: [Assignment: organization-defined user-installed software].
14769 Discussion: Organizations verify the integrity of user-installed software prior to execution to
14770 reduce the likelihood of executing malicious code or executing code that contains errors
14771 from unauthorized modifications. Organizations consider the practicality of approaches to
14772 verifying software integrity, including availability of checksums of adequate trustworthiness
14773 from software developers or vendors.
14859 for format and content. For example, if the organization specifies that numerical values between
14860 1-100 are the only acceptable inputs for a field in a given application, inputs of 387, abc, or %K%
14861 are invalid inputs and are not accepted as input to the system. Valid inputs are likely to vary from
14862 field to field within a software application. Applications typically follow well-defined protocols
14863 that use structured messages (i.e., commands or queries) to communicate between software
14864 modules or system components. Structured messages can contain raw or unstructured data
14865 interspersed with metadata or control information. If software applications use attacker-supplied
14866 inputs to construct structured messages without properly encoding such messages, then the
14867 attacker could insert malicious commands or special characters that can cause the data to be
14868 interpreted as control information or metadata. Consequently, the module or component that
14869 receives the corrupted output will perform the wrong operations or otherwise interpret the data
14870 incorrectly. Prescreening inputs prior to passing to interpreters prevents the content from being
14871 unintentionally interpreted as commands. Input validation ensures accurate and correct inputs
14872 and prevent attacks such as cross-site scripting and a variety of injection attacks.
14873 Related Controls: None.
14874 Control Enhancements:
14875 (1) INFORMATION INPUT VALIDATION | MANUAL OVERRIDE CAPABILITY
14876 (a) Provide a manual override capability for input validation of the following information
14877 inputs: [Assignment: organization-defined inputs];
14878 (b) Restrict the use of the manual override capability to only [Assignment: organization-
14879 defined authorized individuals]; and
14880 (c) Audit the use of the manual override capability.
14881 Discussion: In certain situations, for example, during events that are defined in contingency
14882 plans, a manual override capability for input validation may be needed. Manual overrides
14883 are used only in limited circumstances and with the inputs defined by the organization.
14884 Related Controls: AC-3, AU-2, AU-12.
14885 (2) INFORMATION INPUT VALIDATION | REVIEW AND RESOLVE ERRORS
14886 Review and resolve input validation errors within [Assignment: organization-defined time-
14887 period].
14888 Discussion: Resolution of input validation errors includes correcting systemic causes of
14889 errors and resubmitting transactions with corrected input.
14890 Related Controls: None.
14891 (3) INFORMATION INPUT VALIDATION | PREDICTABLE BEHAVIOR
14892 Verify that the system behaves in a predictable and documented manner when invalid
14893 inputs are received.
14894 Discussion: A common vulnerability in organizational systems is unpredictable behavior
14895 when invalid inputs are received. This control enhancement ensures that there is predictable
14896 behavior when the system receives invalid inputs by specifying system responses that allow
14897 the system to transition to known states without adverse, unintended side effects. The
14898 invalid inputs are those inputs related to the information inputs defined by the organization
14899 in the base control.
14900 Related Controls: None.
14901 (4) INFORMATION INPUT VALIDATION | TIMING INTERACTIONS
14902 Account for timing interactions among system components in determining appropriate
14903 responses for invalid inputs.
14904 Discussion: In addressing invalid system inputs received across protocol interfaces, timing
14905 interactions become relevant, where one protocol needs to consider the impact of the error
14906 response on other protocols in the protocol stack. For example, 802.11 standard wireless
14907 network protocols do not interact well with Transmission Control Protocols (TCP) when
14908 packets are dropped (which could be due to invalid packet input). TCP assumes packet losses
14909 are due to congestion, while packets lost over 802.11 links are typically dropped due to noise
14910 or collisions on the link. If TCP makes a congestion response, it takes the wrong action in
14911 response to a collision event. Adversaries may be able to use what appears to be acceptable
14912 individual behaviors of the protocols in concert to achieve adverse effects through suitable
14913 construction of invalid input.
14914 Related Controls: None.
14915 (5) INFORMATION INPUT VALIDATION | RESTRICT INPUTS TO TRUSTED SOURCES AND APPROVED
14916 FORMATS
14917 Restrict the use of information inputs to [Assignment: organization-defined trusted
14918 sources] and/or [Assignment: organization-defined formats].
14919 Discussion: This control enhancement applies the concept of whitelisting to information
14920 inputs. Specifying known trusted sources for information inputs and acceptable formats for
14921 such inputs can reduce the probability of malicious activity.
14922 Related Controls: AC-3, AC-6.
14923 (6) INFORMATION INPUT VALIDATION | INJECTION PREVENTION
14924 Prevent untrusted data injections.
14925 Discussion: Untrusted data injections may be prevented using, for example, a parameterized
14926 interface or output escaping (output encoding). Parameterized interfaces separate data from
14927 code so injections of malicious or unintended data cannot change the semantics of the
14928 command being sent. Output escaping uses specified characters to inform the interpreter’s
14929 parser whether data is trusted.
14930 Related Controls: AC-3, AC-6.
14931 References: [OMB A-130, Appendix II].
15082 Discussion: Trusted sources include software and data from write-once, read-only media or
15083 from selected off-line secure storage facilities.
15084 Related Controls: None.
15085 (2) NON-PERSISTENCE | NON-PERSISTENT INFORMATION
15086 (a) [Selection: refresh [Assignment: organization-defined information] [Assignment:
15087 organization-defined frequency]; generate [Assignment: organization-defined
15088 information] on demand]; and
15089 (b) Delete information when no longer needed.
15090 Discussion: Retaining information longer than it is needed makes the information a
15091 potential target for advanced adversaries searching for high value assets to compromise
15092 through unauthorized disclosure, unauthorized modification, or exfiltration. For system-
15093 related information, unnecessary retention provides advanced adversaries information that
15094 can assist in their reconnaissance and lateral movement through the system.
15095 Related Controls: None.
15096 (3) NON-PERSISTENCE | NON-PERSISTENT CONNECTIVITY
15097 Establish connections to the system on demand and terminate connections after
15098 [Selection: completion of a request; a period of non-use].
15099 Discussion: Persistent connections to systems can provide advanced adversaries with paths
15100 to move laterally through systems, and potentially position themselves closer to high value
15101 assets. Limiting the availability of such connections impedes the adversary’s ability to move
15102 freely organizational systems.
15103 Related Controls: SC-10.
15104 References: None.
15167 Discussion: The use of automated mechanisms to improve data quality may inadvertently
15168 create privacy risks. Automated tools may connect to external or otherwise unrelated
15169 systems, and the matching of records between these systems may create linkages with
15170 unintended consequences. Organizations assess and document these risks in their privacy
15171 impact assessment and make determinations that are in alignment with their privacy
15172 program plan.
15173 As data is obtained and used across the information life cycle, it is important to confirm the
15174 accuracy and relevance of personally identifiable information. Automated mechanisms can
15175 augment existing data quality processes and procedures and enable an organization to
15176 better identify and manage personally identifiable information in large-scale systems. For
15177 example, automated tools can greatly improve efforts to consistently normalize data or
15178 identify malformed data. Automated tools can also be used to improve auditing of data and
15179 detect errors that may incorrectly alter personally identifiable information or incorrectly
15180 associate such information with the wrong individual. Automated capabilities backstop
15181 processes and procedures at-scale and enable more fine-grained detection and correction of
15182 data quality errors.
15183 Related Controls: PM-18, PM-22, RA-8.
15184 (2) PERSONALLY IDENTIFIABLE INFORMATION QUALITY OPERATIONS | DATA TAGS
15185 Employ data tags to automate the correction or deletion of personally identifiable
15186 information across the information life cycle within organizational systems.
15187 Discussion: Data tagging personally identifiable information includes tags noting processing
15188 permissions, authority to process, de-identification, impact level, information life cycle
15189 stage, and retention or last updated dates. Employing data tags for personally identifiable
15190 information can support the use of automation tools to correct or delete relevant personally
15191 identifiable information.
15192 Related Controls: SC-16.
15193 (3) PERSONALLY IDENTIFIABLE INFORMATION QUALITY OPERATIONS | COLLECTION
15194 Collect personally identifiable information directly from the individual.
15195 Discussion: Individuals, or their designated representatives, can be a source of correct
15196 personally identifiable information about themselves. Organizations consider contextual
15197 factors that may incentivize individuals to provide correct data versus providing false data.
15198 Additional steps may be necessary to validate collected information based on the nature and
15199 context of the personally identifiable information, how it is to be used, and how it was
15200 obtained. Measures taken to validate the accuracy of personally identifiable information
15201 used to make determinations about the rights, benefits, or privileges of individuals under
15202 federal programs may be more comprehensive than those used to validate less sensitive
15203 personally identifiable information.
15204 Related Controls: None.
15205 (4) PERSONALLY IDENTIFIABLE INFORMATION QUALITY OPERATIONS | INDIVIDUAL REQUESTS
15206 Correct or delete personally identifiable information upon request by individuals or their
15207 designated representatives.
15208 Discussion: Inaccurate personally identifiable information maintained by organizations may
15209 cause problems for individuals, especially in those business functions where inaccurate
15210 information may result in inappropriate decisions or the denial of benefits and services to
15211 individuals. Even correct information, in certain circumstances, can cause problems for
15212 individuals that outweigh the benefits of an organization maintaining the information.
15213 Organizations use discretion in determining if personally identifiable information is to be
15214 corrected or deleted, based on the scope of requests, the changes sought, the impact of the
15215 changes, and applicable laws, regulations, and policies. Organizational personnel consult
15216 with the senior agency official for privacy and legal counsel regarding appropriate instances
15217 of correction or deletion.
15218 Related Controls: PM-22.
15219 (5) PERSONALLY IDENTIFIABLE INFORMATION QUALITY OPERATIONS | NOTICE OF COLLECTION OR
15220 DELETION
15221 Notify [Assignment: organization-defined recipients of personally identifiable information]
15222 and individuals that the personally identifiable information has been corrected or deleted.
15223 Discussion: When personally identifiable information is corrected or deleted, organizations
15224 take steps to ensure that all authorized recipients of such information, and the individual
15225 with which the information is associated or their designated representative, are informed of
15226 the corrected or deleted information.
15227 Related Controls: None.
15228 References: [SP 800-188].
15306 Discussion: The mathematical definition for differential privacy holds that the result of a
15307 dataset analysis should be approximately the same before and after the addition or removal
15308 of a single data record (which is assumed to be the data from a single individual). In its most
15309 basic form, differential privacy applies only to online query systems. However, it can also be
15310 used to produce machine-learning statistical classifiers and synthetic data. Differential
15311 privacy comes at the cost of decreased accuracy of results, forcing organizations to quantify
15312 the trade-off between privacy protection and the overall accuracy, usefulness, and utility of
15313 the de-identified dataset. Non-deterministic noise can include adding small random values
15314 to the results of mathematical operations in dataset analysis.
15315 Related Controls: SC-12, SC-13.
15316 (7) DE-IDENTIFICATION | VALIDATED SOFTWARE
15317 Perform de-identification using validated algorithms and software that is validated to
15318 implement the algorithms.
15319 Discussion: Algorithms that appear to remove personally identifiable information from a
15320 dataset may in fact leave information that is personally identifiable or data that are re-
15321 identifiable. Software that is claimed to implement a validated algorithm may contain bugs
15322 or may implement a different algorithm. Software may de-identify one type of data, for
15323 example, integers, but not another type of data, for example, floating point numbers. For
15324 these reasons, de-identification is performed using algorithms and software that are
15325 validated.
15326 Related Controls: None.
15327 (8) DE-IDENTIFICATION | MOTIVATED INTRUDER
15328 Perform a motivated intruder test on the de-identified dataset to determine if the
15329 identified data remains or if the de-identified data can be re-identified.
15330 Discussion: A motivated intruder test is a test in which a person or group takes a data
15331 release and specified resources and attempts to re-identify one or more individuals in the
15332 de-identified dataset. Such tests specify the amount of inside knowledge, computational
15333 resources, financial resources, data, and skills that intruders have at their disposal to
15334 conduct the tests. A motivated intruder test can determine if de-identification is insufficient.
15335 It can also be a useful diagnostic tool to assess if de-identification is likely to be sufficient.
15336 However, the test alone cannot prove that de-identification is sufficient.
15337 Related Controls: None.
15338 References: [OMB A-130, Appendix II]; [SP 800-188].
15353 steganographic data in files to enable the data to be found via open source analysis. And finally,
15354 an active tainting approach can include embedding software in the data that is able to “call
15355 home” alerting the organization to its “capture” and possibly its location and the path by which it
15356 was exfiltrated or removed.
15357 Related Controls: None.
15358 Control Enhancements: None.
15359 References: [OMB A-130, Appendix II]; [SP 800-160 v2].
15394 b. Distribute the fragmented information across the following systems or system components:
15395 [Assignment organization-defined systems or system components].
15396 Discussion: One major objective of the advanced persistent threat is to exfiltrate sensitive and
15397 valuable information. Once exfiltrated, there is generally no way for the organization to recover
15398 the lost information. Therefore, organizations may consider taking the information and dividing it
15399 into disparate elements and then distributing those elements across multiple systems or system
15400 components and locations. Such actions will increase the adversary’s work factor to capture and
15401 exfiltrate the desired information and in so doing, increase the probability of detection. The
15402 fragmentation of information also impacts the organization’s ability to access the information in
15403 a timely manner. The extent of the fragmentation would likely be dictated by the sensitivity (and
15404 value) of the information, threat intelligence information received, and if data tainting is used
15405 (i.e., data tainting derived information about exfiltration of some information could result in the
15406 fragmentation of the remaining information).
15407 Related Controls: None.
15408 Control Enhancements: None.
15409 References: [SP 800-160 v2].
15450 disposal of the following systems, system components or system services: [Assignment:
15451 organization-defined systems, system components, or system services];
15452 b. Implement the supply chain risk management plan consistently across the organization; and
15453 c. Review and update the supply chain risk management plan [Assignment: organization-
15454 defined frequency] or as required, to address threat, organizational or environmental
15455 changes.
15456 Discussion: The growing dependence on products, systems, and services from external
15457 providers, along with the nature of the relationships with those providers, present an increasing
15458 level of risk to an organization. Specific threat actions that may increase risk include the insertion
15459 or use of counterfeits, unauthorized production, tampering, theft, insertion of malicious software
15460 and hardware, as well as poor manufacturing and development practices in the supply chain that
15461 can create security or privacy risks. Supply chain risks can be endemic or systemic within a
15462 system element or component, a system, an organization, a sector, or the Nation. Managing
15463 supply chain risk is a complex, multifaceted undertaking requiring a coordinated effort across an
15464 organization building trust relationships and communicating with both internal and external
15465 stakeholders. Supply chain risk management (SCRM) activities involve identifying and assessing
15466 risks, determining appropriate mitigating actions, developing SCRM plans to document selected
15467 mitigating actions, and monitoring performance against plans.
15468 Because supply chains can differ significantly across and within organizations, SCRM plans are
15469 tailored to the individual program, organizational, and operational contexts. Tailored SCRM plans
15470 provide the basis for determining whether a system is fit for purpose; and as such, the controls
15471 need to be tailored accordingly. Tailored SCRM plans help organizations to focus their resources
15472 on the most critical missions and business functions based on mission and business requirements
15473 and their risk environment. Supply chain risk management plans include an expression of the
15474 supply chain risk tolerance for the organization, acceptable supply chain risk mitigation strategies
15475 or controls, a process for consistently evaluating and monitoring supply chain risk, approaches
15476 for implementing and communicating the plan, a description of and justification for supply chain
15477 risk mitigation measures taken, and associated roles and responsibilities. Finally, supply chain risk
15478 management plans address requirements for developing trustworthy secure, privacy-protective,
15479 and resilient system components and systems, including the application of the security design
15480 principles implemented as part of life cycle-based systems security engineering processes (see
15481 SA-8).
15482 Related Controls: CA-2, CP-4, IR-4, MA-2, MA-6, PE-16, PL-2, PM-9, PM-30, RA-3, RA-7, SA-8.
15483 Control Enhancements:
15484 (1) SUPPLY CHAIN RISK MANAGEMENT PLAN | ESTABLISH SCRM TEAM
15485 Establish a supply chain risk management team consisting of [Assignment: organization-
15486 defined personnel, roles, and responsibilities] to lead and support the following SCRM
15487 activities: [Assignment: organization-defined supply chain risk management activities].
15488 Discussion: To implement supply chain risk management plans, organizations establish a
15489 coordinated team-based approach to identify and assess supply chain risks and manage
15490 these risks by using programmatic and technical mitigation techniques. The team approach
15491 enables organizations to conduct an analysis of their supply chain, communicate with
15492 external partners or stakeholders, and gain broad consensus regarding the appropriate
15493 resources for SCRM. The SCRM team consists of organizational personnel with diverse roles
15494 and responsibilities for leading and supporting SCRM activities, including risk executive,
15495 information technology, contracting, information security, privacy, mission or business, legal,
15496 supply chain and logistics, acquisition, and other relevant functions. Members of the SCRM
15497 team are involved in the various aspects of the SDLC and collectively, have an awareness of,
15498 and provide expertise in acquisition processes, legal practices, vulnerabilities, threats, and
15499 attack vectors, as well as an understanding of the technical aspects and dependencies of
15500 systems. The SCRM team can be an extension of the security and privacy risk management
15501 processes or can be included as part of a general organizational risk management team.
15502 Related Controls: None.
15503 References: [SP 800-30]; [SP 800-39]; [SP-800-160 v1]; [SP 800-161]; [IR 7622].
15541 (2) SUPPLY CHAIN PROTECTION CONTROLS AND PROCESSES | LIMITATION OF HARM
15542 Employ the following supply chain controls to limit harm from potential adversaries
15543 identifying and targeting the organizational supply chain: [Assignment: organization-
15544 defined controls].
15545 Discussion: Controls that can be implemented to reduce the probability of adversaries
15546 successfully identifying and targeting the supply chain include avoiding the purchase of
15547 custom or non-standardized configurations; employing approved vendor lists with standing
15548 reputations in industry; following pre-agreed maintenance schedules and update and patch
15549 delivery mechanisms; maintaining a contingency plan in case of a supply chain event, and
15550 using procurement carve outs that provide exclusions to commitments or obligations, using
15551 diverse delivery routes; and minimizing the time between purchase decisions and delivery.
15552 Related Controls: None.
15553 References: [SP 800-30]; [SP 800-161]; [IR 7622].
15588 programs; or other programs, processes, or procedures associated with the production and
15589 distribution of supply chain elements. Supply chain personnel are individuals with specific
15590 roles and responsibilities related to the secure development, delivery, maintenance, and
15591 disposal of a system or system component. Identification methods are sufficient to support
15592 an investigation in case of a supply chain change (e.g. if a supply company is purchased),
15593 compromise, or event.
15594 Related Controls: IA-2, IA-8, PE-16.
15595 (2) PROVENANCE | TRACK AND TRACE
15596 Establish and maintain unique identification of the following systems and critical system
15597 components for tracking through the supply chain: [Assignment: organization-defined
15598 systems and critical system components].
15599 Discussion: Tracking the unique identification of systems and system components during
15600 development and transport activities provides a foundational identity structure for the
15601 establishment and maintenance of provenance. For example, system components may be
15602 labeled using serial numbers or tagged using radio-frequency identification tags. Labels and
15603 tags can help provide better visibility into the provenance of a system or system component.
15604 A system or system component may have more than one unique identifier. Identification
15605 methods are sufficient to support a forensic investigation after a supply chain compromise
15606 or event.
15607 Related Controls: IA-2, IA-8, PE-16, PL-2.
15608 (3) PROVENANCE | VALIDATE AS GENUINE AND NOT ALTERED
15609 Employ the following controls to validate that the system or system component received is
15610 genuine and has not been altered: [Assignment: organization-defined controls].
15611 Discussion: For many systems and system components, especially hardware, there are
15612 technical means to determine if the items are genuine or have been altered, including
15613 optical and nanotechnology tagging; physically unclonable functions; side-channel analysis;
15614 cryptographic hash verifications or digital signatures; and visible anti-tamper labels or
15615 stickers. Controls can also include monitoring for out of specification performance, which
15616 can be an indicator of tampering or counterfeits. Organizations may leverage supplier and
15617 contractor processes for validating that a system or component is genuine and has not been
15618 altered, and for replacing a suspect system or component. Some indications of tampering
15619 may be visible and addressable before accepting delivery, including inconsistent packaging,
15620 broken seals, and incorrect labels. When a system or system component is suspected of
15621 being altered or counterfeit, the supplier, contractor, or original equipment manufacturer
15622 may be able to replace the item or provide a forensic capability to determine the origin of
15623 the counterfeit or altered item. Organizations can provide training to personnel on how to
15624 identify suspicious system or component deliveries.
15625 Related Controls: AT-3, SR-9, SR-10, SR-11.
15626 References: [SP 800-161]; [IR 7622].
15635 assessment can guide and inform the strategies, tools, and methods that are most applicable to
15636 the situation. Tools and techniques may provide protections against unauthorized production,
15637 theft, tampering, insertion of counterfeits, insertion of malicious software or backdoors, and
15638 poor development practices throughout the system development life cycle. Organizations also
15639 consider providing incentives for suppliers who implement controls; promote transparency into
15640 their processes and security and privacy practices; provide contract language that addresses the
15641 prohibition of tainted or counterfeit components; and restrict purchases from untrustworthy
15642 suppliers. Organizations consider providing training, education, and awareness programs for
15643 personnel regarding supply chain risk, available mitigation strategies, and when the programs
15644 should be employed. Methods for reviewing and protecting development plans, documentation,
15645 and evidence are commensurate with the security and privacy requirements of the organization.
15646 Contracts may specify documentation protection requirements.
15647 Related Controls: AT-3, SA-2, SA-3, SA-4, SA-5, SA-8, SA-9, SA-10, SA-15, SR-6, SR-9, SR-10, SR-11.
15648 Control Enhancements:
15649 (1) ACQUISITION STRATEGIES, TOOLS, AND METHODS | ADEQUATE SUPPLY
15650 Employ the following controls to ensure an adequate supply of [Assignment: organization-
15651 defined critical system components]: [Assignment: organization-defined controls].
15652 Discussion: Adversaries can attempt to impede organizational operations by disrupting the
15653 supply of critical system components or corrupting supplier operations. Organizations may
15654 track systems and component mean time to failure to mitigate the loss of temporary or
15655 permanent system function. Controls to ensure that adequate supplies of critical system
15656 components include the use of multiple suppliers throughout the supply chain for the
15657 identified critical components; stockpiling spare components to ensure operation during
15658 mission-critical times, and the identification of functionally-identical or similar components
15659 that may be used, if necessary.
15660 Related Controls: None.
15661 (2) ACQUISITION STRATEGIES, TOOLS, AND METHODS | ASSESSMENTS PRIOR TO SELECTION,
15662 ACCEPTANCE, MODIFICATION, OR UPDATE
15663 Assess the system, system component, or system service prior to selection, acceptance,
15664 modification, or update.
15665 Discussion: Organizational personnel or independent, external entities conduct assessments
15666 of systems, components, products, tools, and services to uncover evidence of tampering,
15667 unintentional and intentional vulnerabilities, or evidence of non-compliance with supply
15668 chain controls. These include malicious code, malicious processes, defective software,
15669 backdoors, and counterfeits. Assessments can include evaluations; design proposal reviews;
15670 visual or physical inspection; static and dynamic analyses; visual, x-ray, or magnetic particle
15671 inspections; simulations; white, gray, or black box testing; fuzz testing; stress testing; and
15672 penetration testing (see SR-6(1)). Evidence generated during assessments is documented for
15673 follow-on actions by organizations. The evidence generated during the organizational or
15674 independent assessments of supply chain elements may be used to improve supply chain
15675 processes and to inform the supply chain risk management process. The evidence can be
15676 leveraged in follow-on assessments. Evidence and other documentation may be shared in
15677 accordance with organizational agreements.
15678 Related Controls: CA-8, RA-5, SA-11, SI-7, SR-9.
15679 References: [SP 800-30]; [SP 800-161]; [IR 7622].
15727 information may expose users or specific uses of the supply chain. Supply chain information
15728 includes user identities; uses for systems, system components, and system services; supplier
15729 identities; security and privacy requirements; system and component configurations; supplier
15730 processes; design specifications; and testing and evaluation results. Supply chain OPSEC may
15731 require organizations to withhold mission or business information from suppliers and may
15732 include the use of intermediaries to hide the end use, or users of systems, system components,
15733 or system services.
15734 Related Controls: SC-38.
15735 Control Enhancements: None.
15736 References: [SP 800-30]; [SP 800-161]; [IR 7622].
15811 Discussion: Proper disposal of system components helps to prevent such components from
15812 entering the gray market.
15813 Related Controls: MP-6.
15814 (4) COMPONENT AUTHENTICITY | ANTI-COUNTERFEIT SCANNING
15815 Scan for counterfeit system components [Assignment: organization-defined frequency].
15816 Discussion: The type of component determines the type of scanning to be conducted (e.g.,
15817 web application scanning if the component is a web application).
15818 Related Controls: RA-5.
15819 References: None.
15820 APPENDIX A
15821 REFERENCES
15822 LAWS, POLICIES, DIRECTIVES, REGULATIONS, STANDARDS, AND GUIDELINES 31
31 The references cited in this appendix are those external publications that directly support the FISMA and Privacy
Projects. Additional NIST standards, guidelines, and interagency reports are also cited throughout this publication,
including in the references section of the applicable controls in Chapter Three. Direct links to the NIST website are
provided to obtain access to those publications.
[EO 13587] Executive Order 13587, Structural Reforms to Improve the Security of
Classified Networks and the Responsible Sharing and Safeguarding of
Classified Information, October 2011.
https://obamawhitehouse.archives.gov/the-press-office/2011/10/07/executive-
order-13587-structural-reforms-improve-security-classified-net
[EO 13636] Executive Order 13636, Improving Critical Infrastructure Cybersecurity,
February 2013.
https://obamawhitehouse.archives.gov/the-press-office/2013/02/12/executive-
order-improving-critical-infrastructure-cybersecurity
[EO 13800] Executive Order 13800, Strengthening the Cybersecurity of Federal
Networks and Critical Infrastructure, May 2017.
https://www.whitehouse.gov/presidential-actions/presidential-executive-order-
strengthening-cybersecurity-federal-networks-critical-infrastructure
[USC 552] United States Code, 2006 Edition, Supplement 4, Title 5 - Government
Organization and Employees, January 2011.
https://www.govinfo.gov/content/pkg/USCODE-2010-title5/pdf/USCODE-2010-
title5-partI-chap5-subchapII-sec552a.pdf
[OMB A-108] Office of Management and Budget Memorandum Circular A-108, Federal
Agency Responsibilities for Review, Reporting, and Publication under the
Privacy Act, December 2016.
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/circulars/A108/omb
_circular_a-108.pdf
[OMB A-130] Office of Management and Budget Memorandum Circular A-130, Managing
Information as a Strategic Resource, July 2016.
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/circulars/A130/a13
0revised.pdf
[OMB M-08-05] Office of Management and Budget Memorandum M-08-05, Implementation
of Trusted Internet Connections (TIC), November 2007.
https://obamawhitehouse.archives.gov/sites/default/files/omb/assets/omb/memo
randa/fy2008/m08-05.pdf
[OMB M-17-06] Office of Management and Budget Memorandum M-17-06, Policies for
Federal Agency Public Websites and Digital Services, November 2016.
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2017/
m-17-06.pdf
[OMB M-17-12] Office of Management and Budget Memorandum M-17-12, Preparing for
and Responding to a Breach of Personally Identifiable Information, January
2017.
https://obamawhitehouse.archives.gov/sites/default/files/omb/memoranda/2017
/m-17-12_0.pdf
[OMB M-17-25] Office of Management and Budget Memorandum M-17-25, Reporting
Guidance for Executive Order on Strengthening the Cybersecurity of Federal
Networks and Critical Infrastructure, May 2017.
https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/2017/
M-17-25.pdf
[OMB M-19-03] Office of Management and Budget Memorandum M-19-03, Strengthening
the Cybersecurity of Federal Agencies by Enhancing the High Value Asset
Program, December 2018.
https://www.whitehouse.gov/wp-content/uploads/2018/12/M-19-03.pdf
[OMB M-19-15] Office of Management and Budget Memorandum M-19-15, Improving
Implementation of the Information Quality Act, April 2019.
https://www.whitehouse.gov/wp-content/uploads/2019/04/M-19-15.pdf
[OMB M-19-23] Office of Management and Budget Memorandum M-19-23, Phase 1
Implementation of the Foundations for Evidence-Based Policymaking Act of
2018: Learning Agendas, Personnel, and Planning Guidance, July 2019.
https://www.whitehouse.gov/wp-content/uploads/2019/07/M-19-23.pdf
[CNSSD 505] Committee on National Security Systems Directive No. 505, Supply Chain
Risk Management (SCRM), August 2017.
https://www.cnss.gov/CNSS/issuances/Directives.cfm
[CNSSP 22] Committee on National Security Systems Policy No. 22, Cybersecurity Risk
Management Policy, August 2016.
https://www.cnss.gov/CNSS/issuances/Policies.cfm
[CNSSI 1253] Committee on National Security Systems Instruction No. 1253, Security
Categorization and Control Selection for National Security Systems, March
2014.
https://www.cnss.gov/CNSS/issuances/Instructions.cfm
[CNSSI 4009] Committee on National Security Systems Instruction No. 4009, Committee
on National Security Systems (CNSS) Glossary, April 2015.
https://www.cnss.gov/CNSS/issuances/Instructions.cfm
[DODI 8510.01] Department of Defense Instruction 8510.01, Risk Management Framework
(RMF) for DoD Information Technology (IT), March 2014.
https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/851001_201
4.pdf
[DHS NIPP] Department of Homeland Security, National Infrastructure Protection Plan
(NIPP), 2009.
https://www.dhs.gov/xlibrary/assets/NIPP_Plan.pdf
[FIPS 186-4] National Institute of Standards and Technology (2013) Digital Signature
Standard (DSS). (U.S. Department of Commerce, Washington, D.C.), Federal
Information Processing Standards Publication (FIPS) 186-4.
https://doi.org/10.6028/NIST.FIPS.186-4
[FIPS 197] National Institute of Standards and Technology (2001) Advanced Encryption
Standard (AES). (U.S. Department of Commerce, Washington, D.C.), Federal
Information Processing Standards Publication (FIPS) 197.
https://doi.org/10.6028/NIST.FIPS.197
[FIPS 199] National Institute of Standards and Technology (2004) Standards for
Security Categorization of Federal Information and Information Systems.
(U.S. Department of Commerce, Washington, D.C.), Federal Information
Processing Standards Publication (FIPS) 199.
https://doi.org/10.6028/NIST.FIPS.199
[FIPS 200] National Institute of Standards and Technology (2006) Minimum Security
Requirements for Federal Information and Information Systems. (U.S.
Department of Commerce, Washington, D.C.), Federal Information
Processing Standards Publication (FIPS) 200.
https://doi.org/10.6028/NIST.FIPS.200
[FIPS 201-2] National Institute of Standards and Technology (2013) Personal Identity
Verification (PIV) of Federal Employees and Contractors. (U.S. Department
of Commerce, Washington, D.C.), Federal Information Processing Standards
Publication (FIPS) 201-2.
https://doi.org/10.6028/NIST.FIPS.201-2
[FIPS 202] National Institute of Standards and Technology (2015) SHA-3 Standard:
Permutation-Based Hash and Extendable-Output Functions. (U.S.
Department of Commerce, Washington, D.C.), Federal Information
Processing Standards Publication (FIPS) 202.
https://doi.org/10.6028/NIST.FIPS.202
[SP 800-12] Nieles M, Pillitteri VY, Dempsey KL (2017) An Introduction to Information
Security. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-12, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-12r1
[SP 800-18] Swanson MA, Hash J, Bowen P (2006) Guide for Developing Security Plans
for Federal Information Systems. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-18, Rev.
1.
https://doi.org/10.6028/NIST.SP.800-18r1
[SP 800-28] Jansen W, Winograd T, Scarfone KA (2008) Guidelines on Active Content
and Mobile Code. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-28, Version 2.
https://doi.org/10.6028/NIST.SP.800-28ver2
[SP 800-30] Joint Task Force Transformation Initiative (2012) Guide for Conducting Risk
Assessments. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-30, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-30r1
[SP 800-32] Kuhn R, Hu VC, Polk T, Chang S-jH (2001) Introduction to Public Key
Technology and the Federal PKI Infrastructure. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-32.
https://doi.org/10.6028/NIST.SP.800-32
[SP 800-34] Swanson MA, Bowen P, Phillips AW, Gallup D, Lynes D (2010) Contingency
Planning Guide for Federal Information Systems. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-34, Rev. 1, Includes updates as of November 11, 2010.
https://doi.org/10.6028/NIST.SP.800-34r1
[SP 800-35] Grance T, Hash J, Stevens M, O'Neal K, Bartol N (2003) Guide to Information
Technology Security Services. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-35.
https://doi.org/10.6028/NIST.SP.800-35
[SP 800-37] Joint Task Force (2018) Risk Management Framework for Information
Systems and Organizations: A System Life Cycle Approach for Security and
Privacy. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-37, Rev. 2.
https://doi.org/10.6028/NIST.SP.800-37r2
[SP 800-39] Joint Task Force Transformation Initiative (2011) Managing Information
Security Risk: Organization, Mission, and Information System View.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-39.
https://doi.org/10.6028/NIST.SP.800-39
[SP 800-40] Souppaya MP, Scarfone KA (2013) Guide to Enterprise Patch Management
Technologies. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-40, Rev. 3.
https://doi.org/10.6028/NIST.SP.800-40r3
[SP 800-41] Scarfone KA, Hoffman P (2009) Guidelines on Firewalls and Firewall Policy.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-41, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-41r1
[SP 800-45] Tracy MC, Jansen W, Scarfone KA, Butterfield J (2007) Guidelines on
Electronic Mail Security. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-45, Version 2.
https://doi.org/10.6028/NIST.SP.800-45ver2
[SP 800-46] Souppaya MP, Scarfone KA (2016) Guide to Enterprise Telework, Remote
Access, and Bring Your Own Device (BYOD) Security. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-46, Rev. 2.
https://doi.org/10.6028/NIST.SP.800-46r2
[SP 800-47] Grance T, Hash J, Peck S, Smith J, Korow-Diks K (2002) Security Guide for
Interconnecting Information Technology Systems. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-47.
https://doi.org/10.6028/NIST.SP.800-47
[SP 800-50] Wilson M, Hash J (2003) Building an Information Technology Security
Awareness and Training Program. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-50.
https://doi.org/10.6028/NIST.SP.800-50
[SP 800-52] McKay KA, Cooper DA (2019) Guidelines for the Selection, Configuration,
and Use of Transport Layer Security (TLS) Implementations. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-52, Rev. 2.
https://doi.org/10.6028/NIST.SP.800-52r2
[SP 800-53A] Joint Task Force Transformation Initiative (2014) Assessing Security and
Privacy Controls in Federal Information Systems and Organizations: Building
Effective Assessment Plans. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-53A, Rev.
4, Includes updates as of December 18, 2014.
https://doi.org/10.6028/NIST.SP.800-53Ar4
[SP 800-53B] National Institute of Standards and Technology Special Publication 800-53B,
Control Baselines and Tailoring Guidance for Federal Information Systems
and Organizations. Projected for publication in 2020.
[SP 800-55] Chew E, Swanson MA, Stine KM, Bartol N, Brown A, Robinson W (2008)
Performance Measurement Guide for Information Security. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-55, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-55r1
[SP 800-56A] Barker EB, Chen L, Roginsky A, Vassilev A, Davis R (2018) Recommendation
for Pair-Wise Key-Establishment Schemes Using Discrete Logarithm
Cryptography. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-56A, Rev. 3.
https://doi.org/10.6028/NIST.SP.800-56Ar3
[SP 800-56B] Barker EB, Chen L, Roginsky A, Vassilev A, Davis R, Simon S (2019)
Recommendation for Pair-Wise Key-Establishment Using Integer
Factorization Cryptography. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-56B, Rev.
2.
https://doi.org/10.6028/NIST.SP.800-56Br2
[SP 800-56C] Barker EB, Chen L, Davis R (2018) Recommendation for Key-Derivation
Methods in Key-Establishment Schemes. (National Institute of Standards
and Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-56C,
Rev. 1.
https://doi.org/10.6028/NIST.SP.800-56Cr1
[SP 800-57-1] Barker EB (2016) Recommendation for Key Management, Part 1: General.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-57 Part 1, Rev. 4.
https://doi.org/10.6028/NIST.SP.800-57pt1r4
[SP 800-57-2] Barker EB, Barker WC (2019) Recommendation for Key Management: Part 2
– Best Practices for Key Management Organizations. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-57 Part 2, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-57pt2r1
[SP 800-57-3] Barker EB, Dang QH (2015) Recommendation for Key Management, Part 3:
Application-Specific Key Management Guidance. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-57 Part 3, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-57pt3r1
[SP 800-58] Kuhn R, Walsh TJ, Fries S (2005) Security Considerations for Voice Over IP
Systems. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-58.
https://doi.org/10.6028/NIST.SP.800-58
[SP 800-60 v1] Stine KM, Kissel RL, Barker WC, Fahlsing J, Gulick J (2008) Guide for
Mapping Types of Information and Information Systems to Security
Categories. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-60, Vol. 1, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-60v1r1
[SP 800-60 v2] Stine KM, Kissel RL, Barker WC, Lee A, Fahlsing J (2008) Guide for Mapping
Types of Information and Information Systems to Security Categories:
Appendices. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-60, Vol. 2, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-60v2r1
[SP 800-61] Cichonski PR, Millar T, Grance T, Scarfone KA (2012) Computer Security
Incident Handling Guide. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-61, Rev. 2.
https://doi.org/10.6028/NIST.SP.800-61r2
[SP 800-63-3] Grassi PA, Garcia ME, Fenton JL (2017) Digital Identity Guidelines. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-63-3, Includes updates as of March 2, 2020.
https://doi.org/10.6028/NIST.SP.800-63-3
[SP 800-63A] Grassi PA, Fenton JL, Lefkovitz NB, Danker JM, Choong Y-Y, Greene KK,
Theofanos MF (2017) Digital Identity Guidelines: Enrollment and Identity
Proofing. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-63A, Includes updates as of March 2,
2020.
https://doi.org/10.6028/NIST.SP.800-63a
[SP 800-70] Quinn SD, Souppaya MP, Cook MR, Scarfone KA (2018) National Checklist
Program for IT Products: Guidelines for Checklist Users and Developers.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-70, Rev. 4.
https://doi.org/10.6028/NIST.SP.800-70r4
[SP 800-73-4] Cooper DA, Ferraiolo H, Mehta KL, Francomacaro S, Chandramouli R,
Mohler J (2015) Interfaces for Personal Identity Verification. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-73-4, Includes updates as of February 8, 2016.
https://doi.org/10.6028/NIST.SP.800-73-4
[SP 800-76-2] Grother PJ, Salamon WJ, Chandramouli R (2013) Biometric Specifications
for Personal Identity Verification. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-76-2.
https://doi.org/10.6028/NIST.SP.800-76-2
[SP 800-77] Frankel SE, Kent K, Lewkowski R, Orebaugh AD, Ritchey RW, Sharma SR
(2005) Guide to IPsec VPNs. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-77.
https://doi.org/10.6028/NIST.SP.800-77
[SP 800-78-4] Polk T, Dodson DF, Burr WE, Ferraiolo H, Cooper DA (2015) Cryptographic
Algorithms and Key Sizes for Personal Identity Verification. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-78-4.
https://doi.org/10.6028/NIST.SP.800-78-4
[SP 800-79-2] Ferraiolo H, Chandramouli R, Ghadiali N, Mohler J, Shorter S (2015)
Guidelines for the Authorization of Personal Identity Verification Card
Issuers (PCI) and Derived PIV Credential Issuers (DPCI). (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-79-2.
https://doi.org/10.6028/NIST.SP.800-79-2
[SP 800-81-2] Chandramouli R, Rose SW (2013) Secure Domain Name System (DNS)
Deployment Guide. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-81-2.
https://doi.org/10.6028/NIST.SP.800-81-2
[SP 800-82] Stouffer KA, Lightman S, Pillitteri VY, Abrams M, Hahn A (2015) Guide to
Industrial Control Systems (ICS) Security. (National Institute of Standards
and Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-82,
Rev. 2.
https://doi.org/10.6028/NIST.SP.800-82r2
[SP 800-83] Souppaya MP, Scarfone KA (2013) Guide to Malware Incident Prevention
and Handling for Desktops and Laptops. (National Institute of Standards
and Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-83,
Rev. 1.
https://doi.org/10.6028/NIST.SP.800-83r1
[SP 800-84] Grance T, Nolan T, Burke K, Dudley R, White G, Good T (2006) Guide to Test,
Training, and Exercise Programs for IT Plans and Capabilities. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-84.
https://doi.org/10.6028/NIST.SP.800-84
[SP 800-86] Kent K, Chevalier S, Grance T, Dang H (2006) Guide to Integrating Forensic
Techniques into Incident Response. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-86.
https://doi.org/10.6028/NIST.SP.800-86
[SP 800-88] Kissel RL, Regenscheid AR, Scholl MA, Stine KM (2014) Guidelines for Media
Sanitization. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-88, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-88r1
[SP 800-92] Kent K, Souppaya MP (2006) Guide to Computer Security Log Management.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-92.
https://doi.org/10.6028/NIST.SP.800-92
[SP 800-94] Scarfone KA, Mell PM (2007) Guide to Intrusion Detection and Prevention
Systems (IDPS). (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-94.
https://doi.org/10.6028/NIST.SP.800-94
[SP 800-95] Singhal A, Winograd T, Scarfone KA (2007) Guide to Secure Web Services.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-95.
https://doi.org/10.6028/NIST.SP.800-95
[SP 800-97] Frankel SE, Eydt B, Owens L, Scarfone KA (2007) Establishing Wireless
Robust Security Networks: A Guide to IEEE 802.11i. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-97.
https://doi.org/10.6028/NIST.SP.800-97
[SP 800-100] Bowen P, Hash J, Wilson M (2006) Information Security Handbook: A Guide
for Managers. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-100, Includes updates
as of March 7, 2007.
https://doi.org/10.6028/NIST.SP.800-100
[SP 800-101] Ayers RP, Brothers S, Jansen W (2014) Guidelines on Mobile Device
Forensics. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-101, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-101r1
[SP 800-111] Scarfone KA, Souppaya MP, Sexton M (2007) Guide to Storage Encryption
Technologies for End User Devices. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-111.
https://doi.org/10.6028/NIST.SP.800-111
[SP 800-113] Frankel SE, Hoffman P, Orebaugh AD, Park R (2008) Guide to SSL VPNs.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-113.
https://doi.org/10.6028/NIST.SP.800-113
[SP 800-114] Souppaya MP, Scarfone KA (2016) User's Guide to Telework and Bring Your
Own Device (BYOD) Security. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-114, Rev.
1.
https://doi.org/10.6028/NIST.SP.800-114r1
[SP 800-115] Scarfone KA, Souppaya MP, Cody A, Orebaugh AD (2008) Technical Guide to
Information Security Testing and Assessment. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-115.
https://doi.org/10.6028/NIST.SP.800-115
[SP 800-116] Ferraiolo H, Mehta KL, Ghadiali N, Mohler J, Johnson V, Brady S (2018) A
Recommendation for the Use of PIV Credentials in Physical Access Control
Systems (PACS). (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-116, Rev. 1.
https://doi.org/10.6028/NIST.SP.800-116r1
[SP 800-121] Padgette J, Bahr J, Holtmann M, Batra M, Chen L, Smithbey R, Scarfone KA
(2017) Guide to Bluetooth Security. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-121, Rev.
2.
https://doi.org/10.6028/NIST.SP.800-121r2
[SP 800-124] Souppaya MP, Scarfone KA (2013) Guidelines for Managing the Security of
Mobile Devices in the Enterprise. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-124, Rev.
1.
https://doi.org/10.6028/NIST.SP.800-124r1
[SP 800-125B] Chandramouli R (2016) Secure Virtual Network Configuration for Virtual
Machine (VM) Protection. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-125B.
https://doi.org/10.6028/NIST.SP.800-125B
[SP 800-126] Waltermire DA, Quinn SD, Booth H, III, Scarfone KA, Prisaca D (2018) The
Technical Specification for the Security Content Automation Protocol
(SCAP): SCAP Version 1.3. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-126, Rev. 3.
https://doi.org/10.6028/NIST.SP.800-126r3
[SP 800-128] Johnson LA, Dempsey KL, Ross RS, Gupta S, Bailey D (2011) Guide for
Security-Focused Configuration Management of Information Systems.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Special Publication (SP) 800-128.
https://doi.org/10.6028/NIST.SP.800-128
[SP 800-130] Barker EB, Smid ME, Branstad DK, Chokhani S (2013) A Framework for
Designing Cryptographic Key Management Systems. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-130.
https://doi.org/10.6028/NIST.SP.800-130
[SP 800-137] Dempsey KL, Chawla NS, Johnson LA, Johnston R, Jones AC, Orebaugh AD,
Scholl MA, Stine KM (2011) Information Security Continuous Monitoring
(ISCM) for Federal Information Systems and Organizations. (National
Institute of Standards and Technology, Gaithersburg, MD), NIST Special
Publication (SP) 800-137.
https://doi.org/10.6028/NIST.SP.800-137
[SP 800-147] Cooper DA, Polk T, Regenscheid AR, Souppaya MP (2011) BIOS Protection
Guidelines. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Special Publication (SP) 800-147.
https://doi.org/10.6028/NIST.SP.800-147
[SP 800-150] Johnson CS, Waltermire DA, Badger ML, Skorupka C, Snyder J (2016) Guide
to Cyber Threat Information Sharing. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-150.
https://doi.org/10.6028/NIST.SP.800-150
[SP 800-152] Barker EB, Branstad DK, Smid ME (2015) A Profile for U.S. Federal
Cryptographic Key Management Systems (CKMS). (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP)
800-152.
https://doi.org/10.6028/NIST.SP.800-152
[SP 800-154] Souppaya MP, Scarfone KA (2016) Guide to Data-Centric System Threat
Modeling. (National Institute of Standards and Technology, Gaithersburg,
MD), Draft NIST Special Publication (SP) 800-154.
https://csrc.nist.gov/publications/detail/sp/800-154/draft
[SP 800-192] Yaga DJ, Kuhn R, Hu VC (2017) Verification and Test Methods for Access
Control Policies/Models. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Special Publication (SP) 800-192.
https://doi.org/10.6028/NIST.SP.800-192
[IR 7539] Cooper DA, MacGregor WI (2008) Symmetric Key Injection onto Smart
Cards. (National Institute of Standards and Technology, Gaithersburg, MD),
NIST Interagency or Internal Report (IR) 7539.
https://doi.org/10.6028/NIST.IR.7539
[IR 7559] Singhal A, Gunestas M, Wijesekera D (2010) Forensics Web Services (FWS).
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Interagency or Internal Report (IR) 7559.
https://doi.org/10.6028/NIST.IR.7559
[IR 7622] Boyens JM, Paulsen C, Bartol N, Shankles S, Moorthy R (2012) Notional
Supply Chain Risk Management Practices for Federal Information Systems.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Interagency or Internal Report (IR) 7622.
https://doi.org/10.6028/NIST.IR.7622
[IR 7676] Cooper DA (2010) Maintaining and Using Key History on Personal Identity
Verification (PIV) Cards. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Interagency or Internal Report (IR) 7676.
https://doi.org/10.6028/NIST.IR.7676
[IR 7788] Singhal A, Ou X (2011) Security Risk Analysis of Enterprise Networks Using
Probabilistic Attack Graphs. (National Institute of Standards and
Technology, Gaithersburg, MD), NIST Interagency or Internal Report (IR)
7788.
https://doi.org/10.6028/NIST.IR.7788
[IR 7817] Ferraiolo H (2012) A Credential Reliability and Revocation Model for
Federated Identities. (National Institute of Standards and Technology,
Gaithersburg, MD), NIST Interagency or Internal Report (IR) 7817.
https://doi.org/10.6028/NIST.IR.7817
[IR 7849] Chandramouli R (2014) A Methodology for Developing Authentication
Assurance Level Taxonomy for Smart Card-based Identity Verification.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Interagency or Internal Report (IR) 7849.
https://doi.org/10.6028/NIST.IR.7849
[IR 7870] Cooper DA (2012) NIST Test Personal Identity Verification (PIV) Cards.
(National Institute of Standards and Technology, Gaithersburg, MD), NIST
Interagency or Internal Report (IR) 7870.
https://doi.org/10.6028/NIST.IR.7870
[IR 7874] Hu VC, Scarfone KA (2012) Guidelines for Access Control System Evaluation
Metrics. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Interagency or Internal Report (IR) 7874.
https://doi.org/10.6028/NIST.IR.7874
[IR 8040] Greene KK, Kelsey JM, Franklin JM (2016) Measuring the Usability and
Security of Permuted Passwords on Mobile Platforms. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Interagency or Internal
Report (IR) 8040.
https://doi.org/10.6028/NIST.IR.8040
[IR 8062] Brooks S, Garcia M, Lefkovitz N, Lightman S, Nadeau E (2017) An
Introduction to Privacy Engineering and Risk Management in Federal
Systems. (National Institute of Standards and Technology, Gaithersburg,
MD), NIST Interagency or Internal Report (IR) 8062.
https://doi.org/10.6028/NIST.IR.8062
[IR 8179] Paulsen C, Boyens JM, Bartol N, Winkler K (2018) Criticality Analysis Process
Model: Prioritizing Systems and Components. (National Institute of
Standards and Technology, Gaithersburg, MD), NIST Interagency or Internal
Report (IR) 8179.
https://doi.org/10.6028/NIST.IR.8179
15824
15825 APPENDIX B
15826 GLOSSARY
15827 COMMON TERMS AND DEFINITIONS
15828 Appendix B provides definitions for terminology used in NIST Special Publication 800-53. Sources
15829 for terms used in this publication are cited as applicable. Where no citation is noted, the source
15830 of the definition is Special Publication 800-53.
authenticity The property of being genuine and being able to be verified and
trusted; confidence in the validity of a transmission, a message,
or message originator. See authentication.
authorization Access privileges granted to a user, program, or process or the
[CNSSI 4009] act of granting those privileges.
authorization boundary All components of an information system to be authorized for
[OMB A-130] operation by an authorizing official. This excludes separately
authorized systems to which the information system is
connected.
authorization to operate The official management decision given by a senior Federal
[OMB A-130] official or officials to authorize operation of an information
system and to explicitly accept the risk to agency operations
(including mission, functions, image, or reputation), agency
assets, individuals, other organizations, and the Nation based on
the implementation of an agreed-upon set of security and
privacy controls. Authorization also applies to common controls
inherited by agency information systems.
authorizing official A senior Federal official or executive with the authority to
[OMB A-130] authorize (i.e., assume responsibility for) the operation of an
information system or the use of a designated set of common
controls at an acceptable level of risk to agency operations
(including mission, functions, image, or reputation), agency
assets, individuals, other organizations, and the Nation.
availability Ensuring timely and reliable access to and use of information.
[FISMA]
baseline See control baseline.
baseline configuration A documented set of specifications for a system, or a
[SP 800-128, Adapted] configuration item within a system, that has been formally
reviewed and agreed on at a given point in time, and which can
be changed only through change control procedures.
blacklisting The process used to identify software programs that are not
authorized to execute on a system; or prohibited Universal
Resource Locators or websites.
boundary protection Monitoring and control of communications at the external
interface to a system to prevent and detect malicious and other
unauthorized communications, using boundary protection
devices, for example, gateways, routers, firewalls, guards,
encrypted tunnels.
boundary protection A device with mechanisms that facilitates the adjudication of
device different connected system security policies or provides system
boundary protection.
control baseline The set of security and privacy controls defined for a low-impact,
[FIPS 200, Adapted] moderate-impact, or high-impact system or selected based on
the privacy selection criteria that provide a starting point for the
tailoring process.
control effectiveness A measure of whether a given security or privacy control is
contributing to the reduction of information security or privacy
risk.
control enhancement Augmentation of a security or privacy control to build in
additional, but related, functionality to the control; increase the
strength of the control; or add assurance to the control.
control inheritance A situation in which a system or application receives protection
from security or privacy controls (or portions of controls) that
are developed, implemented, assessed, authorized, and
monitored by entities other than those responsible for the
system or application; entities either internal or external to the
organization where the system or application resides. See
common control.
controlled area Any area or space for which an organization has confidence that
the physical and procedural protections provided are sufficient
to meet the requirements established for protecting the
information and/or information system.
controlled interface An interface to a system with a set of mechanisms that enforces
the security policies and controls the flow of information
between connected systems.
controlled unclassified Information that the Government creates or possesses, or that
information an entity creates or possesses for or on behalf of the
[32 CFR 2002]
Government, that a law, regulation, or Government-wide policy
requires or permits an agency to handle using safeguarding or
dissemination controls. However, CUI does not include classified
information or information a non-executive branch entity
possesses and maintains in its own systems that did not come
from, or was not created or possessed by or for, an executive
branch agency or an entity acting for an agency.
counterfeit An unauthorized copy or substitute that has been identified,
[SP 800-161] marked, and/or altered by a source other than the item's legally
authorized source and has been misrepresented to be an
authorized item of the legally authorized source.
countermeasures Actions, devices, procedures, techniques, or other measures that
[FIPS 200] reduce the vulnerability of a system. Synonymous with security
controls and safeguards.
industrial control system General term that encompasses several types of control systems,
[SP 800-82] including supervisory control and data acquisition (SCADA)
systems, distributed control systems (DCS), and other control
system configurations such as programmable logic controllers
(PLC) often found in the industrial sectors and critical
infrastructures. An ICS consists of combinations of control
components (e.g., electrical, mechanical, hydraulic, pneumatic)
that act together to achieve an industrial objective (e.g.,
manufacturing, transportation of matter or energy).
information Any communication or representation of knowledge such as
[OMB A-130] facts, data, or opinions in any medium or form, including textual,
numerical, graphic, cartographic, narrative, electronic, or
audiovisual forms.
information flow control Controls to ensure that information transfers within a system or
organization are not made in violation of the security policy.
information leakage The intentional or unintentional release of information to an
untrusted environment.
information owner Official with statutory or operational authority for specified
[SP 800-37] information and responsibility for establishing the controls for its
generation, collection, processing, dissemination, and disposal.
information resources Information and related resources, such as personnel,
[OMB A-130] equipment, funds, and information technology.
information security The protection of information and systems from unauthorized
[OMB A-130] access, use, disclosure, disruption, modification, or destruction
in order to provide confidentiality, integrity, and availability.
information security An embedded, integral part of the enterprise architecture that
architecture describes the structure and behavior of the enterprise security
[OMB A-130] processes, security systems, personnel and organizational
subunits, showing their alignment with the enterprise’s mission
and strategic plans.
information security Aggregate of directives, regulations, rules, and practices that
policy prescribes how an organization manages, protects, and
[CNSSI 4009]
distributes information.
information security Formal document that provides an overview of the security
program plan requirements for an organization-wide information security
[OMB A-130]
program and describes the program management controls and
common controls in place or planned for meeting those
requirements.
information security risk The risk to organizational operations (including mission,
[SP 800-30] functions, image, reputation), organizational assets, individuals,
other organizations, and the Nation due to the potential for
unauthorized access, use, disclosure, disruption, modification, or
destruction of information and/or systems.
mobile device A portable computing device that has a small form factor such
that it can easily be carried by a single individual, is designed to
operate without a physical connection (e.g., wirelessly transmit
or receive information), possesses local, non-removable data
storage, and is powered on for extended periods of time with a
self-contained power source. Mobile devices may also include
voice communication capabilities, on board sensors that allow
the device to capture (e.g., photograph, video, record, or
determine location) information, and/or built-in features for
synchronizing local data with remote locations. Examples include
smart phones, tablets, and E-readers.
moderate-impact system A system in which at least one security objective (i.e.,
[FIPS 200] confidentiality, integrity, or availability) is assigned a FIPS
Publication 199 potential impact value of moderate and no
security objective is assigned a potential impact value of high.
multifactor An authentication system or an authenticator that requires more
authentication than one authentication factor for successful authentication.
[SP 800-63-3] Multifactor authentication can be performed using a single
authenticator that provides more than one factor or by a
combination of authenticators that provide different factors.
The three authentication factors are something you know,
something you have, and something you are. See authenticator.
multilevel security Concept of processing information with different classifications
[CNSSI 4009] and categories that simultaneously permits access by users with
different security clearances and denies access to users who lack
authorization.
multiple security levels Capability of a system that is trusted to contain, and maintain
[CNSSI 4009] separation between, resources (particularly stored data) of
different security domains.
national security system Any system (including any telecommunications system) used or
[OMB A-130] operated by an agency or by a contractor of an agency, or other
organization on behalf of an agency—(i) the function, operation,
or use of which involves intelligence activities; involves
cryptologic activities related to national security; involves
command and control of military forces; involves equipment that
is an integral part of a weapon or weapons system; or is critical
to the direct fulfillment of military or intelligence missions
(excluding a system that is to be used for routine administrative
and business applications, for example, payroll, finance, logistics,
and personnel management applications); or (ii) is protected at
all times by procedures established for information that have
been specifically authorized under criteria established by an
Executive Order or an Act of Congress to be kept classified in the
interest of national defense or foreign policy.
privacy plan A formal document that details the privacy controls selected for
[OMB A-130] an information system or environment of operation that are in
place or planned for meeting applicable privacy requirements
and managing privacy risks, details how the controls have been
implemented, and describes the methodologies and metrics that
will be used to assess the controls.
privacy program plan A formal document that provides an overview of an agency’s
[OMB A-130] privacy program, including a description of the structure of the
privacy program, the resources dedicated to the privacy
program, the role of the Senior Agency Official for Privacy and
other privacy officials and staff, the strategic goals and
objectives of the privacy program, and the program
management controls and common controls in place or planned
for meeting applicable privacy requirements and managing
privacy risks.
privileged account A system account with authorizations of a privileged user.
privileged command A human-initiated command executed on a system involving the
control, monitoring, or administration of the system, including
security functions and associated security-relevant information.
privileged user A user that is authorized (and therefore, trusted) to perform
[CNSSI 4009] security-relevant functions that ordinary users are not
authorized to perform.
protected distribution Wire line or fiber optic system that includes adequate safeguards
system and/or countermeasures (e.g., acoustic, electric,
[CNSSI 4009] electromagnetic, and physical) to permit its use for the
transmission of unencrypted information through an area of
lesser classification or control.
provenance The chronology of the origin, development, ownership, location,
and changes to a system or system component and associated
data. It may also include personnel and processes used to
interact with or make modifications to the system, component,
or associated data.
public key infrastructure The architecture, organization, techniques, practices, and
[CNSSI 4009] procedures that collectively support the implementation and
operation of a certificate-based public key cryptographic system.
Framework established to issue, maintain, and revoke public key
certificates.
purge A method of sanitization that applies physical or logical
[SP 800-88] techniques that render target data recovery infeasible using
state of the art laboratory techniques.
role-based access control Access control based on user roles (i.e., a collection of access
authorizations a user receives based on an explicit or implicit
assumption of a given role). Role permissions may be inherited
through a role hierarchy and typically reflect the permissions
needed to perform defined functions within an organization. A
given role may apply to a single individual or to several
individuals.
runtime The period during which a computer program is executing.
sanitization A process to render access to target data on the media infeasible
[SP 800-88] for a given level of effort. Clear, purge, and destroy are actions
that can be taken to sanitize media.
scoping considerations A part of tailoring guidance providing organizations with specific
considerations on the applicability and implementation of
security and privacy controls in the control baselines.
Considerations include policy or regulatory, technology, physical
infrastructure, system component allocation, public access,
scalability, common control, operational or environmental, and
security objective.
security A condition that results from the establishment and
[CNSSI 4009] maintenance of protective measures that enable an organization
to perform its mission or critical functions despite risks posed by
threats to its use of systems. Protective measures may involve a
combination of deterrence, avoidance, prevention, detection,
recovery, and correction that should form part of the
organization’s risk management approach.
security attribute An abstraction representing the basic properties or
characteristics of an entity with respect to safeguarding
information; typically associated with internal data structures,
including records, buffers, and files within the system and used
to enable the implementation of access control and flow control
policies, reflect special dissemination, handling or distribution
instructions, or support other aspects of the information security
policy.
security categorization The process of determining the security category for information
or a system. Security categorization methodologies are described
in CNSS Instruction 1253 for national security systems and in
FIPS Publication 199 for other than national security systems.
See security category.
security category The characterization of information or an information system
[OMB A-130] based on an assessment of the potential impact that a loss of
confidentiality, integrity, or availability of such information or
information system would have on agency operations, agency
assets, individuals, other organizations, and the Nation.
security policy filter A hardware and/or software component that performs one or
more of the following functions: content verification to ensure
the data type of the submitted content; content inspection,
analyzing the submitted content to verify it complies with a
defined policy; malicious content checker that evaluates the
content for malicious code; suspicious activity checker that
evaluates or executes the content in a safe manner, such as in a
sandbox or detonation chamber and monitors for suspicious
activity; or content sanitization, cleansing, and transformation,
which modifies the submitted content to comply with a defined
policy.
security requirement A requirement levied on an information system or an
[FIPS 200, Adapted] organization that is derived from applicable laws, executive
orders, directives, regulations, policies, standards, procedures,
or mission/business needs to ensure the confidentiality,
integrity, and availability of information that is being processed,
stored, or transmitted.
Note: Security requirements can be used in a variety of contexts from high-
level policy-related activities to low-level implementation-related activities in
system development and engineering disciplines.
senior agency official for Senior official, designated by the head of each agency, who has
privacy agency-wide responsibility for privacy, including implementation
[OMB A-130] of privacy protections; compliance with Federal laws,
regulations, and policies relating to privacy; management of
privacy risks at the agency; and a central policy-making role in
the agency’s development and evaluation of legislative,
regulatory, and other policy proposals.
senior information See senior agency information security officer.
security officer
sensitive compartmented Classified information concerning or derived from intelligence
information sources, methods, or analytical processes, which is required to
[CNSSI 4009] be handled within formal access control systems established by
the Director of National Intelligence.
service-oriented A set of principles and methodologies for designing and
architecture developing software in the form of interoperable services. These
services are well-defined business functions that are built as
software components (i.e., discrete pieces of code and/or data
structures) that can be reused for different purposes.
shared control A security or privacy control that is implemented for an
information system in part as a common control and in part as a
system-specific control. See hybrid control.
software Computer programs and associated data that may be
[CNSSI 4009] dynamically written or modified during execution.
spam The abuse of electronic messaging systems to indiscriminately
send unsolicited bulk messages.
special access program A program established for a specific class of classified
[CNSSI 4009] information that imposes safeguarding and access requirements
that exceed those normally required for information at the same
classification level.
split tunneling The process of allowing a remote user or device to establish a
non-remote connection with a system and simultaneously
communicate via some other connection to a resource in an
external network. This method of network access enables a user
to access remote devices and simultaneously, access
uncontrolled networks.
spyware Software that is secretly or surreptitiously installed into an
information system to gather information on individuals or
organizations without their knowledge; a type of malicious code.
subject An individual, process, or device causing information to flow
among objects or change to the system state. Also see object.
subsystem A major subdivision or component of an information system
consisting of information, information technology, and
personnel that performs one or more specific functions.
supply chain Linked set of resources and processes between multiple tiers of
[ISO 28001, Adapted] developers that begins with the sourcing of products and
services and extends through the design, development,
manufacturing, processing, handling, and delivery of products
and services to the acquirer.
supply chain element An information technology product or product component that
contains programmable logic and that is critically important to
the functioning of a system.
supply chain risk A systematic process for managing supply chain risk by
management identifying susceptibilities, vulnerabilities, and threats
[CNSSD 505] throughout the supply chain and developing mitigation
strategies to combat those threats whether presented by the
supplier, the supplies product and its subcomponents, or the
supply chain (e.g., initial production, packaging, handling,
storage, transport, mission operation, and disposal).
system Any organized assembly of resources and procedures united and
[CNSSI 4009] regulated by interaction or interdependence to accomplish a set
of specific functions.
Note: Systems also include specialized systems such as industrial/process
controls systems, telephone switching and private branch exchange (PBX)
systems, and environmental control systems.
[ISO 15288] Combination of interacting elements organized to achieve one or
more stated purposes.
Note 1: There are many types of systems. Examples include: general and
special-purpose information systems; command, control, and communication
systems; crypto modules; central processing unit and graphics processor
boards; industrial/process control systems; flight control systems; weapons,
targeting, and fire control systems; medical devices and treatment systems;
financial, banking, and merchandising transaction systems; and social
networking systems.
Note 2: The interacting elements in the definition of system include hardware,
software, data, humans, processes, facilities, materials, and naturally occurring
physical entities.
Note 3: System-of-systems is included in the definition of system.
threat modeling A form of risk assessment that models aspects of the attack and
[SP 800-154] defense sides of a logical entity, such as a piece of data, an
application, a host, a system, or an environment.
threat source The intent and method targeted at the intentional exploitation
[FIPS 200] of a vulnerability or a situation and method that may
accidentally trigger a vulnerability. See threat agent.
trusted path A mechanism by which a user (through an input device) can
communicate directly with the security functions of the system
with the necessary confidence to support the system security
policy. This mechanism can only be activated by the user or the
security functions of the system and cannot be imitated by
untrusted software.
trustworthiness The attribute of a person or enterprise that provides confidence
[CNSSI 4009] to others of the qualifications, capabilities, and reliability of that
entity to perform specific tasks and fulfill assigned
responsibilities.
trustworthiness The degree to which an information system (including the
(system)
information technology components that are used to build the
system) can be expected to preserve the confidentiality,
integrity, and availability of the information being processed,
stored, or transmitted by the system across the full range of
threats. A trustworthy information system is a system that is
believed to can operate within defined levels of risk despite the
environmental disruptions, human errors, structural failures, and
purposeful attacks that are expected to occur in its environment
of operation.
user Individual, or (system) process acting on behalf of an individual,
[CNSSI 4009, Adapted] authorized to access a system.
See organizational user and non-organizational user.
virtual private network Protected information system link utilizing tunneling, security
[CNSSI 4009] controls, and endpoint address translation giving the impression
of a dedicated line.
vulnerability Weakness in an information system, system security procedures,
[CNSSI 4009] internal controls, or implementation that could be exploited or
triggered by a threat source.
vulnerability analysis See vulnerability assessment.
vulnerability assessment Systematic examination of an information system or product to
[CNSSI 4009] determine the adequacy of security measures, identify security
deficiencies, provide data from which to predict the
effectiveness of proposed security measures, and confirm the
adequacy of such measures after implementation.
15832 APPENDIX C
15833 ACRONYMS
15834 COMMON ABBREVIATIONS
15836 APPENDIX D
15839 Tables D-1 through D-20 provide a summary of the security and privacy controls and control
15840 enhancements in Chapter Three. Each table focuses on a different control family. A control or
15841 control enhancement that has been withdrawn from the control catalog is indicated by an
15842 explanation of the control or control enhancement disposition in light gray text. A control or
15843 control enhancement that is typically implemented by an information system through technical
15844 means is indicated by an “S” in the implemented by column. A control or control enhancement
15845 that is typically implemented by an organization (i.e., by an individual through nontechnical
15846 means) is indicated by an “O” in the implemented by column. 32 A control or control
15847 enhancement that can be implemented by an organization or a system or a combination of the
15848 two, is indicated by an “O/S”. Finally, controls or control enhancements marked with a “√” in the
15849 assurance column indicate the controls or control enhancements that contribute to the grounds
15850 for justified confidence that a security or privacy claim has been or will be achieved. 33 Each
15851 control and control enhancement in tables D-1 through D-20 is hyperlinked to the text for that
15852 control and control enhancement in Chapter Three.
32 The indication that a certain control or control enhancement is implemented by a system or by an organization in
Tables D-1 through D-20 is notional. Organizations have the flexibility to implement their selected controls and
control enhancements in the most cost-effective and efficient manner while simultaneously complying with the basic
intent of the controls or control enhancements. In certain situations, a control or control enhancement may be
implemented by the system or by the organization or a combination of the two entities.
33 Assurance is a critical aspect in determining the trustworthiness of systems. Assurance is the measure of confidence
that the security and privacy functions, features, practices, policies, procedures, mechanisms, and architecture of
organizational systems accurately mediate and enforce established security and privacy policies.
15854
15856
15858
15860
15862
15864
15866
15867
15870
15871
15873
15874
PE-13(4) INSPECTIONS O
PE-14 Environmental Controls O
PE-14(1) AUTOMATIC CONTROLS O
PE-14(2) MONITORING WITH ALARMS AND NOTIFICATIONS O
PE-15 Water Damage Protection O
PE-15(1) AUTOMATION SUPPORT O
PE-16 Delivery and Removal O
PE-17 Alternate Work Site O
PE-18 Location of System Components O
PE-18(1) FACILITY SITE W: Moved to PE-23.
PE-19 Information Leakage O
PE-19(1) NATIONAL EMISSIONS AND TEMPEST POLICIES AND PROCEDURES O
PE-20 Asset Monitoring and Tracking O
PE-21 Electromagnetic Pulse Protection O
PE-22 Component Marking O
PE-23 Facility Location O
15876
15877
15879
15881
15883
15884 TABLE D-15: PERSONALLY IDENTIFIABLE INFORMATION PROCESSING AND TRANSPARENCY FAMILY
15885
15887
SA-17(8) ORCHESTRATION O √
SA-17(9) DESIGN DIVERSITY O √
SA-18 Tamper Resistance and Detection W: Moved to SR-9.
SA-18(1) MULTIPLE PHASES OF SYSTEM DEVELOPMENT LIFE CYCLE W: Moved to SR-9(1).
SA-18(2) INSPECTION OF SYSTEMS OR COMPONENTS W: Moved to SR-10.
SA-19 Component Authenticity W: Moved to SR-11.
SA-19(1) ANTI-COUNTERFEIT TRAINING W: Moved to SR-11(1).
SA-19(2) CONFIGURATION CONTROL FOR COMPONENT SERVICE AND REPAIR W: Moved to SR-11(2).
SA-19(3) COMPONENT DISPOSAL W: Moved to SR-11(3).
SA-19(4) ANTI-COUNTERFEIT SCANNING W: Moved to SR-11(4).
SA-20 Customized Development of Critical Components O √
SA-21 Developer Screening O √
SA-21(1) VALIDATION OF SCREENING W: Incorporated into SA-21.
SA-22 Unsupported System Components O √
SA-22(1) ALTERNATIVE SOURCES FOR CONTINUED SUPPORT W: Incorporated into SA-22.
SA-23 Specialization O √
15889
15891
15893
15895
OPTION 1 OPTION 2
15915 This collaboration index is a starting point to facilitate discussion between security and privacy
15916 programs within organizations since the degree of collaboration needed for control
15917 implementation for specific systems depends on many factors.
15918 For purposes of review and comment, three control families are identified as notional examples
15919 – Access Control (AC), Program Management (PM), and Personally Identifiable Information
15920 Processing and Transparency (PT). Tables 1 through 3 below provide the sample security and
15921 privacy collaboration rating indices for the three controls families selected to demonstrate this
15922 approach.
15923 We are interested in comments in the following areas.
15924 • Does an implementation collaboration index for each control provide meaningful guidance
15925 to both privacy and security professionals? If so, how? If not, what are potential issues and
15926 concerns?
15927 • Which option (3-gradient scale or 5-gradient scale) is preferred and why?
15928 • Are there other recommendations for a collaboration index?
15929 • Are there recommendations on other ways to provide more guidance on collaboration?
15930
15931 TABLE 1: ACCESS CONTROL FAMILY
15932
15934
15936