Iso Iec TR 27563-2023
Iso Iec TR 27563-2023
REPORT 27563
First edition
2023-05
Reference number
ISO/IEC TR 27563:2023(E)
© ISO/IEC 2023
ISO/IEC TR 27563:2023(E)
Contents Page
Foreword......................................................................................................................................................................................................................................... iv
Introduction..................................................................................................................................................................................................................................v
1 Scope.................................................................................................................................................................................................................................. 1
2 Normative references...................................................................................................................................................................................... 1
3 Terms and definitions..................................................................................................................................................................................... 1
4 Abbreviated terms.............................................................................................................................................................................................. 2
5 Analysis of security and privacy.......................................................................................................................................................... 3
5.1 General............................................................................................................................................................................................................ 3
5.2 Application domains in ISO/IEC TR 24030:2021 use cases............................................................................. 3
5.3 Security in ISO/IEC TR 24030:2021 use cases............................................................................................................ 3
5.4 Privacy in ISO/IEC TR 24030:2021 use cases.............................................................................................................. 4
6 Templates for analysis.................................................................................................................................................................................... 5
7 Supporting information................................................................................................................................................................................ 6
7.1 Describe ecosystem............................................................................................................................................................................ 6
7.2 Provide assessment of systems of interest.................................................................................................................... 7
7.3 Identify security and privacy concerns............................................................................................................................. 7
7.4 Identify security and privacy risks....................................................................................................................................... 9
7.5 Identify security and privacy controls............................................................................................................................ 11
7.6 Identify security and privacy assurance concerns.............................................................................................. 15
7.7 Identify security and privacy plan requirements................................................................................................. 16
Annex A (informative) Additional use cases............................................................................................................................................ 18
Bibliography..............................................................................................................................................................................................................................28
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work.
The procedures used to develop this document and those intended for its further maintenance
are described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria
needed for the different types of document should be noted. This document was drafted in
accordance with the editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives or
www.iec.ch/members_experts/refdocs).
ISO and IEC draw attention to the possibility that the implementation of this document may involve the
use of (a) patent(s). ISO and IEC take no position concerning the evidence, validity or applicability of
any claimed patent rights in respect thereof. As of the date of publication of this document, ISO and IEC
had not received notice of (a) patent(s) which may be required to implement this document. However,
implementers are cautioned that this may not represent the latest information, which may be obtained
from the patent database available at www.iso.org/patents and https://patents.iec.ch. ISO and IEC shall
not be held responsible for identifying any or all such patent rights.
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to
the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see
www.iso.org/iso/foreword.html. In the IEC, see www.iec.ch/understanding-standards.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 27, Information technology, cyber security and privacy protection.
Any feedback or questions on this document should be directed to the user’s national standards
body. A complete listing of these bodies can be found at www.iso.org/members.html and
www.iec.ch/national-committees.
Introduction
Artificial intelligence (AI) and machine learning (ML) are increasingly being adopted by the digital
industry, using algorithms to make decisions that have the potential to negatively impact the privacy
of individuals and in some cases can even cause harm to some of them, unless adequate safeguards are
deployed. Such safeguards to protect privacy often depend on a variety of factors including the specific
type of process, sensitivity of data used, and potential harm likely to be caused.
This concern has been expressed by:
— Practitioners, who identified 23 principles for AI at the 2017 Asilomar conference[1] covering
research, ethics and values, as well as longer term issues.
— Standard developers, as evidenced by the report on ethically aligned design published by the IEEE
Global Initiative on Ethics of Autonomous and Intelligent Systems[2].
— Policy makers, as exemplified by the appointment by the European Commission of a high-level
expert group on artificial intelligence and the subsequent publication of an assessment list[3].
This document provides an analysis of security and privacy of use cases provided in ISO/IEC TR 24030,
which should be used in parallel. A number of additional use cases are provided in Annex A.
This document also uses concepts from ISO/IEC TR 24028, which addresses trustworthiness in AI
systems, including approaches to establish trust (e.g. transparency, explainability, controllability), and
to achieve trustworthiness properties (e.g. resiliency, reliability, accuracy, safety, security, or privacy).
1 Scope
This document outlines best practices on assessing security and privacy in artificial intelligence use
cases, covering in particular those published in ISO/IEC TR 24030.
The following aspects are addressed:
— an overall assessment of security and privacy on the AI system of interest;
— security and privacy concerns;
— security and privacy risks;
— security and privacy controls;
— security and privacy assurance; and
— security and privacy plans.
Security and privacy are treated separately as the analysis of security and the analysis of privacy can
differ.
2 Normative references
There are no normative references in this document.
3.2
PII controller
privacy stakeholder (or privacy stakeholders) that determines the purposes and means for processing
personally identifiable information (PII) (3.1) other than natural persons who use data for personal
purposes
Note 1 to entry: A PII controller sometimes instructs others [e.g. PII processors (3.4)] to process PII on its behalf
while the responsibility for the processing remains with the PII controller.
4 Abbreviated terms
STRIDE spoofing identity, tampering, repudiation, information disclosure, denial of service, ele-
vation of privilege
UC use case
V2X vehicle-to-everything
5.1 General
This document includes a security and privacy analysis of ISO/IEC TR 24030:2021 use cases. Two
electronic attachments were used:
— the first is the material used by ISO/IEC TR 24030:2021, available here: https://standards.iso.org/
iso-iec/t r/24030/ed-1/en/Use+cases-v05_electronic_ attachment_022021.pdf,
— the second is the material used by this document, available here: https://standards.iso.org/iso-iec/
tr/27563/ed-1/en/Security-privacy-24030-ed-1-AI-use-cases.pdf.
Annex A provides a list of new use cases.
NOTE 2 The number of use cases per domain, e.g. 1 energy use case compared to 29 healthcare use cases is not
an indication of the potential deployment of AI capabilities in a domain.
NOTE 3 The assignment of a use case to a domain depends on the viewpoint of experts. For instance, use case
132 (Device control using both cloud AI and embedded AI) is classified as manufacturing instead of home.
— the number of use cases for which security concerns can be maximum.
NOTE 1 The assessment is based on the most critical systems of interest. For instance, use case 1 (Explainable
artificial intelligence for genomic medicine) involves two systems of interest, the genomic sequence processing
system for which security concerns can be maximum, and the genomic training system for which system
concerns can be significant. The resulting assessment is that security concerns can be maximum.
NOTE 2 The assessment result of each domain is not an indication of the potential privacy concern of AI in a
domain.
NOTE 2 The assessment result of each domain is not an indication of the potential privacy concern of AI in a
domain.
7 Supporting information
NOTE 3 Table 5 is based on ISO/IEC TS 27110 and the NIST privacy framework[15].
Table 4 (continued)
Points of attention Description
Minimize the PII which is processed and the number of privacy stake-
Data minimization
holders and people to whom PII is disclosed or who have access to it
Limiting the use, retention and disclosure (including transfer) of PII to
Use, retention and disclosure limita-
that which is necessary in order to fulfil specific, explicit and legitimate
tion
purposes
Ensuring that the PII processed is accurate, complete, up-to-date (un-
Accuracy and quality less there is a legitimate basis for keeping outdated data), adequate and
relevant for the purpose of use
Providing PII principals with clear and easily accessible information
Openness, transparency and notice about the PII controller’s policies, procedures and practices with re-
spect to the processing of PII
Giving PII principals the ability to access and review their PII, provided
Individual participation and access their identity is first authenticated with an appropriate level of assur-
ance and such access is not prohibited by applicable law
Documenting and communicating as appropriate all privacy-related
policies, procedures and practices. Assigning to a specified individual
Accountability within the organization (who can in turn delegate to others in the or-
ganization as appropriate) the task of implementing the privacy-related
policies, procedures and practices
Protecting PII under its authority with appropriate controls at the oper-
ational, functional and strategic level to ensure the integrity, confiden-
Information security tiality and availability of the PII, and to protect it against risks such as
unauthorized access, destruction, use, modification, disclosure or loss
throughout the whole of its life cycle
Verifying and demonstrating that the processing meets data protection
Privacy compliance and privacy safeguarding requirements by periodically conducting
audits using internal auditors or trusted third-party auditors
Table 8 and Table 9 show examples of categories of threats that can be used.
NOTE These categories of threats are based on the STRIDE and LINDDUN taxonomy.
Table 8 (continued)
Points of attention Description
Establishing the link between two or more actions, identities, and pieces
Linkability
of information
Establishing the link between an identity and an action or a piece of infor-
Identifiability
mation
Inability to deny having performed an action that other parties can nei-
Non-repudiation
ther confirm nor contradict
Privacy Detectability Detecting the PII principal’s activities
threat Disclosure of informa-
LINDDUN Disclosing the data content or controlled release of data content
tion
taxonomy
PII principals being unaware of what PII about them is being processed
Unawareness Unawareness by PII Controllers of life cycle weaknesses that can exist/
develop due to greater awareness of the content of the training model or
other ML techniques
PII controller fails to inform the data subject about the system’s privacy
Non-compliance policy, or does not allow the PII principal to specify consents in compli-
ance with legislation
The following categories of issues related to privacy consequences in Table 9 can be used.
Table 11 (continued)
Category themes Controls
Networks security
Security of network services
Segregation of networks
Web filtering
Use of cryptography
Secure development life cycle
Application security requirements
Secure system architecture and engineering principle
Secure coding
Secure testing in development and acceptance
Outsourced development
Separation of development, test and production environments
Change management
Test information
Protection of information systems during audit testing
Table 12 lists control categories as proposed by ISO/IEC 27701 for PII controllers.
Table 12 — Additional supporting information for PII controllers (for information systems)
Category Supporting information
Identify and document purpose
Identify lawful basis
Conditions for Determine when and how consent is to be obtained
collection and Obtain and record consent
processing Privacy impact assessment
Joint PII controller
Records related to processing PII
Determining and fulfilling obligations to PII principals
Determining information for PII principals
Providing information to PII principals
Providing mechanism to modify or withdraw consent
Obligations to
Providing mechanism to object to PII processing
PII principals
Access, correction and/or erasure
PII controllers’ obligation to inform third parties
Handling requests
Automated decision making
Table 12 (continued)
Category Supporting information
Limit collection
Limit processing
Accuracy and quality
Privacy by de- PII minimization objectives
sign and privacy PII de-identification and deletion at the end of processing
by default Temporary files
Retention
Disposal
PII transmission controls
Identify basis for PII transfer between jurisdictions
PII sharing, Countries and international organizations to which PII can be transferred
transfer and
disclosure Records of transfer of PII
Records of PII disclosure to third parties
Table 13 below lists control categories as proposed by ISO/IEC 27701 for PII processors.
Table 13 — Additional supporting information for PII processors (for information systems)
Category Supporting information
Customer agreement
Organization’s purposes
Conditions for collection and Marketing and advertising use
processing Infringing instruction
Customer obligations
Records related to processing PII
Obligations to PII principals Obligations to PII principals
Temporary files
Privacy by design and privacy
Return, transfer or disposal of PII
by default
PII transmission controls
Basis for PII transfer between jurisdictions
Countries and international organizations to which PII can be transferred
Records of PII disclosure to third parties
PII sharing, transfer and disclo- Notification of PII disclosure requests
sure Legally binding PII disclosures
Disclosure of subcontractors used to process PII
Engagement of a subcontractor to process PII
Change of subcontractor to process PII
Table 15 (continued)
Points of at-
Comment
tention
The citizen engagement process focuses on consultation with citizens on security and privacy
Citizen engage-
rules and policies at governance level, and on the support on the enforcement of these rules
ment process
and policies concerning the security and privacy of an ecosystem service.
— Re-identified data (some data that is initially categorized at non-PII is now a PII)
Annex A
(informative)
A.1 General
This annex provides additional new examples of use cases elaborated by experts in the scope of this
document, which are not listed in ISO/IEC TR 24030.
Picture summarizing Figure A.1 shows the impact of the use case on security and privacy.
Impact the impact of the use
summary case on security and
privacy
participant system of interest
participant asset
participant security impact
participant privacy impact
box over system of interest,asset:abnormal transactions
Picture source code monitoring system
Sequencediagram.
org box right of asset:Core banking system
parallel
box right of asset:internal control and compliance data
rbox right of security impact #lightgrey:significant
rbox right of privacy impact #lightgrey:significant
parallel off
Figure A.1 — UC SC27–1 Abnormal transactions of internal control and compliance employees
in bank system
Picture summarizing Figure A.2 illustrates the impact of the use case on security and pri-
Impact the impact of the use vacy.
summary case on security and
privacy
participant system of interest
participant asset
participant security impact
participant privacy impact
box over system of interest,asset:Financial risk
management system
Picture source code
Sequencediagram.org box right of asset:Financial business management system
parallel
box right of asset:Financial performance data
rbox right of privacy impact#lightgrey:significant
rbox right of security impact#lightgray:significant
parallel off
— ICT resources
System of interest: AI webcam employee monitoring
Assessment Assessment on
— Privacy concerns for employees are significant as they are monitored
of system of security and privacy
while working especially those that are in telecommute setup.
interest concerns
Security concerns:
— Confidentiality and integrity
Highlight security Privacy concerns:
Security and
and privacy concerns
privacy — Unlinkability, transparency, purpose legitimacy and proportionality.
that are impacted
concerns All privacy concepts to consider for monitoring productivity of
by AI
employees and ensure that they are aware of the processing and
data being processed are proportional to the declared purpose.
(Identify-P, Govern-P, Control-P, Communicate-P, Protect-P)
Security risks:
Identify security and — Repudiation, information disclosure, spoofing
Security and
privacy risks that Privacy risks:
privacy risks
are impacted by AI
— Identifiability, detectability, disclosure of information, unawareness,
non-compliance, lack of transparency, unsolicited tracking
— Controls from ISO/IEC 27001 applies (e.g. mobile devices and
teleworking, logging and monitoring, user responsibilities,
Security and Identify security and compliance with legal and contractual requirements)
privacy privacy controls that — Controls from ISO/IEC 27701 applies (e.g. identify and document
controls are impacted by AI purpose, identify lawful basis, privacy impact assessment,
obligations to PII principals, privacy by design and privacy by
default, records of PII disclosure to third parties)
Identify security and
Security and
privacy assurance
privacy Assurance approach and metrics for assurance
aspects that are
assurance
impacted by AI
Identify security and
All security and privacy plan requirements applies (governance pro-
Security and privacy plan aspects
cess, data management process, risk management process, engineering
privacy plan that are impacted
process, citizen engagement process).
by AI
Table (continued)
System’s threat and In the systems considered, humans generally remain in the loop and have the last word,
vulnerabilities limiting any risk of uncontrolled bias
ID Name Description Reference to use case men-
Key performance tioned objectives
indicators (KPIs)
— the reaction function to the misbehaving ITS station (revocation, suspension) is expected
to mitigate the effects of an attack.
The local and global detection systems are based on machine learning algorithms which
have shown to outperform rule-based systems.
Further issues that are anticipated:
— The deployment of a MisBehaviour detection system can enable new axes of attacks such
as a malicious actor that causes the revocation or suspension of genuine vehicles.
Description
Societal
concerns SDG to be Sustainable Development Goal 9: Industry, Innovation and Infrastructure[34]
achieved
Bibliography
[1] Asilomar AI Principles, Asilomar Conference. Future of Life Institute. 2017. Available
from: https://f utureoflife.org/open-letter/ai-principles/
[2] Ethically Aligned Design,1st edition. IEEE Global Initiative on Ethics of Autonomous and
Intelligent Systems. 2019. Available from: https://standards.ieee.org/content/dam/ieee
-standards/standards/web/documents/other/ead1e.pdf
[3] The Assessment List for Trustworthy Artificial Intelligence (ALTAI). July 2020. Available
from https://f uturium.ec.europa.eu/en/european-ai-alliance/pages/altai-assessment-list
-trustworthy-artificial-intelligence
[4] ISO/IEC/TR 24028:2020, Information technology — Artificial intelligence — Overview of
trustworthiness in artificial intelligence
[5] ISO/IEC/TR 24030:2021, Information technology — Artificial intelligence (AI) — Use cases
[6] ISO/IEC/IEEE 15288:2015, Systems and software engineering — System life cycle processes
[7] ISO/IEC 27701:2019, Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for
privacy information management — Requirements and guidelines
[8] ISO/IEC/TR 27550:2019, Information technology — Security techniques — Privacy engineering for
system life cycle processes
[9] ISO/IEC 29100:2011, Information technology — Security techniques — Privacy framework
[10] ISO/IEC 29134:2017, Information technology — Security techniques — Guidelines for privacy
impact assessment
[11] ISO/IEC/IEEE 29148:2018, Systems and software engineering — Life cycle processes —
Requirements engineering
[12] ISO/IEC 29151:2017, Information technology — Security techniques — Code of practice for
personally identifiable information protection
[13] ISO 31700-1, Consumer protection — Privacy by design for consumer goods and services — Part 1:
High-level requirements
[14] Cavoukian A., “7 Foundational Principles of Privacy by Design”, Information & Privacy
Commissioner, Ontario, Canada. Available from https://w ww.ipc.on.ca/w p-content/uploads/
Resources/7foundationalprinciples.pdf
[15] The NIST Privacy Framework, A Tool for Improving Privacy through Enterprise Risk
Management. Version 1.0 (January 2020), https://doi.org/10.6028/NIST.CSWP.01162020
[16] ISO/IEC/TS 27570:2021, Privacy protection — Privacy guidelines for smart cities
[17] ISO/IEC 27002:2022, Information security, cybersecurity and privacy protection — Information
security controls
[18] ISO/IEC 27005:2022, Information security, cybersecurity and privacy protection — Guidance on
managing information security risks
[19] ISO/IEC 27006-1, Requirements for bodies providing audit and certification of information security
management systems — Part 1: General
[20] ISO/IEC/TS 27006-2:2021, Requirements for bodies providing audit and certification of information
security management systems — Part 2: Privacy information management systems
ICS 35.020
Price based on 29 pages