0% found this document useful (0 votes)
771 views246 pages

National-Data-Index v1.0 EN

Uploaded by

HONDALD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
771 views246 pages

National-Data-Index v1.0 EN

Uploaded by

HONDALD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 246

National Data Index

Version 1.0

1
Document Control

Version Revision Date Contributor Modification

1.0 Oct 2023 SDAIA First Issue

2
Table of Content
1. Document Objective ......................................................................................................... 4

2. Table of Abbreviations...................................................................................................... 4

3. Introduction ....................................................................................................................... 5

4. NDI Overview..................................................................................................................... 6

5. Scope ................................................................................................................................. 7

6. The NDI Framework .......................................................................................................... 8

6.1. Benchmarking ............................................................................................................ 9

6.2. Index Design ............................................................................................................... 9

6.3. Outcomes ..................................................................................................................12

7. Summary ..........................................................................................................................16

8. Appendix...........................................................................................................................16

8.1. Appendix I – DM Maturity Assessment Questionnaire ...........................................16

8.2. Appendix II – Acceptance Evidence Checklists ......................................................96

8.3. Appendix III – Operational Excellence (OE) ...........................................................243

3
1. Document Objective
This document aims to present introductory details about the National Data Index (NDI)
for its first round in 2023. It includes the NDI’s overview, scope, aims, framework, and
components with their measurement scores and assessment levels which are described
in the appendix.

2. Table of Abbreviations

Abbreviation Definition

NDL National Data Lake (A National Data Platform).

DM Data Management.

DMP Data Marketplace (A National Data Platform).

DG Data Governance.

DQ Data Quality.

GSB Government Service Bus (A National Data Platform).

KSA Kingdom of Saudi Arabia.

NDC National Data Catalog (A National Data Platform).

NDI National Data Index.

NDMO National Data Management Office.

ODP Open Data Portal (A National Data Platform).

OE Operational Excellence.

PDP Personal Data Protection.

RDP Reference Data Management Platform (A National Data Platform).

SDAIA Saudi Data and Artificial Intelligence Authority.

4
3. Introduction
The Saudi Data and Artificial Intelligence Authority (SDAIA) is on the mission of
developing a data-driven economy and fostering the Kingdom’s data-literacy level to
benefit from data for operational efficiency and insightful decision-making. From this
perspective, SDAIA launched, as part of its initiatives, the National Data Index (NDI),
which is a dynamic and result-oriented monitoring and evaluation index established to
assess the productivity and track the progress of government entities with regards to
the Maturity of the Data Management (DM) practices, Compliance to Data Management
and Personal Data Protection (DM and PDP) Standards, and Operational Excellence
(OE). The NDI is built to satisfy the data ecosystem requirements of the Kingdom of Saudi
Arabia (KSA) specifically. This comprehensive framework serves as a reliable metric and
an influential indicator, aiding the realization of Vision 2030's objectives by contributing
to becoming a smart nation and building a thriving economy, a vibrant society, and an
ambitious nation. The NDI provides government entities with the enablers that help
effectively in measuring current DM practices and achieving high assessment levels, and
aims high to achieve the following key objectives:

Establish a strong Data Enhance Data Quality (DQ)


Governance (DG) framework and integrity to ensure
& policies to govern Data accurate, complete, and
Management (DM) practices, consistent data.
gauge DM Maturity and
ensure Compliance.

Accelerate the efficiency and Implement data lifecycle


improve the effectiveness of management processes to
the Data Management (DM) handle data from creation to
operational processes. disposal in a compliant
manner.

Develop mechanisms for Foster a culture of Data


compliance reporting and Management (DM) through
auditing to track and monitor employee training programs
adherence to regulations. and awareness campaigns for
the target audiences.

5
4. NDI Overview
SDAIA designed the NDI as a data-specific maturity index based on the benchmarking
research done on global data indices to reach best practices. For activating the NDI, it
is developed based on the following three components:

• DM Maturity Assessment Questionnaire: Measuring the extent to which the entity


applies best practices in 14 DM domains with regards to other parties including
individuals, technologies, and operational processes. The output reports also
provide recommendations based on the results to improve the entity’s DM
Maturity level.

• DM and PDP Standards Compliance Assessment: Measuring the entity’s


commitment to the adoption and implementation of the Data Management and
Personal Data Protection (DM and PDP) Standards published by the National Data
Management Office (NDMO).

• Operational Excellence (OE) Assessment: Measuring the entity’s progress level in


terms of taking advantage of the National Data Platforms through assessing the
entity’s automated processes and operations of 6 DM Domains so far.

The above NDI components will be measured and audited periodically using the NDI
platform, which is the technological infrastructure of the assessments. The key NDI
platform functionalities include:

• Assessment Execution: The NDI platform facilitates the automated execution of


the measurements, ensuring standardized and efficient evaluation processes.

• Data Analysis: The NDI platform provides detailed reports on the results of the
NDI components in an interactive and smooth manner.

The NDI platform is a tool designed to facilitate the NDI assessment for the entity.
Furthermore, it covers a range of functionalities that provide real-time capabilities,
enabling the entity to closely track its incremental progress towards attaining a high NDI
score.

6
5. Scope
The NDI’s scope of work represents practical auditing procedures to ensure accuracy in
DM practices, integrity, and conformity of the targeted entities. In its first round, the NDI
will cover the following components and DM Domains:

For the Maturity and Compliance components, the NDI scope covers the following 14
DM domains:

Reference & Master Data


Data Governance (DG)
Management (RMD)

Data Catalog & Metadata Business Intelligence & Analytics


Management (MCM) (BIA)

14
Data Quality (DQ) Data Value Realization (DVR)

Data Operations (DO) Open Data (OD)

domains Document & Content Management


Freedom of Information (FOI)
(DCM)

Data Architecture & Modelling


Data Classification (DC)
(DAM)

Data Sharing & Interoperability


Personal Data Protection (PDP)
(DSI)

For the OE component, the NDI scope covers the following 6 prioritized DM Domains,
as others shall be added gradually based on readiness:

1. Data Catalog & Metadata Management


4. Data Sharing & Interoperability (DSI)
(MCM)

5. Reference & Master Data Management


2. Data Quality (DQ)
(RMD)

3. Data Operations (DO) 6. Open Data (OD)

7
This comprehensive approach enhances the empowerment of government entities,
mainly in relation to data-based initiatives, by supporting a culture of continuous
improvement, risk mitigation, and effective Data Governance. The NDI framework helps
in providing the correct understanding of the strengths and gap fulfilments to improve
the DM practices and maximize the value of the data assets.

6. The NDI Framework


The NDI Framework includes the index components, measurement methodologies,
reporting, audits and governance mechanisms as shown in the figure below:

The National Data Index (NDI) Framework

Benchmarking Index Design Outcomes

Global Data Indexes & Regulatory Standards


1
Index Components

Maturity Maturity Score


Legislations Themes
(Levels, Band,
Status)
DM Maturity DM and PDP Standards Operational
Assessment Compliance Assessment Excellence (OE)
Questionnaire Assessment

2 Compliance Score
Measurement Methodology

3
Reporting Design
OE Score

Shortlisted Data Indexes

The NDI Framework is based on a systematic approach and methodology consisting of


the following focal elements:

8
6.1. Benchmarking
Benchmarking was a strategic process followed to take into consideration the lessons
learnt from global indices. The benchmarking outputs made SDAIA gain valuable ideas
about the relative ranking of the methodologies and process efficiencies of these
international data-related economic factors. This comparative analysis served as a
crucial starting point for identifying the best practices that are implemented worldwide
in data indices. Benchmarking was built on various sources such as: industry reports,
surveys, case studies, best practices, themes, dimensions, etc. which were used to
design the NDI.

6.2. Index Design


The design of the NDI covers the various DM dimensions in alignment with the “DM and
PDP Standards”, thus providing a progress roadmap for each government entity. As
shown in the figure below, the NDI was designed by performing several activities, mainly:
determining its components, formulating its measurement methodology, and designing
its reporting mechanism:

1 2 3

Index Components Measurement Methodology Reporting Design

The definitions of the three NDI Each of the three NDI components The detailed scores and analysis of
components: has a different methodology used to the three NDI components provide
measure each entity’s DM practices. insights and recommendations for
• DM Maturity Assessment the participating government entities
Questionnaire. to facilitate continuous improvement.
• DM and PDP Standards
Compliance Assessment.
• Operational Excellence (OE)
Assessment.

9
The Index Components

The NDI has three components which are defined in the figure
below and described in detail in the appendix. They will be
measured through a comprehensive evaluation of each entity's
Data Management practices. The NDI components are:

DM Maturity Assessment Questionnaire

• This component measures the entities’ maturity in Data Management (DM).


• It’s shared with the entities which will participate in the NDI measurement to answer the
questions and provide supporting evidences.
• It’s managed through a dedicated measurement platform.
• It covers 14 DM domains published by the National Data Management Office (NDMO).
• The 14 DM domains have scoring weights assigned by SDAIA.

The DM and PDP Standards Compliance Assessment

• This component measures the Entities’ Compliance with the 191 Specifications in the DM &
PDP Standards published by the NDMO, all of which were mapped in the Maturity
Questionnaire.
• The Compliance assessment results are obtained by answering the DM Maturity
Questionnaire and completing the acceptance evidences.
• It covers 14 DM domains published by the NDMO.
• The 14 DM domains have scoring weights assigned by SDAIA.

10
The Operational Excellence (OE) Assessment

• This component measures the progress of entities’ processes operationally which are
based on the national platforms / technological solutions such as NDC, GSB, etc.
• It currently covers 6 of the DM domains published by NDMO.

The Measurement Methodology

Is developed for each NDI component, as each one has a different


calculation and a separate result used for measuring its level for
each entity:

• The Maturity Assessment: Is calculated using the Maturity questionnaire to measure the
entity’s maturity in 14 of the DM domains.

• The Compliance Assessment: The Compliance score is deduced from the entity’s
answers to the Maturity questionnaire, as the result shows the extent of the entity’s
commitment to implementing the 191 specifications of the DM domains published by
the National Data Management Office (NDMO), namely, “Data Management and
Personal Data Protection (DM and PDP) Standards”.

• The Operational Excellence (OE) Assessment: Is applicable on 6 of the DM domains. It


measures how progressive the entity is in terms of its DM operational processes by
maximizing the benefit from the National Data Platforms, such as:

 National Data Lake (NDL).  Reference Data Management


 Data Marketplace (DMP). platform (RDP).
 National Data Catalog (NDC).  Open Data Portal (ODP).
 Government Service Bus (GSB).

11
Reporting Design

A special mechanism has been designed to generate the NDI


reports, which present detailed results for each of the index
components, and thorough details which allow the entity to
understand its strengths, advancement opportunities, and areas
of continuous improvement in its DM practices.

6.3. Outcomes
Based on the approved design of the NDI above, the NDI outcomes generate for each
entity three scores which are described below:

The Score of the DM Maturity


Assessment Questionnaire.

The Score of DM and PDP


Standards Compliance
Assessment

The Score of the Operational


Excellence (OE) Assessment

12
6.3.1. The Score of DM Maturity Assessment Questionnaire

The DM Maturity score is derived from the entity’s answers to the questionnaire. The
Maturity assessment has 6 levels, ranging from Level 0 “Absence of Capabilities” to
Level 5 “Pioneer”. The entity’s overall maturity level for its DM practices shall be specified
based on the percentage scale as described in the diagram below:

5
4
3
2
1
0
Maturity
Level

Absence of
Establishing Defined Activated Managed Pioneer
Capabilities

Maturity Score
0 – 0.24 0.25 – 1.24 1.25 – 2.49 2.5 – 3.99 4 – 4.74 4.75 – 5
Band

Maturity
0%- 4.9% 5%- 24.9% 25%- 49.9% 50%- 79.9% 80%- 94.9% 95%- 100%
Percentage Scale

Continuous
Institutional
Lack of Centralized improvement is
Data processes,
fundamental governance enabled, with a
Basic data management scalable tools
data with end-to-end strong focus on
management practices have and initial
Maturity Level management
practices are been developed stages of
Key data innovation
capabilities, Performance and a
Description with no
introduced, but and formalized, automation
Indicators reputation as a
they are not ensuring have been
established (KPIs) and benchmark for
standardized. consistency implemented to
practices. metrics are in excellence in
and reliability. support data
place. data
management.
management

Maturity Question Implementation KPIs and Continuous


No Practices Identification Development
Coverage No practices or Execution Metrics improvement

Implementing
Managing
Developing best and
Building practices and
Objective None
awareness
practice standardizing
measuring
Innovation
capabilities the developed
progress
practices

13
6.3.2. The Score of the DM and PDP Standards Compliance

Assessment

The Compliance score is also derived from the entity’s answers to the questionnaire but
based on the acceptance evidences submitted. These evidences were mapped to the
191 specifications of the DM and PDP Standards. The NDI defines the following two
binary levels of Compliance:

Compliant Non-Compliant

The entity implemented all requirements of The entity has not implemented all
the specification to achieve compliance. requirements of the specification,
thus, compliance is not achieved.

1. Compliant:
• An entity is “Compliant” with a specification if it is fully implemented as stated in
the DM and PDP standards (Acceptance evidences to be provided where
applicable).

2. Non-Compliant:
• An entity is “Non-Compliant” with a specification if it is not fully implemented.

• Partial implementation of a specification is also considered “Non-Compliant”.

14
6.3.3. The Score of the Operational Excellence (OE) Assessment

The OE Assessment component consists of several metrics that are divided across 6 of
the DM domains listed under the scope (i.e., MCM, DQ, DO, DSI, RMD, OD, and more
shall be added subsequently). The metric scores include different measurement units
(e.g., Time, Percentage, etc.). In order to normalize the measurement units, each metric
output is mapped to a predefined scale that ranges in value from Level 0 “Unacceptable”
to Level 5 "Leader” which are described in the following figure:

The Operational Excellence (OE) levels

Unacceptable Low Fair Good Excellent Leader

0 1 2 3 4 5

OE Level OE Level Description

0 Unacceptable The entity is unable to meet the minimum OE threshold.

The entity barely meets the minimum expected OE threshold.


1 Low Most of the operations are ad-hoc or performed with minimal established Standard Operating Procedures
(SOPs).

The entity has started applying the foundational measures and marginally exceeds the minimum expected OE
2 Fair
threshold in Data Management processes and operations.

The entity has a clear understanding of its objectives in the context of Data Management operations.
3 Good
The entity applies advanced tools and techniques to improve DM.

The entity has the ability to consistently exceed the expected OE threshold and Key Performance Indicators
4 Excellent (KPIs).
The entity has established a culture of continuous improvement.

The entity has a strong focus on innovation, creativity and acts as a front runner in adopting cutting-edge
5 Leader
processes, techniques and methods.

15
7. Summary
In conclusion, SDAIA developed and launched the National Data Index (NDI), with its
three component assessments which complement each other: Maturity, Compliance,
and OE, in pursuit of improving the DM practices and processes in government entities
and maximizing the value from data as a national asset. This contributes to leveraging
data in informed decision-making, optimizing operational efficiency and achieving the
national goal of becoming a leader in data-driven economies globally. Furthermore, OE
increases the entity’s progress in the DM processes and operations with the help of the
National Data Platforms.

8. Appendix
This section consists of the detailed description of each NDI component which will have
a separate score based on different measurements, thus contributing to the NDI
outcomes:

8.1. Appendix I – DM Maturity Assessment Questionnaire


This section provides a detailed questionnaire, which contains 42 questions aimed at
evaluating entities’ Data Management practices across 14 domains. Each domain has
specific questions with maturity levels ranging from level 0 (Absence of capabilities) to 5
(Pioneer).

To demonstrate commitment to Data Management practices, the entities must provide


SDAIA with acceptance evidences for each maturity level. This include a range of
evidences, such as policies, reports, process documentation, implementation records,
and other relevant artifacts. This will also help ensure compliance with Data Management
and Personal Data Protection (DM and PDP) Standards.

16
8.1.1. Data Governance Domain

Maturity Questions – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Data Management & Personal Data Protection (DM & PDP) - Not Applicable.
Absence of strategy is in place.
Capabilities

Level 1: - The Entity has existing DM practices that are not formalized (not - Existing data-related practices.
Establishing approved).

Level 2: - A cross-Entity DM & PDP strategy is defined for all DM Domains. - The approved DM & PDP - DG.1.1
Defined Strategy.
- DM Guiding principles have been developed.

- The DM guiding principles. - DG.1.2

- Data Strategy Approval - DG.1.4


Decision.

Level 3: - The Entity developed a DM & PDP plan based on the defined DM All level 2 acceptance evidence - DG.1.3
Activated & PDP strategy. requirements including:

- The Entity is implementing this DM & PDP plan which covers all - The developed DM & PDP
DM Domains. implementation plan.

- Implementation status report.

Level 4: - The Entity is monitoring the DM & PDP strategy & plan All level 3 acceptance evidence
Managed implementation across all DM Domains with the pre-defined KPIs. requirements including:

- The monitoring report of the DM


& PDP strategy & plan
implementation with the pre-
defined KPIs.

17
Maturity Questions – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - The DM & PDP strategy is periodically updated, improved and - All level 4 acceptance evidence - DG.6.1
Pioneer optimized across all DM Domains. requirements including:

- The DM & PDP implementation plan is periodically reviewed and - The Continuous Improvement
updated based on both the changes to the DM & PDP strategy Report of the DM & PDP Plan.
and the changing business requirements.

- The Continuous Improvement - DG.7.2


Report of the DM & PDP
Strategy.

Maturity Questions – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Data Management (DM) Policies, Standards or Guidelines are - Not Applicable.
Absence of in place.
Capabilities

Level 1: - The Entity has existing DM-related policies and standards that are - The Data Management & Personal - DG.2.1
Establishing not formalized. Data Protection (DM & PDP)
Policies, Controls and Guidelines
- The Entity conducted a gap analysis to identify the required DM Gap Analysis document.
and PDP policies to be developed

Level 2: - The Entity developed policies, standards and guidelines for all DM All level 1 acceptance evidence - DG.2.2
Defined Domains. requirements including:

- The developed DM and PDP


policies, standards and guidelines
covering all DM Domains as
required.

Level 3: - The Entity is implementing the developed DM and PDP policies All level 2 acceptance evidence
Activated with defined processes and standards to establish governance, requirements including:
uniformity and manage compliance across the Entity.
- Implementation status report.

18
Maturity Questions – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Related
Level Name Level Description Acceptance Evidence
Specification

- A document proving the Entity's


approval & adoption of the
developed policies, standards &
guidelines.

- Approved Compliance - DG.5.1


Management Framework.

Level 4: - The DM and PDP policies, standards and processes are tracked All level 3 acceptance evidence - DG.7.1 (3 &
Managed and measured based on the pre-defined KPIs. requirements including: 6)

- Periodic monitoring is conducted on the Entity's compliance with - The monitoring reports for the
the Regulations published by NDMO-SDAIA. developed policies, processes and
standards with pre-defined KPIs.

- The Entity's Compliance Audit - DG.5.2


Results Report.

- Compliance Monitoring Report. - DG.5.3

Level 5: - The DM and PDP policies, standards, DG processes and All level 4 acceptance evidence
Pioneer compliance with regulations are continuously reviewed and requirements including:
optimized.
- Continuous Improvement Report of
the updated DM and PDP policies,
standards and updated DG
processes.

19
Maturity Questions – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal/approved Data Management Organization structure is - Not Applicable.


Absence of in place.
Capabilities

Level 1: - The Entity established a Data Management Office with one or - The Entity's Data Management Office - DG.4.1
Establishing two roles only. establishment decision.

- The Hiring / Appointment decisions - DG.4.3


of the following roles:

A. Chief Data Officer (CDO) Hiring


/ Appointment Decision.

- The Hiring / Appointment decisions - DG.4.4


of the following roles:

B. Data Management Officer /


Data Governance Officer Hiring
/ Appointment Decision.

Level 2: - The Entity "Data Management & Data Governance Committee" All Level 1 acceptance evidence - DG.4.2
Defined (An internal executive committee) is established with a total of requirements including:
six roles (out of the DM Office roles) assigned & operationalized
for the Data Management Organization. - Entity Data Management & Data
Governance Committee formation
decision.

- The Hiring / Appointment decisions - DG.4.6


of the following roles:

A. Compliance Officer Hiring /


Appointment Decision.

20
Maturity Questions – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Hiring / Appointment decisions - DG.4.8


of the following roles:

B. Business Data Executive Hiring


/ Appointment Decision.

- The Hiring / Appointment decisions - DG.4.11


of the following roles:

C. Legal Advisor Hiring /


Appointment Decision.

Level 3: - All DM roles are established as per the DM and PDP Standards All Level 2 acceptance evidence - DG.4.9
Activated requirements including:
and fully adopted across the Entity.

- The Hiring / Appointment decisions


- Data Stewardship / Ownership defined across the Entity, in
of the following roles:
addition to providing business and technical support for critical
Data systems & critical Data elements.
A. Business Data Steward(s)
Hiring / Appointment
Decision

- The Hiring / Appointment decisions - DG.4.10


of the following roles:

B. List of the IT Data


Stewards.

- The Hiring / Appointment decisions - DG.4.7


of the following roles:

C. Personal Data Protection


(PDP) Officer Hiring /
Appointment Decision.

21
Maturity Questions – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Hiring / Appointment decisions - DG.4.5


of the following roles:

D. Open Data and Information


Access Officer Hiring /
Appointment Decision.

- The documented & approved Data


Management Organization structure.

- The documented Data Stewardship /


Ownership structure.

Level 4: - The Data Management Organization & the structure are fully All Level 3 acceptance evidence - DG.7.1 (KPI
Managed established and operationalized. requirements including: 1)

- The roles are tracked through pre-defined KPIs, periodically - The monitoring reports for the
updated and optimized. Entity’s Data Management
Organization roles with pre-defined
KPIs.

Level 5: - Data Management Organization (& its structure) and Data All level 4 acceptance evidence
Pioneer stewardship (& its structure) (Data owners, business and requirements including:
technical stewards) are periodically reviewed and updated.
- The Continuous Improvement Report
for the DM Organization and Data
Stewardship.

22
Maturity Questions – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication, change
DG.MQ.4
control, and capability development?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Change Management practices are in place. - Not Applicable


Absence of
Capabilities

Level 1: - Awareness sessions, training courses and communication on - Evidences of communication on DM-
Establishing DM-related practices are performed on a reactive and ad-hoc related practices.
basis.

- The Entity has change control practices that are not formalized. - Evidences of training and awareness
sessions conducted.

Level 2: - The Entity has defined Change Management practices. All level 1 acceptance evidence
Defined requirements including:
- There is an Entity-wide awareness, training and communication
plan. - The Change Management Plan
covering all DM Domains including:
- Change control practices are established to ensure coordinated
efforts in managing changes on Data systems (implemented A. The DM & PDP Training Plan
within the Entity). for all DM Domains.

B. The DM Communication
plan.

C. The Stakeholders
engagement plan.

D. The Change Control plan for


Data system changes.

Level 3: - The Entity is implementing the defined Change Management - All level 2 acceptance evidence - DG.3.1
Activated practices for all DM Domains. requirements including:

- Formal communication channels are established and adopted - The Change Management
for DM issues management, escalations, resolutions & Implementation Status Report
approvals. showing the DM & PDP Training
activities.

- The Change Management - DG.6.2


Implementation Status Report

23
Maturity Questions – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication, change
DG.MQ.4
control, and capability development?

Related
Level Name Level Description Acceptance Evidence
Specification
- A Version control mechanism is defined and implemented for showing the DM Communication
DM documents and artifacts that the Entity published. activities.

- The DM & PDP Strategy is being socialized and awareness is


conducted across the Entity.
- The Stakeholders Engagement and - DG.1.4
Socialization plan implementation
status report.

- The Data Governance Approvals - DG.8.1


Register.

- The Data Management Issue Tracking - DG.8.2


Register.

- Evidences that the Entity has DM & - DG.8.3


DG document & artifact Version
Control practices.

Level 4: - The Entity is monitoring the progress and the effectiveness of All level 3 acceptance evidence - DG.7.1
Managed the Change Management practices with pre-defined KPIs. requirements including: (KPIs 2, 4,
5, 7 & 8)
- The monitoring report of the Change
Management practices with pre-
defined KPIs.

Level 5: - Change Management practices for all DM Domains are All level 4 acceptance evidence
Pioneer periodically reviewed, and continuously updated, optimized & requirements including:
developed.
- The Continuous Improvement Report
of the Change Management Practices
for all DM Domains.

24
8.1.2. Metadata and Data Catalog Domain

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to integrate and manage Metadata in the - Not Applicable.
Absence of Entity.
Capabilities

Level 1: - The Entity manages its Metadata on a reactive and ad hoc basis. - The Recorded or Documented
Establishing Metadata Report.

Level 2: - The Entity developed a plan, an operating model and a structure - The Approved Metadata and Data - MCM.1.1
Defined for the implementation of Metadata and Data Catalog. Catalog Plan.

- The Metadata management structure should include the


requirements to capture, integrate, populate, publish and utilize
- The Approved Metadata Structure - MCM.4.3
the required technology solution for the Entity's Metadata
and Framework.
(business and technical).

- A roadmap is defined with a list of activities, required resources,


and a budget to manage the Data Catalog and Metadata
implementation.

Level 3: - The Entity is implementing the defined Metadata and Data All level 2 acceptance Evidence
Activated Catalog plan and framework. requirements, Including:

- This implementation includes the identification and prioritization - The Metadata and Data Catalog Plan
of the critical systems and attributes. Implementation Status Report.

- The Metadata Management


Framework Implementation Status
Report.

25
Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the Metadata and All level 3 acceptance Evidence
Managed Data Catalog plan and activities with pre-defined Key requirements, Including:
Performance Indicators (KPIs).
- The Monitoring Report with Pre-
defined Key Performance Indicators
(KPIs) for the Metadata and Data
Catalog Plan and Activities.

Level 5: - The Metadata and Data Catalog plan and activities are regularly All level 4 acceptance Evidence
Pioneer monitored for continuous improvement and optimization. requirements, Including:

- The Continuous Improvement Report


on the Metadata and Data Catalog
Plan.

Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place to manage the Entity's Metadata. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity manages its Metadata manually or in standalone - Evidences of Existing Metadata.
Establishing applications without standardized tools.

Level 2: - The Entity documented its requirements for the Data Catalog tool - The Selected Data Catalog Tool.
Defined / solution and selected an appropriate tool based on business and
regulatory requirements.

- The Entity's data sources to be included in the Data Catalog tool - The Approved and Prioritized Data - MCM.1.2
have been identified and prioritized. Sources Report.

26
Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity developed a target Metadata architecture.
- The Developed and Approved Target - MCM.1.3
Metadata Architecture.

- The Data Catalog Tool


Implementation Requirements
Report.

- The Approved and Developed Data


Catalog Training Plan.

Level 3: - The Data Catalog tool has been implemented and populated with All level 2 acceptance Evidence - MCM.5.1
Activated the Entity’s Metadata. requirements, Including:

- The tool is integrated into the Entity’s Metadata repository with - Evidence of the Implemented Data
appropriate access permissions. Catalog Tool.

- The tool stores the activities and tracking logs for Audit Trails.
- Data Access Approval Process - MCM.2.1
- Power users are identified and trained on the Data Catalog tool Documentation (Authorization to
usage. Connect the Data Catalog with the
Data Sources).
- A training and communication plan is rolled out to raise
awareness and adoption of the Data Catalog tool.
- Metadata Access Approval Process - MCM.2.2
- The Data catalog automated tool is regularly updated to the latest Documentation.
version.

- Evidences of Data Catalog Adoption - MCM.3.2


and Usage Including Metadata
Populated on the Tool.

- The Regular Audits Report on the - MCM.5.3


Data Catalog Usage.

- Evidence of Training Conducted for - MCM.3.1


the Identified Data Catalog Users.

- Tool Versioning Report. - MCM.5.4

27
Maturity Questions – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring changes to its Metadata through All level 3 acceptance Evidence - MCM.6.1
Managed automated notifications. requirements, Including:

- The adoption and usage of the implemented Data Catalog tool is - The Monitoring Report with Pre-
closely monitored based on a set of pre-defined Key Performance defined KPIs for the Adoption and
Indicators (KPIs). Usage of the Metadata & Data
Catalog Solution / Tool.

- The Approved List of Pre-defined


KPIs for the Quality of the Metadata
which is Populated on the Data
Catalog Tool.

Level 5: - The Entity has fully automated end-to-end Metadata (Business, All level 4 acceptance Evidence
Pioneer Technical, and Operational) capture and exchange. requirements, Including:

- The Entity continuously monitors and optimizes the Data Catalog. - The Continuous Improvement Report
on the Metadata and Data Catalog
- The Entity continuously updates the Data Catalog in the event of Tool's quality.
any changes specified by the National Data Bank (NDB).

- The Metadata Management


Automation Report.

Maturity Questions – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No process is in place to manage Metadata. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity manages the Metadata on a reactive and ad hoc basis - Evidence of the Existing Processes
Establishing (e.g. Projects / initiatives) without formalized practices. Used to Manage Metadata.

28
Maturity Questions – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 2: - The Entity defined processes to manage the Metadata across the - Metadata Identification Process
Defined business and technical applications. Report.

- The Entity developed processes to prioritize, populate and


manage the Metadata including access management, annotation, - Metadata Prioritization Process
certification, and Metadata quality; by collaboration across the Report.
Entity.

- Metadata Population Process - MCM.4.2


Report.

- Metadata Update process Report. - MCM.4.4

- Metadata Quality process Report. - MCM.4.5

- Metadata Annotation process - MCM.4.6


Report.

- Metadata Certification process - MCM.4.7


Report.

Level 3: - The Entity is implementing defined processes for end-to-end All level 2 acceptance Evidence
Activated Metadata management. requirements, Including:

- The established processes for updating the Metadata are - Evidence of the Implementation and
implemented and automated as workflows within the Data Adoption of the Approved Processes
Catalog tool. as Workflows in the Entity's Data
Catalog.
- The Integrated Metamodel is deployed with role-based access.

- The Metadata quality issues are identified and addressed. - The Logs or the List of Notifications - MCM.5.2
on the Metadata Changes.

29
Maturity Questions – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3
population, access management, and quality issue management, etc., supported & fostered by collaboration across the Entity?

Related
Level Name Level Description Acceptance Evidence
Specification
- Full end-to-end Metadata is collected and enriched.
- Evidences of Communications to the
- All the Data Governance and Data Management teams along with Data Catalog Users of any Metadata
business stakeholders are engaged collaboratively through the Update.
automated tool.

- The Metadata Stewardship - MCM.4.1


Coverage Model.

Level 4: - The Entity is monitoring and tracking the Metadata processes and All level 3 acceptance Evidence
Managed activities with pre-defined Key Performance Indicators (KPIs). requirements, Including:

- The Metadata Processes Monitoring


Report with Pre-Defined KPIs.

- The Metadata Quality Monitoring - MCM.6.2


Report with Pre-Defined KPIs.

Level 5: - The Entity's Metadata management processes and practices are All level 4 acceptance Evidence
Pioneer regularly monitored for continuous improvement and optimization. requirements, Including:

- Optimization techniques are being utilized to improve the - The Continuous Improvement
processes of developing taxonomies, ontologies, or semantic Report on the Metadata
representations. Management Practices.

30
8.1.3. Data Quality Domain

Maturity Questions – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Data Quality (DQ) plan is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity implements Data Quality activities on a reactive or ad - Evidence of the Existing Data
Establishing hoc basis. Quality (DQ) Related Activities.

Level 2: - The Entity has a defined and approved DQ management plan. - The Defined and Approved DQ - DQ.1.2
Defined Implementation Plan.
- A roadmap is defined with a list of activities, required resources,
and a budget to manage the DQ implementation.

Level 3: - The Entity is implementing the defined DQ plan and roadmap. All level 2 acceptance Evidence
Activated requirements, Including:
- Clear roles and responsibilities are defined for the DQ activities in
line with the Data Management Organization structure. - A Report on the DQ Plan
Implementation Status.

- A Report on the Defined DQ Roles


& Responsibilities.

- A Report on the Assigned


Resources for the DQ Plan.

Level 4: - The Entity is monitoring the implementation of the DQ plan and All level 3 acceptance Evidence
Managed activities with the pre-defined Key Performance Indicators (KPIs). requirements, Including:

- The Monitoring Report of the DQ


Plan and Activities with Pre-defined
Key Performance Indicators (KPIs).

31
Maturity Questions – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - The DQ plan and activities are regularly monitored for optimization All level 4 acceptance Evidence
Pioneer and continuous improvement. requirements, Including:

- DQ Management is embedded in the Data Lifecycle and Software - The Continuous Improvement
Development Lifecycle (SDLC). Report of the Data Quality Plan &
Activities.

Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place for Data Quality (DQ). - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity has existing DQ-related practices that are not - The Existing DQ Domain Initiatives.
Establishing formalized.

- The identification of Data issues and the resolution activities are - The Existing Processes for
done on a reactive or ad hoc basis. Detecting DQ Issues.

- The Existing Processes Used for


Data Corrections or Data
Validations.

Level 2: - The Prioritized List of Data - DQ.1.1


Defined Elements.
- The Entity prioritized Data elements based on the business
requirements from the perspective of its importance for the DQ
Management. - The Defined DQ Dimensions for the - DQ.2.1
Entity's Datasets.
- The Entity defined DQ Dimensions for the prioritized Datasets.

- The Data Quality Rules Report. - DQ.2.1

32
Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity defined and documented DQ Rules including formats,
mandatory fields, validations for Data-entry values. - The Developed & Approved - DQ.2.3
Processes for DQ Issue
- The Entity established DQ issue management processes to Management & Remediation.
identify and resolve the DQ issues.

Level 3: - The Entity is implementing DQ Management. All level 2 acceptance Evidence - DQ.1.3
Activated requirements, Including:
- An initial DQ assessment was conducted to identify DQ issues.
Periodic DQ assessments are established to ensure the - The Planned & Conducted Initial &
improvement on the Entity's DQ. Periodic Data Quality Assessment
Report.
- Data issues were identified, a root cause analysis was conducted
and remediation activities were developed accordingly to correct
the identified Data issues. - The Resolution Status Report of the - DQ.2.3
Identified DQ Issues.
- Standardized DQ Tools are implemented with automated
workflows and defined DQ Rules.

- Evidences of DQ Tools Used for - DQ.2.5


Automating DQ Issue Management
Workflows.

- The Defined & Implemented DQ - DQ.2.4


Service Level Agreements (SLAs).

- The List of DQ System


Enforcements Adopted by the
Entity.

Level 4: - The Entity is monitoring the effectiveness of the DQ management All level 3 acceptance Evidence
Managed practices with pre-defined KPIs. requirements, Including:

- The Entity is monitoring the agreed upon DQ threshold values - The Monitoring Report of the DQ
with pre-defined SLAs. Management Practices with Pre-
defined KPIs.
- The Entity is monitoring the effectiveness of the issue resolution
process.
- The Monitoring Report of the DQ
Threshold Values.

- The Monitoring Report of the DQ - DQ.3.2


Issue Resolution Process.

33
Maturity Questions – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - Proactive profiling and critical Dataset identification are done All level 4 acceptance Evidence
Pioneer using Advanced Artificial Intelligence / Machine Learning (AI/ML) requirements, Including:
based methods.
- The Continuous Improvement
- The DQ tools are continuously optimized to improve the DQ Report on the Tools Used for DQ.
management processes.

- Proactive remediation is done with proper monitoring of critical - The Continuous Improvement
Data systems, Data entities (Data objects) and Data attributes. Report on the DQ Management
Practices.
- The Entity implemented DQ Management industry standards or
developed Entity-specific standards.

- The Implemented / Adopted DQ


Industry Standards.

Maturity Questions – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place to monitor nor report on the quality of - Not Applicable.
Absence of Data.
Capabilities

Level 1: - The Entity monitors its DQ on a reactive or ad hoc basis. - The Existing DQ Monitoring
Establishing Practices.

- Evidence of the Entity's Current


DQ Status.

34
Maturity Questions – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 2: - The Entity defined and formalized DQ monitoring and reporting - The Defined and Formalized DQ - DQ.2.2
Defined practices to maintain and document the DQ activities on a Monitoring Plan.
regular basis.

- The Entity developed DQ processes and a plan for conducting - The Defined DQ Checkpoints

DQ Audits at defined checkpoints. Report.

Level 3: - The Entity is monitoring the quality of its Data and All level 2 acceptance Evidence - DQ.2.2
Activated documenting the quality status on a regular basis. requirements, Including:

- DQ reviews are conducted at the defined DQ Checkpoints. - DQ Scorecards or Dashboards.

- A workflow for reporting DQ issues has been implemented on


the Data Catalog automated tool. - A Report on the Data Quality - DQ.4.3
Metadata Logged on the Data
- The DQ Rules and the DQ Monitoring results are documented Catalog Tool.
as the Metadata within the Data Catalog automated tool.

- Evidence of a Data Quality - DQ.4.2


Support Process Implemented as
a Workflow.

- The Results of the DQ Checkpoint - DQ.4.1


Reviews.

Level 4: - The Entity monitors the DQ trends and reporting activities All level 3 acceptance Evidence - DQ.3.1
Managed based on pre-defined KPIs and established targets. requirements, Including:

- Trends from the DQ Monitoring


Activities with Pre-defined KPIs.

Level 5: - The Entity conducts periodic reviews of the monitoring and All level 4 acceptance Evidence
Pioneer reporting practices for continuous improvement in line with the requirements, Including:
changes in business and technical requirements.
- A Continuous Improvement
Report on the DQ Monitoring and
Reporting Practices.

35
Maturity Questions – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No standards nor definitions are in place for the Entity's - Not Applicable.
Absence of Datasets.
Capabilities

Level 1: - The Entity defines its Datasets and DQ standards on an ad hoc / - The Existing List of Data
Establishing project basis. Standards and Data Definitions.

Level 2: - The Entity has Data definitions and standards for a set of - The Developed Data Standards for
Defined domains / attributes in line with the DQ Dimensions. Data Elements.

- The Entity identified and provided Metadata definitions for all


Data attributes in line with the DQ definitions.
- The Identified Metadata with their
Definitions.
- The Entity identified the Datasets for which Metadata should be
published on the National Data Catalog (NDC).

- The Identified Datasets to be


Published on the National Data
Catalog (NDC).

Level 3: - The Entity has been on boarded on the NDC. The Entity All level 2 acceptance Evidence
Activated populated the Metadata definitions on the NDC. requirements, Including:

- The Entity is developing and implementing standards - A Report about the Data
independently for the Entity-specific domains / attributes. Standards & the Data Definitions
which the Entity Uploaded on the
- The Entity uploads the Metadata on the NDC as required. NDC.

- The Entity-Specific List of Applied


Data Definitions and Applied Data
Standards.

36
Maturity Questions – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity's Data definitions and Data standardization All level 3 acceptance Evidence
Managed processes are reviewed and tracked through pre-defined KPIs. requirements, Including:

- A Monitoring Report for the Data


Definitions and Data
Standardization with Pre-defined
KPIs.

Level 5: - The standards are developed proactively for all the new and All level 4 acceptance Evidence
Pioneer future systems. The standards are continuously improved requirements, Including:
across the Entity's Data systems.
- A Continuous Improvement Report
- Regular assessments are conducted to ensure that the for Optimizing the Data Standards
standards and definitions are provided and implemented for the and Definitions within the Entity
Entity's Datasets. and on the NDC.

37
8.1.4. Data Operations Domain

Maturity Questions – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Operations (DO), Data storage, Data - Not Applicable.
Absence of retention and disaster recovery (DR).
Capabilities

Level 1: - The Entity has existing practices for Data Operations, storage and - The Approved initial data
Establishing retention. However, this plan is setup in a reactive form on an ad operations and storage plan.
hoc basis.

Level 2: - The Entity has a defined and approved Data Operations (DO) and - The Developed and Approved - DO.1.1
Defined storage plan and roadmap including: Data Operations and Storage
Plan.
• Data storage forecasting.

• Budget and resource allocation.


- The Information Systems Priority - DO.1.3
• Information systems prioritization. List.

- This DO plan is in line with:


- The Developed and Approved - DO.2.1
• The policy development. Policies for Data Operations,
Storage, Retention and Business
• Tool selection.
Continuity.
• Business continuity.

• Plans for backup and disaster recovery.


- The Periodic Forecasting Plan for - DO.1.2
Storage Capacity.

- The Database Technology - DO.1.4


Evaluation Process and Selection
Criteria.

Level 3: - The Entity is implementing its defined Data Operations (DO) plan. All level 2 acceptance Evidence
Activated requirements, Including:
- The Entity assigned & activated the roles and responsibilities to
manage the DO, storage and retention in line with the business

38
Maturity Questions – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification
requirements and the overall "Data Management Organization" - The Data Operations and Storage
structure. plan with implementation status
report.
- The Entity is reporting on the Database performance.

- The Entity is conducting periodic forecasts of its storage capacity.

- The Storage Trend Forecast - DO.1.2


Document.

- The Database Technology - DO.1.4


Evaluation Report.

- The Document on the Entity's


Application Performance
Assessment.

- The Entity's Applications


Development Roadmap with the
Status Report on the
Implementations.

- The Budget Estimations of the


Procurement Transactions of the
Future Storage Needs.

- The Data Operations


Orchestration Document.

Level 4: - The Entity is monitoring the progress and effectiveness of the All level 3 acceptance Evidence - DO.5.1
Managed Data Operations (DO) plan implementation with pre-defined Key requirements, Including:
Performance Indicators (KPIs).
- The Monitoring Report with Pre-
defined Key Performance
Indicators (KPIs) for the
Implementation Progress and
Effectiveness of the Data
Operations Plan.

39
Maturity Questions – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and Data
DO.MQ.1
retention?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - The Entity's Data Operations (DO), storage, retention and All level 4 acceptance Evidence
Pioneer recovery practices are continuously reviewed and optimized. requirements, Including:

- The Continuous Improvement


Report on the Practices of Data
Operations, Storage, and
Retention.

Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices nor processes are in place for Database - Not Applicable.
Absence of management and operations.
Capabilities

Level 1: - The Entity conducts Data Operations but not in a standardized - Evidences of Data Operations
Establishing approach. Done within the Entity.

- Operations are being managed without clear procedures or with


minimal documented procedures.

Level 2: - The Entity has defined and developed practices and processes - The Process Documentation of the - DO.3.1
Defined for Database operations management, monitoring and storage Detailed Practices of Data
configurations including access control processes for the Operations, Including:
Databases.
A. Database Monitoring.
- All roles and support levels are identified to manage the
Database Operations.
B. Database Access - DO.3.2
Control.
- Processes and Standard Operating Procedures (SOPs) are
documented with clearly defined roles and responsibilities
across the operations and the IT teams.

C. Storage Configuration - DO.3.3


- The Database Management System (DBMS) / tool is regularly Management.
updated to the latest version.

- Service Level Agreements (SLAs) are developed to guide the


Database operations. D. DBMS Versioning - DO.3.4
Mechanism.

40
Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

E. The Database
Performance Service Level
Agreements and Operational
Level Agreements (SLAs &
OLAs).

Level 3: - The Entity is implementing the defined Practices and Processes All level 2 acceptance Evidence - DO.3.1
Activated of Database operations management and Storage requirements, Including:
Configurations.
- The Database Monitoring Status
Report.

- Under these Processes within the Entity, there are Standard


Operating Procedures (SOPs) include: - The Data Operations operating
model.
• The Database performance monitoring and
reporting.

• Role-based access control. - Evidences of Agreements (SLAs - DO.3.5


• Storage Configuration Management and DBMS and OLAs).
versioning (Implementation of Database
changes on the Production Environments).

- The Standard Operating Procedures (SOPs) are activated:

• For all support levels, communication,


escalation and resolution.

• With all the involved stakeholders.

- All required Data Operations (DO) roles are activated and


onboarded.

- The Service Level Agreements (SLAs) and the Operational Level


Agreements (OLAs) are enforced as agreed upon with business
stakeholders.

Level 4: - The Entity is monitoring the DO processes and SOP with pre- All level 3 acceptance Evidence
Managed defined Key Performance Indicators (KPIs) and thresholds. requirements, Including:

- The Monitoring Report with Pre-


defined KPIs for Database
Operations Management.

41
Maturity Questions – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - DO and Storage practices and processes are continuously All level 4 acceptance Evidence
Pioneer reviewed and optimized. requirements, Including:

- Leading DO practices are evaluated on a regular basis and - The Continuous Improvement
adopted (e.g., DataOps). Report on the Implemented
Practices and Processes of Data
- All operational metrics covering SLAs / OLAs, and Key Operations and Storage.
Performance Indicators (KPIs) / Key Quality Indicators (KQIs) are
reviewed proactively to improve and optimize the operations.
- The Knowledgebase document.
- Proactive / preventive measures are planned and executed to
ensure uninterrupted operations.

- All learning scenarios and experiences cases are documented


and updated regularly to create a knowledgebase for reusability
and future planning.

Maturity Questions – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR) and a
DO.MQ.3
defined Business Continuity Plan (BCP) for the Data?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plans, practices or processes are in place for business - Not Applicable.
Absence of continuity such as backups and disaster recovery.
Capabilities

Level 1: - The Entity's Data backup and disaster recovery practices are - Evidences of Data Storage Backup
Establishing reactive and on an ad hoc basis. Instruction Documents.

Level 2: - The Entity has defined and developed plans, practices and - The Developed Processes and - DO.4.1
Defined processes for business continuity, Data backup, and disaster Practices for Data Storage
recovery in line with the DM and PDP standards. Backups and Recovery.

- The Recovery Time Objective (RTO) and the Recovery Point


Objective (RPO) for the critical systems are defined (e.g., backup - The Business Continuity Plan - DO.4.2
frequency, location of backup files, storage medium and scope). (BCP) and the Disaster Recovery
(DR) Plan and the Processes.

42
Maturity Questions – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR) and a
DO.MQ.3
defined Business Continuity Plan (BCP) for the Data?

Related
Level Name Level Description Acceptance Evidence
Specification

- A Document of Actions Required


to Implement Database Changes
and Rollbacks.

Level 3: - The Entity is implementing its defined Business Continuity Plan All level 2 acceptance Evidence
Activated (BCP), practices and processes for business continuity, Data requirements, Including:
backup and recovery.
- The Technical Design Document.
- The BCP is implemented for the critical systems.

- Role-based access control for production data, all RTO / RPO - The BCP Run Report.
details (High availability, load balancing, etc.) are agreed upon
with Business and IT stakeholders and implemented.
- The Change Request for

- The Incident response teams are activated. Production Data.

- The Production Data Access - DO.4.3


Control Document.

Level 4: - The entity is monitoring the practices pertaining to the BCP and All level 3 acceptance Evidence
Managed DR with pre-defined Key Performance Indicators (KPIs). requirements, Including:

- The Monitoring Report with Pre-


defined KPIs for the Business
Continuity (BCP) and Disaster
Recovery (DR).

Level 5: - The Entity continuously reviews Data storage backup & recovery All level 4 acceptance Evidence
Pioneer and disaster recovery processes for improvement and requirements, Including:
optimization.
- The Continuous Improvement
- The Entity performs periodic executions and validations of the Report on the Business Continuity
RTO and RPO to ensure that the systems are well prepared for Plan (BCP).
any unexpected interruption or calamity.

- The BCP details are regularly reviewed to incorporate the


learnings, modern trends and business needs / demands.

43
8.1.5. Document and Content Management Domain

Maturity Questions – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - The Entity doesn’t have any plan in place currently for Document - Not Applicable.
Absence of and Content Management (DCM).
Capabilities

Level 1: - The Entity manages documents and content on a reactive or ad - Evidences of Existing DCM
Establishing hoc basis without formalized DCM practices. Activities.

Level 2: - The Entity has defined plans to manage the Entity's documents - The DCM Plan. - DCM.1.1
Defined and content lifecycle and digitization activities. A roadmap has
been defined with a list of activities, required resources,
trainings, awareness and a budget to manage both DCM and - The DCM Digitization Plan. - DCM.1.2
Digitization.

- The Entity has defined processes for prioritizing and digitizing


- The Developed DCM Training
documents to be stored in the Entity's Document Management
Plan.
System (DMS).

- The Entity has identified and prioritized the key processes to be


implemented as workflows in DMS to enable automated and - The Developed Prioritization
paperless management of documents. Process for Documents and
Content.

- The Identified DCM Processes / - DCM.1.4


Procedures.

- The Ranked List of Prioritized - DCM.1.4


Document Workflows to be
Implemented.

Level 3: - The Entity is implementing its DCM plan and Digitization plan All level 2 Acceptance Evidences
Activated including training and awareness activities. required including:

- The Entity is prioritizing its documents to be stored in the - The DCM Plan Implementation
Entity's Document Management System (DMS). Status Report.

- Roles and responsibilities are defined and initiated to manage


the DCM activities and the Digitization activities in line with the - The Digitization Plan
overall "Data Management Organization" structure. Implementation Status Report.

44
Maturity Questions – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Related
Level Name Level Description Acceptance Evidence
Specification

- The List of Prioritized Documents - DCM.1.3


for Digitization.

- The DCM Roles and


Responsibilities.

- The DCM Training Plan - DCM.3.1


Implementation Status Report.

Level 4: - The Entity is monitoring the effectiveness of the DCM plan and All level 3 Acceptance Evidences
Managed the DCM Digitization plan with pre-defined Key Performance required including:
Indicators (KPIs).
- The Monitoring Reports with Pre-
defined KPIs for the
Implementation of the DCM Plan
& the DCM Digitization Plan.

Level 5: - The Entity continuously reviews and updates both its DCM plan All level 4 Acceptance Evidences
Pioneer and Digitization plan for improvement and optimization in line required including:
with the changing business needs.
- The Continuous Improvement
Report of the DCM and
Digitization plans.

Maturity Questions – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No policies and processes are in place currently for Document - Not Applicable.
Absence of and Content Management (DCM).
Capabilities

Level 1: - The Entity manages its documents and content on a reactive - Evidences of Processes for
Establishing and ad hoc basis without formalized DCM policies and Retaining & Disposing of
processes. Documents.

45
Maturity Questions – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 2: - The Entity defined a clear set of policies and processes to - The Developed Policy Document - DCM.2.1
Defined manage its documents and content. for the DCM Lifecycle.

- The Processes are defined for:


- The Developed DCM Backup & - DCM.4.1
• Backup & Recovery. Recovery Procedures.

• Retention & Disposal.

• Document and Content Access Approvals. - The Developed DCM Retention & - DCM.4.2

• Metadata Publishing. Disposal Procedures.

- The Developed DCM Role-Based - DCM.4.3


Access Approval Procedures.

- The Developed DCM Metadata


Publishing Procedures.

Level 3: - The Entity is implementing its DCM processes aligned with the All level 2 Acceptance Evidences - DCM.4.2
Activated policies. required including:

- The Document Management System (DMS) is included in the - The Implemented Workflow for
Entity's Backup & Recovery plan. the DCM Retention & Disposal
Process.
- Documents are transferred to the Entity's archival facility
(Archival Register).
- The Implemented Workflow for - DCM.4.3
- The physical destruction and deletion of the documents is done the DCM Access Approval
securely. Process.

- The Entity established role-based access authorizations to the


Content Management System (CMS) and the Document - Evidences of Document Transfers
Management System (DMS). to the Archival Facility (Archival
Register).
- Document and content Metadata is published via an automated
tool.

- The Documents Disposal


- Periodic DCM audits are conducted to ensure that the right Register.
information is getting to the right people at the right time.

- The Access Rights


Documentation.

46
Maturity Questions – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Report on Document & - DCM.4.4


Content Metadata Publishing.

Level 4: - The Entity is monitoring the DCM processes aligned with the All level 3 Acceptance Evidences - DCM.5.1
Managed policies with pre-defined Key Performance Indicators (KPIs). required including:

- The Audit Trail / Audit Framework is tracked based on pre- - The Monitoring Report with Pre-
defined Key Performance Indicators (KPIs). defined Key Performance
Indicators (KPIs) of the DCM
Processes Aligned with the
Policies.

Level 5: - The Entity regularly reviews and updates its DCM processes for All level 4 Acceptance Evidences
Pioneer continuous improvement and optimization. required including:

- The Continuous Improvement


Report on the DCM Processes
and Practices.

Maturity Questions – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place currently for managing documents nor - Not Applicable.
Absence of content.
Capabilities

Level 1: - The Entity manages documents and content in disparate - Evidences of Document Storage /
Establishing applications. Retention & Disposal Activities.

Level 2: - The Entity identified and developed or acquired the required tool - The documented DCM tool
Defined to support DCM. requirements.

Level 3: - The Entity implemented the selected tool for Document and All level 2 Acceptance Evidences - DCM.4.5
Activated Content Management (DCM) which includes: required including:

47
Maturity Questions – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Related
Level Name Level Description Acceptance Evidence
Specification
• A Document Management System (DMS). - The Implemented Tool for DCM.

• A Web Content Management System (CMS).

• A collaboration tool.

- The Record of Digitized


Documents.

- The List of Approved Users.

Level 4: - The Entity is monitoring the implementation of the tool based on All level 3 Acceptance Evidences
Managed pre-defined Key Performance Indicators (KPIs). required including:

- The Monitoring Report on the


Implemented Tool with Pre-
defined Key Performance
Indicators (KPIs).

Level 5: - For Continuous Improvement, the Entity regularly reviews: All level 4 Acceptance Evidences
Pioneer required including:
• The performance of the implemented
Digitization Management tool to ensure - The Continuous Improvement
automation. Report on the DCM Solution /
Tool Design and DCM Practices.
• The DCM activities to ensure optimization.

48
8.1.6. Data Architecture and Modelling Domain

Maturity Questions – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Architecture & Modelling (DAM). - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity manages its Data Architecture & Modelling (DAM) - The existing DAM Domain related
Establishing related activities on a reactive and ad hoc basis without practices.
formalized practices.

Level 2: - The Entity has a defined & approved DAM Management plan. - The Approved Current State Data - DAM.3.1
Defined Architecture & the Existing
- The Entity developed an Entity-wide baseline current state and Technical Architecture.
target future state Data Architecture in accordance with the
Enterprise Architecture (EA).
- The Approved Target State Data - DAM.3.2
- The Entity performed a gap analysis resulting in an Architecture.
implementation roadmap with defined priorities.

- The Entity adopted a Standard enterprise architecture framework. - The Future State Gap Assessment. - DAM.3.3

- The Entity published its DAM Policies & Guidelines.

- The Approved DAM Plan. - DAM.1.1

- The Approved Enterprise


Architecture Framework.

- The DAM Policy. - DAM.2.1

49
Maturity Questions – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined implementation roadmap All level 2 acceptance evidence
Activated for its Target State Data Architecture. requirements including:

- The IT architecture is ensuring all business architecture outcomes - The DAM Plan Implementation
are addressed by selecting and implementing the right technical Progress Report.
capabilities.

- The Business Architecture team works closely with the Entity's


Enterprise Architecture team to align the applicable Standards
(e.g., DMBOK, TOGAF, etc.).

- The EA Framework Implementation


Report.

Level 4: - The Entity is monitoring the effectiveness of the DAM Plan & All level 3 acceptance evidence - DAM.6.1
Managed activities with pre-defined Key Performance Indicators (KPIs). requirements including:

- The Monitoring Report of the DAM


Plan & Activities Implementation
with Pre-defined KPIs.

Level 5: - The Entity follows an identified architecture Change Management All level 4 acceptance evidence
Pioneer process to review, approve and implement the changes to the requirements including:
Current and / or Target State Data Architectures.
- The Data Architecture Plan's
- The Entity continuously reviews the Target State Data Architecture Continuous Improvement Report.
via checkpoints incorporated into the Software Development
Lifecycle (SDLC) process.

- The Architecture Change - DAM.5.1


- The Entity regularly reviews and refines the Target State roadmap, Management Process.
priorities, and architecture artifacts, reflecting the current
Enterprise Architecture strategy and key initiatives.

- The Change Control Document.

- The Data Architecture Checkpoints - DAM.5.2


Report.

50
Maturity Questions – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place within the Entity for Data Architecture & - Not Applicable.
Absence of Modelling (DAM).
Capabilities

Level 1: - The Entity maps Data Flows to the business processes only on a - The Current DAM Domain
Establishing reactive basis for specific projects / initiatives. Activities.

- Foundational Data Flows are identified, defined and designed but


primarily driven by a project / an initiative.

Level 2: - The Entity has identified & defined the requirements for - A document containing the
Defined developing a Data Lake environment. Approved Business Processes on
the Data Architecture and any
Related Data Flow.

- The Entity has defined a Partitioning Strategy for the target state
data architecture. - A document containing the Big - DAM.3.4
Data Considerations Including the
- The Entity identified and mapped its Data Flows to the business Data Lake Requirements.
processes.

- The Entity's Enterprise Data Model adopts a Standard - A document containing the Data - DAM.3.5
diagramming method (e.g. UML) for documenting the structure, Processing Considerations
relationships and notations of business entities at the conceptual, Including the Partitioning Strategy.
logical and physical levels.

- The Entity stores in a register, its Data & Technical Architecture


projects, Reference documentation materials and Data Model - Model representation. - DAM.4.1
designs.

- The end-to-end Data Modelling lifecycle is defined with required - The DAM Register. - DAM.7.1
processes and best practices / guidelines such as naming
conventions, Data types, physical model deployment
considerations and optimizations, etc.
- A document containing the Data
Model Representation's Technical
Standards & Best Practices.

51
Maturity Questions – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined practices for its DAM All level 2 acceptance evidence
Activated activities. requirements including:

- Multiple Data integration and flow patterns are designed, - The Data Integration Pattern
published and are operating on defined Data architecture Implementation Document.
principles such as "Capture Once Use Many" etc.

- Structured and streamlined information flows across the Entity are


done using the defined and published Data integration patterns.

- The Entity is implementing tools and technologies for DAM


initiatives.

- Data Models are uploaded in the Data Catalog.

- The Logical Model is linked with business glossaries and technical


attributes to provide end-to-end lineage and activate the impact - Evidences of DAM Tools & - DAM.4.2
analysis. Technologies in Use.

- Evidences of the Enterprise Data


- The Entity's Enterprise Data Model is used throughout the Model Uploaded in the Data
Software Development Lifecycle (SDLC). Catalog.

- The development is done via a Data Model tool.

- Evidences of Applying the


Technical Data Standards.

Level 4: - The Entity is monitoring the implementation of the DAM practices All level 3 acceptance evidence
Managed with pre-defined Key Performance Indicators (KPIs). requirements including:

- The Monitoring Report of the


Implementation of the DAM
Practices with Pre-defined KPIs.

Level 5: - The mapping of the Entity's Data Flow to the Business processes All level 4 acceptance evidence
Pioneer is continuously monitored, measured, and improved including: requirements including:

• Multi-latency, multi-format Data Flows & - The Continuous Improvement


Architecture Patterns defined, adopted, and Report Including the DAM Business
mapped to the business processes. Process Documentation & Other
Related Data Flow Documentation.
• Entity-wide Data structures and Data services
mapped to the business processes.

• Entity-wide Data structures and Data services


shared in real-time.

52
8.1.7. Data Sharing & Interoperability Domain

Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place for Data Sharing nor Integration. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity shares Data on a reactive or ad hoc basis. - The Names of the Current
Establishing Practices of Data Sharing and
- Manual Data exchange practices exist within the Entity and with Integration (DSI).
other Entities.

- A DSI practice and initial data integration assessment of the - The Updated and Approved - DSI.1.1
Entity's current state has been conducted to identify pain points Results of the Initial Data
and inefficiencies in the data transfer and the integration across Integration Assessment.
the Entity.

Level 2: - A Target Data Integration Architecture was developed based on All level 1 acceptance evidence
Defined the identified pain points. requirements including:

- The Entity has developed an integration strategy and - The Developed Data Integration
implementation plan for data sharing and integration to remediate Strategy Document.
the identified pain points and manage the required integration
initiatives.

- The Developed Target Data - DSI.1.2


- The Data Sharing policies and guidelines are defined for DSI Integration Architecture.
within and outside the Entity in line with the Data Sharing
Regulations published by NDMO-SDAIA.

- The Entity has developed a plan for training every employee


- The Developed Data Integration - DSI.1.3
involved in the Data Sharing initiatives.
Plan (Including Data Sharing
Activities).

- The Developed Data Sharing


Policies.

53
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Developed Data Sharing


Training Plan.

Level 3: - The Entity is implementing the defined plan and roadmap for data All level 2 acceptance evidence
Activated sharing and integration. requirements including:

- Roles and responsibilities are assigned within the Entity to - A Data Integration Plan
manage Data Sharing requests and Data integration activities in Implementation Status Report.
line with both the Data Management Organization structure and
the Data Sharing regulations published by NDMO-SDAIA.

- A Report on the Defined Roles and


- The Entity is conducting Data Sharing training in line with the Responsibilities for DSI.
Data Sharing training plan.

- A Progress Report on Data Sharing - DSI.2.1


Training Programs.

Level 4: - The Entity is monitoring the implementation of the DSI activities to All level 3 acceptance evidence
Managed ensure completion as per the plan with pre-defined Key requirements including:
Performance Indicators (KPIs).
- The Monitoring Report with Pre-
defined Key Performance
Indicators (KPIs) for the Data
Integration Plan and Data Sharing
Activities.

Level 5: - The DSI plan and the activities are continuously optimized to All level 4 acceptance evidence
Pioneer facilitate innovation and to maintain compliance with the Data requirements including:
Sharing Policies.
- The Continuous Improvement
Report on the Data Integration Plan
and Data Sharing Activities.

54
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for sharing the Entity's Data internally - Not Applicable.
Absence of nor externally.
Capabilities

Level 1: - The Entity shares Data on a reactive or ad hoc basis within and - Evidences of the Current DSI
Establishing outside the Entity with no formalized practices. Practices / Processes by the Entity
(Internally & Externally).

Level 2: - The Entity defined and approved a standardized Data Sharing - The Process Documentation for - DSI.5.1
Defined mechanism for sharing Data internally and externally according to Data Sharing (Including Data
the defined classification levels. The mechanism includes forms, Classification Levels and
processes and agreements to be used to manage the Data Timelines).
Sharing requests in line with the Data Sharing principles and
regulation.

- The Developed and Approved Data


- Data Sharing request forms have been developed and are shared Sharing Request Forms (Internal
with Data requestors as needed. and External).

- The Data Sharing agreements have been developed for Internal


and External Data Sharing.
- The Developed and Approved - DSI.7.1
Internal Data Sharing Agreement
Template.

- The Developed and Approved - DSI.7.2


External Data Sharing Agreement
Template.

Level 3: - The Entity is implementing and operationalizing the defined and All level 2 acceptance evidence - DSI.5.1
Activated approved Data Sharing mechanism to facilitate Data Sharing requirements including:
within and outside the Entity.
- Evidences of Operationalization of
- Data is shared with Entities only through SDAIA's certified and The Data Sharing Process, e.g.:
approved channels (e.g., The Government Service Bus (GSB), Communication of The Defined
National Information Center Network, etc.). Data Sharing Mechanism.

55
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity put in place controls to ensure that only the
appropriate authority is allowed to access, obtain and use the - Evidences of Data Sharing Through
shared Data based on the nature and sensitivity of the Data. SDAIA's Certified and Approved
Channels.
- All ongoing Data Sharing agreements are reviewed on a regular
basis to accommodate for any changes.

- The Access Authorization Controls


- The Entity's official Government website has an established
Document.
channel to manage Data Sharing requests.

- Evidences of Data Sharing - DSI.6.1


Requests Submitted to The Entity
and The Requests Submitted by
The Entity Through the Established
Channel.

- Evidences of the Entity's


Responses to the Data Sharing
Requests.

- An Adoption Evidence of The - DSI.7.1


Developed Data Sharing
Agreement Template for An Internal
Data Sharing Request.

- An Adoption Evidence of The - DSI.7.2


Developed Data Sharing
Agreement Template for An
External Data Sharing Request.

- The Documented Review - DSI.7.3


Outcomes of The Data Sharing
Agreements.

Level 4: - The entity is measuring the efficiency of Data Sharing activities, All level 3 acceptance evidence - DSI.8.1
Managed tracking progress, and ensuring compliance with pre-defined requirements including: (3,4,5,6,7)
KPIs.
- The Monitoring Report with Pre-
defined KPIs for the Data Sharing
Processes.

56
Maturity Questions – Data Sharing & Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Related
Level Name Level Description Acceptance Evidence
Specification

- The compliance audit


methodology.

Level 5: - The Entity's Data Sharing process is automated and updated All level 4 acceptance evidence
Pioneer frequently. requirements including:

- Data Sharing practices are regularly reviewed and adjusted to - The Continuous Improvement
address the changing business objectives while implementing Report on the DSI Processes.
continuous improvements.

- Data-as-a-service (DaaS) and Data products are developed and


maintained proactively by the Entity.

- DSI is fully automated and scalable with full control on protection


and privacy at the Entity level.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices in place to manage the Data movement across the - Not Applicable.
Absence of entity's systems and applications.
Capabilities

Level 1: - The data movement across Data stores, systems, and applications - Evidences of Data Movement
Establishing is done on ad-hoc basis with no formal architecture. / Integration within the Entity
and with Other Entities.

Level 2: - The Entity defines its business requirements for each initiative - The Integration Requirements - DSI.3.1
Defined requiring data integration between Data stores, systems, and Document.
applications.

- The Entity develops a solution design document for each data


integration initiative based on the target data integration - The Solution Design - DSI.3.2

architecture and the Integration Requirements Document. Document.

57
Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the Data integration architecture All level 2 acceptance evidence
Activated following an integration solution development lifecycle for each requirements including:
data integration initiative.
- The Document of the
- The entity performs testing of the developed integration solution Approved Standardized Entity
before deployment to the production environment. Wide Integration Solution
Development Lifecycle.

- Data sources are integrated through established and approved - The Developed Test Scripts - DSI.3.3
integration patterns. and the Conducted Tests
(Integration, Functional) in
Line with the Plan and the
Solution Design Document.

Level 4: - The Entity is monitoring the effectiveness of the integration All level 3 acceptance evidence - DSI.8.1 (1,2)
Managed initiatives with pre-defined KPIs. requirements including:

- The Monitoring Report with


Pre-defined KPIs for the Data
Integration Initiatives.

- The Integration Solution - DSI.3.4


Monitoring and Maintenance
Document.

Level 5: - The Entity reviews the integration architecture regularly and All level 4 acceptance evidence
Pioneer optimizes the integrations practices by adopting cutting edge requirements including:
methodologies (e.g., Implementing Continuous Integration and
Continuous Deployment (CI/CD) pipelines) to automate and - The Continuous

streamline the improvement of the integration solution. Improvement Report on the


Data Integration Practices.

58
Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Continuous Improvement


Mechanisms for Data
Integration, e.g.: Continuous
Integration & Continuous
Delivery (CICD) Pipeline
Details for Automation.

Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No controls or processes in place for data sharing or data - Not Applicable.
Absence of transformation and movement.
Capabilities

Level 1: - The Entity implements controls for data sharing and moves data - The Existing Data Integration or
Establishing across applications on an ad-hoc basis without formalized Data Movement Processes within
controls or processes. the Entity.

Level 2: - The Entity has defined data sharing controls, processes and - The Developed Data Migration - DSI.4.1
Defined standards for Data transformation and Data movement across the Processes (i.e.: ETL).
Entity's applications.

- The Entity has conducted a risk assessment of DSI processes. - The Developed Data Migration - DSI.4.2
Processes (i.e.: ELT).
- The Entity has defined specific processes to classify, protect and
support Data privacy.

- The Risk Assessment Report on the


Entity's Datasets to be Shared.

- The Defined Data Sharing Controls.

59
Maturity Questions – Data Sharing & Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: All level 2 acceptance evidence


Activated requirements including:
- The Entity is implementing defined processes for data
transformation and movement (ETL/ELT), and controls for Data - A Report on the Implemented
Sharing. This includes processes for integrating data from Controls (e.g.: Data Security &
disparate sources and loading into Data Warehouse Store, and Protection, Data Sharing, Data
processes for storing unstructured data in its raw native format in Integration, Data Access, etc.).
the Data Lake.

- The Entity is implementing standards for Data development, - Evidence of the Implemented Data - DSI.4.1
transformation and movement across all applications. Migration Processes (i.e.: “Extract,
Transform, Load (ETL).
- The Data security controls are implemented in line with the
security controls published by the KSA's National Cybersecurity
Authority (NCA).
- Evidence of the Implemented Data - DSI.4.2
Migration Processes (i.e.: “Extract,
- Appropriate controls are applied on the Datasets to be shared in Load, Transform (ELT)).
line with the Data Sharing regulations and other relevant
regulations (e.g., Personal Data Protection (PDP) etc.).

Level 4: - The Entity is monitoring the effectiveness of the DSI controls, All level 3 acceptance evidence
Managed processes and standards with pre-defined KPIs. requirements including:

- The Monitoring Report with Pre-


defined KPIs for the DSI Controls,
Processes and Standards.

Level 5: - The Entity continuously reviews its Data sharing and integration All level 4 acceptance evidence
Pioneer controls and practices for optimization by adopting advanced requirements including:
interoperability standards (e.g., Event driven, event sourcing, Data
mesh, etc.). - The Continuous Improvement
Report on Reviewing the DSI
- The transformed and merged Data is available upon request as a Processes and Mechanisms to
service (DaaS) for various transactions, reporting and analysis in Automate the DSI Controls and
the Entity for innovation (e.g., Data-driven transformation and Practices.
products, etc.).

- Proactive periodic audits and reviews of the applications and


systems are performed in an automated way.

60
8.1.8. Reference and Master Data Management Domain

Maturity Questions – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - The Entity doesn't have a plan in place for Reference & Master - Not Applicable.
Absence of Data (RMD) Management.
Capabilities

Level 1: - The Entity manages its Reference & Master Data (RMD) on a - The Currently Existing Practices
Establishing reactive or ad hoc basis without formalized practices. Related to the RMD Management
Domain.

Level 2: - The Entity has a defined and approved RMD Management plan. - The Developed and Approved RMD - RMD.1.1
Defined Management Plan.
- A roadmap is defined with a list of activities, required resources,
trainings and a budget to manage the RMD plan implementation.

Level 3: - The Entity is implementing the defined RMD Management plan All level 2 acceptance evidence
Activated and roadmap. requirements including:

- Clear roles and responsibilities are defined and activated for the - The RMD Management Plan
RMD activities. Implementation Status Report.

- Business and IT Data stewards are assigned in line with the "Data
Management Organization" structure. - The RMD Management Training - RMD.3.1
Implementation Status Report.
- Training is being conducted for all Entity's employees responsible
for the management of its RMD as required in line with the
change management practices. - The RMD Management Operating - RMD.4.1
Model Showing the RMD
- The RMD change request logs and the decisions are Stewardship Coverage.
documented.

- The documented RMD Management initiatives registered in the - The RMD Change Request Logs. - RMD.6.1

statement of Architecture work are being implemented.

- The RMD Management Documents - RMD.6.2


& Artifacts.

61
Maturity Questions – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the RMD All level 3 acceptance Evidence
Managed Management plan and activities with pre-defined KPIs. requirements, Including:

- The Monitoring Report of the RMD


Management Plan Implementation
with Pre-defined KPIs.

Level 5: - The Entity's RMD Management plan and activities are All level 4 acceptance Evidence
Pioneer continuously reviewed and updated for optimization and requirements, Including:
improvement.
- The Continuous Improvement
Report of the RMD Management
Plan.

Maturity Questions – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for Reference & Master Data (RMD) - Not Applicable.
Absence of management.
Capabilities

Level 1: - The Entity manages its RMD on a reactive or ad hoc basis (e.g.: - Evidence of Projects / Initiatives
Establishing Projects / initiatives) without formalized practices. with Reference Data and/or Master
Data Identified.

Level 2: - The Entity has defined processes to manage its RMD objects - The Identified, Prioritized and - RMD.1.2
Defined across the business and technical applications. Categorized RMD Objects.

- RMD requirements are documented in line with business needs.


- Reference Data Categorization. - RMD.1.3

62
Maturity Questions – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Related
Level Name Level Description Acceptance Evidence
Specification
- RMD objects are identified, prioritized, and categorized (Internal
or External). - Master Data Categorization. - RMD.1.4

- Standards and rules for matching and merging Master Data


Objects and Reference Datasets are defined. - The Reference & Master Data - RMD.2.1
Requirements.
- Processes are developed for the RMD lifecycle (Creation,
modification, and archiving).
- The Defined SLAs for RMD - RMD.5.1

- Service Level Agreements (SLAs) are defined for the processes. Lifecycle Management.

Level 3: - The Entity is implementing its RMD Management Processes. All level 2 acceptance evidence - RMD.4.2
Activated requirements including:
- Match and Merge, survivorship rules for Master Data Objects
are embedded in the Data transformation rules and - The RMD Lifecycle Management
implemented on the Entity's Data applications. Process

- Rules for standardized Reference Data are embedded in the


Entity's Data applications and managed centrally.

- Evidence of the Implementation &


Adoption of the National Reference
Datasets.

Level 4: - The Entity is monitoring the processes for managing the RMD All level 3 acceptance Evidence - RMD.5.2
Managed Objects across the Data lifecycle with pre-defined KPIs. requirements, Including:

- The Monitoring Report of the RMD


Management Processes with Pre-
defined KPIs & SLAs.

Level 5: - The Entity's RMD Management Processes and Standards are All level 4 acceptance Evidence
Pioneer continuously monitored for optimization and improvement. requirements, Including:

- The Continuous Improvement


Report of the RMD Management
Processes & Standards.

63
Maturity Questions – Reference and Master Data Management Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No tools are in place for managing the Reference & Master Data - Not Applicable
Absence of (RMD).
Capabilities

Level 1: - The Entity manages its RMD within standalone Data sources - Evidence of Reference & Master
Establishing without standardized tools. Data (RMD) Used for Project
Purposes.

Level 2: - The Entity identified the required Hub / Tool for RMD - The Target RMD Management - RMD.2.2
Defined Management based on business and technical requirements. Architecture Design.

- A suitable Data Hub architecture implementation pattern is


agreed upon and the Data Hub design is developed. - The Developed RMD Conceptual - RMD.2.3
Architecture.

- A conceptual and information architecture is developed based


on the selected Data Hub Architecture Design.

- RMD models and architectures are developed.

- The Developed RMD Information - RMD.2.4


- RMD Hub / Tool technical requirements are documented based
Architecture.
on the defined target RMD information Architecture.

- The RMD Hub / Tool Technical - RMD.2.5


Requirements.

Level 3: - The Entity implemented the selected Data Hub for RMD All level 2 acceptance Evidence - RMD.4.3
Activated Management and is operating the hub as the Trusted Source of requirements, Including:
RMD across the Entity.
- The Implemented RMD Management
- The implemented Data Hub is flexible to accommodate new Hub.
Data sources and to support workflow capabilities and
customizations (e.g., Data localization, privacy, consent

64
Maturity Questions – Reference and Master Data Management Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Related
Level Name Level Description Acceptance Evidence
Specification
management, hierarchy management, and exception processing
and automated alert capabilities). - The Workflow Documentation - RMD.4.4
Showing the Establishment of the
Data Hub as the Entity's Trusted
Source.

Level 4: - The Entity is monitoring the adoption and usage of the All level 3 acceptance Evidence
Managed implemented Data Hub with the pre-defined KPIs. requirements, Including:

- The Monitoring Report of the RMD


Management Hub / Tool Capabilities
with Pre-defined Key Performance
Indicators (KPIs).

Level 5: - The Entity continuously inspects the implemented Data Hub to: All level 4 acceptance evidence
Pioneer requirements, Including:
• Ensure full coverage of all new and updated
Data sources. - The Continuous Improvement Report
of the RMD Information Architecture
• Ensure that all deployed tools and technologies
and the Implemented Data Hub.
are regularly optimized.

• Ensure that the level of automation increased


based on industry and global trends.

- A centralized Hub / Tool is developed being the Trusted Data


Source wherein all the RMD Objects are hosted and maintained.

65
8.1.9. Business Intelligence and Analytics Domain

Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1
activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No formal Business Intelligence & Analytics (BIA) plan is in - Not Applicable.
Absence of place.
Capabilities

Level 1: - The Entity implements BIA activities on a reactive or ad hoc - Current activities related to BIA.
Establishing basis.

Level 2: - The Entity has a defined and approved BIA plan based on the - The defined & approved BIA Plan . - BIA.1.1
Defined overall Data Management (DM) strategy.

- A roadmap is defined with a list of activities, required resources


and budget to manage the BIA implementation.

Level 3: - The Entity is implementing the defined & approved BIA plan & All level 2 acceptance evidence
Activated roadmap. requirements including:

- Clear roles and responsibilities are defined for the BIA activities - The BIA plan & roadmap
in line with the Data Management Organization structure. implementation status report.

- The Defined & Documented Roles &


Responsibilities for BIA Activities
Including Data Stewardship Roles.

Level 4: - The Entity is monitoring the effectiveness of the BIA plan and All level 3 acceptance evidence
Managed activities with pre-defined Key Performance Indicators (KPIs). requirements including:

- The Effectiveness Monitoring Report


of the BIA plan & activities with pre-
defined Key Performance Indicators
(KPIs).

66
Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1
activities?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 5: - The BIA plan and activities are continuously reviewed and All level 4 acceptance evidence
Pioneer updated for optimization and improvement. requirements including:

- Regular reviews and improvements


to the business intelligence and
analytics plan and roadmap.

Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No defined BIA use cases are identified within the Entity. - Not Applicable.
Absence of
Capabilities

Level 1: - The Entity identifies and implements BIA use cases on ad hoc - List of the implemented BIA uses
Establishing basis or for specific projects and initiatives. cases within the Entity.

Level 2: - The Entity has identified and prioritized its BIA use cases using a - The approved BIA business cases /
Defined defined prioritization framework. BIA use cases.

- Shortlisted use cases have been documented in a detailed BIA


Use Case Portfolio document. - The approved BIA Use Cases - BIA.1.2
Prioritization Framework.
- An Implementation plan has been developed for each shortlisted
and approved use case with implementation priorities based on
the Entity's defined use case implementation approach.
- The shortlisted / prioritized use - BIA.1.2
cases based on business needs.
- A use case validation process has been developed.

- The BIA Use Case Portfolio - BIA.1.3


document with details of each use
case.

- The Approved Use Case - BIA.1.4


implementation plan.

67
Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Related
Level Name Level Description Acceptance Evidence
Specification

- The defined use case


implementation approach (e.g.,
DevOps, Agile, etc.).

- The approved use case validation - BIA.3.1


process.

Level 3: - The Entity is implementing and validating the prioritized use cases All level 2 acceptance evidence
Activated in order to transform the Entity into a Data-driven organization. requirements including:

- The new use cases are evaluated and implemented based on the - The Implemented and new Use
business requirements. Cases.

- The Implemented BIA use cases and their final outcomes are
documented in a register. - The up-to-date BIA use cases - BIA.5.1
register.

- The Outcomes of the BIA Use Case


Validation Processes.

Level 4: - The Entity is evaluating / monitoring the performance of analytical All level 3 acceptance evidence - BIA.4.1
Managed models and business impacts of implemented use cases with requirements including:
pre-defined Key Performance Indicators (KPIs).
- The Monitoring Report on the BIA
- The Technical or Business impacts and the Return on Investment Portfolio Effectiveness with pre-
(ROI) from the use cases are monitored with pre-defined KPIs. defined Key Performance Indicators
(KPIs).

Level 5: - The Entity's activated and implemented BIA use cases are All level 4 acceptance evidence
Pioneer regularly reviewed and optimized. requirements including:

- The potential impact, competitive advantage, Total Cost of - Continuous Improvement Report
Ownership (TCO) and Return on Investment (ROI) are on BIA Use Cases.
continuously reviewed and updated with a defined calculation
methodology.

- The Updated BIA Use Case


Register and the Optimized Use
Cases.

68
Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place for BIA. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity manages its BIA processes on a reactive or ad hoc - The Existing BIA Domain
Establishing basis without formalized practices. Processes & Governance
Documentation.
- Reporting is done either on direct source systems; or different
Data Marts are developed based on the business needs with no
semantic relationship / link among them.

- The List of Existing Reports &


- The BIA Governance focuses only on the requirements of the
Dashboards.
projects.

- The DQ issues are resolved during the Data population for BI by


the BI team.

Level 2: - The Entity has defined formal BIA practices including the - The Developed & Approved
Defined development of processes for a Data warehouse or a Data Lake Processes of BIA Management.
with logical models in place to model the business functions.

- A Semantic layer is developed and maintained for supporting BI


- The Approved Process of New
and Advanced Analytics use cases.
Data Source Requirements.

- The Entity is developing a change management plan including


training programs, awareness campaigns and a release
- The Approved Demand
management process to guide the publishing mechanism of
Management Process.
reports, dashboards and implemented uses cases.

- Development and maintenance


document of the Semantic Layer.

69
Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification

- Advanced analytics management


and governance process.

- The identified & developed Data


Sources & Data Marts.

- The Developed & Approved - BIA.2.1


Change Management Plan
including:

• The Training programs.

- The Developed & Approved - BIA.2.2


Change Management Plan
including:

• Awareness campaigns.

Level 3: - The Entity is implementing the defined BIA processes and All level 2 acceptance evidence
Activated practices which are established to govern BIA activities. requirements including:

- Self-service analytics are fully activated covering both BI and - Evidences of the adoption &
Artificial Intelligence / Machine Learning (AI/ML). implementation for business
intelligence and analytics
- Business users have full access to all types of Data (Raw, management and governance.
Processed, Modelled) in a governed manner.

- End-to-end governance is in place covering emerging topics such - The approved Operating Model - BIA.3.2
as AI ethics, etc. with defined roles & responsibilities
for The Data Science team.
- Advanced Analytics (AI/ML) governance and management (AI/ML
OPs) foundational processes are in place operationally.

- A central Data Science & BIA team has been set up with defined
- Evidences of training courses &
roles and responsibilities.
awareness campaigns conducted.

70
Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The central Data Science & BIA team has a clear process flow to
handle the Entity's BIA needs. - The approved User Acceptance
Test (UAT) Documents.
- BIA training courses are conducted for all employees involved in
the BIA initiatives to upskill the analytics capabilities; and
awareness campaigns are conducted to promote the awareness, - The approved outcomes as

education & adoption of the BIA capabilities. Reports & Dashboards produced
for The Business Units.

- The Capacity planning document.

Level 4: - The Entity is monitoring BIA processes, governance and the All level 3 acceptance evidence
Managed performance of the centralized BI team with pre-defined Key requirements including:
Performance Indicators (KPIs).
- The Monitoring Report on the
effectiveness of practices and
processes for managing and
governing business intelligence and
analytics through predefined key
performance indicators (KPIs).

- The monitoring Report on the - BIA.4.1


effectiveness of training and
awareness sessions conducted
through predefined key
performance indicators (KPIs).

- Performance Measurement Report


for the Business Intelligence and
Analytics Team.

Level 5: - The Entity created a BI Competency center including the required All level 4 acceptance evidence
Pioneer roles, skills & capabilities to fulfil all BI requirements. requirements including:

- The Entity created a BI Competency center which is governing - Establishment of Business


the Entity's BI management including the demand management Intelligence and Analytics Center
and change management. Document.

71
Maturity Questions – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The performance of the BIA team is continuously reviewed,
measured and optimized. - The continuous review and
improvement of business
- The BIA Capacity planning is reviewed regularly and fulfilled to intelligence and analytics
support the growing demand. management and governance
practices.

- Continuous Review and


Improvement of Business
Intelligence and Analytics Team
Performance.

- The Updated Capacity Planning


document.

- The Revised Demand Management


Process.

Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No technology for BIA is in place. - Not Applicable.


Absence of
Capabilities - The reporting is done Manually.

Level 1: - The Entity implements its BIA activities on an ad hoc basis or - The list of BIA tools.
Establishing individual applications without formalized tools.

- The list of reports.

72
Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 2: - The Entity selected standardized tools to develop standard All Level 1 acceptance evidence
Defined analytics artifacts (reports and dashboards). requirements including:

- The Business and IT stakeholders collaborate to develop, - The list of business units which use
enhance and maintain the Semantic Layer. the BIA technology tools.

Level 3: - The Entity selected and implemented tools and technologies that All Level 2 acceptance evidence
Activated are being adopted across the Entity. requirements including:

- The Entity has self-service advanced analytics (Data discovery, - The list of users with role and
wrangling, statistical analysis, etc.) and collaborative Artificial privileges.
Intelligence / Machine Learning (AI / ML) using a low-code / no-
code environment.

- Users have the choice of a unified interface in their workflows by


- The Approved Architecture &
using:
Documentation for Advanced

• Tools. Analytics.

• Programming Languages (e.g., Python, R, Go,


etc.).

• Platforms (e.g., Anaconda, Jupiter, etc.). - Advanced Analytics and


Communication Project
• Libraries (e.g., Pandas, Numpy, etc.).
Management Documents.

- Project documentation is managed centrally and shared across


the Entity.

- Common AI / ML Models are versioned and documented through


the tools.
- The Approved Advanced Analytics
Models Documents.
- Users have the ability to reuse those AI / ML Models in developing
new BI Analytics & Models.

73
Maturity Questions – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the adoption and utilization of the BIA All Level 3 acceptance evidence
Managed tools and technologies with pre-defined Key Performance requirements including:
Indicators (KPIs).
- Report on the Effectiveness
Monitoring of Business Intelligence
and Analytics Tools and
Techniques through Predefined Key
Performance Indicators.

Level 5: - New technologies are regularly evaluated, tested, and introduced All Level 4 acceptance evidence
Pioneer to keep the capabilities aligned with the future needs. requirements including:

- Advanced frameworks are used (e.g., Spark, TensorFlow, - Document for Continuous Adoption
PyTorch, etc.) and integrated with their unified AI/ML of Technologies, Tools,
environment. Frameworks, and Features.

- Advanced capabilities such AI/ML engineering tools and features


are used for complex processing and reusability. - The Continuous Improvement
Report of the BIA and Advanced
- Tools and technologies which use AI/ML OPs are practiced to Analytics Technology Solutions
manage the end-to-end AI/ML lifecycle. including the Proof of Concept
(POC) Roadmap.

- A Report on the Proof of Concept


(POC) Results.

74
8.1.10. Data Value Realization Domain

Maturity Questions – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement Data-
DVR.MQ.1
related cost optimization initiatives and use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to generate value from Data nor implement - Not Applicable.
Absence of Data-related cost optimization initiatives.
Capabilities
- No use cases are identified for Data revenue generation nor Data-
related cost optimization.

Level 1: - The Entity has existing practices for Data Value Realization (DVR), - The Existing List of Data Value
Establishing Data-related cost optimization and identification of use cases, Realization (DVR) Activities.
however, this is done in a reactive or ad hoc basis without
formalized DVR practices.

Level 2: - The Entity has a defined and approved plan to identify, document - The Data Value Realization (DVR) - DVR.1.2
Defined and realize value from its Data through revenue generation plan.
initiatives and Data-related cost optimization initiatives.

- The Entity identified use-cases for revenue generation and cost


- The List of Identified Use Cases for - DVR.1.1
optimization based on the guidelines for ethical use cases.
Both Revenue Generation & Cost
Optimization.
- The Entity estimated and projected a Payback period and Return
on Investment (ROI) for each identified use case.

- A Document Explaining the - DVR.1.1


- Roles and responsibilities to support the DVR implementation
Payback Period and Return on
have been identified and documented.
Investment (ROI) for Each Identified
Use Case.

Level 3: - The Entity is implementing its DVR plan through use-cases for All level 2 acceptance Evidence - DVR.3.1
Activated revenue generation and cost optimization. requirements, Including:

- Roles and responsibilities for DVR use-case implementations are - Report on DVR Monitoring &
activated in line with the Data Management Organization Maintenance.
structure.

- The Entity is maintaining the DVR use cases.

Level 4: - The Entity is monitoring the efficiency of the plan implementation All level 3 acceptance Evidence - DVR.4.1 (1,
Managed and DVR activities with pre-defined Key Performance Indicators requirements, Including: 2, 6, 7)
(KPIs) in line with the “DM and PDP Standards” document.
- The Monitoring Report of the DVR
use cases with pre-defined KPIs.

75
Maturity Questions – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement Data-
DVR.MQ.1
related cost optimization initiatives and use cases?

Related
Level Name Level Description Acceptance Evidence
Specification

- The Monitoring Report of the DVR


Activities & Plan with Pre-defined
KPIs.

Level 5: - The DVR plan implementation is regularly reviewed and optimized All level 4 acceptance evidence
Pioneer to ensure continuous improvement. including:

- Review & Continuous


Improvement Document of the DVR
Plan.

- The Revised & Updated DVR KPIs.

- Evidences of New Partnerships


(e.g., MoU, jointly developed use
cases or products, etc.).

Maturity Questions – Data Value Realization Domain

DVR.MQ.2 Has the Entity implemented practices to support a Data revenue generation process?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No processes are in place for use cases of both: Data revenue - Not Applicable.
Absence of generation and cost optimization.
Capabilities

Level 1: - The Entity has not implemented a formalized process for Data - The Existing Practices Related to
Establishing revenue generation, however, this is done in a reactive or ad hoc Supporting the Data Revenue
basis. Generation Process.

Level 2: - The Entity has a defined, documented, and approved process for - The Defined Pricing Scheme. - DVR.2.1
Defined Data revenue generation covering the details of:

76
Maturity Questions – Data Value Realization Domain

DVR.MQ.2 Has the Entity implemented practices to support a Data revenue generation process?

Related
Level Name Level Description Acceptance Evidence
Specification
• The selection of a pricing schema model.
- The Data or Data Product Price - DVR.2.2
• The calculation of the total cost.
Calculation.

• Aligning the adopted charging model in line with


the business needs.

- The Adopted / Approved Charging - DVR.2.3


Model for Each Data or Data
Product.

Level 3: - The Entity is implementing the defined process of Data revenue All level 2 acceptance evidence - DVR.2.4
Activated generation. including:

- For each Data revenue generation request, the Entity submits the - Evidences of Revenue Generation
required documentation to NDMO-SDAIA. Requests Submitted to NDMO-
SDAIA.
- For each Data or Data Product that the Entity is expecting to
generate revenue from, the Entity submits a revenue generation
request to NDMO-SDAIA.

Level 4: - The Entity is monitoring the efficiency of the Data revenue All level 3 acceptance evidence - DVR.4.1
Managed generation process with pre-defined KPIs. including: (3,4,5)

- The Monitoring Report of the Data


Revenue Generation Process with
Pre-defined KPIs.

Level 5: - The Revenue Generation Process is regularly reviewed and All level 4 acceptance evidence
Pioneer optimized to ensure continuous improvement. including:

- Continuous Improvement Report of


the Data Revenue Generation
Process.

77
8.1.11. Open Data Domain

Maturity Questions – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No Open Data (OD) plan is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity has existing practices to maintain OD, but these - The existing list of Open Datasets.
Establishing practices are not formalized.

Level 2: - The Entity has defined an OD plan and framework. - The approved Open Data
Defined Framework.
- OD Management structure is in place and a change management
plan has been developed to conduct training on OD activities and
awareness campaigns. - The approved Open Data plan. - OD.1.1

- The OD Management structure.

- The developed and approved plan


for change management including:

• The OD training plan.

- The developed and approved plan - OD.2.1


for change management including:

• The OD awareness
campaigns plan.

Level 3: - The Entity is implementing the activities on the defined Open Data All level 2 acceptance evidence
Activated plan including training & awareness programs. requirements including:

- An annual report on the Open Data plan & progress is submitted - The Open Data Plan
to NDMO-SDAIA. Implementation status report.

78
Maturity Questions – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity appointed the following roles in line with the Data
Management Organization: - Evidence of submission of the
annual Compliance report to
• Open Data & Information Access Officer (ODIAO) NDMO-SDAIA.
to lead the Open Data activities within the Entity.

• Business Data Executive (BDE).


- Assignment decisions / appointees
• Business Data Steward. to job roles.

- Evidence of Implementation of the


Change Management Program
(The conducted training courses
and the launched awareness
campaigns related to Open Data).

Level 4: - The Entity is monitoring the effectiveness of the Open Data All level 3 acceptance evidence - OD.4.1
Managed activities plan with pre-defined Key Performance Indicators (KPIs). requirements including:

- The Monitoring Report on the


Effectiveness of the Open Data
Plan through Predefined Key
Performance Indicators (KPIs).

Level 5: - The Open Data plan is revised & optimized for continuous All level 4 acceptance evidence
Pioneer improvement. requirements including:

- The periodic reviews and


improvements of the open data
plan.

79
Maturity Questions – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing process is in place to identify Open Data (OD) within - Not Applicable.
Absence of the Entity.
Capabilities

Level 1: - Open Data identification practices are performed on a reactive or - The Existing Practices for OD
Establishing ad hoc basis, without formalized OD practices. Identification.

Level 2: - The Entity has developed and documented processes required to - The Defined Process - OD.3.1
Defined manage the lifecycle of Open Data. Documentation for managing the
lifecycle of Open Data.
- In alignment with the Open Data Policy, the Entity defined a
process to identify public Datasets to be published including a
mechanism to evaluate all Data classified as “Public Data” to - The Defined Process
determine whether to be published as OD or not. Documentation for Identifying
Open Data.

- The process of evaluating the


value and impact of open or public
datasets.

Level 3: - The Entity is implementing the defined processes for identifying All level 2 acceptance evidence
Activated the Datasets to be published. requirements including:

- The Entity is evaluating the identified datasets in terms of value - The OD Identification Process
whilst also conducting risk assessment to ensure there will be no Implementation Status Report.
security or privacy threats when published.

- The Identified Open Datasets are being aligned with the


regulations published by NDMO-SDAIA. - The list of Identified Open Datasets - OD.3.2
with the assigned priorities.

- The Entity has identified & documented metadata of the Entity's


open datasets.

- The Identified & Documented - OD.3.4


Metadata for the Open Datasets.

80
Maturity Questions – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Related
Level Name Level Description Acceptance Evidence
Specification

- Value and Impact Assessment - OD.3.2


Report for Identified Open or
Public Datasets.

Level 4: - The Entity is monitoring the effectiveness of the implemented All level 3 acceptance evidence - OD.4.1
Managed processes of identifying Open Datasets with pre-defined KPIs. including:

- The Monitoring Report of OD


Identification & Prioritization
Processes with Pre-defined KPIs.

Level 5: - The Entity is conducting continuous improvement, enabled by All level 4 acceptance evidence
Pioneer automated processes to support OD identification. requirements including:

- The Continuous Improvement


Report Showing the Documented
Periodic Reviews & Outcomes of
the OD Identification Processes
and the Implemented Automation.

Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing process are in place for managing Open Datasets - Not Applicable.
Absence of within the Entity.
Capabilities

Level 1: - Open Data (OD) is shared on a reactive or ad hoc basis. - The Existing Practices for
Establishing Publishing Open Datasets.

81
Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 2: - The Entity has defined a process to manage the publishing of - The Defined Process
Defined Open Datasets as part of the documented Open Data processes. Documentation for Publishing
Open Data.
- The Entity has defined a process to regularly review, update and
document changes to its published Open Datasets and
associated metadata to ensure they meet defined regulatory
requirements.
- The Defined Process
Documentation for Open Data
Maintenance.

Level 3: - The Entity is implementing the defined process for publishing All level 2 acceptance evidence
Activated Open Data by collaborating with National Information Center (NIC) requirements including:
in SDAIA to publish the Open Datasets on the Saudi Open Data
Portal under the KSA Open Data License. - Status Report on the
Implementation of the Open Data
- The Entity utilizes standardized formats when publishing its Publishing Process.
datasets that are in machine-readable form.

- The Entity ensures that the open datasets are published in high - Evidences of Published Datasets - OD.3.3
quality to ensure fitness for use. on the Saudi Open Data Portal.

- The Entity is updating the published open datasets and the


associated Metadata and maintaining data traceability and
- Evidences of Feedback /
versioning history of the open datasets.
Comments Received on OD.

- The Entity is documenting in a register its identified open datasets


and activities.
- Evidence of formats used to - OD.3.5
standardize open datasets in
machine readable form.

- Evidence of data standards


applied on open datasets to
ensure high data quality.

- Open Data Maintenance Report. - OD.3.6

- Open Data Register containing - OD.5.1


Records of Open Data Activities
and Published Open Datasets.

82
Maturity Questions – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the effectiveness of the Open Data All level 3 acceptance evidence - OD.4.1
Managed processes with pre-defined KPIs, covering: requirements including:

1. The number of downloads per published Open Dataset. - The Monitoring Report of the OD
Publishing Process with Pre-
2. The number of defined, identified and prioritized Open defined KPIs.
Datasets.

3. The number of identified Open Datasets that have been


published.

4. The number of updates performed on published Open


Datasets.

Level 5: - The Entity is conducting continuous improvement to support the All level 4 acceptance evidence
Pioneer defined OD publishing processes to ensure optimization. requirements including:

- The published Open Datasets and the associated Metadata are - The Continuous Improvement
regularly reviewed and updated to the newest version. Report of the OD Publication &
Maintenance Practices.
- The OD and the associated Metadata changes are documented
where necessary.

83
8.1.12. Freedom of Information Domain

Maturity Questions – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan is in place to address the Freedom of Information (FOI) - Not Applicable.
Absence of regulations within the Entity.
Capabilities

Level 1: - The Entity has existing FOI practices in place to address its - The Existing FOI Practices.
Establishing compliance with the FOI Regulations without formalized practices.

Level 2: - The Entity has a defined and approved FOI plan. - The Defined & Approved FOI - FOI.1.1
Defined Implementation Plan & Roadmap.
- A roadmap is defined with a list of activities, required resources,
training courses & awareness campaigns, and a budget to
manage the FOI plan implementation.

Level 3: - The Entity is implementing the defined plan to manage FOI All level 2 Acceptance Evidence
Activated requests. requirements, Including:

- The Entity is publishing on its official Government website with a - The FOI Plan Implementation
feedback mechanism for questions or issues raised. Status Report.

- The Entity appointed an Open Data & Information Access Officer


(ODIAO) to manage the Entity's compliance with the FOI - The Assigned Open Data &
regulations in line with the "Data Management Organization" Information Access Officer

structure. (ODIAO).

- The Entity launched awareness campaigns to promote and


enhance the culture of transparency and to raise awareness of the - FOI Awareness. - FOI.2.1

FOI regulations published by NDMO-SDAIA.

Level 4: - To comply with the FOI Regulations, the Entity is monitoring the All level 3 Acceptance Evidence
Managed effectiveness of the plan established with pre-defined Key requirements, Including:
Performance Indicators (KPIs).
- The Monitoring Report on the
Entity's FOI Plan & Activities with

84
Maturity Questions – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity is conducting periodic audit and review processes on Pre-defined Key Performance
the compliance with FOI regulations published by NDMO-SDAIA. Indicators (KPIs).

- The Internal audit reports on the - FOI.3.6


entity's Compliance with the FOI
Regulations.

Level 5: - The Entity's FOI plans and practices are continuously reviewed All level 4 Acceptance Evidences
Pioneer and optimized. required, Including:

- The Continuous Improvement


Report of the FOI Plan.

Maturity Questions – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing practices are in place for Freedom of Information - Not Applicable.
Absence of (FOI) within the Entity.
Capabilities

Level 1: - The Entity responds to FOI requests on a reactive and ad hoc - The Existing Processes for the FOI
Establishing basis. Requests & Responses.

Level 2: - The Entity has defined processes to manage FOI requests - The Developed & Approved FOI - FOI.3.1
Defined (including response processes and appeal / denial processes) in Request Processes & Procedures
alignment with the National Data Governance Regulations. Documentation.

- The Entity has developed end-to-end processes, steps and


Frequently Asked questions (FAQs) for acquiring information from
- The Developed FOI Process Guide
the official Gov portal / website.
& FAQs.

85
Maturity Questions – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: All level 2 Acceptance Evidences - FOI.3.2


Activated required including:
- The Entity is implementing the defined processes and practices
for FOI Regulatory Compliance to ensure adoption across the - The Implementation / Adoption
Entity. Status Report on the FOI Request
Processes.

- The Entity is publishing on the official portal / website, the end-to- - Evidences of Entity-wide - FOI.3.3
end process guide and the FAQs required for acquiring the Communication
information.

- An FOI Request Management process is activated across the - The Register of the Received - FOI.3.4
Entity, including access and appeal / denial practices. Request Forms with the
Responses.
- The Entity has a record keeping system which is regularly
updated on the FOI Register.
- The Identified Public Datasets
- The Entity adopted a pricing scheme for Public Information Shared Under the FOI
Access Requests. Regulations.

- Evidences of the Published FOI - FOI.3.3


Communications Including
Guidelines & FAQs on the Entity's
Official Gov Website in Line with
the NDMO Requirements.

- The Pricing Scheme for Public - FOI.3.5


Information Access Requests.

- The Up-to-date FOI Register. - FOI.4.1

86
Maturity Questions – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 4: - The Entity is monitoring the completeness and performance of All level 3 Acceptance Evidence
Managed the FOI processes with pre-defined KPIs. requirements, Including:

- The Monitoring Report with Pre-


defined KPIs for the Entity's
Responses on FOI Requests.

Level 5: - The Entity continuously reviews & optimizes FOI processes & All level 4 Acceptance Evidence
Pioneer practices. requirements, Including:

- The FOI request responding process is automated. - The Continuous Improvement


Report on the FOI Processes.

- The Automated tool for FOI


requests.

87
8.1.13. Data Classification Domain

Maturity Questions – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No plan for Data Classification (DC) is in place. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity has existing DC practices which are not formalized. - The Existing DC Practices.
Establishing

Level 2: - The Entity defined a plan for DC to manage and orchestrate its - The defined & Approved Data - DC.1.1
Defined DC activities. Classification Plan.

Level 3: - The Entity has implemented the defined DC plan and roadmap on All level 2 acceptance evidence
Activated all approved Datasets/artifacts. requirements including:

- The Data Classification


implementation plan status report.

Level 4: - The Entity is monitoring the effectiveness of the DC plan & All level 3 acceptance evidence
Managed activities with the pre-defined Key Performance Indicators (KPIs). requirements including:

- The Implementation Monitoring


Report of the DC Plan & Activities
with Pre-defined KPIs.

Level 5: - The DC plan & activities are continuously reviewed & updated for All level 4 acceptance evidence
Pioneer optimization & improvement. requirements including:

- Data classification plan review


report.

88
Maturity Questions – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing processes for Data Classification (DC) are in place - Not Applicable.
Absence of within the Entity.
Capabilities

Level 1: - Existing DC activities are performed on a reactive or ad hoc basis. - The Current List of Classified
Establishing Datasets.

Level 2: - The Entity developed a prioritization framework to classify its - Data Classification Policy.
Defined Datasets.

- The Entity developed a framework for conducting the impact


assessment.
- The Data Handling and Protection - DC.2.1
Controls.
- The Entity defined Data handling Data protection Controls for the
Datasets and artifacts.

- The Entity defined & identified the DC levels in line with the DM
and PDP standards.

- The Entity defined a process for identification and inventory of its


Datasets and artifacts

Level 3: - The Entity is identifying and maintaining an inventory of all All level 2 acceptance evidence - DC.3.1
Activated datasets and artifacts owned by the Entity. including:

- The Entity is conducting prioritization based on the identified - The Inventory Report of the
Datasets. Identified Datasets and Artifacts.

- The Entity is conducting an impact analysis to assess any


potential damage of unauthorized access to its identified Datasets - The Prioritized Datasets and - DC.1.2
and assigning the defined DC levels in line with the DM and PDP Artifacts.

standards.

- The Entity is conducting an impact analysis for data classified as - Evidences of Utilization of the Data

low impact. Catalog Tool for the Data Inventory.

- The Impact Assessment Report. - DC.3.2

89
Maturity Questions – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity is implementing the defined Data Handling and
Protection Controls based on the classification level for the - The Assessment Report of Low- - DC.3.3

Datasets/artifacts. Impact Data.

- The Entity is assigning Data access rights and privileges across


the Entity. - The Approved Data Access List of
users with the Assigned Privileges.
- The Entity is maintaining a register of all identified datasets and
artifacts.
- Data Register - DC.5.1

Level 4: - The Entity is monitoring the effectiveness of the DC processes All level 3 acceptance evidence - DC.4.1
Managed with pre-defined KPIs. including:

- The Monitoring Report of the DC


Processes with Pre-defined KPIs.

Level 5: - The Entity is continuously improving the process for DC. All level 4 acceptance evidence
Pioneer including:
- Innovative solutions and tools are explored to proactively scan
and prioritize the classification of Data. - The Data Classification Automation
Tool.

Maturity Questions – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are the
DC.MQ.3
most appropriate ones as specified by the Data Classification (DC) Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No existing Data Classification (DC) review processes are in place. - Not Applicable.
Absence of
Capabilities

90
Maturity Questions – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are the
DC.MQ.3
most appropriate ones as specified by the Data Classification (DC) Policies?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 1: - Existing DC review activities are on a reactive or ad hoc basis. - The Current Practices of DC
Establishing Reviews.

Level 2: - The Entity defined a review mechanism for all classified Datasets - The DC Review Mechanism.
Defined and artifacts to be reviewed.

Level 3: - The Entity is implementing the defined mechanism for DC reviews All level 2 acceptance evidence - DC.3.4
Activated for the classified Datasets and artifacts across the Entity. including:

- The Entity is publishing on the Data Catalog the DC levels for the - The Data Classification Review
reviewed Datasets and artifacts as Metadata. Report.

- An Evidence Document of the - DC.3.5


Published Classification Levels as
Metadata.

Level 4: - The Entity is monitoring the DC review mechanism with pre- All level 3 acceptance evidence
Managed defined KPIs. including:

- The Monitoring Report of the DC


Review Mechanism with Pre-
defined KPIs.

Level 5: - The Entity is continuously improving its DC review process to All level 4 acceptance evidence
Pioneer ensure optimization. including:

- Data classification review


mechanisms review report.

91
8.1.14. Personal Data Protection Domain

Maturity Questions – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic
PDP.MQ.1
and operational Privacy requirements?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No assessment is done for Personal Data Protection (PDP). - Not Applicable.
Absence of
Capabilities - No plan is in place to address Data Privacy requirements.

Level 1: - The Entity is aware of the requirements for Data Privacy - Evidences of the existing
Establishing Management. practices of the PDP Domain and
Data Privacy.
- The Entity conducted an initial PDP Assessment to evaluate the
current state of the Entity's PDP practices, identified gaps
against business and regulatory obligations including the
following:
- The Initial PDP Assessment - PDP.1.1

1. Identification of the types of personal data being Result.

collected.

2. Location & method of personal data storage.

3. Current processing & uses of the personal data.

4. Privacy challenges to meet compliance with the PDP


Regulations published by NDMO-SDAIA.

Level 2: - The Entity developed a plan to address the Strategic and All level 1 acceptance evidence - PDP.1.2
Defined Operational Data-Privacy requirements, including a list of requirements including:
activities, required resources, training courses and a budget.
- The Approved PDP
implementation plan.

- The PDP Training plan.

92
Maturity Questions – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic
PDP.MQ.1
and operational Privacy requirements?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 3: - The Entity is implementing the defined PDP & Data Privacy All level 2 acceptance evidence
Activated program in a formalized manner. requirements including:

- This program includes implementing activities which ensure the - The PDP Plan Implementation
proactive identification and resolution of potential Privacy issues Status Report.
and risks including training and awareness for all employees to
promote a Personal Data Protection-centric culture.
- Evidence of PDP training activities - PDP.2.1
conducted.

Level 4: - The Entity is monitoring the effectiveness of the PDP plan with All level 3 acceptance evidence
Managed pre-defined Key Performance Indicators (KPIs). requirements including:

- The Monitoring Report for the


PDP Plan with pre-defined KPIs.

Level 5: - The Entity periodically reviews its PDP plan to ensure sustained All level 4 acceptance evidence
Pioneer compliance with the applicable regulations, and other requirements including:
environmental requirements or impacts.
- Review report and continuous
improvement of the personal data
protection plan.

Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 0: - No practices are in place to identify and address Data breaches - Not Applicable.
Absence of nor Privacy violations.
Capabilities

93
Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification

Level 1: - The Entity identifies Data breaches or Privacy violations on a - Evidences of the existing
Establishing reactive basis without standardized practices nor policies. initiatives for PDP and Data
Privacy.
- The Entity addresses violations when ad hoc Data Privacy
interventions occur.

Level 2: - The Entity has developed policies and processes for PDP and - The documented Data Breach
Defined Data Privacy including the identification of Data breaches, Data Notifications Process.
Privacy considerations, privacy notice & consent management,
and compliance management to comply with the applicable
regulations. - The documented Data Breach - PDP.3.2
Management Process.
- Governance has been defined to manage compliance with the
PDP regulations in line with the Data Management Organization
structure. - Entity-Specific PDP Policies.

- The Entity has defined a process for Data Subjects' rights


management. - The PDP & Data Privacy Notice - PDP.4.1
and the Consent Management
Process.

- The Data Subjects' Rights - PDP.4.2


Management Processes.

Level 3: - The Entity is implementing and standardizing the developed and All level 2 acceptance evidence - PDP.4.1
Activated approved practices for PDP including the Consent Management requirements including:
Workflow, notifications to regulatory authority as required and
data breach management. - The Developed & Adopted
Consent Management Workflow.
- The PDP policies and standardized processes for data subjects
rights have been published on the Entity's official government
website with a feedback mechanism for questions or issues - Evidences of Notifications Sent to - PDP.3.1

raised by the data subjects. the Regulatory Authority within the


Allotted Timeframe.

94
Maturity Questions – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Related
Level Name Level Description Acceptance Evidence
Specification
- The Entity maintains a register of its compliance records
including records of any collection and / or processing of any - Evidences of Data Breach

Personal Data (e.g. Identified Data breaches, etc.). The register Management Including Identified

is made available to regulators as required. Data Breaches.

- The Results of the PDP Risk - PDP.4.3


Assessments.

- Evidences of Published Data - PDP.4.2


Subjects Rights Management
Processes and Feedback
Received from Data Subjects.

- The PDP Register. - PDP.5.1

Level 4: - The Entity is monitoring the effectiveness of its PDP practices All level 3 acceptance evidence
Managed and compliance with the rules & regulations through pre-defined requirements including:
KPIs.
- The Monitoring Report for the
- The Entity is conducting periodic audits on compliance with PDP PDP & Data Privacy Practices with
rules & regulations. pre-defined KPIs.

- The Compliance Monitoring - PDP.4.4


Report & Audit Results.

Level 5: - The Entity continuously reviews its PDP practices to ensure All level 4 acceptance evidence
Pioneer sustained compliance with regulations. requirements including:

- Continuous improvement, enabled by automation & change - The Documented Periodic


management, is performed in line with the defined regulatory Reviews & Outcomes for the PDP
requirements. & Data Privacy Practices.

- Evidences of Automation &


Change Management for PDP.

95
8.2. Appendix II – Acceptance Evidence Checklists
The acceptance evidences outlined in Appendix I are detailed into acceptance criteria in
this section. This helps to streamline the assessment journey and ensures coverage of
all aspects required for a systematic evaluation.

8.2.1. Data Governance Domain

Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Existing data-related - Attach a report showing current data practices.


Establishing practices.

Level 2: - The approved DM & PDP - Attach the Data Management and Personal Data Protection (DM & PDP) Strategy which
Defined Strategy. must be approved & includes the following, as a minimum:

• Current challenges in DM.

• The Strategic requirements including:

• Internal requirements emanating from the Entity's business strategy.

• External requirements emanating from the National Strategy for DM


& PDP.

• The DM & PDP Vision, Mission, and Strategic Objectives.

• Strategic and operational performance indicators with targets extending


from 3 to 5 years over the strategy’s implementation duration.

• The financial budget required to implement the strategy divided


according to identified initiatives.

- The DM guiding principles. - Attach a copy of the DM & PDP program guidelines, which may be a part of the Entity's
DM & PDP Strategy doc or a separate doc. The guidelines must refer to:

• The principles underlying the culture of DM & Data processing, to enable


and spread a unified DM & PDP concept within the Entity itself.

96
Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

- Data Strategy Approval - Attach a copy of the entity-specific data strategy approval decision by the Entity’s Data
Decision. Management Committee and/or other related senior level executives within the Entity.

Level 3: - The developed DM & PDP - Attach the implementation plan which must include, as a minimum:
Activated implementation plan.
• Identified initiatives that covers all the DM domains.

• A prioritized list of DM initiatives with descriptions.

• A three-year Implementation roadmap to close the gaps identified


between the current state and target state.

- Implementation status - Attach a report on the implementation status including, as a minimum:


report.
• The achievement percentages of the initiatives and projects included in
the DM & PDP implementation plan.

Level 4: - The monitoring report of - The report must be prepared based on the data of the Key Performance Indicators (KPIs)
Managed the DM & PDP strategy & (Indicator Cards) which were pre-defined in the DM & PDP strategic plan, and each
plan implementation with indicator’s data or card should include the following, as a minimum:
the pre-defined KPIs.
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

97
Checklist – Data Governance Domain

Has the Entity established & implemented a Data Management & Personal Data Protection (DM & PDP) Strategy and a DM &
DG.MQ.1
PDP Plan with Key Performance Indicators (KPIs) that can be continuously measured to ensure optimization?

Levels Acceptance Evidence Acceptance Criteria

Level 5: - The Continuous - Attach a report showing that the Entity identified, implemented and has been monitoring
Pioneer Improvement Report of the the continuous improvement mechanisms for DM & PDP Strategy.
DM & PDP Strategy.
- The report shall include the following:

• The updated DM & PDP strategy.

• The updated DG practices that include the Data Management


Organization, the roles, the processes, & the technologies.

• The continuous improvement mechanisms for all DM Domains.

- The Continuous - Attach a report showing that the Entity identified, implemented and has been monitoring
Improvement Report of the the continuous improvement mechanisms for DM & PDP Plan.
DM & PDP Plan.
- The report shall include the following:

• The documented periodic DM & PDP reviews & results.

• Documented changes to the initial approved plan where applicable.

Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Data Management & - The gap analysis document shall include the following as a minimum:
Establishing Personal Data Protection
(DM & PDP) Policies, • An analysis of DM & PDP Standards and Guidelines established by the
Controls and Guidelines National Data Management Office (NDMO).
Gap Analysis document.
• Identifying & analyzing all Data-related policies & controls published by
the Oversight Entities & the regulator(s) of the sector to which the Entity
belongs.

• An analysis of the Entity's internal requirements for DM & PDP Controls.

• Recommendations and suggestions to close the gaps identified as well a


specific list of policies that the Entity will develop in line with the Policies
published by NDMO-SDAIA.

98
Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

• A plan to develop DM & PDP policies and controls, clearly presenting the
implementation timeline.

Level 2: - The developed DM and - Every Policy’s document shall include, the following, as a minimum:
Defined PDP policies, standards
and guidelines covering all • Policy Name.
DM Domains as required.
• Release date.

• Version number.

• Document control (Preparation, review, approval).

• Version history.

• Terminology.

• Goal.

• Scope of work.

• Guiding Principles

• Policy Statement.

• Job roles & responsibilities.

• Related Policies.

• References.

• Policy Owner.

Level 3: - Implementation status - Attach a report showing the implementation status of the developed DM and PDP policies,
Activated report. processes and standards, and include, as a minimum:

• Implementation achievement percentages.

- A document proving the - The submitted evidence must be approved or issued by the Entity’s authority holder.
Entity's approval &
adoption of the developed
policies, standards &
guidelines.

- Approved Compliance - Attach the Compliance Framework, detailing the following:


Management Framework.
• The scope of the compliance audit’s periodic procedure.

• The processes needed to plan and perform the compliance audit


procedure.

• The processes & tools needed for reporting a compliance audit results.

99
Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

• The processes & plans for remediating & escalating non-compliance


cases.

Level 4: - The monitoring reports for - The report must be prepared based on the data of the Key Performance Indicators (KPIs)
Managed the developed policies, (Indicator Cards) which were pre-defined to measure the Entity’s Performance in
processes and standards monitoring the developed Policies & Standards implementation status, and each
with pre-defined KPIs. indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Entity's Compliance - With the compliance report, attach a report documenting the results & outputs of each
Audit Results Report. procedure implemented to monitor compliance. The compliance report shall include the
following, as a minimum:

• Compliance or Non-Compliance with each specification.

• An explanation for compliance results with sufficient evidence for each


specification.

• Recommendations to remediate each instance of non-compliance.

• Accountable stakeholder for each recommendation, and the target date


to complete the recommendation.

- Compliance Monitoring - Attach a report(s) to confirm that the Entity has established and monitored compliance
Report. points by implementing a periodic compliance audit procedure. For the report to be
accepted, it must be:

• Compliant with the national framework for compliance in data


management and governance.

100
Checklist – Data Governance Domain

Has the Entity established and implemented Data Management (DM) Policies, Standards and Guidelines across all Data
DG.MQ.2
Management (DM) Domains?

Levels Acceptance Evidence Acceptance Criteria

• Include compliance audit scores generated by the periodic compliance


audits.

Level 5: - Continuous Improvement - Attach a report showing that the Entity identified, implemented & is monitoring continuous
Pioneer Report of the updated DM improvement mechanisms, including as a must:
and PDP policies,
standards and updated DG • A report containing samples of the updated policies, standards &
processes. processes related to DM, DG & PDP.

Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Entity's Data - Attach the decision to establish a Data Management Office to supervise the
Establishing Management Office implementation of the national strategy for DM, DG & PDP.
establishment decision.
- To accept the attachment, the following requirements must be met:

• The decision must be issued by the Entity’s authority holder.

• The Office’s responsibilities are documented in line with the


“Organizational Manual”.

- The appointment/hiring - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The CDO’s responsibilities are clarified in an approved job description.


A. Chief Data Officer
(CDO) Hiring / • The responsibilities are in line with the “Organizational Manual”.
Appointment
Decision.

101
Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

- The appointment/hiring - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The Data Management Officer / Data Governance Officer’s


B. Data Management
responsibilities are clarified in an approved job description.
Officer / Data
Governance Officer • The responsibilities are in line with the "Organizational Manual”.
Hiring / Appointment
Decision.

Level 2: - Entity Data Management & - To accept the attachment, the following requirements must be met:
Defined Data Governance
Committee formation • The decision must be issued by the Entity’s authority holder.
decision.
• The Committee’s Charter including:

• Roles and Responsibilities.

• Rules of work for the committee.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The Compliance Officer’s responsibilities are clarified in an approved job


A. Compliance Officer
description.
Hiring / Appointment
Decision. • The responsibilities are in line with the “Organizational Manual”.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The Business Data Executive’s responsibilities are clarified in an


B. Business Data
approved job description.
Executive Hiring /
Appointment • The responsibilities are in line with the “Organizational Manual”.
Decision.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The Legal Advisor’s responsibilities are clarified in an approved job


C. Legal Advisor Hiring /
description.
Appointment
Decision. • The responsibilities are in line with the “Organizational Manual”.

102
Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The Hiring / Appointment - To accept the attachment, the following requirements must be met:
Activated decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The Business Data Steward’s responsibilities are clarified in an approved


A. Business Data
job description.
Steward(s) Hiring
/ Appointment • The responsibilities are in line with the “Organizational Manual”.
Decision.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The IT Data Steward’s responsibilities are clarified in an approved job


B. List of the IT Data
description.
Stewards.
• The responsibilities are in line with the Organizational Manual.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The PDP Officer’s responsibilities are clarified in an approved job


C. Personal Data
description.
Protection (PDP)
Officer Hiring / • The responsibilities are in line with the Organizational Manual.
Appointment
Decision.

- The Hiring / Appointment - To accept the attachment, the following requirements must be met:
decisions of the following
roles: • The decision must be issued by the Entity’s authority holder.

• The ODIAO’s responsibilities are clarified in an approved job description.


D. Open Data and
Information • The responsibilities are in line with the Organizational Manual.
Access Officer
Hiring /
Appointment
Decision.

- The documented & - Attach the documented & approved Data Management Organization structure, including
approved Data the following, as a minimum:
Management Organization
structure. • Roles & responsibilities.

• Approved job descriptions for each role.

• Authorization: Reviews, approvals and decision making.

103
Checklist – Data Governance Domain

Has the Entity established and operationalized all roles required for the Data Management Organization as per the NDMO
DG.MQ.3
Controls & Specifications?

Levels Acceptance Evidence Acceptance Criteria

- The documented Data - Attach the documented & approved Data Stewardship / Data Ownership structure,
Stewardship / Ownership including the following, as a minimum:
structure.
• Identified Data Domains.

• Business Data Executives for each Data Domain.

• Business Data Steward for each Data Domain.

• IT Data Steward for each Data Domain.

• Roles & responsibilities.

Level 4: - The monitoring reports for - The report must be prepared based on the data of the Key Performance Indicators (KPIs)
Managed the Entity's Data (Indicator Cards) which were pre-defined for the Data Management Organization roles, and
Management Organization each indicator’s data or card should include the following, as a minimum:
roles with pre-defined
KPIs. • Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - Attach a report showing that the Entity identified, implemented & is monitoring continuous
Pioneer Improvement Report for improvement mechanisms within the Entity for the:
the DM Organization and
Data Stewardship. • The Data Management Organization.

• The Data Stewardship / Ownership Structure.

104
Checklist – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication, change
DG.MQ.4
control, and capability development?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of - Attach evidences showing that the entity has communications regarding Data Management
Establishing communication on DM- practices, such as:
related practices.
• E-mails.

• Correspondences.

- Evidences of training and - Attach a report on the training courses & awareness programs implemented by the Entity,
awareness sessions including the following, as a minimum:
conducted.
• Names of the implemented training courses related to DM, DG & PDP.

• Awareness programs implemented in the different Domains under DM, DG


& PDP.

• A certificate of attendance sample for training the Entity’s employees in


Domains related to DM, DG & PDP.

Level 2: - The Change Management - The Change Management Plan covering all DM Domains including:
Defined Plan.
• The DM & PDP Training Plan for all DM Domains.

• The DM Communication plan.

• The Stakeholders engagement plan.

• The Change Control plan for Data system changes.

Level 3: - The Change Management - Attach a report on the implementation status of the change management plan, and the report
Activated Implementation Status must include, as a minimum:
Report showing the DM &
PDP Training activities. • The DM & PDP Training implementation status for all DM Domains showing:

• Evidences of training conducted for the Entity’s employees, covering


all DM Domains.

• A list of activities carried out by the Entity to raise awareness regarding


the national regulations, laws, policies & controls, standards and their
applicability (such as: e-mails, publications, lectures, workshops, etc.).

• A list of activities carried out by the Entity to raise awareness regarding


the national DM, DG & PDP strategy and programs and their
applicability on the Entity (such as: e-mails, publications, lectures,
workshops, etc.).

105
Checklist – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication, change
DG.MQ.4
control, and capability development?

Levels Acceptance Evidence Acceptance Criteria

• A list of activities carried out by the Entity to raise awareness regarding


the data management domains as per the National Data Management
and Personal Data Protection Framework, addressed to the related
Data Management and Personal Data Protection roles.

- The Change Management - Attach a report on the implementation status of the change management plan, and the report
Implementation Status must include evidences confirming the Entity's continuous communication regarding the
Report showing the DM following:
Communication activities.
• The DM, DG & PDP Program, activities and main decisions.

• The storage of DM & DG documents & artifacts.

• The measurement of DM & DG performance indicators.

• The updates on DM & DG policies & processes.

• The updates on compliance reports, implementation plans, and the


regulatory (legislative) environment related to DM & DG.

- The Stakeholders - Attach a report showing that the Entity is engaging the identified stakeholders to develop and
Engagement and improve on the capabilities of the DM & PDP program.
Socialization Plan
implementation status
report.

- The Data Governance - The register should include, as a minimum:


Approvals Register.
• CDO’s DG Decisions with their rationalized constituents.

• DM & DG Committee Decisions.

- The Data Management - The register should include, as a minimum:


Issue Tracking Register.
• A sample of DM & DG issues reported by business & technical users.

• Evidence of resolution of the reported issues.

- Evidences that the Entity - Attach a sample of a register / record or a document created by the Entity, and it must
has DM & DG document include the following as a minimum:
& artifact Version Control
practices. • The Domain name of the registry.

• The Registry’s Issuance Date.

• The Registry’s Updating Dates.

• The Registry Versions Control.

• The Updates made to the Registry or document.

106
Checklist – Data Governance Domain

Has the Entity established and implemented practices for Change Management including awareness, communication, change
DG.MQ.4
control, and capability development?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The monitoring report of - A report on monitoring the Change Management activities & practices should be attached
Managed the Change Management based on pre-defined KPIs, which include, as a minimum, the following indicators:
practices with pre-defined
KPIs. • The Indicator of the periodic meetings of the internal DM & DG & PDP
committee.

• The Indicator of the completed training & awareness sessions.

• The Indicator of the attendance rates in the completed training & awareness
sessions.

• The duration Indicator of the DM & PDP issue resolutions.

• The quantity Indicator of the number of resolved & closed change requests.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - Attach a report showing that the Entity identified, implemented and is monitoring
Pioneer Improvement Report of mechanisms for continuous improvement for Change Management & Spreading Awareness
the Change Management about all Domains under DM, DG & PDP.
Practices for all DM
Domains.

107
8.2.2. Metadata and Data catalog Domain

Checklist – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.

Absence of
Capabilities

Level 1: - The Recorded or - The Entity must attach a report on the Metadata recorded or documented as part of any
Establishing Documented Metadata project’s implementation, or in standalone applications.
Report.

Level 2: - The Approved Metadata - The Entity must attach the approved MCM tool / solution implementation plan including the
Defined and Data Catalog Plan. following, as a minimum:

• A roadmap that includes the projects and milestones of the technological


tool / solution implementation for the Data Catalog. The activities shall
incorporate what is needed to achieve this Domain’s specifications, as a
minimum.

• The assignment of the required resources & budget allocation to manage


the implementation of the Data Catalog Automation tool.

- The Approved Metadata - The Entity must attach an updated and approved report illustrating the approved Metadata
Structure and architecture and Metadata framework. The report must include the following, as a minimum:
Framework.
• The Business Metadata fields to be filled in the Data Catalog.

• The additional fields based on the Entity’s requirements.

Level 3: - The Metadata and Data - The Entity must attach an updated and approved report clarifying the MCM Plan
Activated Catalog Plan implementation status.
Implementation Status
Report.

- The Metadata - The Entity must attach an updated and approved report clarifying the Metadata Management
Management Framework Framework implementation status.
Implementation Status
Report.

108
Checklist – Metadata and Data Catalog Domain

MCM.MQ.1 Has the Entity developed and implemented a plan to integrate and manage Metadata across the Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report - The Entity must attach an updated and approved monitoring report on the MCM Plan and
Managed with Pre-defined Key activities, prepared based on pre-defined KPIs (Indicator Cards).
Performance Indicators
(KPIs) for the Metadata - e.g.: The indicators may include measuring the achievement percentages of the plan
and Data Catalog Plan implementation requirements.
and Activities.
- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated & approved report including the following, as a minimum:
Pioneer Improvement Report on
the Metadata and Data • The documents of the periodic reviews & documented results of the MCM
Catalog Plan. Plan.

• The continuous improvement mechanisms of the MCM Plan.

109
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Existing - The Entity must attach a report proving the existence of the identified Metadata within
Establishing Metadata. standalone applications or for a specific project.

Level 2: - The Selected Data - The Entity must attach an approved report clarifying the technology / tool chosen for the Data
Defined Catalog Tool. Catalog (The information includes: The Vendor, version, and specifications).

- The Approved and - The Entity must attach an approved report that includes the following, as a minimum:
Prioritized Data Sources
Report. • The Data Catalog Sources where each Source has an approved priority.

• The definitions of the Entity’s business Metadata and the technical


Metadata.

- The Developed and - The Entity must attach an approved report clarifying the developed and approved target
Approved Target Metadata architecture including (but not limited to), the following:
Metadata Architecture.
• Metadata Sources the Entity's Data Sources that are sources of Metadata
used in the Data Catalog.

• Metadata Repository the Data Catalog as the Entity's central Metadata


Repository.

• Metadata Flows A definition of Metadata Flow between the Metadata


Sources and the Metadata Repository.

• Metadata Model A Metadata Model used by the Entity's Data Catalog.

110
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

- The Data Catalog Tool - The Entity must attach an approved report clarifying the Data Catalog tool implementation
Implementation requirements, including the following, as a minimum:
Requirements Report.
• The procurement approval for a tool to automate the Data Catalog.

• The approved plan to implement the Data Catalog automation tool.

- The Approved and - The Entity must attach the developed and approved training and awareness Plan regarding
Developed Data Catalog the Data Catalog usages, including the following, as a minimum:
Training Plan.
• The training and awareness plan’s approval and validity / expiry dates.

• The training and awareness programs plan implementation dates.

• The scope of the training and awareness.

• The objectives of the training and awareness.

• The training shall include the following:

• An overview of the Data Catalog concept and benefits.

• Introductory and advanced lessons on the Automated Data Catalog


Tool and its functionalities.

• Practical (Hands on) exercises based on use cases on the automated


Data Catalog tool.

• The training and awareness target audiences.

• The topics of the training programs and awareness campaigns.

• The methods and channels through which the Training plan will be
conducted.

• Identifying the MCM Domain Awareness campaign channels, which include:

• E mails or mobile phone messages.

• Publications.

• Lectures or workshops.

111
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Evidence of the - The Entity must attach a report on the Data Catalog tool proving the tool is implemented in
Activated Implemented Data the Entity.
Catalog Tool.

- Data Access Approval - The Entity must attach an approved report about the documentation of the approval process
Process Documentation for granting authorization access to the Entity’s Data by connecting the Data Catalog tool to
(Authorization to Connect the Entity's Data Sources.
the Data Catalog with the
Data Sources).

- Metadata Access - The Entity must attach an approved report about the documentation of the approval process
Approval Process for granting scope-based authorization access to the Metadata, including the following, as a
Documentation. minimum:

• The name of the system / tool and version number.

• A description of the access granting and approval process.

• A sample of role-based access groups.

- Evidences of Data - The Entity must attach a report proving the adoption and usage of the Data Catalog, with
Catalog Adoption and evidence for populating Metadata in the tool, including the following, as a minimum:
Usage Including
Metadata Populated on • The identified Data Catalog power users – The Entity's Data Catalog

the Tool. advanced users who can act as coaches for other users.

• The Communication plan between the current Data Catalog power users
and the other Data Catalog users to activate the Data Catalog usage. The
plan must include (but not limited to) the following:

• A description of the communication procedure actions & processes.

• The required frequency of the communication processes.

• The target audience.

- The Regular Audits - The Entity must attach the periodic audits report on the Data Catalog usage, including the
Report on the Data following, as a minimum:
Catalog Usage.
• An audit sample of information about the Data Catalog users.

• An audit sample of processes / operations performed by the users.

112
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

• An audit sample of the Data Catalog activity memory and separately


maintained logs of the tracking functionality / audit-trail.

- Evidence of Training - The Entity must attach a report proving the implementation of the training & awareness
Conducted for the programs conducted for the identified Data Catalog users. The report must include the
Identified Data Catalog following, as a minimum:
Users.
• The list of conducted training programs and awareness campaigns
(Including the topics of the programs, the dates when they were conducted,
and the names of the training attendants).

• Samples of the activities performed by the Entity to raise awareness about


the MCM Domain, including:

• E mails or mobile phone messages.

• Publications.

• Lectures or workshops.

• A sample attendance certificate of training the Entity’s employees.

- Tool Versioning Report. - The Entity must attach a Tool Versioning report including the following, as a minimum:

• The name of the developer of the current tool used by the Entity.

• The number of the current version used by the Entity.

• The current version’s release date within the Entity.

• The number and date of the latest tool version published by the developer.

• The Methodology of the Version Management Strategy (Releasing versions


of technological solutions and tools for the Data Catalog) followed in cases
where there are issues that prevent upgrading the current version to the
latest version.

Level 4: - The Monitoring Report - The Entity must attach a monitoring report on the Metadata and on the Data Catalog usage
Managed with Pre-defined KPIs for based on pre-defined KPIs (Indicator Cards) which cover:
the Adoption and Usage
of the Metadata & Data • The number of registered Data Catalog users.

Catalog Solution / Tool.


• The number of active Data Catalog users.

• The number of logins to the Data Catalog.

• The number of performed Metadata queries.

113
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

• The number of annotations (tags, comments) added to the Data assets.

• The number of ratings added to data assets.

• The number of trust certificates assigned to the Metadata.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Approved List of Pre- - The Entity must attach a list of approved KPIs (Indicator Cards) pre-defined to monitor the
defined KPIs for the Metadata Quality, including the following, as a minimum:
Quality of the Metadata
which is Populated on • Completeness.

the Data Catalog Tool.


• Accuracy.

• Consistency.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

114
Checklist – Metadata and Data Catalog Domain

MCM.MQ.2 Has the Entity implemented a Metadata Management and Data Catalog tool / solution?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated & approved report presenting continuous improvement on
Pioneer Improvement Report on the MCM tool, including the following, as a minimum:
the Metadata and Data
Catalog Tool's quality. • The documents of the periodic reviews & documented results.

• The continuous improvement mechanisms.

• The resources assigned for implementing the continuous improvement plan.

- The Metadata - The Entity must attach an updated and approved report proving the full automation of end to
Management Automation end Metadata Management using a fully implemented Data Catalog tool, e.g.: Automating
Report. Metadata collection and exchange (Evidence such as reports, screenshots, etc…).

115
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across the
Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of the Existing - The Entity must attach a document of the current Processes followed for Metadata
Establishing Processes Used to Management (Evidence such as reports, screenshots, etc…).
Manage Metadata.

Level 2: - Metadata Identification - The Entity must attach a report showing the entity’s process for identifying and defining its
Defined Process Report. business and technical metadata that will be included in the data catalog.

- Metadata Prioritization - The Entity must attach a report showing the entity’s process for prioritizing the identified
Process Report. metadata.

- Metadata Population - The Entity must attach a report showing the process for registering and populating the
Process Report. Metadata within the Data Catalog so that the process is implemented as a workflow within
the automated data catalog tool.

- Metadata Update Process - The Entity must attach a report showing the process for updating metadata in its data
Report. catalog so that the process is implemented as a workflow within the automated data
catalog tool.

- Metadata Quality Process - The Entity must attach a report showing the process for identifying and resolving quality
Report. issues with the Metadata. This Metadata Management process should include a
mechanism for reporting identified data quality issues and development of remediation
actions within defined SLAs so that the process is implemented as a workflow within the
automated data catalog tool.

- Metadata Annotation - The Entity must attach a report showing the process for regularly reviewing the metadata
Process Report. annotations (tags, comments) made by users to the Metadata within the Data Catalog so
that the process is implemented as a workflow within the automated data catalog tool.

- Metadata Certification - The Entity must attach a report showing the process for regularly reviewing the trust
Process Report. certificates assigned by users to the Metadata within the Data Catalog so that the process
is implemented as a workflow within the automated data catalog tool.

116
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across the
Entity?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Evidence of the - The Entity must attach an approved report proving the implementation and adoption of the
Activated Implementation and approved processes, including each process’s workflow chart. The processes must
Adoption of the Approved include the following, as a minimum:
Processes as Workflows in
the Entity's Data Catalog. • Metadata Identification Process.

• Metadata Prioritization Process.

• Metadata Population Process.

• Metadata Updating process.

• Metadata Quality Process.

• Metadata Annotation Process.

• Metadata Certification Process.

- The Logs or the List of - The Entity must attach a report that includes the logs, or a list of alerts caused by
Notifications on the Metadata changes.
Metadata Changes.

- Evidences of - The Entity must attach the implementation report of contacting the Data Catalog users
Communications to the (e.g.: emails) when any Metadata is updated.
Data Catalog Users of any
Metadata Update.

- The Metadata Stewardship - The Entity must attach a report clarifying the Metadata Stewardship Coverage Model.
Coverage Model.

Level 4: - The Metadata Processes - The Entity must attach an updated and approved report on monitoring the Metadata
Managed Monitoring Report with Processes based on pre-defined KPIs (Indicator Cards), and each indicator’s data or card
Pre-Defined KPIs. must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

117
Checklist – Metadata and Data Catalog Domain

Has the Entity defined and implemented formal processes for effective Metadata Management, such as: prioritization,
MCM.MQ.3 population, access management, and quality issue management, etc., supported & fostered by collaboration across the
Entity?

Levels Acceptance Evidence Acceptance Criteria

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Metadata Quality - The Entity must attach an updated and approved report on monitoring the Metadata
Monitoring Report with Quality based on pre-defined KPIs (Indicator Cards) including the following, as a minimum:
Pre-Defined KPIs.
• Completeness.

• Accuracy.

• Consistency.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated & approved report including the following, as a
Pioneer Improvement Report on minimum:
the Metadata Management
Practices. • The documents of the periodic reviews & documented results of the
Metadata Management Practices.

• The continuous improvement mechanisms of the Metadata Management


Practices.

118
8.2.3. Data Quality Domain

Checklist – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of the Existing - The Entity must attach a report on the activities that are practiced in the DQ Domain. For
Establishing Data Quality (DQ) Related each activity, the report must contain the following, as a minimum:
Activities.
• Copies / screenshots of e-mail correspondences which support
practicing the activity.

• Copies / screenshots of documents which support practicing the activity.

• Screenshots of the systems which support practicing the activity.

Level 2: - The Defined and Approved - The Entity must attach a defined and approved DQ management plan to implement and
Defined DQ Implementation Plan. manage the activities which aim at improving the Entity’s Data. This plan must contain the
following, as a minimum:

• A roadmap that includes the activities and milestones of implementing


the DQ Management practices in the Entity. The activities shall
incorporate what is needed to achieve this Domain’s specifications, as a
minimum.

• The assignment of the required resources & budget allocation to manage


the implementation of the DQ Management Plan.

Level 3: - A Report on the DQ Plan - The Entity must attach an updated & approved report on the DQ plan implementation
Activated Implementation Status. status containing the following, as a minimum:

• The DQ Domain activities and the implementation status of every activity.

- A Report on the Defined - The Entity must attach an updated & approved report presenting the implementation
DQ Roles & status of the DQ Domain’s defined roles and responsibilities containing the following, as a
Responsibilities. minimum:

• The DQ Roles of the Data Stewards.

• The Responsibilities of the Data Stewards in the DQ Domain operations.

- A Report on the Assigned - The Entity must attach an updated & approved report presenting the current status of the
Resources for the DQ resources allocated for the implementation of the DQ activities containing the following, as
Plan. a minimum:

• The people, processes and tools needed to implement the DQ activities.

119
Checklist – Data Quality Domain

DQ.MQ.1 Has the Entity developed and implemented a Data Quality (DQ) plan focused on improving the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report of - The Entity must attach an updated & approved monitoring report on the DQ activities
Managed the DQ Plan and Activities implementation plan based on the KPIs (Indicator Cards) which were pre-defined in the DQ
with Pre-defined Key implementation plan. Each indicator’s data or card must include the following, as a
Performance Indicators minimum:
(KPIs).
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated & approved report presenting continuous improvement
Pioneer Improvement Report of the in the DQ Plan & Activities containing the following, as a minimum:
Data Quality Plan &
Activities. • The review document & the review results.

• The DQ continuous improvement plan.

120
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DQ Domain - The Entity must attach a report on the current DQ Domain initiatives.
Establishing Initiatives.

- The Existing Processes for - The Entity must attach a report on the current DQ issue detection processes.
Detecting DQ Issues.

- The Existing Processes - The Entity must attach a report on the current processes used for Data corrections or Data
Used for Data Corrections validations.
or Data Validations.

Level 2: - The Prioritized List of Data - The Entity must attach a report listing the Data elements ranked / prioritized based on
Defined Elements. business requirements. The first priority Data must include the Entity’s Master Data, as a
minimum.

- The Defined DQ - The Entity must attach a report clarifying the defined DQ Dimensions for the Entity's
Dimensions for the Entity's Datasets including the following, as a minimum:
Datasets.
• Completeness (The degree of how complete data records are).

• Uniqueness (The degree of how unique data records are without


duplicates).

• Timeliness (The degree to which data is up to date and available when it


is needed).

• Validity (The degree of records' conformance to the established formats,


types and ranges).

• Accuracy (The degree to which data values align to real values).

• Consistency (The degree to which data is consistent across the Entity’s


business and across different sources).

121
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

- The Data Quality Rules - The Entity must attach a report clarifying the list of defined DQ Rules that are aligned with
Report. the DQ Dimensions including the following, as a minimum:

• The DQ rule owner.

• The business description of the requirement to be validated by the rule.

• Regarding the particular data whose quality is being measured: The


assignment of Rules to each of the DQ Dimensions including the
following, as a minimum:

• Completeness.

• Uniqueness.

• Timeliness.

• Validity.

• Accuracy.

• Consistency.

• The list of Data Attributes validated by the defined rules.

• The metrics that are calculated when validating each DQ rule.

• The escalation threshold that triggers a DQ alert for the rule.

- The Developed & - The Entity must attach a report clarifying the Developed & Approved Processes for DQ
Approved Processes for Issue Management & Remediation followed for DQ issue management & remediation /
DQ Issue Management & resolution including, as a minimum, the following:
Remediation.
• The development of a remediation plan. The remediation plan must include the

following, as a minimum:

• A "Root Cause" analysis to determine the causes of the identified DQ


issue.

• An Impact analysis to assess the negative consequences and determine


an issue level (whether it is local / limited, or at the Entity level).

• The definition & identification of the DQ Targets set for each of the issues
& challenges, related to each DQ Dimension, depending on the context
of the issue / challenge within the Entity.

• The definition & identification of the options for resolving the issue's
"Root Cause", including a feasibility analysis.

122
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

• The specifications of the Data Cleansing process to be performed if the


solution does not correct the "Root Cause" of the DQ issue.

• The decision & the logical rationale for selecting the specific option
(chosen) to solve the issue.

• The implementation status of the issue's most suitable resolution choice


(including any change).

• The review of the implemented resolution and a verification that the issue
is resolved.

• Establishing & developing a roadmap and milestones for the resolution of the

noticed & identified DQ issues.

• Allocating the necessary resources to implement the DQ issue identification plan.

Level 3: - The Planned & Conducted - The Entity must attach a report clarifying the initial & periodic DQ assessment, which was
Activated Initial & Periodic Data planned for and conducted (already). The assessment must include the following, as a
Quality Assessment minimum:
Report.
• Collecting business requirements for the Quality of Data in the scope.

• Establishing & defining DQ Rules based on the collected business requirements


specifically.

• Performing a check by Data Profiling based on the pre-defined DQ Rules.

• Reporting the discovered and identified DQ comments & issues to the concerned /
relevant department in the Entity.

• Developing plans (with key milestones) for resolving the discovered & identified DQ
issues.

- The Resolution Status - The Entity must attach a report presenting the resolution status of the identified DQ issues,
Report of the Identified DQ including:
Issues.
• The implementation status of the issue's most suitable resolution choice.

• The review of the implemented resolution and a verification that the issue
is resolved.

123
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of DQ Tools - The Entity must attach a report showing the activation of DQ tools used to implement DQ
Used for Automating DQ issue management and presenting the following tools capabilities, as a minimum:
Issue Management
Workflows. 1. Data Profiling Statistical Data analysis on the following levels: Data attributes,
tables, cross Domain and different systems.

2. DQ Rules Management DQ rules establishment, development and execution.

3. DQ Issues Management Automation of workflows for reporting and resolving DQ


issues.

- The Defined & - The Entity must attach the defined DQ SLAs including the following, as a minimum:
Implemented DQ Service
Level Agreements (SLAs). • A timeline and a deadline for the development of a remediation plan for
the identified DQ issue.

• A timeline and a deadline for the implementation and the review of the
DQ changes.

• The escalation actions to be taken when the SLA is not met.

- The List of DQ System - The Entity must attach a report clarifying the list of DQ System Enforcements adopted by
Enforcements Adopted by the Entity (e.g., The Standard Implementation Mechanism & the Standard Implementation
the Entity. Result).

124
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report of - The Entity must attach an updated & approved report on monitoring the DQ practices
Managed the DQ Management based on pre-defined KPIs (Indicator Cards).
Practices with Pre-defined
KPIs. - Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

- The Monitoring Report of - The Entity must attach an approved report on monitoring and updating the threshold
the DQ Threshold Values. Values for each DQ Rule.

- The Monitoring Report of - The Entity must attach an updated & approved monitoring report on the DQ issue
the DQ Issue Resolution resolution process including the following, as a minimum:
Process.
• The number of resolved DQ issues vs the number of reported DQ issues.

• The number of DQ issues resolved after the specified deadlines.

• The total time of remediation plan development for a discovered DQ


issue.

125
Checklist – Data Quality Domain

DQ.MQ.2 Has the Entity established / developed and implemented practices to manage and improve the quality of the Entity's Data?

Levels Acceptance Evidence Acceptance Criteria

• The total time of resolving a DQ issue (A “Root Cause” resolution


implementation).

Level 5: - The Continuous - The Entity must attach an updated & approved report showing that the Entity is
Pioneer Improvement Report on periodically monitoring & regularly optimizing the tools used for DQ.
the Tools Used for DQ.

- The Continuous - The Entity must attach an updated & approved report showing that the Entity identified,
Improvement Report on implemented and is regularly monitoring the continuous improvement mechanisms of the
the DQ Management DQ Domain practices.
Practices.

- The Implemented /
- The Entity must provide evidence of the adopted DQ Management Standards (e.g., ISO
Adopted DQ Industry
8000).
Standards.

Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DQ Monitoring - The Entity must attach an updated & approved report clarifying the Entity’s current DQ
Establishing Practices. monitoring practices.

- Evidence of the Entity's - The Entity must attach an updated & approved report showing the current DQ situation.
Current DQ Status.

126
Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

Level 2: - The Defined and Formalized - The Entity must attach the defined DQ monitoring plan showing activities required to
Defined DQ Monitoring Plan. monitor and document the data quality status on a regular with the assigned resources for
implementation.

- The Defined DQ - The Entity must attach a report clarifying specific Checkpoints defined for DQ Monitoring.
Checkpoints Report.

Level 3: - DQ Scorecards or - The Entity must attach a report illustrating DQ scorecards or DQ dashboards. This should
Activated Dashboards. include the following, as a minimum:

• Execution of the defined DQ rules which according to defined triggering


conditions (time schedule, event).

• Reporting of the noticed and identified DQ issues to Data Stewards &


Owners, (As a minimum: A Business Data Steward & a Business Data
Executive).

- A Report on the Data Quality - The Entity must attach a report presenting the DQ Metadata registered in the Data Catalog
Metadata Logged on the Tool as per the process identified in the MCM Domain (Data Catalog & Metadata
Data Catalog Tool. Management / Management of the Catalog & Metadata). The DQ Metadata must include
the following, as a minimum:

• The existing DQ Rules.

• The DQ Monitoring process results.

- Evidence of a Data Quality - The Entity must attach a report illustrating DQ Support implemented as a workflow
Support Process process to solve the issues discovered during the DQ reviews. The report must include the
Implemented as a Workflow. following, as a minimum:

• A clear process that enables data users to report DQ issues to Business


Data Stewards.

• A diagram of the workflow in the automated Data Catalog tool.

- The Results of the DQ - The Entity must attach a report clarifying the review / audit results of the DQ Checkpoints
Checkpoint Reviews. including the following, as a minimum:

• A log containing the detected DQ issues.

• A remediation plan for the detected DQ issues.

Level 4: - Trends from the DQ - The Entity must attach an updated & approved report about the trends of the DQ
Managed Monitoring Activities with monitoring activities which are based on the data of the pre-defined KPIs (indicator cards)
Pre-defined KPIs. including the following, as a minimum:

127
Checklist – Data Quality Domain

DQ.MQ.3 Has the Entity established and implemented practices to monitor and report the Entity's Data Quality (DQ) status?

Levels Acceptance Evidence Acceptance Criteria

• The number of DQ issues reported based on the established &


implemented DQ Rules.

• The number of DQ issues reported by the Data Catalog users.

• The number of the DQ Rules deployed.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - A Continuous Improvement - The Entity must attach an updated & approved report clarifying the monitoring and
Pioneer Report on the DQ reporting practices that have been reviewed and optimized to raise DQ including the
Monitoring and Reporting following, as a minimum:
Practices.
• The documents of the periodic reviews & documented results of the DQ
monitoring & reporting practices.

• The continuous improvement mechanisms of the DQ monitoring &


reporting practices.

128
Checklist – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing List of Data - The Entity must attach an updated & approved report listing the current DQ standards
Establishing Standards and Data and explaining the Data definitions.
Definitions.

Level 2: - The Developed Data - The Entity must attach an updated & approved report clarifying the Standards
Defined Standards for Data Elements. developed for the Data Elements.

- The Identified Metadata with - The Entity must attach an updated & approved report clarifying the identified Metadata
their Definitions. with their definitions.

- The Identified Datasets to be - The Entity must attach an updated & approved report showing the identified datasets to
Published on the National be published on the National Data Catalog (NDC).
Data Catalog (NDC).

Level 3: - A Report about the Data - The Entity must attach an updated & approved report proving that the Entity’s Data
Activated Standards & the Data Standards & Data Definitions have been published on the National Data Catalog (NDC).
Definitions which the Entity
Uploaded on the NDC.

- The Entity-Specific List of - The Entity must attach an updated & approved report listing the approved Entity specific
Applied Data Definitions and definitions & Data standards.
Applied Data Standards.

Level 4: - A Monitoring Report for the - The Entity must attach an updated & approved report on monitoring Data Definitions and
Managed Data Definitions and Data Data Standardizations based on pre-defined KPIs (indicator cards), e.g.:
Standardization with Pre-
defined KPIs. • The percentage of defined Data Elements.

• The percentage of business Metadata attributes which are defined and


published to the NDC.

129
Checklist – Data Quality Domain

Has the Entity developed Data Quality (DQ) standards, provided definitions for its Datasets, and published / uploaded the
DQ.MQ.4
definitions on the National Data Catalog (NDC)?

Levels Acceptance Evidence Acceptance Criteria

• The percentage of technical Metadata attributes which are defined and


published on the NDC.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - A Continuous Improvement - The Entity must attach an updated & approved report presenting regular reviews and
Pioneer Report for Optimizing the Data continuous improvements of Data Standards & Data Definitions in both: within the Entity
Standards and Definitions and on the National Data Catalog (NDC), including the following, as a minimum:
within the Entity and on the
NDC. • The documents of the periodic reviews & documented results of the
Data Standards & Data Definitions.

• The continuous improvement mechanisms of the Data Standards &


Data Definitions.

130
8.2.4. Data Operations Domain

Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Approved Initial Data Operations - The Entity must attach the initial Data Operations and Storage Plan
Establishing and Storage Plan. including the following, as a minimum:

• A roadmap that includes the activities and milestones.

• The assignment of the required resources & budget


allocation.

Level 2: - The Developed and Approved Data - The Entity must attach the developed and approved Data Operations and
Defined Operations and Storage Plan. Storage Plan including the following, as a minimum:

• A roadmap that includes the activities and milestones.

• The assignment of the required resources & budget


allocation.

• A prioritized list of the information systems (based on


their criticality to the business progress).

- The approved Policies for Data Operations, storage and retention, as well
as business continuity, wherein each Policy document includes the
following:

• Policy title / name.

• Policy owner.

• Release date.

• Version number.

• Version history.

• Objective.

• Identified stakeholders, e.g.: Target audience.

• Communications process.

131
Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

• Monitoring procedure.

• Document control (Preparation, review, approval).

• Policy Statement.

• Roles & responsibilities.

• Terminology.

• Scope of work.

• Activation mechanisms.

• Approval.

• References.

- The Information Systems Priority List. - The Entity must attach a list of prioritized information systems that must be
followed to establish a system recovery order in the Disaster Recovery
(DR) plan.

- The Developed and Approved Policies - The Entity must attach a report on the Policies which are approved for
for Data Operations, Storage, Data Operations, storage, retention, and business continuity, including the
Retention and Business Continuity. following, as a minimum:

• Storage conditions which ensure Data protection in


disaster events.

• Data retention periods based on Data type, Data


Classification (DC) level, Data value for/in the business,
and legal requirements.

• The disposal and destruction rules based on the Data


type and classification level.

• The required actions in the event of an accidental


permanent loss of Data.

- The Periodic Forecasting Plan for - The Entity must attach a periodic forecast report on proactive planning to
Storage Capacity. satisfy the expected storage capacity requirements, including the
following, as a minimum:

• Historical Data storage capacities used.

• Performance Evaluation of the Entity’s applications and


needs.

132
Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

• A list of upcoming applications to be developed and built


for the Entity.

- The Database Technology Evaluation - The Entity must attach the Database Technology Evaluation Process and
Process and Selection Criteria. Selection Criteria including the following, as a minimum:

• The Total Cost of Ownership (TCO) including, at least:


licensing, support, training, and hardware.

• The availability of resources skilled in these technological


solutions, both internally and in the job market.

• The presence of software tools related to the Database


technologies being evaluated in the Entity.

• Volume and velocity limits of the utilized technologies.

• Reliability provided by the utilized technologies.

• Scalability of each technology.

• Security controls provided by the utilized technologies.

Level 3: - The Data Operations and Storage plan - The Entity must attach the Data Operations and Storage Plans with a
Activated with implementation status report. report clarifying the implementation status, including the following, as a
minimum:

• The achievement percentages of the actions / tasks


included in the Data Operations and Storage Plans.

- The Storage Trend Forecast - The Entity must attach a report clarifying the expected storage capacity,
Document. including the following, as a minimum:

• Forecasting the Entity’s future needs of Data storage


capacity.

• Estimating budgets for future purchases of Data storage


space.

- The Database Technology Evaluation - The Entity must attach a Database technology assessment report including
Report. the following, as a minimum:

• The performance of Database technologies.

133
Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

• The availability of technical support for the current


version of each Database technology.

• Technical support availability for software tools and


operating systems of the Databases.

- The Document on the Entity's - The Entity must attach an evaluation report on the performance of the
Application Performance Assessment. Entity's applications, including the following, as a minimum:

• The number of transactions by each application used in


the Entity.

• The percentage % of application utilization to perform the


Entity’s work in an automated manner.

• Availability – Users Accessibility to the Entity’s


applications.

- The Entity's Applications Development - The Entity must attach a roadmap document for the upcoming
Roadmap with the Status Report on applications to be developed and a report on their development statuses,
the Implementations. including the following, as a minimum:

• The activities and milestones.

• Application development prioritization.

• Each application’s development status.

- The Budget Estimations of the - The Entity must attach a report clarifying the estimated budgets for future
Procurement Transactions of the procurements of Data storage processes.
Future Storage Needs.

- The Data Operations Orchestration - The Entity must attach a document on Data Operations Orchestration,
Document. including assigning teams to operate and maintain systems Data, plus to
process and analyze Data.

Level 4: - The Monitoring Report with Pre- - The Entity must attach a report prepared based on pre-defined KPIs
Managed defined Key Performance Indicators (Indicator Cards), and each indicator’s data or card should include the
(KPIs) for the Implementation Progress following, as a minimum:
and Effectiveness of the Data
Operations Plan. • Indicator’s Name / Code.

134
Checklist – Data Operations Domain

Has the Entity developed and implemented a plan to manage and satisfy the needs of Data Operations, Data storage and
DO.MQ.1
Data retention?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement


year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually


/ Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator
value is the target).

Level 5: - The Continuous Improvement Report - The Entity must attach a report presenting that the Entity specified,
Pioneer on the Practices of Data Operations, implemented, monitored, and reviewed mechanisms for continuous
Storage, and Retention. improvement of the Data Operations and storage plans, and satisfying
Data retention needs. The report must include the following, as a
minimum:

• The documents of the periodic reviews & documented


results of the Data Operations Plans and Data Retention
needs.

135
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data Operations Done - The Entity must attach evidences of the Database operations performed by
Establishing within the Entity. the Entity (e-mails, screenshots).

Level 2: - The Process Documentation of the - The Entity must attach the overall Process of the Detailed Practices of
Defined Detailed Practices of Data Operations, Database Monitoring including:
Including:
• The process for monitoring and reporting database
A. Database Monitoring. performance on regular basis.

B. Database Access Control. - The Entity must attach the Detailed process for providing the Entity's
employees access to the databases.

C. Storage Configuration - The Entity must attach the detailed Practices of Storage Configuration
Management. Management including:

• Configuration identification.

• Configuration change control.

• Configuration status accounting.

• Configuration audits.

D. DBMS Versioning Mechanism.


- The Entity must attach the overall Process of the Detailed Practices of
DBMS Versioning / Updating Mechanism Covering:

• The Management Plan of Updated Releases / Versions.

• The Strategy Including Analysis and Rationale.

136
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Levels Acceptance Evidence Acceptance Criteria

E. The Database Performance Service - The Entity must attach the process for defining Database Performance
Level Agreements and Operational Service Level Agreements and Operational Level Agreements (SLAs &
Level Agreements (SLAs & OLAs). OLAs).

Level 3: - The Database Monitoring Status - The Entity must attach a monitoring status report on Database
Activated Report. performance, including the following, as a minimum:

• Capacity – The size of the unused storage.

• Availability – Users’ accessibility to the Entity’s


Databases, as & when needed.

• Queries Execution Performance – Query execution times,


durations and errors.

• Tracking Changes – Tracking Database changes for root


cause analysis, as & when needed.

- The Data Operations Operating Model. - The Entity must attach the approved Operational Model of Database
Operations, including the following, as a minimum:

• The teams’ roles and responsibilities.

• The procedures and practices of Database Operations


management.

• The technologies used to support Database Operations.

- Evidences of Agreements (SLAs and - The Entity must attach a report on Database performance agreements,
OLAs). e.g.: Service Level Agreements (SLAs) and Operational Level Agreements
(OLAs); including the following, as a minimum:

• The timeframe of making the Database available for


users.

• The maximum time allowed to complete electronic


transactions on a specific application.

• Each agreement must clarify the escalation procedures to


be followed when the agreement is violated.

137
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report with Pre- - The Entity must attach a report prepared based on pre-defined KPIs
Managed defined KPIs for Database Operations (Indicator Cards), and each indicator’s data or card should include the
Management. following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement


year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually


/ Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator
value is the target).

Level 5: - The Continuous Improvement Report - The Entity must attach a periodic evaluation report on continuous
Pioneer on the Implemented Practices and improvement of the following, as a minimum:
Processes of Data Operations and
Storage. • The processes, procedures and practices followed during
Database Operations.

• The operational metrics (Service Level Agreements


(SLAs), Operational Level Agreements (OLAs), Key
Performance Indicators (KPIs), Key Quality Indicators
(KQIs)).

138
Checklist – Data Operations Domain

Does the Entity have in place a defined methodology, processes and Standard Operating Procedures (SOPs) for Database
DO.MQ.2
operations?

Levels Acceptance Evidence Acceptance Criteria

- The Knowledgebase document. - The Entity must attach a Knowledgebase document including the
following, as a minimum:

• All lessons learnt, test / trial cases and user stories.

• The log of errors and issues.

• The solutions of the recorded errors and issues.

Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data Storage Backup - The Entity must attach a sample document of the Data storage backup
Establishing Instruction Documents. instructions (e-mail, guide / manual).

Level 2: - The Developed Processes and - The Entity must attach the process and practices developed for the "Data
Defined Practices for Data Storage Backups Storage Backup" and "Data Backup Recovery", including the following, as
and Recovery. a minimum:

• Defining and determining the backup frequency of each


information system.

• Scope of backup for each information system including


the scope of Data and the scope of Database transaction
logs.

• Location of backup files including a storage medium and


a physical location.

• Periodic validations of backup completions using system


copies in non-production environments.

139
Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

- The Business Continuity Plan (BCP) - The Entity must attach processes and plans that ensure work progress
and the Disaster Recovery (DR) Plan without disruption, including the following, as a minimum:
and the Processes.
• The Business Continuity Processes and Plan (BCP)
including, as a minimum:

• Risk assessment and business impact analysis.

• Roles and responsibilities.

• Alternative work locations.

• The infrastructure.

• Incident management.

• The communication mechanisms of the continuity


procedures to notify the relevant stakeholders, e.g.:
Employees and customers,

• The Disaster Recovery (DR) Processes and Plan


including, as a minimum:

• A prioritized list of information systems defining their


recovery order.

• Assigning the roles responsible for addressing /


handling incident cases and responding.

• Defining and specifying the procedural actions to be


taken to activate a response to each incident
through the system.

• Defining and specifying the procedural actions to be


taken to reduce the damage and mitigate the
incident consequences on the Entity's critical
operations.

• Definition & identification of Recovery Point


Objectives (RPO) (A maximum targeted period within
which Data might be lost without causing damage to
the business) for each information system covered in
the plan.

• Definition of Recovery Time Objectives (RTO) (A


maximum targeted duration of time within which the
Database can be down without causing damage to

140
Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

the business) for each information system covered in


the plan

• Definition of Recovery activities.

• Periodic training courses with scenario-based trial


simulations to evaluate the response and identify areas
for improvement.

- A Document of Actions Required to - The Entity must attach a document explaining the definitions of the
Implement Database Changes and measures and the procedural action steps to be taken to implement the
Rollbacks. changes on each Database or to rollback / undo the changes as & when
needed.

Level 3: - The Technical Design Document. - The Entity must attach the Technical Design document including the
Activated following, as a minimum:

• Accurate information about the technical infrastructure of


the main location and the Disaster Recovery (DR)
location.

• The processes and procedures which are required for


Data recovery.

• System architecture, including hardware, software, and


network structure.

• Tasks and responsibilities of the technical team


responsible for DR.

• Testing and maintenance procedures of the DR


processes.

- The BCP Run Report. - The Entity must attach the BCP Run Report, i.e., the output doc after BCP
execution, explaining the results, recommendations, etc. which ensure the
continuous availability of critical business functions during adverse /
damaging incidents. The following must be included, as a minimum:

• The implementation status of the BCP, practices and


processes.

• The implementation status of the DR Plan, practices and


processes.

141
Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

- The Change Request for Production - The Entity must attach the approved Change Request form / template
Data. used for requesting a Data change in the production environment.

- The Production Data Access Control - The Entity must attach the Access Control document of Data in the
Document. Production Environments (based on the authorizations matrix and the
roles), including the following, as a minimum:

• The change requests initiating the process.

• Definition and identification of procedural actions to be


taken for a controlled implementation of changes in the
Databases.

• Definition and identification of procedural actions to be


taken for rollback / reversing the changes in cases of
identified issues.

Level 4: - The Monitoring Report with Pre- - The Entity must attach a monitoring report prepared based on pre-defined
Managed defined KPIs for the Business KPIs (Indicator Cards) for the following:
Continuity (BCP) and Disaster
Recovery (DR). • Data Storage Capacity Utilization:

• The percentage % of total capacity used.

• The percentages % of capacities used by the type of


Database.

• The percentage % of capacity used for backups.

• The number of performed Data transactions.

• The average time of queries execution.

• The BCP implementation monitoring KPIs Report.

- Each indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

142
Checklist – Data Operations Domain

Does the Entity have in place, practices and processes for Business Continuity such as backup and disaster recovery (DR)
DO.MQ.3
and a defined Business Continuity Plan (BCP) for the Data?

Levels Acceptance Evidence Acceptance Criteria

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement


year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually


/ Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator
value is the target).

Level 5: - The Continuous Improvement Report - The Entity must attach a detailed report with clarity and details on
Pioneer on the Business Continuity Plan (BCP). optimizing the following, as a minimum:

• BCP.

• Processes for production environment authorization


access control.

• Processes for Data backup and recovery.

• DR plan.

- The continuous improvement report must include the following, as a


minimum:

• The documents of the periodic reviews & documented


results.

• The continuous improvement mechanisms.

143
8.2.5. Document and Content Management Domain

Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Existing DCM - Evidences must be attached proving the existence of DCM activities, including what
Establishing Activities. supports practicing a DCM activity such as:

• Copies / screenshots of e-mails.

• Copies / screenshots of documents.

• Screenshots of systems / tools.

Level 2: - The DCM Plan. - The DCM plan must include the following, as a minimum:
Defined
• A roadmap that includes the activities and milestones for the
implementation of Documents and Content Management activities.
The activities shall incorporate what is needed to achieve this
Domain’s specifications, as a minimum.

• The assignment of the required resources & budget allocation to


manage the implementation of Documents and Content Management
processes.

- The DCM Digitization Plan. - A report must be attached containing the DCM Digitization Plan and the implementation
status with the necessary details as follows, as a minimum:

• Roadmap with the activities and key milestones for the migration of
the Entity's existing paper-based documents to the electronic format.

• Roadmap with the activities and key milestones for the


implementation of initiatives focused on eliminating a creation of
paper-based documents in the Entity and replacing them with
electronic documents.

• Assignment of the required resources and budget to manage the


implementation of paperless management initiatives.

144
Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

- The Developed Prioritization - The Prioritization Process of Documents and Content must include the following, as a
Process for Documents and minimum:
Content.
• The definition of “Data Prioritization”.

• The prioritization matrix.

• The identification and definition of DCM procedures.

• The ranked list of prioritized document workflows.

- The Identified DCM Processes - Based on the approved policies, the Entity must attach a report on the approved
/ Procedures. procedures followed for Data Operations (DO), storage, retention and business
continuity, containing the following:

• Storage conditions ensuring Data protection in disaster events.

• Data Retention periods based on Data type, classification, business


value and legal requirements.

• Disposal and destruction rules based on the Data type & Data
classification.

• The actions required in the event of an accidental permanent loss of


Data.

- The Ranked List of Prioritized - The Entity must attach a report clarifying a ranked list of tasks for each prioritized
Document Workflows to be Workflow of Documents, including:
Implemented.
• The definition and rating of the workflows based on the level of
importance and impact on business operations.

• The tasks of each prioritized Workflow based on the compliance


requirements, risk factors, and operational needs.

• A clear implementation roadmap for each Workflow including the


timelines and resource assignments.

- The Developed DCM Training - The Entity must attach the developed DCM training plan, including:
Plan.
• The target audience groups identified in the training plan, such as
employees, managers, and IT staff.

• The developed training curricula, including presentations, handouts


and online resources, designed / customized specifically to satisfy the
identified & defined needs.

• The training delivery staff’s assigned responsibilities, including trainers


or facilitators.

• The training schedule.

145
Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The DCM Plan Implementation - A report must be attached clarifying the DCM Plan’s implementation status including
Activated Status Report. the following, as a minimum:

• The achievement percentages of the actions / works included in the


DCM Plan.

- The Digitization Plan - A report must be attached clarifying the Digitization Plan’s implementation status
Implementation Status Report. including the following, as a minimum:

• The achievement percentages of the actions / works included in the


Digitization Plan.

- The List of Prioritized - A report must be attached clarifying the prioritized list of documents to be digitized
Documents for Digitization. based on the Inventory of Data Elements within the Entity.

- The DCM Roles and - The Entity must attach the approved DCM Roles & Responsibilities.
Responsibilities.

- The DCM Training Plan - The DCM Training Plan Implementation status report must be attached including the
Implementation Status Report. following, as a minimum:

• The objectives of the training program with the topics covering:

• Introduction of the DCM Policies.

• Introductory and advanced tutorials about DCM systems used by


the entity and their functionalities.

• The dates and implementation status.

Level 4: - The Monitoring Reports with - The Entity must attach monitoring reports prepared based on pre-defined KPIs
Managed Pre-defined KPIs for the (Indicator Cards), including, as a minimum:
Implementation of the DCM
Plan & the DCM Digitization • The Monitoring Report with approved & pre-defined KPIs for the
Plan. Implementation of the DCM Plan, with the Indicator Card(s) attached.

• The Monitoring Report with approved & pre-defined KPIs for the
Implementation of the Digitization Plan, with the Indicator Card(s)
attached.

- In both reports, each indicator’s data or card should include the following, as a
minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

146
Checklist – Document and Content Management Domain

Has the Entity developed a Document and Content Management (DCM) plan and a Digitization plan to manage the
DCM.MQ.1
implementation of paperless management activities?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - A report must be attached showing that the Entity identified, implemented and is
Pioneer Report of the DCM and regularly monitoring the Continuous Improvement mechanisms for both the DCM Plan
Digitization Plans. & the Digitization Plan, including the following, as a minimum:

• The DCM plan continuous improvement mechanisms.

Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Processes for - Evidences must be attached, covering the Processes of Document Retention and
Establishing Retaining & Disposing of Disposal, including what supports practicing each procedure / process such as:
Documents.
• Copies / screenshots of e-mails.

• Copies / screenshots of documents.

• Screenshots of systems / tools.

147
Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

Level 2: - The Developed Policy - Across the DCM Lifecycle, the Entity must attach the developed Policies for each of the
Defined Document for the DCM following, as a minimum:
Lifecycle.
• The Naming Convention Policies.

• Policies for assigning classification levels to documents.

• The Access Approval Policies.

• The Backup & Recovery Policies.

• The Retention & Disposal Policies.

- Every Policy’s document must include, the following, as a minimum:

• Policy Name.

• Policy Owner.

• Release date.

• Version number.

• Document control (Preparation, review, approval).

• Version history.

• Terminology.

• Goal.

• Scope of work.

• Principles

• Policy Statement.

• Job roles & responsibilities.

• Related Policies.

• References.

- The Developed DCM Backup & - The Entity must attach reports showing evidence that the Entity has included the
Recovery Procedures. Document and Content Management Systems within its overall backup and recovery
plan for DCM Backup & Recovery.

- The Developed DCM Retention - The Entity must attach reports containing the documentations of the developed DCM
& Disposal Procedures. Retention & Disposal Processes/ Procedures.

- The Developed DCM Role- - The Entity must attach reports containing the documentations of the developed DCM
Based Access Approval Access Approval & Role-Based Access Authorization Processes.
Procedures.

148
Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

- The Developed DCM Metadata - The Entity must attach reports containing the documentations of the developed
Publishing Procedures. processes for the following:

• Publishing the Metadata of Documents & Contents.

• Classifying Documents & Contents.

Level 3: - The Implemented Workflow for - Evidence must be attached for the workflow performed for the Retention & Disposal
Activated the DCM Retention & Disposal process. This shall include:
Process.
• Evidence of handover of documents to the Entity's archival facility.

• Evidence of physical destruction of documents, including overwriting


and a secure deletion.

- The Implemented Workflow for - Evidence must be attached for the workflow performed for the Access Authorization
the DCM Access Approval Approval process based on job roles & job tasks (A copy of the work procedure steps).
Process.

- Evidences of Document - Evidences must be attached to prove transferring documents to the Entity’s Archival
Transfers to the Archival Facility / Archives Unit. (Registering the archive in a system), e.g.: Copies of Archived
Facility (Archival Register). Documents.

- The Documents Disposal - Evidences must be attached to prove the existence of a Register for logging the
Register. Disposal of Documents (A copy of the Documents Disposal Register).

- The Access Rights - The Entity must attach the Access Rights Documentation including the following, as a
Documentation. minimum:

• The mechanism of accessing documents and contents clarifying the


dependence of the role and job tasks.

• The identified Access Groups based on Data Classification Domain


Standards & Controls.

- The Report on Document & - The Entity must attach a report on Publishing Metadata of Documents & Contents,
Content Metadata Publishing. including the following, as a minimum:

• The Metadata Standards.

• The Forms / Templates.

• The Data Catalog Integration document.

149
Checklist – Document and Content Management Domain

Has the Entity implemented policies and processes for Document and Content Management (DCM) including: Backup &
DCM.MQ.2
Recovery, Retention & Disposal, Document & Content Access Approval, and Metadata Publishing?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report with - The Entity must attach a monitoring report prepared based on pre-defined KPIs
Managed Pre-defined Key Performance (Indicator Cards), including, as a minimum:
Indicators (KPIs) of the DCM
Processes Aligned with the • The size / volume of the Entity’s Documents stored in the Entity's
Policies. Document Management System (DMS).

• The number of system / tool users.

• The percentage % of the migrated records.

• The number of users of the Entity’s Document Management System


(DMS).

• The percentage % of the paper-based documents transformed into


electronic formats.

- Each indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - A report must be attached clarifying that the Entity identified, implemented and is
Pioneer Report on the DCM Processes regularly monitoring the Continuous Improvement mechanisms for the DCM Processes,
and Practices. including the following, as a minimum:

• The continuous improvement mechanisms of the DCM Processes.

• The review documents and the periodic results of the DCM


Processes.

150
Checklist – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable - Not Applicable


Absence of
Capabilities

Level 1: - Evidences of Document - The Entity must attach a report on the current activities related to document storage,
Establishing Storage / Retention & Disposal document retention, and document disposal, including one of the following to support
Activities. practicing each activity, e.g.:

• Copies / screenshots of e-mails.

• Copies / screenshots of documents.

• Screenshots of systems / tools.

Level 2: - The Documented DCM Tool - The Entity must attach the documented DCM Tool requirements, including the
Defined Requirements. following, as a minimum:

• The functional requirements of the tool.

• The non-functional requirements of the tool.

• The tool’s use case template / form.

Level 3: - The Implemented Tool for - The Entity must attach evidences that the implemented DCM tool includes the
Activated DCM. following, as a minimum:

• Document Management System - an application used to capture, store and


manage documents in an electronic format (electronic documents and digital
media). The selected DMS tool shall provide, at minimum, the following
capabilities:

 Storage of documents.

 OCR (Optical Character Recognition) functionality to


analyze imported images.

 Indexing of documents.

 Versioning of documents including tracking of the


history of changes.

 Secured access to documents.

 Global search and discovery on the registered


documents.

 Documents workflows development.

151
Checklist – Document and Content Management Domain

Has the Entity implemented a tool to support Document and Content Management (DCM) processes including Digitization
DCM.MQ.3
Management implementation?

Levels Acceptance Evidence Acceptance Criteria

• Web Content Management System - an application used to store and manage


website Content used by the Entity's portals and internet sites.

• Collaboration tools – applications providing users with platform to collaborate


real-time on electronic documents, communicate using chat and track
changes in the documents.

- The Record of Digitized - The Entity must attach an evidence of the number of Digitized Documents (A copy of
Documents. the Record of Digitized Documents).

- The List of Approved Users. - The Entity must attach a list of the approved users of the DCM system / tool.

Level 4: - The Monitoring Report on the - The Entity must attach a monitoring report prepared based on pre-defined KPIs
Managed Implemented Tool with Pre- (Indicator Cards), and each indicator’s data or card should include the following, as a
defined Key Performance minimum:
Indicators (KPIs).
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - A report must be attached clarifying that the Entity identified, implemented and is
Pioneer Report on the DCM Solution / regularly monitoring the Continuous Improvement mechanisms for the implemented
Tool Design and DCM tool performance.
Practices.

152
8.2.6. Data Architecture and Modelling Domain

Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The existing DAM Domain - The Entity must attach a report clarifying the current DAM practices.
Establishing related practices.

Level 2: - The Approved DAM Plan. - The DAM implementation plan must include the following, as a minimum:
Defined
• Roadmap with the activities & milestones for the Target State Data
Architecture. The activities shall incorporate what is needed to achieve
this Domain’s specifications, as a minimum.

• The assignment of the required resources & budget to manage the


implementation of the Target State Data Architecture.

- The Approved Current State - The Entity must attach a report on the Current State Data Architecture and the Technical
Data Architecture & the Architecture to support the development of the Target Architecture. The Current Data
Existing Technical Architecture & the existing Technical Architecture shall cover, as a minimum:
Architecture.
• A Data model at the conceptual, logical and physical levels.

• The current processes used in conducting business processes &


decision making.

• The Key System Components – The current essential applications,


data storages, data processing platforms, and data analytics solutions
used in key processes.

• Data flow and Data lineage.

- The Approved Target State - The Entity must attach the Target Data Architecture including the following, as a
Data Architecture. minimum:

• A Data model at the conceptual, logical and physical levels.

• The targeted processes used in conducting business processes &


decision making.

• The Key System Components – The targeted essential applications,


data storages, data processing platforms, and data analytics solutions
used in key processes.

• Data flow and Data lineage.

153
Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The Future State Gap - The Entity must attach a report on the Future-State Gap Assessment including the
Assessment. following, as a minimum:

• An analysis of the gaps between the Current Data Architecture and the
Target Data Architecture.

- The Approved Enterprise - The Entity must attach the Enterprise Architecture (EA) Framework and the general
Architecture Framework. Model of the EA components so that the Government Entity would have a
comprehensive business & IT blueprint that’s linked & aligned with the Entity’s strategic
objectives.

- The DAM Policy. - The Entity must attach the DAM Policies including the following for each policy, as a
minimum:

• Policy Name.

• Release Date.

• Release Number.

• Document Control (Preparation, Review, Approval).

• Version History.

• Terminologies.

• Goal.

• Scope of Work.

• Roles & Responsibilities.

• References.

• Policy Owner.

• Policy Statement including the following:

• The Target Data Architecture shall address the strategic


requirements defined within the Entity's Data Management &
Personal Data Protection (DM & PDP) Strategy.

• The Target Data Architecture shall be mapped with the overall


Enterprise Architecture (EA).

• The Target Data Architecture shall adopt a popular Enterprise


Architecture Framework, e.g., TOGAF, Zachmann.

• The regular mechanisms of monitoring & updating Data Models.

154
Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The DAM Plan Implementation - The Entity must attach a report clarifying the implementation status including the
Activated Progress Report. following, as a minimum:

• The achievement percentages of the initiatives & projects included in


the DAM Implementation Plan.

- The EA Framework - The Entity must attach a proof of implementing the approved EA Framework for the Data
Implementation Report. Architecture.

Level 4: - The Monitoring Report of the - The report must be prepared based on pre-defined KPIs (Indicator Cards), and each
Managed DAM Plan & Activities indicator’s data or card should include the following, as a minimum:
Implementation with Pre-
defined KPIs. • Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Data Architecture Plan's - The Entity must attach a report showing that the Entity identified, implemented and is
Pioneer Continuous Improvement monitoring the Data Architecture plan’s continuous improvement mechanisms including
Report. the following, as a minimum:

• The Documentation of the periodic Data Architecture reviews and the


documented results.

• The continuous improvement mechanisms.

155
Checklist – Data Architecture and Modelling Domain

DAM.MQ.1 Has the Entity developed and implemented a plan to improve its Data Architecture Capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The Architecture Change - The Entity must attach a document clarifying a specific Architecture Change
Management Process. Management Process for reviewing, approving and implementing changes in the Current
& Target Data Architectures. The Architecture Change Scope shall include the following,
as a minimum:

• The requests for new DAM initiatives.

• The modifications on the Current State Architecture documents of the


existing initiatives.

- The Change Control - The Entity must attach a document clarifying the Change Control Process to be used in
Document. changing the Current & Target Data Architectures. The document shall include the
following, as a minimum:

• The scope of change.

• The impact assessment of the change.

• The procedures of the change.

• The roles & responsibilities.

• The data of change approvals.

• The data of the change requester.

- The Data Architecture - The Entity must attach a report showing that the Entity incorporated Data Architecture
Checkpoints Report. Checkpoints in the Software Development Lifecycle (SDLC) processes. The Checkpoints
shall include the following, as a minimum:

• Investigating the possibilities of reusing the existing Data Architecture


components to address the business requirements.

• Validating the conformance of the created Data Models with the


Entity's Enterprise Data Model.

• Verifying if the project implies any change required to be done on the


Entity's overall Enterprise Data Model.

156
Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Current DAM Domain - The Entity must attach the following, as a minimum:
Establishing Activities.
• Proof of the current processes followed to manage the Data Flow &
DAM Practices.

• Documents which clarify the current (Conceptual & Logical) Data


Models.

• Documents which clarify the Data Lineages (of Data Flows) being
aligned with business requirements.

Level 2: - A document containing the - The Entity must attach a document clarifying the Business Processes on the Data
Defined Approved Business Architecture, identifying the roles & responsibilities to ensure compliance with DAM
Processes on the Data Policies and effective & systematic achievements of Data Architecture activities.
Architecture and any Related
Data Flow.

- A document containing the - The Entity must attach a document specifying the requirements for developing a Data
Big Data Considerations Storage / Lake Environment using a vendor-neutral Big Data Reference Architecture
Including the Data Lake Framework (e.g., NIST) and incorporating Big Data architecture components into its
Requirements. overall target Data Architecture design. The Data Storage / Lake requirements shall
address the following, as a minimum:

• Ingest – Ingesting and converting semi structured & unstructured


datasets into a structured form.

• Infrastructure – Networking, computing and storage requirements


needed to handle large, diverse formats of data.

• Platform – A distributed storage solution providing distributed


computing / processing capabilities.

157
Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

- A document containing the - The Entity must attach a document specifying data processing considerations including
Data Processing the following, as a minimum:
Considerations Including the
Partitioning Strategy. • A partitioning strategy for its Target State Data Architecture for efficient
processing of various data volumes, data variety & data velocity.

• The partitioning strategy coverage of both real time and batch


processing operations.

- Model representation. - Attach a document demonstrating the selection and documentation of a planning
method to document business structure, relationships, and code at the conceptual,
logical, and physical levels, and use it throughout the software application development
life cycle.

- The DAM Register. - The Entity must attach a Register clarifying the saving mechanism of the following:

• The Data Architecture & Technical Architecture project.

• The reference documentations / materials.

• Data Model designs.

- A document containing the - The Entity must attach a document clarifying that the Entity adhered to a standardized
Data Model Representation's method in building Data Models according to best practices (Such as naming
Technical Standards & Best conventions, data types, basic attributes, physical model deployment considerations,
Practices. improvements, etc.).

Level 3: - The Data Integration Pattern - The Entity must attach the documents of the Data Integration Pattern diagramming
Activated Implementation Document. (implemented within the Data Architecture).

- Evidences of DAM Tools & - The Entity must choose & implement a set of technologies to design, develop and
Technologies in Use. execute the Entity’s DAM initiatives. The set of technological tools shall include, as a
minimum:

• Data Architecture Design - Visually representing data and system


components along with data flow diagramming.

• Data Modeling - Drawing functionality to create & modify data and


system objects, attributes and relationships, and reverse-engineer the
existing data models.

• Data Lineage - Capturing and maintaining data flows between systems


to enable an impact analysis.

158
Checklist – Data Architecture and Modelling Domain

Has the Entity developed and implemented practices for Data Architecture & Modelling (DAM) activities (including Data Flows,
DAM.MQ.2
Data Models and Governance considerations)?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of the Enterprise - The Entity must attach a proof that the Data Catalog has been updated based on new or
Data Model Uploaded in the updated Data Models.
Data Catalog.

- Evidences of Applying the - The Entity must attach a picture proving applying Technical Standards on the Data
Technical Data Standards. Representation Models which were pre-defined in the Data Model Representation’s
Technical Standards Document.

Level 4: - The Monitoring Report of the - The report must be prepared based on the data of the Key Performance Indicators
Managed Implementation of the DAM (KPIs) (Indicator Cards) which were pre-defined to measure the Entity’s Performance in
Practices with Pre-defined implementation DAM practices, and each indicator’s data or card should include the
KPIs. following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - The Entity must attach a report containing the following, as a minimum:
Pioneer Report Including the DAM
Business Process • The documentation of identifying, implementing and monitoring the
Documentation & Other Continuous Improvement Mechanisms of the DAM practices with the
Related Data Flow relevant Business Processes.
Documentation.
• The other Data Flow documentations.

159
8.2.7. Data Sharing & Interoperability Domain

Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Names of the Current - The Entity must attach a list of the names of the current DSI Practices and Activities,
Establishing Practices of Data Sharing including an evidence to support the practice of each Activity, being one of the following, as
and Integration (DSI). a minimum:

• Email copies / screenshots.

• Document Copies.

• System screenshots.

- The Updated and - The Entity must attach the report of the Data Integration assessment results, including the
Approved Results of the following, as a minimum:
Initial Data Integration
Assessment. • The current IT Structure: an inventory of all existing IT components (data
sources, systems, applications and data stores).

• High Level Data Lineage including the rules according to which data is
changed, and the frequency of changes.

• Data Models used by the Entity's IT components.

Level 2: - The Developed Data - The Entity must attach the developed Data Integration Strategy containing the following, as a
Defined Integration Strategy minimum:
Document.
• The Strategy’s approval and effective date.

• The Strategy must be recent (Approved within the last three years).

• The objectives, initiatives, projects and the metric indicators of the


implementation of the Data Integration activities and milestones. The
activities shall incorporate what is needed to achieve this Domain’s
specifications, as a minimum.

• The implementation roadmap of the Data Integration initiatives.

• The assignment of the required resources & budget allocation to manage


the implementation of this Domain.

160
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

- The Developed Target - The Entity must attach the Target Data Integration Architecture including the following, as a
Data Integration minimum:
Architecture.
• The Data Integration requirements.

• The Data Integration Architecture diagram.

• The Architecture components.

- The Developed Data - The Entity must attach the developed Data Integration Plan including the following, as a
Integration Plan (Including minimum:
Data Sharing Activities).
• The summarized Architecture briefs.

• The roadmap and covenant charters of the projects and initiatives in the
plan.

• The assignment of the required resources & budget allocation for the
projects and initiatives in the plan.

- The Developed Data - The Entity must attach the Policies of the Data Sharing & Interoperability (DSI) Domain
Sharing Policies. including the following, as a minimum:

• The developed policies must be aligned with the National Data Governance
Policies published by NDMO-SDAIA.

• Every Policy document must include, the following, as a minimum:

• Policy title / name.

• Policy owner.

• Release date.

• Version number.

• Version history.

• Objective.

• Identified stakeholders.

• Communications process.

• Monitoring procedure.

• Document control (Preparation, review, approval).

• Policy Statement.

• Roles & responsibilities.

• Terminology.

• Scope of work.

• Activation mechanisms.

161
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

• Target audience.

• Approval.

• References.

- The Developed Data - The Entity must attach the developed and approved DSI Domain’s training and awareness
Sharing Training Plan. plan including the following, as a minimum:

• The training and awareness programs plan implementation dates.

• The objectives of training and awareness including, as a minimum, the


topics stated in the “Data Management and Personal Data Protection (DM
& PDP) Standards” document i.e.:

• Introduction on the applicability of the Data Sharing process.

• The leading / best Data Sharing practices.

• The consequences of mishandling Data.

• The Data Sharing Standards, Controls & Principles.

• The methods and channels through which the trainings will be conducted.

• Identifying the DSI Domain Awareness campaign channels, which include:

• E mails or mobile phone messages.

• Publications.

• Lectures or workshops.

Level 3: - A Data Integration Plan - The Entity must attach an updated and approved report on the implementation status of the
Activated Implementation Status DSI Plan, containing the following, as a minimum:
Report.
• The activities related to the DSI Domain and the implementation status of
each activity.

- A Progress Report on - The Entity must attach an updated and approved report clarifying the DSI training &
Data Sharing Training awareness implementation status containing the following, as a minimum:
Programs.
• The training and awareness target audience.

• The list of conducted training programs and awareness campaigns


(Including the topics of the programs, the dates when they were conducted,
and the names of the training attendants).

• Samples of the activities performed by the Entity to raise awareness about


the DSI Domain, including:

• E mails or mobile phone messages.

• Publications.

• Lectures or workshops.

162
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.1 Has the Entity developed and implemented a Data Sharing and Integration (DSI) Plan in line with the Data Sharing Policies?

Levels Acceptance Evidence Acceptance Criteria

- A Report on the Defined - The Entity must attach an updated and approved report presenting the DSI Domain’s roles
Roles and and responsibilities including the following, as a minimum:
Responsibilities for DSI.
• The stewardship roles of the DSI processes.

• The stewardship responsibilities of the DSI processes.

Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the DSI activities
Managed with Pre-defined Key based on pre-defined KPIs (Indicator Cards), and each indicator’s data or card should
Performance Indicators include the following, as a minimum:
(KPIs) for the Data
Integration Plan and Data • Indicator’s Name / Code.
Sharing Activities.
• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated & approved report presenting the DSI Domain’s
Pioneer Improvement Report on continuous improvements including the following, as a minimum:
the Data Integration Plan
and Data Sharing • The conducted reviews.
Activities.
• The results of each review.

• The implemented enhancements (with supporting documents).

163
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of the Current - The Entity must attach a report on the current DSI Practices and Processes, both internally
Establishing DSI Practices / within the Entity and externally with other Entities, including an evidence to support the
Processes by the Entity practice of each Activity / Process, being one of the following, as a minimum:
(Internally & Externally).
• Email copies / screenshots.

• Document Copies.

• System screenshots.

Level 2: - The Process - The Entity must attach the Data Sharing Process documents with the Process details
Defined Documentation for Data including the following, as a minimum:
Sharing (Including Data
Classification Levels and • Data Sharing Request Reception.
Timelines).
• Identification / assignment of roles.

• Data Classification (DC) level check.

• Data Sharing Principles assessment.

• Data sharing decision and replying with feedback.

• Business Data Executive’s approval.

• Design and implementation of Data Sharing Controls.

• Data Sharing Agreement signing.

• Sharing Data with the Requestor.

- The Developed and - The Entity must attach the Forms of the internal and external Data Sharing requests using the
Approved Data Sharing Entity’s approved templates.
Request Forms (Internal
and External).

- The Developed and - The Entity must attach the internal Data Sharing Agreements between information systems
Approved Internal Data within the Entity.
Sharing Agreement
Template.

- The Developed and - The Entity must attach the external Data Sharing Agreements with other Entities including the
Approved External Data following, as a minimum:
Sharing Agreement
Template. • The Purpose of Data Sharing.

164
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

• Information about each requesting and sharing Entity.

• Lawful / Legal / Regulatory basis for Sharing.

• Sharing details (Date, Duration, etc.).

• Liability provisions.

• The Data Sharing Agreements signed by the Business Data Executive and
the Requestor.

Level 3: - Evidences of - The Entity must attach a communication operationalization report related to the defined
Activated Operationalization of The mechanism of the Data Sharing process, including the following, as a minimum:
Data Sharing Process,
e.g.: Communication of • Correspondence exchanged proving the establishment of a communication
The Defined Data channel between the communicating Entities.
Sharing Mechanism.
• A communication evidence of the approval of a formal mechanism defined
for Data Sharing.

- Evidences of Data - The Entity must attach a report on the electronic communication with the requestor Entity
Sharing Through requesting Data Sharing through the SDAIA certified and SDAIA approved channels (e.g.: The
SDAIA's Certified and Government Service Bus (GSB), etc.). The report may contain:
Approved Channels.
• System screenshots or records. (The communication channel usage can
include Data Sharing).

• A report on the communications using the Entity’s formal government


website.

- The Access - The Entity must attach a document detailing the Controls of authorization & usage for
Authorization Controls accessing the official website.
Document.

- Evidences of Data - The Entity must attach the record(s) containing the Data Sharing requests:
Sharing Requests
Submitted to The Entity • The list received by the Entity.
and The Requests
Submitted by The Entity • The list sent / submitted by the Entity.
Through the Established
Channel.

- Evidences of the Entity's - The Entity must attach the notifications acknowledging receiving the Data Sharing requests
Responses to the Data with the responses.
Sharing Requests.

165
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

- An Adoption Evidence of - The Entity must attach a report containing an evidence of the Entity's adoption of a template
The Developed Data developed for the Data Sharing Agreement upon an internal request from within the Entity.
Sharing Agreement
Template for An Internal
Data Sharing Request.

- An Adoption Evidence of - The Entity must attach a report containing an evidence of the Entity's adoption of a template
The Developed Data developed for the Data Sharing Agreement upon a request from an external Entity.
Sharing Agreement
Template for An External
Data Sharing Request.

- The Documented Review - The Entity must attach a report on the results and outcomes of the documented reviews of
Outcomes of The Data the Data Sharing Agreements.
Sharing Agreements.

Level 4: - The Monitoring Report - The Entity must attach an updated and approved report on monitoring the Data Sharing
Managed with Pre-defined KPIs for Processes based on pre-defined KPIs (Indicator Cards), covering:
the Data Sharing
Processes. • The number of Data Sharing requests received.

• The number of Data Sharing requests accepted / denied.

• The number of Data Sharing requests sent.

• The number of ongoing Data Sharing agreements.

• The average duration of the Data Sharing request evaluation process


expressed, in days.

- Each indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

166
Checklist – Data Sharing and Interoperability Domain

DSI.MQ.2 Has the Entity defined and implemented Processes for Sharing Data within the Entity and with other Entities?

Levels Acceptance Evidence Acceptance Criteria

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Compliance Audit - The Entity must attach the Compliance Audit Methodology document.
Methodology.

Level 5: - The Continuous - The Entity must attach a continuous improvement report including the following, as a
Pioneer Improvement Report on minimum:
the DSI Processes.
• Evidence of DSI Process automation.

• The documents of the periodic reviews, evaluations & documented results


of the DSI Domain Processes.

Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of Data - The Entity must attach evidences of Data Movement or Data Integration / Interoperability:
Establishing Movement / Integration
within the Entity and with • Internally within the Entity.
Other Entities.
• Externally with other Entities.

167
Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level 2: - The Integration - The Entity must attach the Integration / Interoperability Requirements document containing
Defined Requirements Document. the following, as a minimum:

• A clearly defined scope.

• Entity's Business goals / objectives and aims to be achieved.

• Implementation Timeline.

• Resources required.

• Cost estimate.

• Functional requirements.

• Non-Functional requirements.

- The Solution Design - The Entity must attach the Solution Design document including the following, as a
Document. minimum:

• Integration Solution Overview.

• Target Data Integration Architecture.

• Data Orchestration – The Data Flow Diagram (DFD).

• Source to Target Mapping – A set of Data Transformation instructions


that determine how to convert the structure and content of Data in the
source system to the structure and content needed in the target system.
The instructions shall include the following, as a minimum:

• The technical format of Data at the source and the target.

• The Specifications of transformations required for all intermediate


staging points between the source and the target.

Level 3: - The Document of the - The Entity must attach a life cycle description document, being the standardized and
Activated Approved Standardized approved cycle for the development of Integration / Interoperability solutions at the Entity
Entity Wide Integration level.
Solution Development
Lifecycle.

- The Developed Test Scripts - The Entity must attach the developed test scripts and the conducted tests (Integration
and the Conducted Tests tests & functional tests) in alignment with the Plan and the Solution Design document,
(Integration, Functional) in containing the following, as a minimum:
Line with the Plan and the
Solution Design Document. • The defined & identified Test Use Cases.

• Evidence of a Test Environment setup.

• The Test Use Cases, executed in a Test Environment, and the


documented test results.

168
Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report with - The Entity must attach an updated and approved report on monitoring the Data Integration
Managed Pre-defined KPIs for the / Interoperability initiatives based on the data of the KPIs (Indicator Cards) which were pre-
Data Integration Initiatives. defined for the DSI Domain, covering:

• Data transfer rate between systems / applications.

• Latency between Data sources and Data targets.

- Each indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Integration Solution - The Entity must attach a document on monitoring and maintaining the Integration /
Monitoring and Interoperability solutions, containing the following, as a minimum:
Maintenance Document.
• Reports of any detected programming flaws or errors.

• Change requests from end users to incorporate changes in business


requirements.

Level 5: - The Continuous - The Entity must attach an updated & approved report including the following for the Data
Pioneer Improvement Report on the Integration / Interoperability Practices, as a minimum:
Data Integration Practices.
• The documents of the periodic reviews & documented results.

• The continuous improvement mechanisms.

169
Checklist – Data Sharing and Interoperability Domain

Has the Entity defined and implemented a Data integration architecture to manage the Data movement efficiently across Data
DSI.MQ.3
stores, systems, and applications?

Levels Acceptance Evidence Acceptance Criteria

- The Continuous - The Entity must attach a document clarifying the details of the automation activities
Improvement Mechanisms program (Pipeline) in relation to Continuous Integration & Continuous Delivery (CICD).
for Data Integration, e.g.:
Continuous Integration &
Continuous Delivery (CICD)
Pipeline Details for
Automation.

Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Data - The Entity must attach a report on the current Data Integration / Interoperability Processes
Establishing Integration or Data or the current Data Movement Processes within the Entity.
Movement Processes within
the Entity.

Level 2: - The Developed Data - The Entity must attach a report on the Processes and Standards process to integrate data
Defined Migration Processes (i.e.: from disparate sources and load it into the Data Warehouse Store i.e.: Extract, Transform,
ETL). Load (ETL).

- The Developed Data - The Entity must attach a report on the Processes and Standards process to store the
Migration Processes (i.e.: unstructured data in its raw native format in the Data Lake i.e.: Extract, Load, Transform
ELT). (ELT).

- The Risk Assessment - The Entity must attach an assessment report on the risks which may arise as a result of
Report on the Entity's sharing the Entity’s Datasets.
Datasets to be Shared.

170
Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Levels Acceptance Evidence Acceptance Criteria

- The Defined Data Sharing - The Entity must attach a report on the Controls that have been identified and defined for
Controls. Data Sharing.

Level 3: - Evidence of the - The Entity must submit a report to showcase the Implemented “Extract, Transform, Load
Activated Implemented Data Migration (ETL)” process for integrating data from disparate sources and loading into Data
Processes (i.e.: “Extract, Warehouse Store. The process must include the following steps:
Transform, Load (ETL)).
• Extract – Data Extraction must include the following, as a minimum:

• Identification of Data Sources from which the Data will be extracted.

• Extracting Data from the Data Sources.

• Staging the extracted Data temporarily in a physical Data store, e.g.:


On disk or in another memory.

• Transform – The Data Transformation step must include the following, as


a minimum:

• Data Mapping – Planning the actual transformation process.

• Data Transformation – Removing duplicate Data, filling out missing


values, filtering, sorting, joining and splitting Data.

• Review – Validating the correctness of the transformation.

• Load – Physically storing the transformed Data in the Data Warehouse /


Store.

- Evidence of the - The Entity must submit a report to showcase the Implemented “Extract, Load, Transform
Implemented Data Migration (ELT)” process for storing unstructured data in its raw native format in the Data Lake. The
Processes (i.e.: “Extract, process must include the following steps:
Load, Transform (ELT)).
• Extract – The Data Extraction step must include the following, as a
minimum:

• Identification of Data Sources from which the Data will be extracted.

• Extracting Data from Data Sources.

• Loading – Physically storing Data in its raw native format in the Data
Lake.

• Transform – The Data Transformation step must include the following, as


a minimum:

• Data Mapping – Planning the actual transformation process.

171
Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Levels Acceptance Evidence Acceptance Criteria

• Data Transformation – Removing duplicate Data, filling out missing


values, filtering, sorting, joining and splitting Data.

• Review – Validating the correctness of the transformation.

- A Report on the - The Entity must attach a report clarifying the Controls implemented on the Migration
Implemented Controls (e.g.: Processes “ETL or ELT” (e.g., Controls for: Data Sharing, Data Integration /
Data Security & Protection, Interoperability, and Data Access Authorizations, etc.), including the following, as a
Data Sharing, Data minimum:
Integration, Data Access,
etc.). • Attaching an evidence that all stakeholders involved in Data Sharing
applied the appropriate Security Controls to protect and share Data, in a
safe and reliable environment, in alignment with the relevant laws and
regulations, and as issued by the National Cybersecurity Authority (NCA).

• Attaching an evidence that all stakeholders involved in Data Sharing have


the following, as a minimum:

• The authorization to view, obtain / acquire and use this Data (which
may require security scanning depending on the nature and
sensitivity of the Data based on the Data Classification (DC) Domain
Standards).

• Knowledge, skill, and qualified people who are properly trained on


handling the Shared Data.

• Attaching an evidence that all stakeholders involved in Data Sharing


applied the controls which are necessary to appropriately manage and
protect the Shared Data, including what is stated in the National Data
Governance (DG) Policies issued by SDAIA, e.g.:

• Legal / Regulatory basis.

• Delegation / Authorization.

• Data type.

• Data pre-processing.

• Data Sharing methods.

• Data usage & maintenance.

• Data Sharing duration.

• The number of times Data was Shared.

• Sharing cancellation.

• Liability provisions.

172
Checklist – Data Sharing and Interoperability Domain

Has the Entity developed and implemented Data Sharing Controls and Processes for efficient Data transformation and
DSI.MQ.4
movement?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report with - The Entity must attach an updated and approved report on monitoring the DSI Controls,
Managed Pre-defined KPIs for the DSI Practices and Standards based on pre-defined KPIs (Indicator Cards), and each indicator’s
Controls, Processes and data or card should include the following, as a minimum:
Standards.
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - The Entity must attach an updated and approved report on reviewing the processes and
Pioneer Improvement Report on mechanisms implemented to automate the DSI Domain’s Controls, Standards, and
Reviewing the DSI Practices, including the following, as a minimum:
Processes and Mechanisms
to Automate the DSI • The documents of the periodic reviews & documented results.
Controls and Practices.
• The continuous improvement mechanisms.

173
8.2.8. Reference and Master Management Data Domain

Checklist – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Currently Existing - A report must be attached clarifying the current practices related to Reference Data
Establishing Practices Related to the management and/or Master Data management.
RMD Management
Domain.

Level 2: - The Developed and - An RMD Management Plan must be attached including the following, as a minimum:
Defined Approved RMD
Management Plan. • A roadmap that includes the activities and milestones for the Entity’s
RMD Management. The activities shall incorporate what is needed to
achieve this Domain’s specifications, as a minimum.

• The assignment of the required resources & budget allocation to


manage the implementation of the roadmap activities.

Level 3: - The RMD Management - A report must be attached clarifying the implementation status including the following,
Activated Plan Implementation as a minimum:
Status Report.
• The achievement percentages of the initiatives and projects included in
the executive action plan of RMD management.

- The RMD Management - A report must be attached clarifying the implementation status including the following,
Training Implementation as a minimum:
Status Report.
• Attach samples of the activities performed by the Entity to raise
awareness in the RMD Management Domain (Awareness messages,
publications, lectures or workshops).

• Attach a sample attendance certificate of training the Entity’s


employees in the RMD Management Domain.

- The RMD Management - A document must be attached clarifying the RMD Management Operating Model
Operating Model including the following, as a minimum:
Showing the RMD
Stewardship Coverage. • Assigning Business Data Stewards.

• Assigning IT Data Stewards.

174
Checklist – Reference and Master Data Management Domain

Has the Entity developed and implemented a plan focused on improving its Reference & Master Data (RMD) Management
RMD.MQ.1
capabilities?

Levels Acceptance Evidence Acceptance Criteria

- The RMD Change - A consolidated log must be attached containing the following, as a minimum:
Request Logs.
• The RMD change requests.

• The decisions made on the RMD change requests.

- The RMD Management - The planning documents & supporting artifacts of the RMD Management initiatives must
Documents & Artifacts. be attached including the following, as a minimum:

• The register of the Architecture status of all RMD initiatives (e.g., The
Statement of Architecture Work (Scope) document).

Level 4: - The Monitoring Report of - A monitoring report must be attached, prepared based on KPIs (indicator cards) data
Managed the RMD Management pre-defined in the RMD Management Plan. The data of each indicator or each indicator
Plan Implementation with card must include the following, as a minimum:
Pre-defined KPIs.
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - A report must be attached clarifying that the Entity identified, implemented and is
Pioneer Improvement Report of monitoring continuous improvement mechanisms for the RMD Management Plan
the RMD Management including the following, as a minimum:
Plan.
• The documents of the periodic reviews & documented results of RMD
Management.

175
Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidence of Projects / - An evidence must be attached proving the implementation of any existing project or
Establishing Initiatives with Reference current initiative that involves identified Reference Data and/or Master Data Objects.
Data and/or Master Data
Identified.

Level 2: - The Identified, Prioritized - Documents must be attached clarifying that the Entity identified, prioritized &
Defined and Categorized RMD categorized the RMD including the following, as a minimum:
Objects.
• Identifying & Prioritizing the RMD:

• The identified Master Data objects (internal & external).

• The identified Reference Data objects (internal & external).

• The identified Data sources and applications that generate, read,


update, and delete RMD Objects.

• The logically grouped & prioritized RMD Objects.

- Reference Data - Attach documents showing that the entity has categorized the reference data. These
Categorization. documents include, as a minimum, the following:

• Internal Reference Data.

• External Reference Data.

- Master Data Categorization. - Attach documents showing that the entity has categorized the reference data. These
documents include, as a minimum, the following:

• Internal Master Data.

• External Master Data.

176
Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

- The Reference & Master - Evidence documents must be attached clarifying that the Entity collected the RMD
Data Requirements. requirements including the following, as a minimum:

• RMD Management roles across the Data lifecycle from creation to


archiving.

• Rules for accurate matching and merging of the Master Data Records
from different Data sources to create Golden Records.

• Requirements for provisioning Master Data Golden Records to


consuming applications.

• Requirements for provisioning Reference Data Objects to consuming


applications.

• Data Quality (DQ) requirements for RMD Objects to be leveraged as


input for the Initial Data Quality Assessment detailed in the Data
Quality (DQ) Domain.

- The Defined SLAs for RMD - A document should be attached clarifying the RMD lifecycle management process
Lifecycle Management. SLAs including, as a minimum:

• The RMD request implementation time frame.

• The escalation procedures to be done upon SLA violation.

Level 3: - The RMD Lifecycle - A document must be attached clarifying a process for managing RMD Objects across
Activated Management Process. the Data Lifecycle from creation to archival / disposal. The process must cover roles
and procedural actions involved in the following Data Lifecycle steps, as a minimum:

• Creating new RMD Objects & Instances.

• Modifying existing RMD Objects & Instances.

• Archiving RMD Objects & Instances.

- Evidence of the - Attachment of the shared reference datasets with SDAIA.


Implementation & Adoption
of the National Reference - Attachment that proves the processes for transferring the reference data.
Datasets.

Level 4: - The Monitoring Report of - A monitoring report must be attached, prepared based on KPIs (Indicator Cards)
Managed the RMD Management pre-defined for the RMD Management Processes, and based on the SLAs, including
Processes with Pre-defined measuring the following indicators, as a minimum:
KPIs & SLAs.
• The number of incorrect Data values in the Master Data Records.

• The Mean Time to Repair (MTTR) RMD quality issues.

• The volumes of change requests submitted to modify RMD Objects.

177
Checklist – Reference and Master Data Management Domain

Has the Entity defined and implemented processes to manage its Reference & Master Data (RMD) objects from creation to
RMD.MQ.2
archival?

Levels Acceptance Evidence Acceptance Criteria

- Each indicator’s data or card should include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - A report must be attached clarifying that the Entity identified, implemented and is
Pioneer Improvement Report of the monitoring continuous improvement mechanisms for the RMD Management Processes
RMD Management including the following, as a minimum:
Processes & Standards.
• The documents of the periodic reviews & documented results of the
RMD Management Processes.

Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

178
Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

Level 1: - Evidence of Reference & - An evidence must be attached for existing Reference Data and/or Master Data (Tool
Establishing Master Data (RMD) Used sample: Excel) used for the purposes of a specific, current project.
for Project Purposes.

Level 2: - The Target RMD - A document must be attached illustrating that the Entity designed a RMD Hub to
Defined Management Architecture efficiently manage the RMD Objects including, as a minimum:
Design.
• Identifying a Data Hub architecture implementation pattern to
manage the Master Data Objects.

• The design considers supporting centralized management of


Reference Data – The Hub is the single source of Reference Data,
with creation and modification operations performed exclusively
within the Hub.

- The Developed RMD - The target RMD environment Conceptual Architecture must be attached as per the
Conceptual Architecture. Entity's selected Data Hub architecture design; indicating foundational building block
components and high-level capabilities associated with the components. The
conceptual RMD architecture must consist of the following, as a minimum:

• Architecture Description – A description of the overall architecture


concept defined.

• Components Definitions and Descriptions – Building blocks (The Data


Hub, Data sources, consuming applications, etc.) of the RMD
Conceptual Architecture with descriptions of their purposes.

• The General Conceptual Architecture Diagram – A high-level view of


how components work together to address the Entity's RMD
requirements.

- The Developed RMD - A document must be attached clarifying an Information Architecture for the target RMD
Information Architecture. environment based on the defined Conceptual Architecture. The Information
Architecture must represent the following components, as a minimum:

• RMD Objects – An inventory of the identified RMD Objects including


Metadata definitions.

• Conceptual & Logical Master Data Model – A conceptual & logical


Data model for the identified Master Data Objects and their
relationships.

• RMD Sources – An inventory of the identified RMD sources.

• Rules for Matching & Merging Master Data Records from different
Data sources to create Golden Records.

• RMD Flows.

179
Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

- The RMD Hub / Tool - A document must be attached defining, identifying & documenting the technical
Technical Requirements. requirements for the RMD Hub platform based on the defined target RMD Information
Architecture. The requirements must cover the following areas, as a minimum:

• Management of Workflows:

• Creating & modifying RMD Records.

• Assigning Master Data Management (MDM) Stewardship.

• Versioning Control - Tracking changes in RMD Records over time.

• Functional Capabilities – Detailing the functional capabilities required


from the Hub (e.g.: Import, export, Data mappings, automation of
operational tasks around collection, cleansing, etc.).

• Technical Capabilities - Detailing the technical capabilities required


from the Hub (e.g., API Integration with upstream & downstream
applications and systems).

• Security - Supporting secure Data exchange between the Hub and


the connected applications / Data sources.

Level 3: - The Implemented RMD - A document must be attached providing evidence that the Entity implemented an
Activated Management Hub. integrated Hub for managing its RMD Objects to be considered as the trusted source
of RMD across the entire Entity. The document must provide evidences for the
following performed tasks, as a minimum:

• The instantiated physical Data Hub Technical Architecture


components necessary to address the Entity's target RMD
Information Architecture requirements.

• The established Master Data Model as defined by the RMD


Information Architecture within the Data Hub.

• The loaded RMD Objects into the Data Hub.

• The necessary replication activated between Master Data Source


systems and the Data Hub.

• Synchronization activated between the Data Hub and the consuming


applications.

- The Workflow - An evidence must be attached proving that the Data Hub is considered a reliable
Documentation Showing source for each information system or new application that requires the use of special
the Establishment of the RMD Objects (for each program).
Data Hub as the Entity's
Trusted Source.

180
Checklist – Reference and Master Data Domain

RMD.MQ.3 Has the Entity implemented a Data Hub (or Tool) as the trusted Data source to support the RMD Management Processes?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report of - A monitoring report must be attached, prepared based on KPIs (Indicator Cards)
Managed the RMD Management Hub pre-defined to monitor the RMD Management Hub / Tool capabilities, and each
/ Tool Capabilities with Pre- indicator’s data or card should include the following, as a minimum:
defined Key Performance
Indicators (KPIs). • Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous - A report must be attached clarifying that the Entity identified, implemented and is
Pioneer Improvement Report of the monitoring continuous improvement mechanisms for the RMD Hub & Architecture
RMD Information including the following, as a minimum:
Architecture and the
Implemented Data Hub. • The documents of the periodic reviews & documented results of the
RMD Hub including the Information Architecture.

181
8.2.9. Business Intelligence and Analytics Domain

Checklist – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1 activities?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Current activities related to - A document must be attached, covering:


Establishing BIA.
• Current activities related to BIA (reports and/or dashboards, etc.).

Level 2: - The defined & approved BIA - An approved BIA plan must be attached, and the plan must include the following, as a
Defined Plan. minimum:

• A roadmap of activities and milestones for Data Analytics & Artificial


Intelligence (DA & AI) use cases. The activities should include what is
necessary to achieve this Domain’s specifications, as a minimum.

• The allocation of the required resources & budget to manage the


implementation of the DA & AI use cases.

Level 3: - The BIA plan & roadmap - A status report of the BIA Plan & roadmap implementation must be attached, which
Activated implementation status report. includes as a minimum:

• The achievements percentages of the initiatives & projects included in


the plan & roadmap.

- The Defined & Documented - A document must be attached showing the roles & responsibilities of the Data team
Roles & Responsibilities for members working on the BIA Activities’ Implementation including, as a minimum:
BIA Activities Including Data
Stewardship Roles. • The defined & documented roles & authorizations of all data specialists,
including the Data Stewards & Data Owners.

Level 4: - The Effectiveness Monitoring - A monitoring report must be attached showing the effectiveness of BIA plan & activities &
Managed Report of the BIA plan & must be prepared based on pre-defined KPIs, and each indicator’s data or card should
activities with pre-defined Key include the following, as a minimum:
Performance Indicators (KPIs).
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

182
Checklist – Business Intelligence and Analytics Domain

Has the Entity developed and implemented a plan to manage and orchestrate its Business Intelligence & Analytics (BIA)
BIA.MQ.1 activities?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - Regular reviews and - A report of the BIA plan & roadmap must be attached including the following, as a
Pioneer improvements to the business minimum:
intelligence and analytics plan
and roadmap. • Documented periodic reviews and results including the updated plan
and updated roadmap.

• BIA continuous improvement Mechanisms.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - List of the implemented BIA - The document must include a list of the use cases implemented within the Entity, and for
Establishing uses cases within the Entity. each use case included in the specified list, the following must be documented, as a
minimum:

• The name of the use case.

• The description of the use case.

• The desired & the achieved objectives of the use case.

Level 2: - The approved BIA business - A document containing the following must be attached:
Defined cases / BIA use cases.
• The list of approved BIA Business Cases.

183
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

• The list of approved BIA use cases.

- For each business case / use case in the identified lists, the following must be
documented, as a minimum:

• The name of the case.

• The description of the case.

• The stakeholders of the case.

- The approved BIA Use Cases - A document must be attached that includes the approved framework for shortlisting /
Prioritization Framework. prioritizing BIA use cases, and the document must include the following, as a minimum:

• Prioritization Identification Criteria: A list of criteria used to prioritize use


cases such as strategic alignment, potential impact, complexity, and
resource requirements.

• Weighting Criteria: An explanation of how each criterion is weighted and


how weighting is specified. This can be based on opinions of experts,
inputs from stakeholders, a data-driven approach, etc.

• Scoring Mechanism: A description of how scores are calculated to


weight & specify use case priorities.

- The shortlisted / prioritized - A document must be attached, listing the high-priority use cases based on the
use cases based on business pre-approved prioritization framework. For each use case in the prioritization list, the
needs. following must be documented, as a minimum:

• The name of the use case.

• The description of the case.

• The priority of the case.

- Note, if there is a single document containing the approved use cases, and / or approved
framework, and/or prioritized use cases, this evidence can be attached to this
requirement and the previous two requirements.

- The BIA Use Case Portfolio - A portfolio document of BIA use cases must be attached, and the following details must
document with details of each be documented for each use case, as a minimum:
use case.
• The desired & achieved objectives of each use case.

• Type of Analytics leveraged (among the five Analysis-Maturity levels,


which are Discovery, Descriptive, Diagnostic, Predictive, Prescriptive).

• The expected benefits & business value aimed to be derived (Return on


Investment (ROI) through the development of each business case.

184
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

• Stakeholders & Entities involved in the implementation of the use case,


the responsible 'owner' who’d lead the use case, and the target
consumers who’d benefit from the insights generated by the use case.

• A list of business requirements needed to implement the use case.

• Data sources that would feed the use case, with the required data fields.

• The technologies required to implement the use cases.

• Case priority.

- The Approved Use Case - An approved BIA use case implementation plan must be attached including:
implementation plan.
• The implementation plan for each shortlisted / prioritized & approved
Use Case, with the succession order of the implementation steps from
the use case trial / piloting phase, getting through the production phase
toward continuous result monitoring. The implementation plan shall
address the following, as a minimum:

• To detail Functional & Non-Functional Requirements – Use case


objectives translated into analytics requirements.

• High Level Design – Conceptual design of the analytics solution,


(e.g., Wireframes).

• Staging & Production Environment Preparations – Analytics solution


hosting environments during and after development.

• Development – Functional & non-functional requirements to be


developed to meet the high-level design.

• Testing – Scopes & types of testing that must be conducted.

• Deployment & Schedule – Timeline for establishing a pilot and / or


delivery of the complete use case.

• Required Resources – The Entity’s key personnel who have the


needed skills, expertise & knowledge to successfully implement the
Data Analytics Use Case.

• Acceptance Criteria – Key criteria for measuring the successful


implementation of the Data Analytics use case.

- The defined use case - A Use Case Implementation methodology document must be attached including the
implementation approach (e.g. following, as a minimum:
DevOps, Agile, etc.).
• The approach/method’s general description.

• Key practices & tools in the approach.

• The roles & responsibilities of the participants in the approach.

- The approved use case - A validation process document must be attached showing how to validate use case
validation process. outcomes, stating the initial intended purpose, and explaining the alignment with the

185
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

Entity's overall Data Analytics Plan. The validation of use cases shall address the
following, as a minimum:

• Analytics Use Case functional and non-functional requirements.

• Analytics Use Case Personal Data Protection (PDP) considerations, as


prescribed in the Privacy / PDP Domain.

• Data Analytics Use Case Return on Investment (ROI) for each target set.

Level 3: - The Implemented and new - A document must be provided that includes a list of implemented use cases, as well as
Activated Use Cases. another list of new use cases that have been evaluated and implemented according to
business requirements. For each use case in the specified list, the following must be
documented, as a minimum:

• The name of the use case.

• The description of the case.

• The priority of the case.

- The up-to-date BIA use cases - An up-to-data register of use cases must be attached showing the following, as a
register. minimum:

• The Register’s Releases (with the dates).

• BIA use cases.

• The Results of the final reviews of each use case implementation.

• The stakeholders in the Entity who can view the register.

- The Outcomes of the BIA Use - A document must be attached showing the outcomes of the BIA use case validation
Case Validation Processes. activities, and it must include the outcome of each use case validation process, stating
the initial intended purpose, and explaining the alignment with the Entity's overall Data
Analytics Plan. The use case validation outcomes document must include the following,
as a minimum:

• Analytics Use Case functional and non-functional requirements.

• Analytics Use Case Personal Data Protection (PDP) considerations, as


prescribed in the Privacy / PDP Domain.

• Data Analytics Use Case Return on Investment (ROI) for each target set.

Level 4: - The Monitoring Report on the - A monitoring report on the effectiveness of the BIA portfolio must be attached, including
Managed BIA Portfolio Effectiveness the following KPIs, as a minimum:
with pre-defined Key
Performance Indicators (KPIs). • The number of the identified / shortlisted use cases.

• The number of use cases in the pilot / beta phase for testing.

• The number of the implemented use cases that are widely utilized.

186
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

• The total Return on Investment (ROI) value generated from use case
implementations.

• Other potential Key Performance Indicators (KPIs) include (but are not
limited to):

• Accuracy in achieving desired outcomes from analytical models.

• Number of positive impacts resulting from implementing use cases


on business aspects, such as process improvement or enhancing
user experience.

• Number of positive impacts resulting from implementing use cases


on technical aspects, such as increased adherence to technical
standards.

• Number of positive impacts resulting from implementing use cases


on strategic decisions.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - Continuous Improvement - An updated report must be attached on the Continuous Improvement of the BIA Use
Pioneer Report on BIA Use Cases. Cases including the following for each use case, as a minimum:

• Potential Impact.

• Competitive Advantage.

• Total Cost of Ownership (TCO) Analysis:

187
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.2 Has the Entity identified BIA use cases and defined a plan for the use case implementation?

Levels Acceptance Evidence Acceptance Criteria

• Analyzing all expenses over the life of the project or per task.

• The description of the updated TCO calculation methodology, with


any assumptions or estimations applied during the calculation.

• Return on Investment (ROI) Analysis:

• Analyze all benefits achieved such as revenue increases, cost


savings, or productivity gains, in comparison with the expenses per
project or per task.

• The description of the updated ROI calculation methodology, with


any assumptions or estimations applied during the calculation.

- The Updated BIA Use Case - An updated use case record must be attached including the following, as a minimum:
Register and the Optimized
Use Cases. • The Revised & improved use cases.

Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing BIA Domain - A document must be attached covering the following, as a minimum:
Establishing Processes & Governance
Documentation. • The details of the current BIA Processes.

• The current BIA Governance documents.

- The List of Existing Reports & - A document must be attached covering the following, as a minimum:
Dashboards.
• The current Reports.

• The current Dashboards.

188
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

Level 2: - The Developed & Approved - A document containing details of the developed and approved processes for managing
Defined Processes of BIA business intelligence and analytics should be attached, which should include, at a
Management. minimum, the following:

• Data warehouse or data lake processes with logical models for modeling
business functions.

- The Approved Process of New - A document must be attached containing the approved process management procedures
Data Source Requirements. of identifying, evaluating, and approving new sources of data to be used in BIA
applications. The document must include, as a minimum:

• Data Source Identification: This includes identifying new sources which


may have processes that are related to the Entity's BIA needs, e.g.,
External data sources or internal data sources from units within the
Entity.

• Data Source Evaluation: This involves evaluating new sources which are
proposed based on factors such as Data Quality (DQ), alignment with
the business objectives, and cost-benefit considerations.

• Prioritizing Data Sources: This includes giving priority to proposed new


sources based on the importance of their data and their impact on the
Entity’s BIA activities.

• Data Source Approval / Agreement: This includes obtaining the


necessary stakeholder approvals, e.g., Data Owners, IT Teams &
Business Leaders.

189
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- The Approved Demand - The approved process document of BIA Demand Management must be attached showing
Management Process. the following:

• Demand Analysis: The process of analyzing the BIA service demand,


including request volume, frequency, and complexity.

• Request Prioritization: The process of prioritizing BIA service requests,


including the criteria that will be used to prioritize each request.

• Resource Allocation: The process of allocating resources to support the


BIA service demand.

• Communication: The process of communicating the BIA service demand


status to the stakeholders, and about the performance of the Demand
Management process.

• Governance & Compliance: The Governance & Compliance process to


implement this approved process for BIA service Demand Management.

- Development and - The document for the development and maintenance of the semantic layer in the
maintenance document of the business intelligence and analytics solution should be attached. It acts as a bridge
Semantic Layer. between data sources and end users by providing a logical representation of the data to
facilitate understanding and usage. The document should include, at a minimum, the
following:

• Data Source Mapping: The mapping between Data Sources and the
Semantic Layer, i.e., How the data from each source is mapped on to
the Semantic Layer, including any transformations or collations
performed.

• Business Logic: The Business Logic is applied to the data in the


Semantic Layer. This shall include Entity-specific definitions of key
business concepts.

• Security: This section describes how security is implemented in the


Semantic Layer. It shall define the security roles & permissions that
apply to the data.

• Performance Optimization: The strategies used to enhance the Semantic


Layer’s performance. The strategies shall include technologies such as
indexing / cataloging, caching / temporary storage, and clustering to
improve query performance.

• Metadata: The Metadata associated with the Semantic Layer, including


data descriptions, and definitions of business concepts.

190
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

• User Interface: The components of the User Interface used to access


and interact with the Semantic Layer, e.g., Dashboards, reports, and
data discovery / exploration tools.

• Record and results of any maintenance performed on the semantic


layer.

- Advanced analytics - The document for the management and governance of advanced analytics should be
management and governance attached. The document must include, as a minimum:
process.
• The Advanced Analytics Governance Framework.

• The operations & the procedures of Data Management in the Advanced


Analytics context.

• The operations & the procedures of the Development Management of


the Models used in Advanced Analytics, deployment, and maintenance.

• The security & compliance procedures in the context of advanced


analytics.

• The operational procedures of advanced analytics.

• The continuous improvement operations & procedures of advanced


analytics. This shall include procedures for measuring the Advanced
Analytics effectiveness to identify improvement areas and implement the
modifications.

- The Developed & Approved - The developed & approved Change Management Plan (for training programs) must be
Change Management Plan attached including the following, as a minimum:
including the training
programs. • The details of BIA training to be conducted for all employees
participating in BIA initiatives to raise the level of analytical capabilities
within the Entity. The training shall include the following, as a minimum:

• Methods of collecting & organizing the data required for the


analysis.

• Model Development, Analysis Method Implementation, and


Analytical Tool Usage.

• The Development of Data Models and Data Flow Paths.

• The Types of Graphic Representations of Data & Information.

• The Analysis Models Evaluation Methods.

- The Developed & Approved - The developed & approved Change Management Plan (for Awareness Campaigns) must
Change Management Plan be attached including the following, as a minimum:
including awareness
campaigns. • The details of the BIA Domain’s awareness campaigns to enhance
familiarity, knowledge & usage of the Data Analysis (DA) & Artificial
Intelligence (AI) capabilities. The Entity shall use one or more of the

191
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

communication channels (owned by the Entity) for awareness


campaigns to raise edification of the following:

• The Analytics Assets that the Entity currently owns – previously


implemented use cases, analysis models, Application Programming
Interfaces (APIs), BI reports, and the monitoring dashboards
utilized; all of the above for probable data sharing and potential
reuse.

• BIA Success Stories – Quantitative & qualitative benefits and


outcomes from recent use case implementation activities.

• The new tools used in the BIA Domain and the technical tool
workflows in the Entity Especially those tools that are based on
emerging new technologies.

- The identified & developed - A document should be attached that includes, at a minimum, the following:
Data Sources & Data Marts.
• Data Sources: e.g., Systems, databases, or applications that create,
store, or manage data that will be used in BIA. They can include Internal
Data Sources, such as transactional databases, data storages, and data
lakes; in addition to External Data Sources, such as social media
communication methods, market research reports, and publicly available
datasets.

• Data Marts: Subsets / partitions of data warehouses designed to serve


particular business units or specific job functions.

Level 3: - Evidences of the adoption & - A document must be attached including the following:
Activated implementation for business
intelligence and analytics • Proofs of adopting and implementing all business intelligence and
management and governance. analytics management and governance processes.

• Proofs of adopting and implementing comprehensive governance that


includes emerging topics such as ethics of artificial intelligence.

• Proofs of adopting and implementing advanced analytics management


and governance processes (AI/ML Ops).

- The approved Operating - A document must be attached showing the following:


Model with defined roles &
responsibilities for The Data • The approved Operating Model of the Data Science team.
Science team.
• Process flow documentation.

• The roles & responsibilities of the Data Science team which shall include,
as a minimum:

• Data Scientists.

• Data Engineers.

• Visualization Engineers.

192
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- Evidences of training courses - A document must be attached showing the following:


& awareness campaigns
conducted. • Proofs of all conducted training courses.

• Proofs of all conducted awareness campaigns.

- The approved User - A document must be attached showing the following, as a minimum:
Acceptance Test (UAT)
Documents. • The approved plan(s) to conduct UATs.

• The approved report(s) on UATs.

- The approved outcomes as - A document must be attached showing the following:


Reports & Dashboards
produced for The Business • The reports which were generated & approved for each business unit.
Units.
• The Dashboards which were generated & approved for each business
unit.

• Proofs of fully activating self-service analytics, covering both business


intelligence and artificial intelligence/machine learning (AI/ML).

- The Capacity planning - The Capacity Planning document must be attached showing the following, as a minimum:
document.
• The Current Status Analysis: Analyzing the current BIA activities
capacity.

• The Future Status Analysis: Analyzing the expected demand for BIA
services.

• The Requirements: Detailing the specific resources required to satisfy


the expected BIA Service demands, including hardware devices,
software, and employees.

• Capacity Gaps: Identifying the gaps between the current & future
capacities / capabilities, and defining strategies to fill the gaps.

• Resource Allocation: Documenting the process of allocating resources


to support the Entity's BIA activities, including capacity / capability
allocation & resource management.

Level 4: - The Monitoring Report on the - A report on the effectiveness of practices and processes for managing and governing
Managed effectiveness of practices and business intelligence and analytics should be included, using predefined key performance
processes for managing and indicators (KPIs).
governing business
intelligence and analytics - Each indicator’s data or card must include the following, as a minimum:

193
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

through predefined key • Indicator’s Name / Code.


performance indicators (KPIs).
• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

194
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- The monitoring Report on the - A report on the effectiveness of the training and awareness program should be included,
effectiveness of training and using predefined key performance indicators (KPIs) including, as a minimum:
awareness sessions
conducted through predefined • Number of training and awareness sessions conducted.
key performance indicators
(KPIs). • Other optional indicators include:

• The degree of satisfaction of the participants or target audience


upon completion of the training course or awareness campaign.

• The percentage increase in participation in subsequent events or


activities, and the percentage of people who successfully
completed the training course compared to the total number of
participants.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- Performance Measurement - The report should include the following:


Report for the Business

195
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

Intelligence and Analytics • Team Performance Indicators: This includes tracking the team's
Team. performance through specific indicators such as execution level, quality
of work produced, and adherence to timelines.

• Goal Achievement: The report analyzes the team's achievement of pre-


defined goals, such as the development and implementation of specific
analytics.

• Data and Analytics Quality: The quality of the data used in the analytics
and the accuracy of the resulting insights are evaluated.

• Work Efficiency: The overall work efficiency of the team is estimated by


analyzing completed tasks and the effective use of tools and techniques.

• Feedback and Satisfaction: The report may also include surveys of


feedback from clients or internal users to assess their satisfaction and
measure the quality of the services provided.

• These are just examples, and the content of a Business Intelligence and
Analytics Team Performance Measurement Report may vary depending
on the needs and goals of the organization.

Level 5: - Establishment of Business - A document must be attached to confirm the establishment of the Business Intelligence
Pioneer Intelligence and Analytics and Analytics Center within the Entity. The document should include the center's name,
Center Document. objectives, and responsibilities.

- The continuous review and - A report must be attached showing the Reviews, Continuous Improvement & Governance
improvement of business Plan for BIA practices including the following, as a minimum:
intelligence and analytics
management and governance • The Review process: It includes the stakeholders’ roles &
practices. responsibilities, e.g., the data science / BIA team, business & IT
stakeholders. Review methods & tools may be referenced, e.g., Data
Collection & Data Analysis methods & tools.

• Schedule: The timetable for conducting the review. The schedule may
be based on a specific timeframe, e.g., Quarterly, semiannually, or
annually, or it may be based on specific incidents/events, such as the
completion of a major project.

• Metrics: The measures used to assess the effectiveness of BIA Practices


& BIA Governance, which may include:

• Performance Metrics, such as the accuracy of the reports &


dashboards.

• Compliance Metrics, such as conformity with Data Security &


Privacy Policies.

• Reporting & Communication: Review reporting & communication


procedures. This may include routine reports sent to the Data
Science/BIA team and other business & IT stakeholders.

• The improvements made to business intelligence and analytics


management and governance practices.

196
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

- Continuous Review and - The document should include, at a minimum, the following:
Improvement of Business
Intelligence and Analytics • Team Performance Review: Regularly collect and analyze data related to
Team Performance. team performance, utilizing any previous measurements of team
performance.

• Identify Strengths: Identify areas of strength in performance to enhance


and/or leverage them in other tasks or projects.

• Identify Areas for Improvement: Identify weaknesses in performance and


determine areas that need improvement.

• Enhance Training and Development: Training and development


opportunities provided to team members to enhance their skills and
capabilities in the field of business intelligence and analytics based on
regular performance reviews. These opportunities may include
workshops, training courses, and individual mentoring.

- The Updated Capacity - An updated document must be attached showing Capacity Planning including:
Planning document.
• The Current Status Analysis: Analyzing the current BIA activities
capacity.

• The Future Status Analysis: Analyzing the expected demand for BIA
services.

• The Requirements: Detailing the specific resources required to satisfy


the expected BIA Service demands, including hardware devices,
software, and employees.

• Capacity Gaps: Identifying the gaps between the current & future
capacities / capabilities, and defining strategies to fill the gaps.

• Resource Allocation: Documenting the process of allocating resources


to support the Entity's BIA activities, including capacity / capability
allocation & resource management.

- The Revised Demand - The revised process document of BIA Service Demand Management must be attached
Management Process. showing the following:

• The review results summary of the Demand Management Process.

• Demand Analysis: Updates on the process of analyzing the BIA service


demand, including request volume, frequency, and complexity.

• Request Prioritization: Updates on the process of prioritizing BIA service


requests, including updates on the criteria that will be used to prioritize
each request.

• Resource Allocation: Updates on the process of allocating resources to


support the BIA service demand.

197
Checklist – Business Intelligence and Analytics Domain

BIA.MQ.3 Has the Entity defined and implemented practices to manage and govern the BIA processes?

Levels Acceptance Evidence Acceptance Criteria

• Communication: Updates on the process of communicating the BIA


service demand status to the stakeholders, and about the performance
of the Demand Management process.

• Governance & Compliance: Updates on the Governance & Compliance


process to implement this approved process for BIA service Demand
Management.

Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The list of BIA tools. - A document must be attached covering the list of tools used for the Entity’s BI activities.
Establishing For each tool being used, the following must be indicated:

• The name of the tool.

• The users of the target audience.

- The list of reports. - A document must be attached covering:

• The list of names of BIA report which were generated.

• Complete copies of the BIA reports.

Level 2: - The list of business units - A document must be attached covering:


Defined which use the BIA technology
tools. • A list of names of the business units that use BI technology tools.

• Details of business units that use BI technology tools, which include, as


a minimum:

• The name of the business unit.

• The name of the tool utilized.

• The version of the tool.

• The purpose of the usage.

198
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Levels Acceptance Evidence Acceptance Criteria

• The Stakeholders Cooperation to provide the Semantic Layer’s


details.

Level 3: - The list of users with role and - A document must be attached covering:
Activated privileges.
• The list of users with their roles & privileges.

- The Approved Architecture & - A document must be attached describing the Approved Architecture & the Approved
Documentation for Advanced Documentation for the Entity’s Advanced Analytics, which may include:
Analytics.
• The Advanced Analytics Architecture, including technologies required to
support Advanced Analytics.

• Data Storage: Data Storage solutions required to store and manage


large data quantities to be used in Advanced Analytics.

• Data Processing: Processing Algorithms & Technologies used to analyze


& model data within an Advanced Analytics system.

• Analytics Applications: The applications & tools used to create & deploy
analytical models & insights.

• Governance: The Processes & Policies required to control the use of


Advanced Analytics within the Entity, including Data Security, Data
Privacy, and Compliance Requirements.

• The Implementation Plan: The steps required to implement the


Advanced Analytics system within the Entity, including timetables &
requirements such as budgets & resources.

- Advanced Analytics and - The following documents should be included:


Communication Project
Management Documents. • Documents for advanced analytics project management, such as project
planning and scheduling, project organization and governance.

• Communication documents, including procedure documents and


templates for updating project status reports, progress, and results, and
sending them to relevant parties (stakeholders), including executive
sponsors, business users, and IT staff.

• Document management and sharing mechanisms.

199
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Levels Acceptance Evidence Acceptance Criteria

- The Approved Advanced - A document must be attached listing all models developed for Advanced Analytics, model
Analytics Models Documents. verifications and model approvals for usage. For each Advanced Analytics Model, the
following must be documented, as a minimum:

• The purpose of the Model.

• The data sources used.

• The methodology used.

• Verification, Validation, and Testing processes.

Level 4: - Report on the Effectiveness - The report on the effectiveness monitoring of business intelligence and analytics tools and
Managed Monitoring of Business techniques should include the following:
Intelligence and Analytics
Tools and Techniques through • Key Performance Indicators: Predefined key performance indicators and
Predefined Key Performance actual measurements for the adoption and utilization of business
Indicators. intelligence and analytics tools. These indicators may include the
number of participating users, data processing volume, response time,
and performance.

• Tool and Technique Usage: Evaluation of the extent of usage of the


adopted tools and techniques for data analysis and value extraction.
This may include details on usage ratios and distribution among different
departments and teams within the organization.

• Results and Analysis: The results, analysis, and insights achieved


through the use of business intelligence and analytics tools.

• Recommendations and Improvements: Recommendations for improving


the usage of tools and techniques and enhancing their adoption within
the organization. Areas for improvement are identified, and appropriate
guidance is provided to enhance effectiveness and optimize the benefits
of business intelligence and analytics tools.

• Success Evaluation: Evaluation of the success of adopting and utilizing


business intelligence and analytics tools. Results are measured against
expected objectives and outputs, and an analysis is provided on the
achieved benefits and ongoing challenges.

200
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 5: - Document for Continuous - The document for continuous adoption should include the following:
Pioneer Adoption of Technologies,
Tools, Frameworks, and • A list of new tools and technologies that have been evaluated or tested
Features. and identified.

• Advanced technologies (such as Spark, TensorFlow, and PyTorch) that


have been integrated into the artificial intelligence/machine learning
environment.

• A list of advanced capabilities, such as tools and engineering features, in


artificial intelligence/machine learning that are utilized for complex
processing and reusability.

• A list of tools and techniques used in artificial intelligence/machine


learning operations for comprehensive management of the artificial
intelligence/machine learning lifecycle.

- The Continuous Improvement - A document must be attached for the new technological solution’s roadmap of advanced
Report of the BIA and analytics (POC), which is a detailed plan for testing and evaluating new technologies or
Advanced Analytics tools in the advanced analytics field. This is to prove the existence & implementation of a
Technology Solutions continuous improvement mechanism. The Report must include the following, as a
including the Proof of Concept minimum:
(POC) Roadmap.
• The Scope: Specifying the scope of the new technological solution’s
POC, including the technologies being tested, data sources, and the
target audience.

• The Success Criteria: Identifying the success criteria and metrics that
will be used to assess success.

• The Timeline: Setting realistic timetables, including start & end dates and
milestones.

• Resources: Determining the resources needed to implement the new


technological solution’s POC, including technologies, data, employees,
and budget.

• Risks & Challenges: Identifying potential risks & challenges associated


with the new technological solution’s POC and develop a mitigation
plan.

• The Communication Plan: Developing a communication plan to keep


stakeholders informed about the progress and the results of the new
technological solution.

• Next Steps: Determining the next steps following a POC, such as


upscaling the new technological solution, performing extra tests, or
additional implementations.

201
Checklist – Business Intelligence and Analytics Domain

Has the Entity implemented the right tools, technologies, and skills to empower users and support the implementation of the BIA
BIA.MQ.4
use cases?

Levels Acceptance Evidence Acceptance Criteria

- A Report on the Proof of - A report must be attached containing the POC results of each new technological solution,
Concept (POC) Results. including, as a minimum:

• An Executive Summary: A brief overview of the main objectives & key


results.

• The Methodology: A Description of the methodology used in conducting


the POC & performing the periodic tests of the technological solution.

• The Results: A detailed analysis of the POC results & the periodic testing
results, including the solution’s strengths & weaknesses, predictions
accuracy, and any limitations or issues encountered.

• Recommendations: Any insights or recommendations derived from the


results.

• The Conclusion: A conclusion that summarizes the POC results &


highlights the business implications.

• The Future Directions: A discussion of the project’s potential next steps,


including any recommendations for further development, testing, or
deployment.

• Appendices: Supporting materials such as technical documentation,


programming code samples, data visualizations, and interfaces.

202
8.2.10. Data Value Realization Domain

Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement
DVR.MQ.1
Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing List of Data Value - A document must be attached covering a Proof-of-Concept (POC), or other
Establishing Realization (DVR) Activities. temporary tasks performed as practices that support Data Value & Benefit
Realization within the Entity. The document must include the following:

• The currently existing DVR Domain practices.

• A separate description for each activity including, as a minimum:

• The name of the activity.

• The purpose of the activity.

• The achieved or expected output of the activity.

Level 2: - The Data Value Realization - The DVR Plan must be attached to activate Data revenue generation capabilities
Defined (DVR) plan. (Profitable revenue from data) and activate Data based cost-reduction initiatives.
The plan shall include the following, as a minimum:

• A roadmap that includes the activities and milestones for the DVR
use cases. The activities shall incorporate what is needed to
achieve this Domain’s specifications, as a minimum.

• The assignment of the required resources & budget allocation to


manage the implementation of the DVR use cases.

- The List of Identified Use Cases - A document must be attached covering the DVR use cases. Each use case should
for Both Revenue Generation & be documented with the following details, as a minimum:
Cost Optimization.
• Name of use case.

• Type of use case e.g.:

• Data Revenue Generation Use Cases – Data or Data


Products which generate revenue for the Entity.

• Cost Saving Use Cases – Data related cases which will


directly or indirectly contribute to reducing expenses and
achieving greater efficiency.

203
Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement
DVR.MQ.1
Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

• The stakeholders required to implement the use cases, the official


who will lead the use case, and the target beneficiary who will
benefit from the implementation of the use case.

• A list of business requirements needed to implement the use


cases.

• Data sources and required data fields.

• The technology required to implement the use cases.

- A Document Explaining the - A document must be attached containing the following:


Payback Period and Return on
Investment (ROI) for Each • An estimation of the payback period and the ROI for each DVR
Identified Use Case. use case.

Level 3: - A Report on DVR Monitoring & - A document must be attached showing the implementation status of the activities
Activated Maintenance. being done for data value realization including the following, as a minimum:

• The DVR plan status report.

• The organizational structure of the DVR activities.

• Use case’s monitoring and maintenance status report.

- The monitoring and maintenance of the DVR use cases must include the following,
as a minimum:

• Measuring and verifying / validating the KPIs (ROI & Payback


Period) against the projected income values in the DVR &
Income Plan.

• Developing Change Request (CR) documents to


accommodate change requirements from the end users.

• Reporting defects & malfunctions in the implemented use


cases.

204
Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement
DVR.MQ.1
Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report of the - The Monitoring Report of the implemented DVR Use Cases must be attached. The
Managed DVR Use Cases with Pre- report must include the following indicators, as a minimum:
defined KPIs.
• The number of Data Products developed.

• The number (i.e., Quantity) of Data or Data Products revenue


generation requests raised to NDMO.

• The DVR Use Case Payback period.

• The DVR Use Case Return on Investment (ROI).

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative.

- The Monitoring Report of the - The Monitoring Report must be attached containing the DVR activities & plan with
DVR Activities & Plan with Pre- the pre-defined KPIs (Indicator Cards).
defined KPIs.
- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

205
Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement
DVR.MQ.1
Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the target,
Negative Polarity: Lower indicator value is the target).

Level 5: - Review & Continuous - The Entity must provide document for the periodic reviews, results, and continuous
Pioneer Improvement Document of the improvement mechanisms of the DVR Plan, and a list of any improvements done
DVR Plan. on the plan.

- The Revised & Updated DVR - A document must be attached including the following, as a minimum:
KPIs.
• The revised & updated DVR KPIs.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the
indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

206
Checklist – Data Value Realization Domain

Has the Entity developed a plan to identify, document and realize its Data revenue generation potential and implement
DVR.MQ.1
Data-related cost optimization initiatives and use cases?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator


value is the target, Negative Polarity: Lower indicator value is
the target).

- Evidences of New Partnerships - A document must be attached proving new partnerships, which may contain
(e.g., MoU, jointly developed proofs such as:
use cases or products, etc.).
• Memorandum of Understanding (MoU): In the event of signing a
Memorandum of Understanding, a copy of the MoU must be
attached.

• Jointly Developed Use Cases or Data Products: If the partnership


contributed to the development of new use cases or products,
they must be documented and described. There must be a brief
explanation of each use case and each data product, its purpose
and the value it may provide.

Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices Related - A document must be attached containing the current practices which support Data
Establishing to Supporting the Data Revenue revenue generation including, as a minimum:
Generation Process.
• A list of the current activities (reports and/or dashboards...etc.)
related to the DVR Domain practices.

Level 2: - The Defined Pricing Scheme. - A document must be attached containing:


Defined
• An appropriate Pricing Model for any Data and for any Data
Product that’s expected to generate revenue.

207
Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

- The Data or Data Product Price - A document must be attached containing the total expense calculation for each
Calculation. Data or Data Product which is expected to generate revenue. The total cost shall
include the following as a minimum:

• The Data Collection Cost – The total of the costs incurred for
collecting, cleansing, and curating data.

• The Data Development Cost – The total of the costs incurred for
developing analytical models, Data visualizations and other value-
added services provided on top of the collected data.

- The Adopted / Approved - A document must be attached containing the approved Charging Calculation
Charging Model for Each Data Model for each Data or Data Product that’s expected to generate revenue.
or Data Product.
- Here are examples of Charging Calculation Models, but not limited to:

• The Subscription Model.

• The Consumption Based Model.

• The Freemium / Premium Model.

• The One Time Fee Model.

Level 3: - Evidences of Revenue - A document must be attached containing proofs of requests sent to NDMO-SDAIA
Activated Generation Requests Submitted for revenue or income generation from Data or from Data products.
to NDMO-SDAIA.
- Each request shall include the following, as a minimum:

• The Description of the Data or the Data Product.

• The Documentation of the Data Product Pricing Scheme / Model


& the Service Pricing Scheme / Model.

• The Proposed Charging Model.

• The Proposed Final Unit Price.

• The Justification if the Final Unit Price does not follow the Cost
Recovery Pricing Scheme.

Level 4: - The Monitoring Report of the - A report must be attached showing the monitoring of the Data Revenue Generation
Managed Data Revenue Generation Process efficiency using pre-defined KPIs (Indicator Cards).
Process with Pre-defined KPIs.
- The report must include the following indicators, as a minimum:

• The number of Data Products that generated revenue.

• Total revenue generated from offering Data or Data Products.

• The total cost saved from the implemented Cost Saving Use
Cases.

208
Checklist – Data Value Realization Domain

DVR.MQ.2 Has the entity implemented practices to support a data revenue generation process?

Levels Acceptance Evidence Acceptance Criteria

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - Continuous Improvement Report - A document must be attached containing, as a minimum, the following:
Pioneer of the Data Revenue Generation
Process. • Continuous improvement mechanisms.

• Documents of the Periodic reviews and results of the Pricing


Model, Price Calculation for data or data products, and the
Charging Fee Calculation Model.

• The improvement recommendations or updates on data revenue


generation process based on the reviews results.

209
8.2.11. Open Data Domain

Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The existing list of Open - A document must be attached covering the current list of open datasets including
Establishing Datasets. the following, as a minimum:

- For each existing open dataset, the following information must be provided, as a
minimum:

• Name: The name of the open dataset.

• Description: A brief description of each open dataset,


including its source and any related information.

• Download Links: The download link of each open dataset.

Level 2: - The Approved Open Data - Document must be attached including the following, as a minimum:
Defined Framework.
• The approach/methodology used to make open data available to
the public.

- The Approved Open Data Plan. - The approved OD plan must be attached including the following, as a minimum:

• A roadmap that includes activities milestones for the


implementation of OD initiatives. The activities shall include what is
necessary to achieve the specifications of this Domain, as a
minimum.

• Allocation of the required resources & budget to manage the


implementation of OD initiatives.

- The OD Management structure. - An Open Data Management Structure document must be attached including, as a
minimum:

• The roles & responsibilities.

• The compliance audit framework.

210
Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

- The Developed and Approved - A document must be attached containing the change management plan that has
Plan for Change Management been developed & approved including, as a minimum:
(Including awareness
campaigns). • The OD activities training plan.

• The OD activities awareness campaigns plan including the


following, as a minimum:

• The usage of open data, and its various positive social &
economic benefits.

• Promoting the Entity's open data & the related activities.

Level 3: - The Open Data Plan - A report must be attached showing the open data plan implementation status
Activated Implementation status report. including, as a minimum:

• The achievement percentages of the initiatives & projects included


in the OD implementation plan.

- Evidence of Submission of the - The Entity must attach the Annual Compliance Report submitted to NDMO-SDAIA
Annual Compliance Report to and attach the evidence of the submission.
NDMO-SDAIA.

- Assignment Decisions / - A document must be attached including the hiring / assignment decisions for the
Appointees to Job Roles. following roles including job descriptions:

• Open Data & Information Access Officer (ODIAO).

• Business Data Executive (BDE).

• Business Data Steward.

- Evidence of Implementation of - A document should be attached that includes the following:


the Change Management
Program (The conducted • Evidence of the implementation of training courses related to Open
training courses and the Data.
launched awareness campaigns
related to Open Data). • Evidence of the launch of awareness campaigns related to Open
Data.

211
Checklist – Open Data Domain

OD.MQ.1 Has the Entity defined, established, and implemented a plan to identify and coordinate the publishing of its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 4: - The Monitoring Report on the - A monitoring report must be attached showing the Open Data Plan Implementation
Managed Effectiveness of the Open Data & must be prepared based on pre-defined KPIs (Indicator Cards), and each
Plan through Predefined Key indicator’s data or card must include the following, as a minimum:
Performance Indicators (KPIs).
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - The periodic reviews and - A document must be attached containing the periodic reviews and improvements of
Pioneer improvements of the open data the open data plan, including, at a minimum, the following:
plan.
• Mechanisms for continuous review and improvement of the plan
and roadmap.

• Documented reviews and periodic results.

• Any improvements made to the plan and roadmap.

212
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices for OD - A document must be attached, explaining:


Establishing Identification.
• The Current OD Identification Practices.

Level 2: - The Defined Process - A document must be attached containing the Open Data Lifecycle Management
Defined Documentation for Managing Processes including the following, as a minimum:
the Lifecycle of Open Data.
• Processes to identify public datasets to be published by the Entity.

• Processes to ensure that datasets are published and maintained to


their appropriate format, timeliness, comprehensiveness, and
overall high quality and ensure the exclusion of any restricted data.

• Processes for gathering feedback, analyzing performance at the


Entity level, and improving the overall Open Data national impact.

- The Defined Process - A document must be attached containing the Open Data Identification Process
Documentation for Identifying including the following, as a minimum:
Open Data.
• Identifying and documenting all data classified as 'Public' and
prioritizing each dataset included under open data.

• Evaluating the identified datasets to determine whether to be


published as open data or not.

• Determining whether the combination of any publicly available data


and the data to be published would constitute an unauthorized
disclosure of Personal Information, or any other Security or privacy
risk or threat.

213
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

- The Process of Evaluating the - Should include a document that outlines the process for assessing the estimated
Value and Impact of Open or value and potential impact of open or public datasets, which should include, at a
Public Datasets. minimum, the following:

• Steps and methods for analyzing the value of each open dataset
from different dimensions such as economic (e.g., ROI), social, and
environmental dimensions.

• Steps and methods for evaluating the potential impact after sharing
open or public datasets, such as increasing transparency,
improving decision-making processes, and enhancing research
and innovation.

• Risk assessment and mitigation methods for identifying potential


risks associated with open datasets, such as security threats, data
breaches, privacy concerns, Data Quality (DQ) issues, intellectual
property violations, etc.

Level 3: - The OD Identification Process - A report must be attached showing the implementation status of the Open Data
Activated Implementation Status Report. identification process.

- The List of Identified Open - A document must be attached containing the list of public datasets considered to be
Datasets with the Assigned published as open data, with the classification & prioritization information for each
Priorities. dataset.

- For each dataset, the following information must be provided, as a minimum:

• Name: The name of the dataset.

• Description: A brief description of each dataset, including its


source and any related information.

• Size: The size of each dataset in terms of the number of records or


file size.

- The Identified & Documented - A document must be attached containing the identified & documented metadata of
Metadata for the Open Datasets. the open datasets including, as a minimum:

• The necessary metadata for each open dataset, to easily identify,


describe and search for it once published.

214
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

- Value and Impact Assessment - A value and impact assessment report for the identified open or public datasets to
Report for the Identified Open or be published must be attached, which should include, at a minimum, the following:
Public Datasets.
• Results of evaluating the value and impact of the identified
datasets to decide whether or not to publish them as Open Data.

• Risk assessment report for identified potential risks associated with


publishing the open datasets.

Level 4: - The Monitoring Report of OD - A report must be attached showing the OD Identification Process monitoring & must
Managed Identification & Prioritization be prepared based on pre-defined KPIs (Indicator Cards), Including the key
Processes with Pre-defined performance indicator "Number of identified and prioritized open datasets”.
KPIs.
- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

215
Checklist – Open Data Domain

OD.MQ.2 Has the Entity defined, established, and implemented a process to support the identification of Open Data (OD)?

Levels Acceptance Evidence Acceptance Criteria

Level 5: - The Continuous Improvement - A continuous improvement document must be attached proving the periodic reviews
Pioneer Report Showing the & outcomes of the OD identification processes and the implemented automation
Documented Periodic Reviews & processes including the following, as a minimum:
Outcomes of the OD
Identification Processes and the • The document of periodic reviews & results which were
Implemented Automation. documented for the OD identification processes and the
implemented automations.

• The continuous improvement mechanisms of the OD identification


processes and the implemented automations.

• The processes which were implemented & automated to support


OD identification.

Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Practices for - A document must be attached, explaining:


Establishing Publishing Open Datasets.
• The current practices related to Publishing Open Datasets.

Level 2: - The Defined Process - A document outlining the Open Data Publishing Process must be attached showing
Defined Documentation for Publishing steps to ensure that datasets are published and maintained to their appropriate
Open Data. format, timeliness, comprehensiveness, and overall high quality and ensure the
exclusion of any restricted data.

- The Defined Process - A document outlining the Open Data Maintenance Process must be attached
Documentation for Open Data showing activities including, at a minimum the following:
Maintenance.
• Regular update and documentation of changes to its published
Open Datasets and associated metadata whenever changes occur.

• Continuous review of the published Open Datasets to ensure they


meet the defined regulatory requirements.

• Maintenance of data traceability by documenting data provenance


and maintaining versioning history of the datasets.

216
Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - Status Report on the - The report must be attached showing the implementation status of the open data
Activated Implementation of the Open publishing process.
Data Publishing Process.

- Evidences of Published - The proof document should include the datasets published on the National Open
Datasets on the Saudi Open Data Portal under the Open Data License in the Kingdom of Saudi Arabia, as
Data Portal. outlined in the Open Data Policy issued by NDMO-SDAIA. At a minimum, it should
include the following:

• For each dataset that’s currently published on the National Open Data

Portal, the following must be documented, as a minimum:

• Name: The name of the open dataset.

• License data.

• Description: A brief description including its source and any related


information.

• Size: The size in terms of the number of records or file size.

• Format: e.g.: CSV, JSON, or XML.

• Download Links: The download link.

• Publishing Date: The date the open dataset was first published.

• Update Date: The last update date.

• Usage and downloads statistics.

- Evidences of Feedback / - A document must be attached containing proofs that there were feedback
Comments Received on OD. comments on the open datasets.

- Evidence of Formats Used to - A document must be attached showing the formats of the published open datasets,
Standardize Open Datasets in such as CSV, JSON, or XML, and instructions on how to use the open dataset
Machine Readable Form. according to its published format.

- Evidence of Data Standards - A document must be attached showing evidence of the data standards applied on
Applied on Open Datasets to the open datasets to ensure high data quality.
Ensure High Data Quality (DQ).

217
Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

- Open Data Maintenance Report. - A report must be attached showing the Open Data maintenance process outcomes
that include, at a minimum, the following:

• Periodic Data updates, and documentation of any changes made


on the published open datasets and on the associated metadata (in
the event of any changes.

• Ongoing reviews of published datasets to ensure that they meet


the specified regulatory requirements.

• Evidence of data traceability by documenting data provenance


and versioning history of the dataset.

- The Open Data Register - A document must be attached that includes the following, as a minimum:
Containing Records of Open
Data Activities and Published • A record that includes artifacts related to Open Data Domain
Open Datasets. activities and decisions made during the data life cycle
management.

• A record that includes a list of all open datasets, reviews, and


changes associated with them and their metadata.

Level 4: - The Monitoring Report of the OD - The Entity must attach a report on monitoring the OD Publishing Processes based
Managed Publishing Process with Pre- on pre-defined KPIs (Indicator Cards) including the following indicators, as a
defined KPIs. minimum:

• The number of downloads per published Open Dataset.

• The number of identified Open Datasets that have been published.

• The number of updates performed on published Open Datasets.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

218
Checklist – Open Data Domain

OD.MQ.3 Has the Entity defined, established, and implemented a process to support publishing its Open Datasets?

Levels Acceptance Evidence Acceptance Criteria

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - A continuous improvement document must be attached showing:
Pioneer Report of the OD Publication &
Maintenance Practices. • The documented periodic reviews & outcomes for the OD
publication and maintenance practices.

• The continuous improvement mechanisms of the OD publication


and maintenance practices.

219
8.2.12. Freedom of Information Domain

Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing FOI Practices. - The Entity must attach a report on the current practices in the FOI Domain containing
Establishing implementation evidences of these practices.

Level 2: - The Defined & Approved FOI - The Entity must attach the approved FOI Domain implementation plan and roadmap
Defined Implementation Plan & including the following, as a minimum:
Roadmap.
• A roadmap that includes the activities and milestones for the
achievement of full compliance with the FOI regulations published by
NDMO-SDAIA. The activities shall incorporate what is needed to
achieve this Domain’s specifications, as a minimum.

• The assignment of the required resources & budget allocation to


achieve full compliance with the FOI regulations published by NDMO-
SDAIA.

Level 3: - The FOI Plan Implementation - The Entity must attach a report clarifying the implementation status including the
Activated Status Report. following, as a minimum:

- The achievement percentages of the initiatives and projects included in the executive
action plan of FOI.

- The Assigned Open Data & - The Entity must attach the data of the appointed Open Data & Information Access
Information Access Officer Officer (ODIAO) including an evidence of the employment decision.
(ODIAO).

220
Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

- FOI Awareness. - The Entity should attach a report proving its implementation of awareness campaigns
in FOI, with the aim of instilling a culture of transparency and promoting it, and raising
awareness of the policy of FOI issued by the NDMO-SDAIA and the right to access
public information. Awareness campaigns should include, at a minimum, the following:

• Raising awareness among employees involved in the processing of


FOI requests to understand the main obligations and requirements of
the FOI policies issued by the NDMO-SDAIA.

• Raising awareness about the FOI principles and their application to


the rights of beneficiaries.

Level 4: - The Monitoring Report on the - The Entity must attach a report prepared based on pre-defined KPIs (Indicator Cards),
Managed Entity's FOI Plan & Activities and each indicator’s data or card should include the following, as a minimum:
with Pre-defined Key
Performance Indicators (KPIs). • Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

- The Internal Audit Reports on - The Entity must attach reports including the following:
the Entity's Compliance with
the FOI Regulations. • The conducted internal audits of monitoring the compliance with the
FOI Regulations published by NDMO-SDAIA.

• The documented audit findings, submitted in a report to the Open


Data & Information Access Officer (ODIAO).

221
Checklist – Freedom of Information Domain

Has the Entity defined and established a plan to address its compliance with the requirements of the Freedom of Information
FOI.MQ.1
(FOI) Regulations?

Levels Acceptance Evidence Acceptance Criteria

• The corrective actions applied in cases of non-compliance, with


notifications to the regulatory authority or NDMO -As it stated-, and
documentation of these improvements in the audit findings report.

Level 5: - The Continuous Improvement - The Entity must attach a report showing periodic FOI plan reviews to ensure
Pioneer Report of the FOI Plan. continuous compliance with applicable regulations and other environmental
requirements or influences. The report must include the following, as a minimum:

• The FOI plan review documents and the documented periodic


results.

• The FOI Plan continuous improvement mechanisms.

Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing Processes for the - The Entity must attach evidences of a previously done process(es) and response(s) to
Establishing FOI Requests & Responses. information request(s).

Level 2: - The Developed & Approved - The Entity must attach the developed and approved procedures and processes of FOI
Defined FOI Request Processes & requests. The Entity must design and document a standardized / unified process for
Procedures Documentation. information requests and develop procedures for managing, processing and
documenting requests for access to public information in alignment with the FOI
Regulations published by NDMO-SDAIA.

222
Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

- The Developed FOI Process - The Entity must attach the developed FOI process guideline and answers to the
Guide & FAQs. Frequently Asked Questions (FAQs). The guide must clarify the following, as a
minimum:

• The request preparation mechanism.

• The request sending mechanism.

• The response awaiting mechanism.

• The response review mechanism.

• The appeals mechanism (if required).

• Moreover, the FAQs may include:

• Common questions about the FOI process, requirements,


timelines, and other info related to requesting information.
Submitting FAQs can help clarify common inquiries and provide
useful info for individuals seeking to submit FOI requests.

Level 3: - The Implementation / - The Entity must attach a report on the FOI request process implementation status
Activated Adoption Status Report on covering the following:
the FOI Request Processes.
• Granting access to a public information request(s).

• Denying a public information access request(s).

• Extending the time required to respond to specific requests.

• Notifying the requestor(s) if the required information is available on


the Entity’s website or is not within its specialty.

- Evidences of Entity-wide - The Entity must attach evidences for Entity-wide communications and public
Communication. publications, in alignment with the FOI Regulations published by NDMO-SDAIA,
without contradicting the applicable regulations in the Kingdom of Saudi Arabia (KSA).
The Entity must publish the following information on its official government website or
websites linked to it:

• The Laws, regulations, instructions and regulatory decisions


applicable to the Entity / followed within the Entity.

• The Entity's services provided with description detailing how to


obtain access to those services.

• The Entity's organizational structure including the roles &


responsibilities.

• The Entity's job vacancy information, except information of security


or military job vacancies as determined by the security or military
regulatory authorities or applicable KSA Regulations.

• The Entity's annual strategic and operational reports including the


Entity's financial statements.

223
Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

• The Entity's general statistics, news and updates on its activities


including the following, as a minimum:

• The number of the Entity's employees.

• The year of the Entity's establishment.

• The number of the Entity's services provided in the last year.

• The Entity's up-to-date activities' descriptions.

• The projects provided by the Entity as stated in the FOI


Regulations regarding the risks which may affect people’s lives,
health and properties. The information must contain the contact
details of persons with valid licenses granted by the Entity
including the following, as a minimum:

• The names of the persons.

• The postal addresses of the persons.

• The e-mail addresses of the persons.

• Information on projects offered or awarded by the


Entity as prescribed by the FOI Regulations regarding
any risk that may affect people’s lives, health or
properties. The information must include the following,
as a minimum:

• The names of the recipients.

• The execution periods.

• The technical analysis.

• The guidelines and leaflets that raise


the people's awareness of their rights
to the Entity’s FOI.

• If the information above is not available


or applicable, the Entity must provide a
justification with these evidences, and
in line with the FOI Regulations.

- The Register of the Received - The Entity must attach the prepared request forms for access to Public
Request Forms with the Information -whether paper or electronic specifying the required information to be
Responses. provided by the Requestor. The required information must include the following, as a
minimum:

• Information about the Requestor including name, address, national


ID.

• Description of Public Information requested by the requestor.

• Purpose behind the request for access to public information.

• Legal basis for the request.

224
Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

• Notice delivery method to the requestor (e-mail, national address).

• Date of the request.

- The Identified Public Datasets - The Entity must attach evidence of publishing specific Public Datasets, under the FOI
Shared Under the FOI Regulations.
Regulations.
- Responding to FOI requests can sometimes lead to the publication of open datasets
as part of the Open Data (OD) initiatives.

- When a requestor submits an FOI request to obtain specific information that is within
the scope of FOI Regulations, and the request is granted, the government Entity may
provide the requested information in the form of a dataset. If a published dataset
meets the criteria of being publicly available, machine-readable, and reusable, it can
be considered Open Data (OD).

- Evidences of the Published - The Entity must attach evidence of the FOI communication publications, including
FOI Communications guidelines and Frequently Asked Questions (FAQs), on the Entity’s official website, in
Including Guidelines & FAQs line with NDMO requirements.
on the Entity's Official Gov
Website in Line with the
NDMO Requirements.

- The Pricing Scheme for - The Entity must attach a public information access request pricing mechanism. The
Public Information Access Entity must calculate and document the processing fees of each granted / approved
Requests. public-information access request, by adopting a Pricing Scheme determined by the
Entity and approved by NDMO-SDAIA.

- The Up-to-date FOI Register. - The Entity must attach an updated FOI Register as the Entity must document
compliance records in a register as instructed in the FOI guidelines published by
NDMO-SDAIA. The Register must include the following, as a minimum:

• Information on the current Open Data and Information Access Officer


(ODIAO).

• Public Information Access Request Records.

• Public Entity Publications.

• Any other records, including the manner and format, as required by


the FOI Regulations published by NDMO-SDAIA.

Level 4: - The Monitoring Report with - The Entity must attach a report on monitoring the Entity’s responses to FOI requests
Managed Pre-defined KPIs for the based on pre-defined KPIs (Indicator Cards). Each indicator’s data or card should
Entity's Responses on FOI include the following, as a minimum:
Requests.

225
Checklist – Freedom of Information Domain

FOI.MQ.2 Has the Entity defined and implemented the required processes for Freedom of Information (FOI)?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying


the Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Continuous Improvement - The Entity must attach a report on the updated FOI process(es) including the changes,
Pioneer Report on the FOI Processes. with reference to revised copies of the documents which identify the steps and
procedures for a particular FOI process or task including:

• The review documents and the documented periodic results of the


updated processes, with the changes.

• The FOI Processes continuous improvement mechanisms.

- The Automated Tool for FOI - The Entity must attach a report describing (name, version, etc.) the tool(s) used to
Requests. automate the FOI request responding process.

- A tool refers to a program or an application designed to simplify and automate the


process of handling / processing FOI requests from submission to completion. Such a
tool can enhance efficiency, accuracy, and transparency in managing FOI requests by
automating various tasks, reducing manual efforts, and ensuring compliance with FOI
Laws and Regulations.

226
8.2.13. Data Classification Domain

Checklist – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Existing DC Practices. - The Entity must attach a report clarifying the current Data Classification Practices.
Establishing

Level 2: - The defined & Approved Data - The Data Classification Plan must include the following, as a minimum:
Defined Classification Plan.
• The roadmap with the activities & milestones for the classification
of Entity's Data. The activities shall incorporate what is needed to
achieve this Domain’s specifications, as a minimum.

• The assignment of the required resources & budget to manage the


classification of the Entity's data.

Level 3: - The Data Classification - The Entity must attach a report clarifying the implementation status including, as a
Activated implementation plan status minimum:
report.
• The achievement percentages of the initiatives & projects included
in the Data Classification Implementation Plan.

Level 4: - The Implementation Monitoring - The report must be prepared based on KPIs Data (Indicator Cards) which were pre-
Managed Report of the DC Plan & defined in the Data Classification Plan, and each indicator’s data or card should
Activities with Pre-defined KPIs. include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with


identifying the Specification or the Process to which the indicator
belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

227
Checklist – Data Classification Domain

Has the Entity established a plan for Data Classification (DC) as stipulated by the Data Management and personal Data
DC.MQ.1
protection (DM & PDP) standards?

Levels Acceptance Evidence Acceptance Criteria

• Measurement Periodicity (Monthly / Quarterly / Biannually /


Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is


the target, Negative Polarity: Lower indicator value is the target).

Level 5: - Data classification plan review - The Entity must attach a report showing that the Entity identified, implemented and
Pioneer report. monitored continuous improvement mechanisms for the Data Classification Plan
including the following, as a minimum:

• The documented periodic reviews & results.

• The continuous improvement mechanisms.

Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Current List of - The Entity must attach the current classified Datasets list.
Establishing Classified Datasets.

Level 2: - Data Classification Policy. - Attach a data classification policy, which should include at a minimum:
Defined
• Policy name.

• Release date.

• Release number.

• Document control (preparation, revision, approval)

• Version history.

• Terminology.

228
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

• Objective.

• Scope of work.

• References.

• Policy owner.

• Policy statement including as a minimum:

• The main principles of classification

• Classification prioritization process and criteria

• Classification levels

• Classification controls

• Steps required for classification:

1. Define data.

2. Appointing the classification officer.

3. Conduct an impact assessment.

4. Conduct an impact assessment for low-impact data.

5. Classification review.

6. Apply appropriate controls.

7. Roles and responsibilities

- The Data Handling and - The Entity must attach a document containing the Handling & Protection Controls for each
Protection Controls. Dataset & Artifact according to its classification, to ensure secure handling, processing,
sharing and disposal of data by following the policies & regulations of the National
Cybersecurity Authority (NCA). The document shall include, as a minimum:

• The controls for protection, handling & processing of Public Classification


cases.

• The controls for protection, handling & processing of Restricted


Classification cases.

• The controls for protection, handling & processing of Confidential / Secret


Classification cases.

229
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

• The controls for protection, handling & processing of Top Confidential / Top
Secret Classification cases.

Level 3: - The Inventory Report of - The Entity must attach a document clarifying an inventory of all Datasets & Artifacts to
Activated the Identified Datasets and implement the Data Classification Process, including as a minimum:
Artifacts.
• Identifying the ownership of the Datasets & Artifacts.

• The list of all Datasets & Artifacts.

- The Prioritized Datasets - The Entity must attach a document showing the list of prioritized Datasets & Artifacts to be
and Artifacts. followed during classification.

- The Impact Assessment - The Entity must attach a report proving that the Entity conducts probable impact
Report. assessments (e.g., Any potential damage) when disclosing specific Data or when specific
Data was accessed in an unauthorized way. The Impact Assessment process should be
implemented for all identified Datasets & Artifacts, including the following steps:

• The identification of the potential categories impacted, amongst national


interests, organizations, individuals and environment.

• The selection of the potential damage impact level for each identified
category amongst 'High', 'Medium', 'Low' and 'None / Insignificant'.

• The assignment of Classification Levels to Datasets & Artifacts based on


the selected impact level:

• If the impact level was assessed as 'High', then Datasets & Artifacts
shall be classified as Top Confidential / Top Secret.

• If the impact level was assessed as 'Medium', then Datasets & Artifacts
shall be classified as Confidential / Secret.

• If the impact level was assessed as 'Low', then Datasets & Artifacts
shall be classified as Restricted.

• If the impact level was assessed as 'None / Insignificant', then


Datasets & Artifacts shall be classified as Public.

• The assessment alignment with the Data Classification Policies &


Regulations published by NDMO-SDAIA.

230
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

- The Assessment Report of - A report shall be attached proving that the Entity researched / assessed the possibility of
Low-Impact Data. Classifying Low-impact Data as Public instead of Restricted. The assessment shall include
the following:

• Analyzing / evaluating if the disclosure of this Low-impact data breaches


any existing KSA regulation such as the Anti Cyber Crime Regulations and
the Electronic Commerce (e-Commerce) Regulations.

• Identifying the potential benefits of disclosing / opening such Datasets &


Artifacts and assuring / considering whether those would outweigh the
negative impacts.

• Modifying the Low-impact classified data to be considered Public if


publishing / releasing would not breach any applicable / existing regulation,
especially if the benefits outweigh the negative impacts.

• Aligning the assessment with the Data Classification Policies & Regulations
published by NDMO-SDAIA.

- Evidences of Utilization of - The Entity must attach a report or a proof that the Data Catalog Tool was used for Data
the Data Catalog Tool for Inventorying.
the Data Inventory.

- The Approved Data - The Entity must attach a document showing the Data Access list with the assigned
Access List of users with permissions / privileges, including the following, as a minimum:
the Assigned Privileges.
• Specifying the types of users who need Data Access.

• Specifying the Data Access Authorizations (Read, Modify, Delete).

231
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

- Data Register. - The Entity must attach a register / log documenting the list of all identified Datasets &
Artifacts, in addition to the activities implemented during the Data Classification process. The
register / log should include the following, as a minimum:

• The list of the Entity's identified Datasets & Artifacts.

• The Classification Levels assigned to the identified Datasets & Artifacts.

• The Assignment Dates of the classification levels to the identified Datasets


& Artifacts.

• The mandatory Classification Durations of the identified Datasets &


Artifacts.

• The Classification Levels approved / validated during the reviews.

• The Classification Levels' Review Dates.

Level 4: - The Monitoring Report of - A monitoring report on the effectiveness of the Data Classification Processes must be
Managed the DC Processes with attached, including the following KPIs, as a minimum:
Pre-defined KPIs.
• The percentage of Classified Datasets & Artifacts (out of the Entity’s total
Datasets & Artifacts).

• The percentage of Datasets & Artifacts classified with specific classification


levels (out of the Entity’s total Classified Datasets & Artifacts).

• The percentage of Low-impact data classified as Restricted.

• The percentage of Classified Datasets & Artifacts that were reviewed,


approved & validated.

- Each indicator’s data or card must include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

232
Checklist – Data Classification Domain

DC.MQ.2 Has the Entity defined, identified and implemented the required Data Classification (DC) processes?

Levels Acceptance Evidence Acceptance Criteria

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - The Data Classification - The Entity must prove Continuous Improvement Mechanisms by attaching a report /
Pioneer Automation Tool. document issued from the tool used in the automated DC process, clarifying the automation
processes & automation phases.

Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are the
DC.MQ.3
most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - The Current Practices of DC - The Entity must attach a report clarifying the current practices of reviewing the
Establishing Reviews. classification of all Classified Datasets & Artifacts for the following:

• To ensure that the assigned classification level for each is suitable, in line
with the Data Classification Policies & Regulations.

• To change the classification level if the Data status changes.

Level 2: - The DC Review Mechanism. - The Entity must attach a document specifying the Data Classification Review Mechanism
Defined including the following, as a minimum:

233
Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are the
DC.MQ.3
most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

• Verification of accuracy of the data collected.

• Verification of data classification levels.

• Verification of validity.

• Identifying errors and corrections.

• Documenting the Data Classification Review Process.

Level 3: - The Data Classification - The Entity must attach a report that proves periodic Data Classification reviews, and
Activated Review Report. includes, as a minimum:

• The Reviewed Classified Datasets & Artifacts.

• The Decisions made resulting from the Review Process.

- An Evidence Document of - The Entity must attach a document proving that the Entity published on the comprehensive
the Published Classification Data Catalog the classification levels assigned to the Datasets. The Metadata must be
Levels as Metadata. published according to the process defined in the Data Catalog & Metadata (MCM) Data
Management (DM) Domain (i.e., Metadata & Catalog Management (MCM)).

Level 4: - The Monitoring Report of - The report must be prepared based on KPIs Data (Indicator Cards) which were pre-defined
Managed the DC Review Mechanism for the Data Classification Review Mechanism, and each indicator’s data or card should
with Pre-defined KPIs. include the following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic / operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

234
Checklist – Data Classification Domain

Has the Entity reviewed all its classified Datasets and artifacts to ensure that the classification levels assigned to them are the
DC.MQ.3
most appropriate ones as specified by the Data Classification (DC) Policies?

Levels Acceptance Evidence Acceptance Criteria

Level 5: - Data classification review - Attach a report showing that the entity has identified, implemented and monitored
Pioneer mechanisms review report. mechanisms for continuous improvement of data classification review processes.

235
8.2.14. Personal Data Protection Domain

Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic and
PDP.MQ.1
operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of the existing - A report must be attached showing current PDP& data privacy practices, including proofs
Establishing practices of the PDP Domain for those practices.
and Data Privacy.

- The Initial PDP Assessment - The assessment should be aligned with national Personal Data Protection Law (PDPL) and
Result. include the following, as a minimum:

• Identification of the types of personal data being collected.

• Location & method of personal data storage.

• Current processing & uses of the personal data.

• Privacy challenges to meet compliance with the Personal Data Protection


Regulations published by NDMO-SDAIA.

Level 2: - The Approved PDP - The plan should include the following, as a minimum:
Defined implementation plan.
• A roadmap of activities & milestones to achieve conformity commitment
and maintain full compliance with the PDP Policies published by NDMO-
SDAIA. The activities shall include what is necessary to achieve the PDP
Domain specifications, as a minimum.

• Allocating the required resources & budget to achieve full compliance


with the PDP Policies published by NDMO-SDAIA.

- The PDP Training plan. - A report must be attached including an approved valid plan for training the Entity’s
employees in the PDP Domain, including, as a minimum:

• The scope & objectives of the training process including the various PDP
Domain topics mentioned in the “Data Management and Personal Data
Protection (DM & PDP) Standards” document.

• The methods & channels through which the training plan will be
implemented.

• The training plan execution dates.

236
Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic and
PDP.MQ.1
operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

Level 3: - The PDP Plan Implementation - A report must be attached showing the implementation status including, as a minimum:
Activated Status Report.
• The achievement percentages of the initiatives & projects included in the
PDP implementation plan.

- Evidence of PDP training - A report must be attached showing the implementation status of the PDP training for all
activities conducted. employees including, as a minimum:

• The details of the training conducted for the Entity’s employees.

• The training targets audience.

• The methods & channels for the training.

• The number of the Entity’s employees who were trained in the PDP
Domain, and a list of training topics including:

• Importance of Personal Data Protection and the Impacts and


consequences to the Entity and / or Data Subject.

• Definition of Personal Data.

• Data Subject Data Rights.

• Entity and Data Subject Responsibilities.

• Notifications as for when the Entity and / or Data Subject should be


notified and how to handle inquiries about personal data collection,
processing and sharing.

Level 4: - The Monitoring Report for the - The monitoring report must be prepared based on pre-defined KPIs, and each indicator’s
Managed PDP Plan with pre-defined data or card should include the following, as a minimum:
KPIs.
• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

237
Checklist – Personal Data Protection Domain

Has the Entity performed an initial Personal Data Protection (PDP) assessment and developed a plan to address the strategic and
PDP.MQ.1
operational Privacy requirements?

Levels Acceptance Evidence Acceptance Criteria

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

Level 5: - Review report and continuous - A report must be attached showing the Entity’s periodic review and continuous
Pioneer improvement of the personal improvement of the PDP plan to ensure continuous compliance with applicable
data protection plan. regulations and other requirements or environmental influences. The report must include,
as a minimum:

• Documentation of periodic reviews of the personal data protection plan


and documented results.

• Continuous improvement mechanisms to update the data privacy plan.

Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

Level 0: - Not Applicable. - Not Applicable.


Absence of
Capabilities

Level 1: - Evidences of the existing - Proofs must be attached for the current PDP & Data Privacy initiatives performed by the
Establishing initiatives for PDP and Data Entity. For example, not limited to:
Privacy.
• The Entity’s current PDP policies & processes.

• Any documents presenting the personal data breach identifications.

• The documentation of the Entity’s current practices of consent


management, and the rights of the Data Subjects.

• Privacy risk assessments conducted by the Entity.

Level 2: - The documented Data Breach - The entity must attach a document of processes for data breach notifications of, as it
Defined Notifications Process. requires the person in charge of data control or data processing who deals with personal
data at the entity to notify the regulatory authority in the event of breach of personal data,

238
Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

within the time frame specified in the personal data protection policy issued for the
National Data Management Office; Note that the time frame for reporting is 72 hours.

- Please refer to the Personal Data Protection Policy issued by the National Data
Management Office for more detailed requirements.

- The documented Data Breach - A document must be attached containing the procedures & processes of Data Breach
Management Process. Management. The Data Breach Handling Management Process shall include, as a
minimum:

• Conducting an incident review by the Data Controller / Data Protection


Controller with the Regulatory Authority.

• Formulating an immediate response to the incident by the Data


Controller and/or Data Processor.

• Implementing the permanent corrective actions as issued by the


Regulatory Authority.

• Testing the implemented corrective actions to validate the PDP


solution(s) efficiency.

- The PDP & Data Privacy Notice - The Entity shall attach a document containing the procedures & processes of the PDP /
and the Consent Management Privacy Notice and Consent Management Process considering the following components,
Process. as a minimum:

• Define and document the processes of providing Data Subjects with


notice and requesting consent at all the data lifecycle phases
where/when data is collected as prescribed by the PDP Policies &
Regulations published by NDMO-SDAIA.

• The Entity shall provide the Data Subject with all possible options; and
the Entity must get the Data Subject’s (Explicit or Implicit) consent /
approval regarding the collection, use or disclosure of personal data.

• The Entity shall document and make available a PDP/Privacy Notice for
Data Subjects to read / review before or at the time the Entity requests
permission to collect personal data.

- The Data Subjects' Rights - The Entity shall attach the Data Subjects Rights Management procedures & processes
Management Processes. document as it’s a must to establish & document the operations to support the rights of
Data Subjects, in compliance with the PDP Policies & Regulations published by NDMO-
SDAIA, whereby a Data Subject enjoys the following rights:

• Right to be informed.

• Right to access.

• Right to rectify / correct.

• Right to erase or destroy.

239
Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

• Right to object.

• Right to restrict processing.

• Right to data portability / transfer.

- The Entity should inform the Data Subjects about their rights and provide possible means
by which Data Subjects requests are submitted, responded to and tracked.

- Entity-Specific PDP Policies. - The Entity shall attach its PDP Policies document including:

• Policy Name.

• Release date.

• Version number.

• Document control (Preparation, review, approval).

• Policy Statement.

• Job roles & responsibilities.

• Version history / record.

• Terminologies.

• Objective.

• Scope of work.

• References.

• Policy Owner.

Level 3: - The Developed & Adopted - Evidences shall be attached proving the automation of the developed & approved
Activated Consent Management workflow for the consent management process.
Workflow.

- Evidences of Notifications Sent - Evidences shall be attached proving that the notifications were sent by the person in
to the Regulatory Authority charge of data control or data processing who deals with personal data at the entity, to
within the Allotted Timeframe. the regulatory/legislative authority within the specified reporting time frame of 72 hours. If
no incident occurred, an approved letter should be attached.

- Evidences of Data Breach - Evidences shall be attached proving that Data Breach Management, including the
Management Including identification & detection of data breaches which occurred. If no incident occurred, an
Identified Data Breaches. approved letter should be attached.

- The Results of the PDP Risk - The Entity shall attach a document containing the procedures & processes of the PDP risk
Assessments. assessments plan / operations, as it’s a must that the Entity conducts a yearly risk
assessment of the operation and use of the information systems containing personal data,
including the collection & processing of personal data, and the storage & transmission of

240
Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

personal data by each system whether automated or manual. The risk assessment
findings must be as follows, at minimum:

• Documented.

• Analyzed for impact and the occurrence likelihood.

• Evaluated against current regulations obligations and resolution


criticality.

- Evidences of Published Data - Evidences shall be attached proving that processes were published clearly, feedback was
Subjects Rights Management collected from users, especially Data Subjects.
Processes and Feedback
Received from Data Subjects.

- The PDP Register. - The Entity should attach a report of the compliance records maintained (record of any
collection and/or processing of any personal data), and evidence showing that the records
were made available to the regulatory authority (NDMO) as defined in the PDP Policies &
Regulations.

Level 4: - The Monitoring Report for the - The report must be prepared based on the data of the pre-defined Key Performance
Managed PDP & Data Privacy Practices Indicators (KPIs) (Indicator Cards), and each indicator’s data or card should include the
with pre-defined KPIs. following, as a minimum:

• Indicator’s Name / Code.

• Indicator’s Owner.

• Indicator’s Coordinator.

• Indicator’s Description.

• The strategic/operational objective to be measured (with identifying the


Specification or the Process to which the indicator belongs).

• Indicator’s Equation.

• Measurement Unit (Percentage, Number / Quantity, etc.).

• Baseline (Measurement value in the first measurement year).

• Target value.

• Measurement Periodicity (Monthly / Quarterly / Biannually / Annually).

• Data sources used to calculate the indicator.

• Data collection mechanism.

• Indicator’s polarity (+/-) (Positive Polarity: Higher indicator value is the


target, Negative Polarity: Lower indicator value is the target).

241
Checklist – Personal Data Protection Domain

Has the Entity defined and implemented Privacy policies and processes for Personal Data including Data breach identification,
PDP.MQ.2
consent management, Data subject rights, and Privacy risk assessments?

Levels Acceptance Evidence Acceptance Criteria

- The Compliance Monitoring - The Entity shall conduct internal audits to monitor compliance with the PDP Rules & Data
Report & Audit Results. Privacy Regulations and shall document its findings in a report presented to the Data
Protection Officer. In non-compliance cases, the Entity shall take corrective actions and
notify the Regulatory Authority and the NDMO and shall document the corrective actions
within the audit findings report.

Level 5: - The Documented Periodic - A report must be attached including updated PDP & Data Privacy processes & practices
Pioneer Reviews & Outcomes for the documents, showing the operations of reviewing the PDP & Data Privacy processes &
PDP & Data Privacy Practices. procedures. The report shall include, as a minimum:

• Based on the identified matters / issues, such as risk re assessment, the


Entity must update the documents (Policies & Procedures for combating
everything related to breaching Data Privacy) and must attach an
evidence of such reviews.

- Evidences of Automation & - The Entity must address changes on the PDP processes & practices and must attach
Change Management for PDP. continuous improvement mechanisms to prove effective Change Management in relation
to the required changes.

- The Entity must attach an evidence of process automation (An updated workflow, any
system changes) to include these changes as part of the implementation.

242
8.3. Appendix III – Operational Excellence (OE)
The Operational Excellence (OE) component of the NDI utilizes information captured
from various national data platforms (e.g.: ODP, DL, DMP, GSB, NDC, and RDM) to
evaluate the entity's operational efficiency and effectiveness based on the practices
currently applied in 6 Data Management (DM) domains (subject to increase). For the first
round of NDI, each entity will be measured against these metrics: DSI.OE.02, OD.OE.01,
MCM.OE.01, MCM.OE.02, DO.OE.02, and DO.OE.03. Please refer to “The Operational
Excellence (OE) Document” for further details.

8.3.1. Data Sharing and Interoperability

Operational Excellence Metrics

Metric ID Metrics Platforms

- DSI.OE.01 - Government Service Bus (GSB)


- Percentage of attributes shared on the Government Service bus but not produced by the
agency.
- National Data Catalog (NDC)

- DSI.OE.02 - Percentage of systems integrated with National Data Lake. - National Data Lake (NDL)

- DSI.OE.03 - Time taken to process data sharing agreements. - Data Marketplace (DMP)

243
8.3.2. Open Data

Operational Excellence Metrics

Metric ID Metrics Platforms

- OD.OE.01 - Percentage of datasets published in the Open Data Platform.

- OD.OE.02 - Delay/Lag in refreshing open datasets.

- Open Data Platform (ODP)

- OD.OE.03 - Number of reported issues for the published datasets.

- OD.OE.04 - Delay in resolving reported issues on published datasets.

8.3.3. Data Catalog and Metadata (MCM)

Operational Excellence Metrics

Metric ID Metrics Platforms

- MCM.OE.01 - Percentage of systems catalogued in the National Data Catalog.

- MCM.OE.02 - Percentage of business attributes defined and linked in the National Data Catalog.

- MCM.OE.03 - Percentage of reporting assets defined in the National Data Catalog.


- National Data Catalog (NDC)

- MCM.OE.04 - Percentage of business attributes linked to attribute class standards in the National Data
Catalog.

- MCM.OE.05 - Accuracy percentage of business attribute relationships in the National Data Catalog.

244
8.3.4. Reference and Master Data Management

Operational Excellence Metrics

Metric ID Metrics Platforms

- RMD.OE.01 - Percentage of published reference entities.

- Reference Data Management platform


(RDP).
- RMD.OE.02 - Time taken to publish new reference entities.

- Government Service Bus (GSB).

- RMD.OE.03 - Time taken to fix reported issues in reference entities

8.3.5. Data Quality

Operational Excellence Metrics

Metric ID Metrics Platforms

- DQ.OE.01 - Percentage of Data Quality index in the Government Service bus. - Government Service Bus (GSB)

- DQ.OE.02 - Percentage of Data Quality index in National Data Lake. - National Data Lake (NDL)

8.3.6. Data Operations

Operational Excellence Metrics

Metric ID Metrics Platforms

- DO.OE.01 - Percentage of delay in response time of the Government Service Bus APIs.

- Government Service Bus (GSB)

- DO.OE.02 - Percentage of failed API calls on the Government Service Bus.

- DO.OE.03 - Percentage of operational issues from agencies encountered by the National Data Lake. - National Data Lake (NDL)

245
246

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy