0% found this document useful (0 votes)
29 views53 pages

Data Driven Verification

The DNV-CG-0557 guideline outlines the framework for data-driven verification (DDV) methods in maritime and offshore industries, emphasizing the use of digital technologies to enhance verification processes. It details various verification methods, including algorithm-based verification and self-verifying systems, and sets requirements for their application in ensuring compliance and safety. The document also addresses the implications of these technologies on traditional verification practices and aims to support the development of more efficient and less intrusive verification systems.

Uploaded by

wjf3780
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views53 pages

Data Driven Verification

The DNV-CG-0557 guideline outlines the framework for data-driven verification (DDV) methods in maritime and offshore industries, emphasizing the use of digital technologies to enhance verification processes. It details various verification methods, including algorithm-based verification and self-verifying systems, and sets requirements for their application in ensuring compliance and safety. The document also addresses the implications of these technologies on traditional verification practices and aims to support the development of more efficient and less intrusive verification systems.

Uploaded by

wjf3780
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

CLASS GUIDELINE

DNV-CG-0557 Edition August 2021

Data-driven verification

The content of this service document is the subject of intellectual property rights reserved by DNV AS (“DNV”). The user
accepts that it is prohibited by anyone else but DNV and/or its licensees to offer and/or perform classification, certification
and/or verification services, including the issuance of certificates and/or declarations of conformity, wholly or partly, on the
basis of and/or pursuant to this document whether free of charge or chargeable, without DNV’s prior written consent. DNV
is not responsible for the consequences arising from any use of this document by others.

The PDF electronic version of this document available at the DNV website dnv.com is the official version. If there
are any inconsistencies between the PDF version and any other available version, the PDF version shall prevail.

DNV AS
FOREWORD

DNV class guidelines contain methods, technical requirements, principles and acceptance criteria
related to classed objects as referred to from the rules.

© DNV AS August 2021

Any comments may be sent by e-mail to rules@dnv.com

This service document has been prepared based on available knowledge, technology and/or information at the time of issuance of this
document. The use of this document by other parties than DNV is at the user's sole risk. Unless otherwise stated in an applicable contract,
or following from mandatory law, the liability of DNV AS, its parent companies and subsidiaries as well as their officers, directors and
employees (“DNV”) for proved loss or damage arising from or in connection with any act or omission of DNV, whether in contract or in tort
(including negligence), shall be limited to direct losses and under any circumstance be limited to 300,000 USD.
CHANGES – CURRENT

Changes - current
This document supersedes the November 2020 edition of DNVGL-CG-0557.
The numbering and/or title of items containing changes is highlighted in red.

Changes August 2021

Topic Reference Description

Rebranding to DNV All This document has been revised due to the rebranding of DNV
GL to DNV. The following have been updated: the company
name, material and certificate designations, and references to
other documents in the DNV portfolio. Some of the documents
referred to may not yet have been rebranded. If so, please see
the relevant DNV GL document. No technical content has been
changed.

Editorial corrections
In addition to the above stated changes, editorial corrections may have been made.

Class guideline — DNV-CG-0557. Edition August 2021 Page 3


Data-driven verification

DNV AS
CONTENTS

Contents
Changes – current.................................................................................................. 3

Section 1 General.................................................................................................... 5
1 Introduction.........................................................................................5

Section 2 General requirements............................................................................ 16


1 Requirements for all DDV methods....................................................16

Section 3 Qualifier requirements.......................................................................... 30


1 Specific requirements for SVS, BITE, DSA , AVA and DT.....................30

Section 4 Requirements for fleet in service (FIS)................................................. 38


1 General.............................................................................................. 38

Appendix A Explanatory information.....................................................................41


1 Building evidence-based confidence in systems................................ 41

Appendix B Application specific requirements...................................................... 43


1 Introduction.......................................................................................43
2 Dynamic positioning (DP) systems.................................................... 43

Changes – historic................................................................................................ 52

Class guideline — DNV-CG-0557. Edition August 2021 Page 4


Data-driven verification

DNV AS
SECTION 1 GENERAL

Section 1
1 Introduction

1.1 Background
Traditionally, verification of equipment and systems on board vessels has been carried out by deploying
surveyors to perform onboard verification. Today, development of digital technology and increased
possibilities in connectivity between vessel and shore provides the possibility to leverage data and data
centricity to enhance the conventional verification and validation efforts undertaken by industry stakeholders.
When used correctly, the technologies are expected to support a survey scheme that is more flexible and less
intrusive in lieu of methods that traditionally have relied on deploying personnel to the vessel.
It is anticipated that the new technology can be used to develop effective verification and test functionality
which can be incorporated as integral parts of the systems subject to verification. Verification of such
functionality is essential in order to ensure that the evidence generated is genuine, trustworthy, and of
sufficiently high quality. When this is proven, the evidence may be used as basis for assessment towards
specified acceptance criteria.
It is anticipated that the new technology can be used to develop effective verification and test functionality
which can be incorporated as integral parts of the systems subject to verification. When such functions
have proven to provide genuine and trustworthy evidence, the Society may use this evidence in the
assessment towards the specified acceptance criteria, and base issuance, maintenance, and renewal of
class certificates and notations on methods utilizing these functions. The functionality is often algorithm
driven, and possibly based upon machine learning (ML). With the acceptance of DDV methods, assessment
of specified acceptance criteria may be based upon such evidence. Some parts of the assessment itself may
even become autonomous through the use of algorithms, further increasing the importance of the rigour
and intensity with which these functions must be verified. Such functions are expected to be sophisticated
software intensive systems, posing the same challenges in verification as any other complex software
intensive system. Examples of such challenges may include potential lack of transparency, difficulties in
establishing expected results, and built in cognitive and societal biases that may decrease the level of
objectivity in the generated evidence. In App.A of this document, additional information on data-driven
verification philosophy is provided. The appendix contains a discussion on building evidence-based confidence
in systems.
Guidance note:
This class guidance is developed in order to support industry development of data-driven verification (DDV) systems and methods.
A major part of existing maritime and offshore target systems may not be prepared for implementation of DDV methods, nor may
such methods be developed. It is recommended to include the Society in the development of DDV methods intended to be used for
retention of class certificates.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2 Objective
This guideline is developed in order to support industry development of data-driven verification systems and
methods. A wide range of methods are expected to emerge from the vast development within digitalization,
data analytics and data connectivity. The guideline builds on evolution of current industry practice and seeks
to use established and emerging technologies to address the limitations and disadvantages of traditional
methods. These methods may be accepted as long as they provide the same, or higher, level of assurance as
the traditional methods.
The objective builds upon the principles of:
— ensuring that systems are designed to be verified
— incorporate verify on-demand capabilities.

Class guideline — DNV-CG-0557. Edition August 2021 Page 5


Data-driven verification

DNV AS
The objective also aims at reducing exposure to non-productive time associated with taking vessels out of

Section 1
service for trials, reducing costs associated with surveyors from various organizations attending the vessel,
reducing redundant verification activities, and reducing the cognitive burden imposed upon vessel crews
carrying out testing. DDV functions are also assumed to be important parts of the anticipated development of
more autonomous and remotely operated vessels.

1.3 Scope
The document provides guidelines for accepting data driven verification methods based on emerging digital
technologies.
The scope includes setting requirements to:
— built in test equipment and self-verifying systems, e.g. for:
— performance verification of active functions
— testing of on demand functions, e.g. protection functions
— testing of the availability of redundant functions
— digital survey applications, i.e. digital tools which can be managed by the crew in order to gather tamper-
free system/vessel data
— advanced data analytics and algorithm based verification functions
— gathering, treatment and delivery of collected data to ensure the truthfulness, correctness and quality of
data use for class assessment.

1.4 Application
This class guideline is intended for vessels to be provided with arrangements enabling verification by use
of data driven verification methods including vessel with a DDV (data-driven verification) class notation as
specified in DNV-RU-SHIP Pt.6 Ch.11 Sec.2.

1.5 Class notations


The DDV class notations and appurtenant qualifiers are specified in DNV-RU-SHIP Pt.6 Ch.11 Sec.2. A DDV
class notation designates that a data-driven verification method is accepted by the Society. Qualifiers will
identify the type of verification method applied.
Qualifiers described in this document are given in Table 1.

Table 1 Verification method [VM] signifying the method of verification

Qualifier for Description


verification
method
[VMn]

AVA Algorithm based verification agent and advanced data analytics: this qualifier designates data-driven
verification functions based on an automatic algorithm with a dedicated role, purpose, function, and
responsibility (i.e. agent) in conducting verification (i.e. generating evidence) by interacting with the
target system through an API or other digital interface using documented methods and data structures.
The AVA shall also support delivery of a tamper-free data-based body of evidence which can be used for
classification purposes, to the Society. The AVA acts as a 'tester' and generates evidence and possibly
also makes decisions and conclusions (e.g. go/no-go).

Class guideline — DNV-CG-0557. Edition August 2021 Page 6


Data-driven verification

DNV AS
Qualifier for Description

Section 1
verification
method
[VMn]

BITE Verification based on built in test functionality: this qualifier designates data-driven verification functions
(hardware and software) which are built in to, or connected to, the target system with the purpose of
generating verification data, automatically or upon operator request, and to deliver a tamper-free data-
based body of evidence which can be used for classification purposes, to the Society. It is accepted that
the target system may need to be taken out of operation and put in to test mode for the verification
activity to be performed. The target system/BITE shall have the capability to transfer the generated BITE
verification data over an API or other digital interface using documented methods and data structures.

DSA Verification based on digital survey applications: this qualifier designates data driven verification functions
utilizing digital test tools where the complete, or specified parts of the verification scope is incorporated
and managed by the crew or other dedicated test personnel. The tool incorporates test procedures to be
performed for generation and gathering of onboard system(s) generated data and supports delivery of a
tamper-free data-based body of evidence which can be used for classification purposes, to the Society.
The target system may need to be taken out of operation and put in to test mode for the verification
activity to be performed. The interaction between DSA and the target system, and the generated target
system/DSA verification data shall be carried over an API or other digital interface using documented
methods and data structures.

DT Verification based on digital twins: this qualifier designates verification functions where verification of a
target system is based on testing or simulations performed by application of a digital twin.

SVS Verification based on self-verifying systems: this qualifier designates data driven verification functions
where the verification function is an integral part of the target system functionality (hardware and
software). I.e. the system shall continuously or at specified intervals, during normal operation,
automatically provide tamper-free data which can be used as body of evidence for classification purposes,
to the Society. The delivered data shall in general provide/indicate conclusions on the acceptance criteria.
The target system/SVS function shall have the capability to transfer the generated SVS verification data
over an API or other digital interface using documented methods and data structures.

The different verification methods may be applied for verification of a variety of target system (TS) and class
notation requirements. The class rules will therefore also indicate which systems the different DDV methods
are accepted for by use of target system specification qualifiers. Typical examples of qualifiers are shown in
Table 2.

Table 2 Target systems (TS) for DDV

Qualifier for target


Description Rule reference
system (TS)

DP Dynamic positioning system DNV-RU-SHIP Pt.6 Ch.3 Sec.1 and


DNV-RU-SHIP Pt.6 Ch.3 Sec.2

TAM Thruster assisted mooring control system DNV-OS-E301

PMS Main electric power control and monitoring system DNV-RU-SHIP Pt.4 Ch.8

SPT Steering, propulsion and thruster control and DNV-RU-SHIP Pt.4 Ch.5 and
monitoring system DNV-RU-SHIP Pt.4 Ch.10

ICS Integrated control and monitoring system DNV-RU-SHIP Pt.4 Ch.9

DRILL Drilling and well control and monitoring system DNV-OS-E101

BOP Blowout prevention control arrangement DNV-OS-E101

Class guideline — DNV-CG-0557. Edition August 2021 Page 7


Data-driven verification

DNV AS
Qualifier for target

Section 1
Description Rule reference
system (TS)

Crane Crane control and monitoring system DNV-RU-SHIP Pt.5 Ch.10 Sec.2, DNV-
ST-0377 and DNV-ST-0378

NAUT Nautical systems DNV-RU-SHIP Pt.6 Ch.3 Sec.3

For each system in Table 1 reference to relevant rules and/or offshore standards (OS) are listed. The rules or
standards identified will provide more specific requirements for the target systems, and which are normally
considered mandatory. DNV-RU-SHIP Pt.4 Ch.9 and DNV-OS-D202 provide generic common requirements to,
and is applicable for, all systems listed above.
The resulting class notation syntax will then be: DDV(TS1[VM1], … TSn[VMn]).
Guidance note 1:
Example notation, when applying DDV method DSA for DP (dynamic positioning system) and NAUT (nautical system): ✠ 1A
Tanker for oil CSR E0 Bow loading DYNPOS(AUTR) DDV(DP[DSA], NAUT[DSA]).

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 2:
Requirements for additional class notation DDV are found in DNV-RU-SHIP Pt.6 Ch.11 and may incorporate additional and
somewhat different requirements than those stipulated in this glass guideline. Hence, the referenced rules will be the governing
document when applying a DDV class notation.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.6 Structure
The document is structured in the following sections:
— Sec.1: provides information on intention, objectives, and overall document structure.
— Sec.2: gives general requirements applicable for all verification methods. Valid both for use on new
buildings and vessels in service. This section includes also documentation requirements.
— Sec.3: provides requirements for verification by use of self-verifying systems (SVS), built in test
functionality (BITE), digital survey applications (DSA), data analytics and algorithm-based verification
agents (AVA), and digital twins (DT).
— Sec.4: gives general requirements for verification of vessels in service by use of methods accepted per
this document. This includes:
— requirements on how to do verification of alterations to the target system, verification methods/
functions and equipment
— survey of the DDV system
— requirements to procedures and personnel requirements.
— App.A: proposes a nomenclature and a model to support discussions on (and bring attention to) some
important aspects related to evidence and evidence generation properties when using DDV methods to
build evidence based confidence in systems.
— App.B: covers additional requirements to specific application areas. In this revision the following specific
applications are included:
— dynamic positioning systems.

1.7 References
Table 3 lists DNV references used in this document.

Class guideline — DNV-CG-0557. Edition August 2021 Page 8


Data-driven verification

DNV AS
Table 3 DNV references

Section 1
Document code Title

DNV-CG-0264 Autonomous and remotely operated ships

DNV-CG-0508 Smart vessel

DNV-CG-0564 Data collection infrastructures

DNV-CP-0203 Electronic and programmable equipment and systems

DNV-CP-0507 System and software engineering

DNV-RP-0497 Data quality assessment framework

DNV-RP-0496 Cyber security resilience management for ships and offshore units in operations

DNV-RP-A204 Qualification and assurance of digital twins

DNV-OS-D202 Automation, safety and telecommunication systems

DNV-RP-D101 Structural analysis of piping systems

DNV-RP-E306 Dynamic positioning vessel design philosophy guidelines

DNV-RP-E307 Dynamic positioning systems - operation guidance

DNV-RP-0510 Framework for assurance of data-driven algorithms and models

DNV-RP-0513 Assurance of simulation models

DNV-RU-SHIP Pt.4 Ch.9 Control and monitoring systems

DNV-RU-SHIP Pt.6 Ch.3 Sec.1 Dynamic positioning systems

DNV-RU-SHIP Pt.6 Ch.3 Sec.2 Dynamic positioning systems with enhanced reliability

DNV-RU-SHIP Pt.6 Ch.5 Sec.21 Class notation cyber secure

DNV-RU-SHIP Pt.6 Ch.5 Sec.24 Smart vessel - Smart

DNV-RU-SHIP Pt.6 Ch.11 Digital features

Table 4 lists external references used in this document.

Table 4 External references

Document code/URL Title

IMO MSC/Circ.645 Guidelines for vessels with dynamic positioning systems

IMO MSC.1/Circ.1580 Guidelines for vessels and units with dynamic positioning (DP) systems

ISO 5725-1 Accuracy (trueness and precision) of measurement methods and results — Part
1: General principles and definitions

ISO 8000-2 Data quality - Part 2: Vocabulary

ISO 8000-8 Data quality - Part 8: Information and data quality: Concepts and measuring

ISO 13379-1 Condition monitoring and diagnostics of machines - Data interpretation and
diagnostics techniques - Part 1: General guidelines

Class guideline — DNV-CG-0557. Edition August 2021 Page 9


Data-driven verification

DNV AS
Document code/URL Title

Section 1
ISO 13381-1 Condition monitoring and diagnostics of machines- Prognostics - Part 1: General
guidelines

ISO 14224 Petroleum, petrochemical and natural gas industries — Collection and exchange
of reliability and maintenance data for equipment (see also API Std. 689)

ISO 18436 series Condition monitoring and diagnostics of machines - Requirements for
qualification and assessment of personnel

ISO 19848 Ships and marine technology - Standard data for shipboard machinery and
equipment

ISO/IEC 27000 Information technology - Security techniques - Information security


management systems - Overview and vocabulary

1.8 Definitions and abbreviations


1.8.1 Definition of verbal forms
The verbal forms in Table 5 are used in this document.

Table 5 Definition of verbal forms

Term Definition

shall verbal form used to indicate requirements strictly to be followed in order to conform to the
document

should verbal form used to indicate that among several possibilities one is recommended as particularly
suitable, without mentioning or excluding others

may verbal form used to indicate a course of action permissible within the limits of the document

1.8.2 Definitions of terms


The terms defined in Table 6 are used in this document.

Table 6 Definitions and abbreviations

Terms Description

Algorithm based verification agent An algorithm with a dedicated role, purpose, function, and responsibility (i.e.
(AVA) agent) in conducting verification (i.e. generating evidence) by interacting with
the target system through an interface
Guidance note:
Such algorithms act as "testers" - they generate evidence and possibly make
decisions and conclusions (e.g. go/no-go).

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Assurance Grounds for justified confidence that a claim (requirement) has been achieved

Assessment Validation of evidence properties such as correctness, relevance, capability,


quality, integrity

Class guideline — DNV-CG-0557. Edition August 2021 Page 10


Data-driven verification

DNV AS
Terms Description

Section 1
Built in test functionality(BITE) Data driven verification functions (hardware and software) which are built into,
or connected to, the target system with the purpose of generating verification
data, automatically or upon operator request, and to deliver a tamper-free data-
based body of evidence that can be assessed by a verifier towards acceptance
criteria

Body of evidence The compiled evidence used in the assessment supporting that the system
complies to the requirements

Cyber physical systems Systems which have combined computing, networking and physical process with
feedback loops where physical processes affect computations and vice versa

Cyber security Practices, tools and concepts that protect:


— the operational technology against the unintended consequences of a cyber
incident
— information and communications systems and the information contained
therein from damage, unauthorised use or modification, or exploitation
— against interception of information when communicating and using internet

D-class Short for data-driven classification. DNV initiative for acceptance of data-driven
verification as basis for classification services

Data Symbolic representation of something that depends, in part, on its metadata for
its meaning, see ISO 8000-2

Data accuracy Composite of trueness and precision, see ISO 5725-1

Data analytics The process of examining data in order to draw conclusions about the
information they contain

Data completeness Quality of having all data that existed in the possession of the sender at time
the data message was created, see ISO 8000-2 Alt: completeness of data is the
extent to which (i) the relevant data sets, (ii) the expected records of a data
set, and (iii) data elements, attributes, and values in a data set are provided
and reflect the scope and the real world

Data driven model Computation model that is created by applying a suitable training algorithm to a
set of relevant data

Data driven verification (DDV) Verification based on and documented by collection of system generated data
(e.g. gathered during operations, or produced by applying purpose made stimuli
to the system, typically by simulation of specified failure conditions)

Data infrastructure Data infrastructure is used for data creation, consumption and sharing. Data
infrastructure is the sum of functions, methods, mechanisms and tools that
ensure interoperability, transparency and trust in data value chains cross
composite hardware, software, network resources and services.

Data precision The closeness of agreement between independent test results obtained under
stipulated conditions, see ISO 5725-1

Data set Logically meaningful grouping of data, see ISO 8000-2

Data trueness The closeness of agreement between the average value obtained from a large
series of test results and an accepted reference value, see ISO 5725-1

Data quality management Coordinated activities to direct and control an organization with regards to data
quality, see ISO 8000-2

Class guideline — DNV-CG-0557. Edition August 2021 Page 11


Data-driven verification

DNV AS
Terms Description

Section 1
Deficiency A failing or shortcoming with respect to applicable requirements

Digital survey application (DSA) A digital tool where the complete, or specified parts of the verification scope is
incorporated and managed. The tool incorporates methods to gather data which
can be used as a body of evidence for verification by a third party

Digital twin (DT) Virtual representation of a system or physical asset, that makes system
information available or calculates performance through integrated models and
data, with the purpose of providing decision support

DP verification program A system encompassing all DP verification activities to be part of the class
retention scheme. The system shall organize and keep track of all DP
verification activities, including when, what and how to test and keep an historic
record of all results. The system may be built up of smaller blocks/methods
but shall be organized so that it keeps track of the status of the verification
activities for the complete DP system
Guidance note:
A more traditional DP FMEA and FMEA test program may be the core of such a
program, utilizing new verification functionality/methods. It is anticipated that a
digital DP verification program will be more effective compared to the traditional
hard copy FMEA analysis, test program and report format.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Evidence Information elements, such as test results, used in the assessment supporting
that the system complies to the requirements

Evidence property A capacity or quality that can be attributed to a piece, or body of evidence

Evidence (type) capability The ability of a particular type of evidence to show certain aspects about the
target system

Evidence (instance) quality The degree of which an evidence instance is capable of fulfilling its (theoretical)
potential, e.g. an evidence instance may not be repeatable, or even incorrect
because of deficiencies in the verification process

Evidence (instance) validity To what degree the instance of a particular evidence represents the condition of
the actual target system. A typical cause of invalidating evidence is to generate
evidence from an obsolete version of the target system, another cause may be
that the data on which the verification is based upon has been tampered with

Evidence (instance) trustworthiness The degree of trust that can be placed on a particular instance of evidence that
it is actual showing the true target system state. Trustworthiness is strongly
correlated with the objectivity in the process of generating the evidence.
Objectivity may be challenged by cognitive and societal biases in the verification
process (see verification detachment), and enhanced by increased level of
rigour in the verification process (see verification rigour)

(Body of) Evidence type completeness The degree of which the capability of the different types of evidence fulfills the
required verification scope

(Body of) Evidence instance coverage The degree of which a particular instance of body of evidence covering the
potential body of evidence type capability (see verification intensity). E.g.
reviewing documents produces a certain type of evidence, but the number
of reviewed documents, and the number of assessed aspects within each
document may vary (e.g. correctness, consistency, validity, relevance etc.),
which will affect the evidence instance coverage

Class guideline — DNV-CG-0557. Edition August 2021 Page 12


Data-driven verification

DNV AS
Terms Description

Section 1
(Body of) Evidence instance depth The number of system conditions, including their combinations covered by the
evidence instance. A system's condition can be certain input values, system
operational modes, or system states, etc. and combinations of those (see
verification rigour)

Fidelity The degree to which a model or simulation reproduces the state and behaviour
of a real-world object, feature or condition. Fidelity is therefore a measure of
the realism of a model or simulation

Hybrid model Computation model which consists of a combination of both data-driven and
simulation models

Metadata Data that describes and defines other data, see ISO 8000-2

MSC/Circ. Marine Safety Committee Circular

Remote access Use of systems that are inside the perimeter of a security zone being addressed
from a different geographical location with the same/similar rights as when
physically present at the location

Remote survey Survey performed without the Society attending at the actual test location, e.g.
ship or manufacturer's premises

Remote witnessing Testing or other verification activity performed while being simultaneously
remote witnessed, via live video and two-way sound communication, by an
inspector

Self-verifying system (SVS) A system which has built in functionality which automatically can provide
evidence that a specified set of requirements are complied with

Simulation model A digital representation of a part of a process or physical asset which is used
to conduct experiments. The purpose of the simulation experiments is to
understand the behaviour of the system or evaluate strategies for the operation
of the system

Station keeping Simultaneous position and heading keeping

Tamper-free data Genuine system generated data, gathered either from normal system operations
or generated by application of purpose made verification activities/functions,
e.g. testing

Target system (TS) System subject for assurance, under test, or consideration
Guidance note:
The target system, or parts of the target system, can be represented by a digital
twin under the DT qualifier regime.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Test item validity period The time period for which the result of a test is valid

Test operator The person in charge of operating the test equipment and/or application

Unwitnessed testing Testing performed by crew/manufacturer or other owner/manufacturer


representative without the society presence or without being remote witnessed
by an inspector

Validation The process of providing evidence that the system, software, or hardware and
its associated products satisfy requirements allocated to it at the end of each
life cycle activity, solve the right problem, and satisfy intended use and user
needs

Class guideline — DNV-CG-0557. Edition August 2021 Page 13


Data-driven verification

DNV AS
Terms Description

Section 1
Verification The process of providing objective evidence that the system, software, or
hardware and its associated products conform to requirements

Verification activity responsible The part responsible for carrying out the verification and delivering the
verification information to the Society

Verification detachment The verification organizations (VO, see verification roles) level of independence
from the system developer with respect to technical, organizational, and
financial aspects

Verification intensity The size of the verification scope across normal and abnormal system conditions
and system artefacts targeted by the verification effort

Verification method (VM) The method used to gather information (e.g. data) which is used for the
assessment of system towards stated or implied requirements

Verification method design responsible The part responsible for design of the verification method and/or functionality

Verification rigour The degree of formalism with respect to techniques (possibly quantitative) and
documentation used in the verification effort

Verification roles May be divided into verification organization (VO), assessor, and test witness.
Each role is characterised by its tasks, responsibility, intention, motivation,
required competence, attachment to the target system development etc.

Verification system The system implemented (hardware and software) to obtain the verification
body of evidence

1.8.3 Abbreviations
The abbreviations described in Table 7 are used in this document.

Table 7 Abbreviations

Abbreviations Description

DP Dynamic positioning

DPS DNV dynamic positioning system class notations in line with minimum
requirements in IMO Guidelines for vessels with dynamic positioning systems

DPVAD DP verification and acceptance document. See IMO MSC.1/Circ.1580. See also
FSVAD

DYNPOS DNV dynamic positioning system class notations with additional requirements to
achieve higher availability and robustness

FMEA Failure mode and effect analysis

FSVAD Flag state verification and acceptance document. See IMO MSC/Circ.645. See
also DPVAD

GA General arrangement

INS Inertial navigation system

IT Information technology - the application of computers to store, study, retrieve,


transmit, and manipulate data, or information, often in the context of a business
or other enterprise

OS Offshore standard

Class guideline — DNV-CG-0557. Edition August 2021 Page 14


Data-driven verification

DNV AS
Abbreviations Description

Section 1
OT Operational technology - the hardware and software dedicated to detecting or
causing changes in physical processes through direct monitoring and/ or control
of physical devices such as valves, pumps, etc.

PMS Power management system

PRS Position reference system

RP Recommended practice

SWB Switchboard

UPS Uninterruptible power supply

VFD Variable frequency drive

VT Voltage transformer

WCFDI Worst case failure design intent

Class guideline — DNV-CG-0557. Edition August 2021 Page 15


Data-driven verification

DNV AS
SECTION 2 GENERAL REQUIREMENTS

Section 2
1 Requirements for all DDV methods

1.1 General
1.1.1 This class guideline states requirements which may, upon agreement with the Society, be applied in
the system and component certification and classification processes for application of data-driven verification
(DDV) systems/methods on marine and offshore vessels.
Guidance note:
Systems to be verified by the DDV methods described in this document should be designed based on 'build to test' and 'test on
demand capability' philosophies in order to be able to ensure that they are healthy to operate when needed.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.2 For verification of the DDV target systems on new buildings and other first-time verification of DDV
target systems, the basis shall still be traditional Society verification methods. For such activities the use of
DDV methods shall be specifically agreed if these are to replace the traditional verification methods.

1.1.3 When a DDV notation is applied to a new building the yard responsibility will in general be restricted
to documenting compliance for the equipment to be installed onboard the vessel. For any potential non-yard
furnished (e.g. onshore) equipment which is part of the DDV solution, it will in general be the responsibility of
the owner to document compliance, unless otherwise specifically agreed.

1.1.4 Evidence generated by verification methods complying with this standard may be accepted as
complementary to, or replacing evidence generated from traditional verification activities. In general, the
verification coverage and depth provided by the DDV method should cover as much as possible of the
required scope. Additionally, the DDV may also provide verification which can provide increased confidence.
In case the coverage is found to be to limited compared to the required verification coverage the Society may
not be able to assign a DDV notation.
Guidance note:
The verification method/function and target system(s) are specified in the DDV specification document, see Table 2. In case the
DDV method needs to be supported by traditional methods in order to gain sufficient/required verification coverage and/or depth
the verification items to be covered by other methods are also specified in the DDV specification document.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.5 The assurance level provided by any new method, potentially in combination with traditional methods,
shall as a minimum be at the same level or better when compared to the traditional methods. This is
applicable for verification coverage, depth and scope. The verification scope management process should be
risk based and shall ensure as a minimum equivalent safety levels when compared to the existing methods.
Guidance note:
Note that this does not imply that the new scope should cover all elements in a traditional scope, but that the approved method(s)
and scope must provide the same level of assurance in relation to the essential attributes:

— operational performance (normal and degraded modes as found applicable)


— protective functions
— failure and degradation detection.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.6 Test methods shall incorporate functions or means of ensuring that the sensor and measuring
equipment are calibrated and have sufficient accuracy and capacity to perform its functions.

Class guideline — DNV-CG-0557. Edition August 2021 Page 16


Data-driven verification

DNV AS
1.1.7 The hardware and software verification functionality shall comply with relevant parts of the main class

Section 2
rules for instrumentation and control system, and electrical systems.
Guidance note:
The most relevant references in this respect are DNV-RU-SHIP Pt.4 Ch.8 Electrical installations and DNV-RU-SHIP Pt.4 Ch.9 Control
and monitoring systems.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.8 Functions for performing normal system control and safety/protective actions shall have priority over
verification and test functions.

1.1.9 The system under test (target system) shall be tested in order to provide evidence of acceptable
functionality (typically during normal, abnormal and degraded condition) as per stated or implied
requirements.

1.1.10 The verification system shall also comply with the requirements of other class notations assigned
to the vessel, e.g. redundant dynamic positioning class notations, setting requirements to robustness and
response to/effect of failures.

1.1.11 For any method to be accepted the method shall be able to provide relevant, genuine and trustworthy
results for the Society to be able to objectively assess whether the specified acceptance criteria are met. I.e.
the data shall be able to reveal the actual target system status. For verification of DTs, data may be related
to a vessel or target system model when this is intended. This implies that the body of evidence shall be
delivered in such a way that the Society can assess that the test(s) has been executed in the right manner
and that the evidence is complete and genuine.
Guidance note:
The term complete also includes unsuccessful tests in case several test attempts are needed to produce a successful result.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.12 Test methods shall enable class to perform objective independent verification. This includes ensuring
that the test is applied in the right manner to the correct equipment/system, when in the correct mode, and
enabling the ability to verify that the result is in accordance with all relevant acceptance criteria.

Class guideline — DNV-CG-0557. Edition August 2021 Page 17


Data-driven verification

DNV AS
Section 2
Figure 1 Verification process

1.1.13 The results from the verification activities shall be delivered in such a format and in such details
that the Society can verify that relevant acceptance criteria have been met. The format and content of the
verification results to be delivered to the Society shall be agreed upon as part of the approval process.

1.1.14 The vessel shall be able to ensure a reliable and secure vessel to shore data relay with sufficient
capacity to support the intended function.
Guidance note:
See DNV-RU-SHIP Pt.6 Ch.11 Sec.1 [2.6].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.15 The data delivered (i.e. body of evidence) shall be agreed in each case and on such a format, scope
and quality that the Society can perform effective objective assessments and evaluations of the system
subject to verification, towards stated or implied requirements. In general, the Society shall also have access
to the requested sensor data produced from the testing, i.e. the original test data from the vessel.
Guidance note:
Details to be agreed during approval will include what information the delivery shall contain, the format, and how to inform the
Society that the results are ready to be assessed by the Society. The DDV method can only be accepted as basis for the intended
classification service provided that sufficient data delivery can be submitted.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note:
Guidance on assurance of data quality can be found in DNV-RP-0497. See also [1.1.26].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 18


Data-driven verification

DNV AS
1.1.16 Acceptance criteria shall in general be established for all data used as basis for acceptance by the

Section 2
Society.
Guidance note:
It is anticipated that acceptance criteria will be further developed and refined in parallel with increasing experience and knowledge
gaining.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.17 Initial qualification of any verification method shall be based on the following main steps:
— plan approval based upon the required documentation
— demonstration of the effectiveness of verification method in a realistic test setup. This may be at the
verification method vendor's premises (e.g. in a simulator set-up), onboard the actual vessel, or at an
approved remote verification centre.
Guidance note:
Effectiveness should in this context be understood as the capability of the verification method to produce the desired result, i.e.
create evidence that relevant standards are met, and that this evidence can be delivered when required. Included in this is the
ability to cover the required/intended verification scope, as relevant:

— normal target system behaviour and capabilities


— the target system’s ability to handle failure modes
— ability to detect potential target system defects.
It is expected that the vendor of DDV functionality will be able to demonstrate the system functionality in a realistic test setup
at the vendors premises, typically during the test at manufacturer as part of the certification activities. This may be specifically
important when e.g.:

— delivery of new systems/methods with no or limited operational experience, or delivery to an new application
— when verification is dependent on vessel specific operational data, as such data will typically be very limited or not available
during initial onboard verification.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.18 For the Society to be able to perform sufficient assessment of the properties of a verification tool/
method, insight into the hardware, software and procedures will be required. When considered necessary to
be able to achieve this, the Society can require documentation additional to the documentation specified in
the documentation requirements of this standard.
Guidance note:
When assessments are based on automated processes and algorithms it is essential that the Society gets sufficient insight into
these functions in order to ensure effective independent and objective verification. Transparency into verification functionality will
be required in order to avoid them becoming 'black boxes', e.g. potentially introducing biases into the systems and/or possibly
masking control system defects. This means, among other things, that the Society must have sufficient access to algorithms and
data used for data driven verification.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.19 A verification and validation plan shall be made in order to evaluate the effectiveness of the new
verification method over time.
Guidance note:
This verification and validation plan may, upon special agreement, be omitted in case it is agreed that such activities will not be
necessary.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.20 The body responsible for the verification activities (this may be e.g. a system manufacturer, the
vessel owner, or a combination) shall have in place procedures for:
— operational procedures for operating the DDV system (i.e. execution of all involved verification methods)
— management of change procedures covering the DDV system and the target system

Class guideline — DNV-CG-0557. Edition August 2021 Page 19


Data-driven verification

DNV AS
— maintaining competence in the target system domain

Section 2
— identification of hazards and risks related to test and verification activities
— preparation of instructions for risk control measures to other involved parties
— result reporting, results evaluation, and retesting/reverification
— archiving of results including reports with version control of tests carried out and results during the target
system life cycle
— competence requirements and training for personnel involved in all phases
— requirements verification, as relevant
— qualification of personnel for operation of the verification system
— software development and software quality assurance as relevant.
Guidance note:
The above requirements will be considered in relation to the role and responsibility of the respective body, e.g. yard, system
manufacturer, the vessel owner.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.21 The maker of the test applications and tools shall provide user guidance to all intended users of the
application. Guidance shall also be incorporated inside the tools and applications.

1.1.22 All verification activities requiring manual operations shall be executed by qualified personnel. See
Sec.4 [1.5].

1.1.23 Detailed procedures for performing the verification activities safely and efficiently shall be
established. These shall, as needed, include procedures for test setup and preparation, how to perform the
verification activities, including data management, and for putting the system under test back in to normal
operation (restoring).
Guidance note:
In order to contribute to crew understanding of the system under test, the test equipment and method, and the test results, it
is advised that the test equipment, method, and procedures are designed to involve the onboard crew in such a way that the
activities contribute to increased levels of target system understanding.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.24 The power supply arrangement to the DDV system shall be designed in order to support the intended
functionality. E.g. in case the testing involves simulation of black-out scenarios, or in case the DDV system
is intended to be able to gather data during actual black-out scenarios, then it is required that the system
power supplies arrangement supports this functionality, e.g. by arrangement of an uninterruptible power
supply (UPS).

1.1.25 Cyber security of the body of evidence and necessary data collection infrastructure shall as relevant
follow the requirements in DNV-RU-SHIP Pt.6 Ch.11 Sec.1 [2.8].

1.1.26 For DP systems the requirements specified in App.B will apply additionally.

1.2 Verification scope and reporting


1.2.1 The format, scope and frequency used for delivering the verification results to the Society, shall be
agreed upon and approved by the Society. The delivery shall enable the Society to verify compliance with the
requirements relevant for the intended use.

Class guideline — DNV-CG-0557. Edition August 2021 Page 20


Data-driven verification

DNV AS
Guidance note:

Section 2
This is a general requirement to the DDV system. The data format, scope and frequency, and the responsibility for delivering
the data will depend on the actual use case, e.g. the verification target system and test/survey requirements (e.g. test at
manufacturer, new building delivery from yard, installation on vessels in operation, class annual surveys, renewal surveys). See
also [1.1.15].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.2 The verification function shall be delivered as an end to end function. I.e. the process and functions
shall incorporate all that is needed from initiating the verification process/function on the vessel until the
results are reported in the agreed formats to the Society.

1.2.3 The verification results (i.e. body of evidence) shall be delivered in an agreed format which shall be
suited for efficient inspection and evaluation by the Society in open source, or approved system specific
application.

1.2.4 The interaction between the DDV system and the target system, and delivery of the generated target
system/DSA verification data to the Society shall be carried over an API or other digital interface using
documented methods and data structures. The data format for delivery of data to the Society shall be agreed
with the Society.

1.2.5 The information delivered shall incorporate an agreed minimum set of data to be stored in the Society
vessel files for documenting the verification activities and results with respect to the relevant acceptance
criteria. The formats used shall ensure that the information will be available throughout the vessel's entire
operational life.

1.2.6 For each of the verification activities there shall be a minimum frequency for when the activity shall be
performed during the vessel operational phase in order to gather the verification data. In practice this may
be arranged by assigning due dates and or validity periods to each activity. The execution of the verification
activities may be distributed over time as long as each activity is performed within its specified due date.

1.3 Change management


1.3.1 Both the operational systems subject to verification and the systems containing the verification
functionality shall comply with the requirement for change handling in DNV-RU-SHIP Pt.4 Ch.9 and relevant
parts of DNV-CP-0505 where applicable.
Guidance note:
See also Sec.4 for additional requirements to change management for systems in operation.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.4 Documentation requirements


1.4.1 The documentation submitted, shall include descriptions and particulars of the vessel and this shall
be covered by the requirements given in Table 1. These documentation requirements are in addition to the
requirements for main class and other additional notations applicable for the vessel.

Class guideline — DNV-CG-0557. Edition August 2021 Page 21


Data-driven verification

DNV AS
Table 1 Documentation requirements - vessel

Section 2
Object Documentation type Additional description Info

DDV system Z100 - Specification DDV specification document: the document shall AP
describe the DDV verification method/function
and specify the target system(s). In case the DDV
method must be supported by traditional methods
in order to gain sufficient/required verification
coverage and/or depth, the verification items to be
covered by other methods shall also be specified. If
needed reference to where to find specifications for
performing the not covered verification shall also be
provided.

AP = for approval

1.4.2 The cyber security elements required by [1.1.25] shall be documented in accordance with the
applicable requirements in DNV-RU-SHIP Pt.6 Ch.5 Sec.21.

1.4.3 The documentation submitted, shall include descriptions and particulars of the DDV method(s) (system
and functionality). For new verification functionality and methods required to be approved and/or certified,
the manufacturer shall submit the documentation required by Table 2, Table 3, Table 4, and Table 5 as
appropriate. These documentation requirements are in addition to the requirements for main class and other
additional notations applicable for the vessel.

1.4.4 When the DDV functionality (hardware and software) is delivered as an integral part of another
system, i.e. the system it shall deliver verification information on, then the required documentation for the
verification functionality may be delivered as an integral part of the documentation for this system.

Table 2 Documentation requirements for self-verifying systems (SVS) and built in test
functionality (BITE) - required to be approved and/or certified

Object Documentation type Additional description Info

SVS/BITE DDV I010 - Control system The document shall describe the verification FI
system philosophy functionality intentions and objectives. I.e. what
verification information shall be delivered, when and
how it shall be delivered and how it shall be utilized
for verification of the target system. It shall be
described whether the method is intended to replace
traditional verification activities or if the verification
will be an addition.

Class guideline — DNV-CG-0557. Edition August 2021 Page 22


Data-driven verification

DNV AS
Object Documentation type Additional description Info

Section 2
I200 - Control and The documentation package shall contain AP
monitoring system information as relevant:
documentation
— I020 - Control system functional description
— I030 - System block diagram (topology)
— I040 - User interface documentation
— I050 - Power supply arrangement
— I080 - Data sheets with environmental
specifications
— I110 - List of controlled and monitored points
— I150 - Circuit diagrams
— I320 - Software change handling procedure
— Z252 - Test procedure at manufacturer.

Z070 - Failure mode A functional failure analysis may be requested R


description in special cases, see DNV-RU-SHIP Pt.1 Ch.3 for
description of the contents.

Z250 - Procedure Procedures for performing the verification activities. AP


Including procedures for test setup, performing the
verification activities and for putting the system
under test back into normal operation (restoring).

Procedures for calibration of sensors and data AP, R


collection equipment.

Description of personnel training and qualification FI, R


requirements, e.g. user guides and training material.

Verification information management and delivery. AP


Specification on how to manage the gathered
verification information, including how to present the
verification results to the Society. This shall include
data management and data quality management
descriptions.

Z251 - Test procedure Onboard test program for demonstration of test AP


method effectiveness. Testing per the approved
programs to be witnessed by the Society. Normal
function, degraded function and failure modes of the
verification HW/SW functionality to be covered as
applicable.

Z268 - Assessment report A plan for verifying and validating the effectiveness FI, R
of the technology when this has been implemented
and used for some time shall be made and approved
by the Society. Includes the test period.

AP = for approval, FI = for information, R = on request

Class guideline — DNV-CG-0557. Edition August 2021 Page 23


Data-driven verification

DNV AS
Table 3 Documentation requirements for systems for verification based on data analytics and

Section 2
algorithm-based verification agents (AVAs) - required to be approved and/or certified

Object Documentation type Additional description Info

AVA DDV system Z050 - Design philosophy Overall role, purpose, and responsibility of the AP
algorithm related to the verification scope and
related to humans in the verification effort. This
includes, but is not limited to, level of algorithm
autonomous decision-making prerequisites,
sensibility and performance requirements.

Z060 - Functional Description of the logic, mechanisms and reasoning FI


description if essential in the algorithm decision-making.

Z161 - Operational Manual with operator instructions. FI


manual

Z252 - Test procedure at Test program to demonstrate system functionality


manufacturer and other relevant requirements. Shall be based on
Z050 and Z060. The program shall be made to fit for
verification of the functions installed on the specific
vessel.

Z251 - Test procedure Onboard test procedure. Test program to AP


demonstrate system functionality and other relevant
requirements. Shall be based on Z050 and Z060.
The program shall be made to fit for verification of
the functions installed on the specific vessel.

Z261 - Test report Test results with conclusions, from the above Z251 - AP
Test procedure.

Z250 - Procedure Verification information management and delivery. AP


Specification on how to manage the gathered
verification information, including how to present the
verification results to the Society. This shall include
data management and data quality management
descriptions, when relevant.

Class guideline — DNV-CG-0557. Edition August 2021 Page 24


Data-driven verification

DNV AS
Object Documentation type Additional description Info

Section 2
Z050 - Design philosophy Design philosophy modelling base. Type, structure, FI
parameterisation and resulting model behaviours/
properties. Modelling assumptions, and decisions.
The objective function goals and prioritizing between
them. The assumptions (statistical or otherwise)
behind the model. The criteria (e.g. benefit false
positives/negatives) and thresholds for ranking,
classification, or association and their confidence
values. Model limitations, and fault handling
capability, such as response to degraded input
performance (accuracy, performance, failure modes,
etc.). Its generalization capability such as response
to novel (and unseen) system states (true (nominal
or faulty) states or fictitious state caused by input
faults).
Model training/test data: representativeness of the
possible system states including rare (and possibly
dangerous) states. Its size, origin, selection process,
and possible inherent biases. Human agency in
vetting the data.
Input (in operation): sensors input, possibly the
weight of those inputs (if some are deemed more
important that others). Rationale behind the
selection of inputs and their associated weights.
Each input contribution to the model completeness.

AP = for approval, FI = for information

Table 4 Documentation requirements for systems for verification based on digital twins (DTs) -
required to be approved and/or certified

Object Documentation type Additional description Info

DT DDV system Z050 - Design philosophy Overall role, purpose, and responsibility of the AP
digital twin related to the verification scope and
related to humans in the verification effort. This
includes, but is not limited to, level of the digital
twin's autonomous decision-making prerequisites,
sensibility and performance requirements. The
intended use, fidelity level and verification scope
shall be specified in detail.

Z060 - Functional Description of the logic, mechanisms and reasoning FI


description of the digital twin including limitations and intended
use.

Z161 - Operational Manual with operator instructions. FI


manual

Z252 - Test procedure at Test program to demonstrate system functionality AP


manufacturer and other relevant requirements. Shall be based
on Z050 and Z060. The program shall be made to
fit for verification of the functions installed on the
specific vessel.

Class guideline — DNV-CG-0557. Edition August 2021 Page 25


Data-driven verification

DNV AS
Object Documentation type Additional description Info

Section 2
Z251 - Test procedure Onboard test procedure. Test program to AP
demonstrate system functionality and other relevant
requirements. Shall be based on Z050 and Z060.
The program shall be made to fit for verification of
the functions installed on the specific vessel.

Z261 - Test report Test results with conclusions, from the above Z251 -
Test procedure.

Z250 - Procedure Verification information management and delivery. AP


Specification on how to manage the gathered
verification information, including how to present
the verification results to the Society. This shall
include data management and data quality
management descriptions, when relevant.

Z050 - Design philosophy Design philosophy modelling base. Type, structure, FI


parameterisation and resulting model behaviours/
properties . Modelling assumptions, and decisions.
The objective function goals and prioritizing
between them. The assumptions (statistical or
otherwise) behind the model. The criteria (e.g.
benefit false positives/negatives) and thresholds
for ranking, classification, or association and their
confidence values. Model limitations, and fault
handling capability, such as response to degraded
input performance (accuracy, performance, failure
modes, etc.). Its generalization capability such
as response to novel (and unseen) system states
(true (nominal or faulty) states or fictitious state
caused by input faults) Model training/test data:
representativeness of the possible system states
including rare (and possibly dangerous) states. Its
size, origin, selection process, and possible inherent
biases. Human agency in vetting the data. Input (in
operation): sensors input, possibly the weight of
those inputs (if some are deemed more important
than others). Rationale behind the selection of
inputs and their associated weights. Each input
contribution to the model completeness.

AP = for approval, FI = for information

Class guideline — DNV-CG-0557. Edition August 2021 Page 26


Data-driven verification

DNV AS
Table 5 Documentation requirements for digital survey applications (DSAs) - required to be

Section 2
approved and/or certified

Object Documentation type Additional description Info

DSA DDV system I010 - Control system The document shall describe the verification FI
philosophy functionality intentions and objectives. I.e. what
verification information shall be delivered, when
and how it shall be delivered and how it shall be
utilized for verification of the target system. It shall
be described whether the method is intended to
replace traditional verification activities or if the
verification will be an addition.

I200 - Control and The documentation package shall contain AP


monitoring system information as relevant:
documentation
— I020 - Control system functional description
— I030 - System block diagram (topology)
— I040 - User interface documentation
— I050 - Power supply arrangement
— I080 - Data sheets with environmental
specifications
— I110 - List of controlled and monitored points
— I150 - Circuit diagrams
— I320 - Software change handling procedure
— Z252 - Test procedure at manufacturer.

Z250 - Procedure Procedures for performing the verification activities. AP


Including procedures for test setup, performing the
verification activities and for putting the system
under test back in to normal operation (restoring).

Procedures for personnel training/qualifications. E.g. FI, R


user guides and training material.

DSA test scope/program. The test program to AP


be executed on the target system(s). This shall
include:
— target system setup and status at start of each
test (secondary systems to be included as
relevant)
— expected target and secondary target system
responses and performance in terms of
immediate and long-term effect, on equipment
level and target system (e.g. DP system) level
during the whole test sequence, from start to
end of each individual test, as relevant
— stated acceptance criteria, including class
requirements and test expected results.

Class guideline — DNV-CG-0557. Edition August 2021 Page 27


Data-driven verification

DNV AS
Object Documentation type Additional description Info

Section 2
Verification information management and delivery. AP
Specification on how to manage the gathered
verification information, including how to present
the verification results to the Society. This shall
include data management and data quality
management descriptions, when relevant.

Z251 - Test procedure Onboard test program. Demonstration of test AP


method (DSA) effectiveness. Testing per the
approved programs to be witnessed by the Society.
Normal function, degraded function and failure
modes of the verification HW/SW functionality to be
covered as applicable.

Z268 - Assessment report A plan for verifying and validating the effectiveness FI, R
of the technology (when this has been implemented
and used for some time) shall be made. The plan
shall be submitted on request.

AP = for approval, FI = for information, R = on request

1.5 Certification
1.5.1 The DDV systems, i.e. the products (hardware and software) which constitute the new DDV methods
shall when used as basis for classification services be certified as required in Table 6. Relevant requirements
in DNV-RU-SHIP Pt.4 Ch.9 Control and monitoring systems apply. All objects are independent products
and the certificates can hence be issued independently or as combined certificates if delivered by the same
manufacturer.

1.5.2 When the verification functionality (hardware and software) is delivered as an integral part of another
system, e.g. the system it shall deliver verification information on, then the certification for the verification
functionality may be delivered as an integral part of the certification for this system, and a combined
certificate may be issued.

Table 6 Certification requirements

Object Certification Issued Certification standard* Addition description


type by

Self-verifying systems (SVS), built PC Society DNV-CG-0557 This may be an integral part
in test functionality and advanced of the system to be verified
test tools (BITE), Digital survey (i.e. self-verifying system)
applications (DSA), and data or a stand- alone system.
analytic systems - algorithm based
verification agents (AVA)

Digital twins (DT) PC Society — DNV-CG-0557 Documentation related to


— DNV-RP-A204 certification of digital twins
are specified in:
— DNV-RP-0510
— DNV-RP-0513 — DNV-RP-A204
— DNV-RP-0510
— DNV-RP-0513

Class guideline — DNV-CG-0557. Edition August 2021 Page 28


Data-driven verification

DNV AS
Object Certification Issued Certification standard* Addition description

Section 2
type by

* Unless otherwise specified the certification standard is the DNV rules.

1.5.3 For a definition of the certificate types, see DNV-CG-0550 Sec.5.


Guidance note:
Additionally, components and systems should be certified according to main class requirements.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.6 Survey and test upon completion


1.6.1 Upon onboard completion, the DDV system shall be subjected to final tests according to approved test
programs. The program shall include testing and verification to demonstrate the test method effectiveness.
Normal function, degraded function and failure modes of the verification HW/SW functionality to be covered
as applicable. The program(s) shall contain test procedures and acceptance criteria. Prior to the DDV tests,
all target systems and equipment shall be successfully commissioned and tested. Testing per the approved
programs shall be witnessed by the Society.

1.6.2 When deemed necessary by the attending surveyor, tests additional to those specified by the test
program may be required.

Class guideline — DNV-CG-0557. Edition August 2021 Page 29


Data-driven verification

DNV AS
SECTION 3 QUALIFIER REQUIREMENTS

Section 3
1 Specific requirements for SVS, BITE, DSA , AVA and DT

1.1 General
1.1.1 For SVS, BITE, DSA, and, AVA the verification functionality may be built into the actual systems,
delivered as a standalone system connected to it, or as test equipment being connected to the system for the
purpose of performing the verification activity.
Guidance note:
When new methods are introduced with the intention to comply with these requirements it is important to discuss the potential
application, including which qualifier to apply, with the Society.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.2 For DT the verification functionality shall be based on testing or simulation on a digital representation
of actual systems. This may be in form of data driven models, pure simulation models or hybrids of such
models.
Guidance note:
For verification based on traditional hardware in the loop simulations (HIL), see DNV-RU-SHIP Pt.6 Ch.5 Sec.13.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.3 Built in verification functions and self-verifying system functionality shall be approved and documented
in accordance with the requirements in this document. See Sec.2 [1.4] for documentation requirements.

1.1.4 Vendors delivering verification functionality as an integral part of the system to be verified (e.g. a DP
control system, power management system, thruster systems, generator protection system, switchboard
protection system, or other system) or as standalone verification systems, shall upon request document and
demonstrate compliance with relevant parts of DNV-CP-0507 System and software engineering AOM level 1.

1.1.5 The verification function may be an integral part of the system to be verified or be delivered as
separate hardware and software. When delivered as separate hardware and software the verification system
may be approved to be permanently connected to the system to be verified, or to be connected when the
verification activity shall be performed.

1.1.6 Data collection and delivery to the Society shall in general be by automatic means and performed in
such a way that the collection of data is not depending on test personnel capacities or interpretation. The
test personnel shall in general not be able to alter the collected data (body of evidence), including selecting/
deciding between different data to be submitted to the Society, e.g. as a result of several test attempts.

1.1.7 Playback functionality based on recorded system data may be used as part of the basis for verification.
The data recordings shall be based upon approved manual and/or automatic functions to trigger data
gathering related to specified system states. The gathered data must cover the relevant equipment subject to
verification and document normal system operation and required system response and robustness in relevant
failure scenarios.

1.1.8 Systems shall be in place for handling of change management related to software versions and
parametrization of important parameters. The change management control shall cover both the system
subject to verification and the system containing the verification functionality.

1.1.9 The change management system shall be demonstrated to the Society upon request.

Class guideline — DNV-CG-0557. Edition August 2021 Page 30


Data-driven verification

DNV AS
Section 3
1.2 Digital survey applications (DSA)
1.2.1 This section describes rules for verification based on use of digital survey application (DSA) tools where
the verification scope is incorporated and managed. Typically, these tools will incorporate traditional test
verification activities (e.g. DP system FMEA test programs) designed to be managed by onboard personnel.
The DSA shall gather a specified body of evidence in a digital format, which later can be used by the Society
to perform the verification of the target system. The DSA can typically utilize testing performed by crew
guided by specific approved built in test methods, or automatic test functions, to gather system generated
data. If considered necessary, witnessing by the Society may be required. Such witnessing may be required
to be on board or accepted to be performed remotely. See DNV-RU-SHIP Pt.6 Ch.11 Sec.3 and the additional
notation REW for requirements on remote witnessing arrangements.
Guidance note 1:
The DSA may also be used during trials which are directly witnessed by the Society. This typically applies to annual trials, renewal
trials and occasional trials. DSA may incorporate other approved methods to gather the required data, in such cases these
functions shall also comply with the relevant rules and relevant parts of this document. Typical examples may be, automatic
verification functions and data analytics (e.g. based on operational data).

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 2:
Functions may be included to automatically store data in case of actual (e.g. while in normal operation) system/equipment failure
and other relevant system states and incidents. When relevant such data may be accepted as part of the body of evidence.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.2 Digital survey applications shall be approved and documented in accordance with the requirements in
this document. See Sec.2 [1.4] for documentation requirements.

1.2.3 The data gathered and used as body of evidence shall be in form of system generated data, i.e.
trending (time series) of system parameters like e.g. vessel position and heading, power and thruster
load conditions, control system command and feedback signals, operator commands/input and system
status including alarms and events, etc. This includes also the DSA itself, as relevant. In addition, it may
be accepted that smaller parts of the body of the evidence can be based on screen captures, pictures and
videos. When pictures, screen dumps and videos are used as evidence there shall be functionality and
procedures in place ensuring that these are genuine.
Guidance note:
It is important that the DSA provides clear description on the required content and that the quality is so that it is possible to
identify the captured information elements, see Sec.2 [1.1.15].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.4 The DSA data shall provide the Society with information enabling the following verification:
— system setup and status at start of each test
— correct execution of the tests
— system response and performance in terms of immediate and long-term effect, on equipment level and
target system (e.g. DP system) level during the whole test sequence, from start to end of each individual
test
— stated acceptance criteria, including class requirements and test expected results.

1.2.5 The tools shall incorporate verification of the SW versions in the major target systems covered by the
tool. OEM service reports for these systems shall also be available or referenced in the tool.

Class guideline — DNV-CG-0557. Edition August 2021 Page 31


Data-driven verification

DNV AS
Guidance note:

Section 3
For DP major systems will typically be DP control, independent joystick control, thruster, propulsion and steering control, power
management control systems.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.6 Results delivery content and format (i.e. body of evidence) to the Society shall be agreed. In general,
the Society shall also have access to all the test data, i.e. the original generated test data from the vessel.

1.2.7 The test and verification scope shall be approved, and the application shall incorporate functionality
for revision control and clearly identify the validity of the approval. Any changes to the verification scope are
subject to evaluation, and re-approval will be required when considered necessary by the Society.

1.2.8 For each test the DSA shall be configured to identify a specific set of data which shall normally be used
for evaluating the specified acceptance criteria. The DSA should also gather the other available data produced
in the same time frame as the specific test is being executed, which potentially could be used by the verifier
(e.g. the Society) to investigate deeper and further into the system behaviour if needed, e.g. in case there is
uncertainty about the results.

1.2.9 The application shall have functionality for onboard data storage and be able to communicate with and
deliver results to an off-vessel data storage to secure sufficient data retrieval independent of any temporary
or permanent vessel connectivity capability. See also Sec.2 [1.1.15].

1.2.10 As a minimum the scope required for verification in relation to classification of the target system shall
be incorporated. Additional scope can be added. In case such additional scope shall be verified by the Society
this will be on special agreement. See also Sec.4 [1.1.4] and Sec.4 [1.1.5].
Guidance note:
It may be accepted that parts of the required scope can be handled through other tools, e.g. traditional paper based test
programs. In such cases the application shall include reference to these tools, and identify the revision containing the latest test
results.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.11 The DSA shall be password protected and the test operator shall be identified, as a minimum with
position, when logged in. Each test shall, when completed be digitally marked as completed by an onboard
authorized signature.

1.2.12 Unless otherwise agreed with the Society, the DSA shall have specialized user interface for the
Society to log in and perform verification based on the gathered body of evidence. Functionality for viewing
the recordings in an efficient manner which allows the surveyor to play, pause, wind and rewind, and have
access to the relevant system parameters shall be provided. For each test the configured and specified data
should be automatically displayed, while other data should be hidden and available for viewing if found
necessary. The interface shall provide system mimics in such a way that it is easy to understand the system
status on equipment, system and vessel level as relevant.

1.2.13 The DSA shall have functionality in place for the owner to request verification of generated body
of evidence from the Society (e.g. via email or other ways of communication, potentially in the DSA tool
itself), and for the Society to provide the result of the verification to the owner via the tool. Each test item
shall contain comments fields for the test crew and for the Society to provide comments and exchange
information. Comments shall be time stamped.

1.2.14 If considered necessary, tests may be required to be witnessed by an onboard surveyor, or by use of
remote witnessing. In cases where this is required as the standard verification method the agreed method
shall be specified and described in the test description in the DSA.

Class guideline — DNV-CG-0557. Edition August 2021 Page 32


Data-driven verification

DNV AS
1.2.15 The vessel shall also have possibilities for setting up direct communication with the Society. The

Section 3
communication system shall be suitable for performing remote witnessing of smaller agreed scopes.
Activation will typically be based upon case by case specific agreements with the Society.
Guidance note:
When considered necessary by the Society it may be requested to witness specific parts of the testing, e.g. in case of retesting
when there are doubts/discussion about a test outcome. The remote witnessing may be provided by a separate system, e.g. by
use of mobile phone based applications. Application of remote witnessing class notation REW may be considered in this respect.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.16 The test items shall be handled individually with respect to status, due dates and test item validity
periods, and verification history (as minimum, results and time stamp). All started test activities and all
corresponding results shall be recorded (also in the case that the test session is aborted or in case it fails)
and include time stamps (as a minimum start and end times) necessary for establishment of a test activity
timeline. The test items status shall be clearly indicated, e.g. due/not due, started/not started, finished,
accepted/not accepted as applicable.
Guidance note:
The test items should preferably be divided in to smaller items, typically one test per equipment unit when this is relevant (e.g.
one item for each reference system for testing of a DP system). The intention is to make the verification process (including follow
up of potential finding) easier by avoiding that the test items becomes bigger/longer than necessary.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.17 In case test are aborted, unsuccessful or for any other reason not completed in the intended manner,
the program shall require that the test operator leaves an explanation to why, linked to the unsuccessful
attempt.

1.2.18 For each item the application shall provide descriptions, interlocks, functionality and guidance to help
and ensure that the test operator performs the test and verification activities in the correct and intended
manner, and so that each test is repeatable with comparable result between different test sessions.

1.2.19 The application shall include interlocks and functionality to ensure that the test results are tamper
free and genuine. It shall not be possible for the onboard test operators, or others to alter or delete collected
results and otherwise alter the body of evidence subject to verification by the Society.

1.2.20 When the results are assessed and accepted by the Society, the DSA shall automatically adjust due
date(s) for the verified item(s) according to the test date and the validity period assigned to the test item.

1.2.21 The digital survey application shall have functionality for, at any time, creating status reports, provide
overview of test item status and required upcoming verification activities (i.e. based on due dates) within
a user defined period. The same functionality shall also be available in the required surveyor verification
interface, see [1.2.12].

1.2.22 The application shall be possible to run both in connected mode (e.g. connected to application
provider and/or the Society onshore database) and offline mode. When verification data is collected in offline
mode the results shall be uploaded to the Society onshore database automatically, or on crew request when
the system becomes connected again. In connected mode the application shall have functionality to display
the collected data online, with as low latency as practicable, in the verification application.

1.2.23 It shall be clearly marked in the tool and in any DSA generated reports which test items are required
to be verified by the Society and which are not. It shall also be clearly visible which items are verified by the
Society and which are not (i.e. pending Society verification).

Class guideline — DNV-CG-0557. Edition August 2021 Page 33


Data-driven verification

DNV AS
Guidance note:

Section 3
It is common that test programs include test items outside the minimum class required scope. The minimum class required scope
will depend on the vessel class notation and other agreements between the Society and the owner. This may e.g. be that the
owner of a DP vessel has requested an FSVAD according to IMO MSC/Circ. 645, or a DPVAD according to IMO MSC.1/Circ. 1580,
or has agreed with the Society, and potentially also his charterer, that the Society shall perform verification of an extended vessel
specific scope.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.24 The DSA shall have functionality for the Society to generate reports showing the verification result
and the actual status of the test elements relevant for class.
Guidance note:
This documentation is intended to be uploaded in the Society files as part of class documentation/survey report. This may be in
e.g. PDF format.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.2.25 The DSA shall have functionality for the vessel owner to document the verification result and the
actual status of the complete test scope, i.e. all test elements.

1.2.26 For each test item it shall be possible to track the historic results and test activity, including potential
findings, their status and how potential findings have been handled.

1.3 Algorithm-based verification agent (AVA)


1.3.1 When AVA data algorithms are part of any verification method, it shall be clearly defined what
prerequisites, sensibility and performance requirements apply for the algorithm. It shall also be possible
to verify the sensibility and performance of the algorithms and their presence and agency (role, purpose,
actions, responsibility, and autonomy) shall be documented and made known to the Society. It is also
important to document the AVA in context of human agency, i.e., the split (role, purpose, responsibility, etc.)
between human and algorithm.

1.3.2 AVAs shall be approved and documented in accordance with the requirements in this document. See
section Sec.2 [1.4] for documentation requirements.

1.3.3 Detailed test programs shall be established as part of the verification. The verification process shall
assure the performance in accordance with required specification for the applicable equipment installed on
the specific unit. Verification shall consider criticality of the relevant function.
Guidance note:
Detailed test programs shall be established as part of the verification. The verification process shall assure the performance
in accordance with required specification for the applicable equipment installed on the specific unit. Verification shall consider
criticality of the relevant function.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.3.4 AVAs shall be assured together with their requirements. The level of assurance, of both the algorithm
and its governing requirements, will depend upon the safety criticality of the target functions in the
verification of which the algorithm is part of. The assurance level of the AVA and its requirement increases
with increased level of autonomous decision making that the algorithm exercises in the verification process.
Guidance note:
In relation to this requirement additional assistance concerning various aspects of the development and assurance of algorithms,
digital assets and simulation models can be found in DNV-RP-0510, DNV-RP-A204 and DNV-RP-0513 Assurance of simulation
models, as relevant.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 34


Data-driven verification

DNV AS
1.3.5 The data delivery to the Society shall be so that it is possible for the Society to perform independent

Section 3
verification. See also Sec.2 [1.1.12].

1.3.6 Logic and data used in that reasoning (i.e. inference) that is deemed essential to the assurance level of
the target system, shall be made readable and understandable (i.e. transparency of inference) to the Society
and be subject for assurance. In case data is part of the knowledge-base for the algorithm (i.e. machine
learning), the training and test data shall also be part of the assurance process and made available (and
readable) to the Society.

1.3.7 The algorithm, possibly together with associated training data, constitutes a digital representation of a
system (i.e. a model). When in operation, this model uses inputs from the system and calculates the current
system state. The assurance process of the model shall as a minimum be viewed along the following three
dimensions:
— model architecture (structural verification)
— model training/test data (knowledge verification)
— model input (operational verification).
Artefacts that represent the required information within each dimension, and methods relevant for each
artefact shall be used to generate evidence that the model will operate with adequate capability, quality and
trustworthiness.
Guidance note:
Model architecture: type, structure, parametrization and resulting model behaviours/properties, possibly the objective function,
and all associated modelling assumptions, and decisions. The objective function may represent multiple goals, what are these
goals, and how are they prioritized? The assumptions (statistical or otherwise) behind the model, and where did those assumptions
arise. The criteria (e.g. benefit false positives/negatives) and thresholds for ranking, classification, or association and their
confidence values. Model predicted performance, accuracy and precision. Model limitations, and fault handling capability, such
as response to degraded input performance (accuracy, performance, failure modes, etc.). Its generalization capability such as
response to novel (and unseen) system states (true (nominal or faulty) states or fictitious state caused by input faults).
Model training/test data: if the model is not based upon machine learning, this information shall be coded into the algorithm code.
Its representativeness of the possible system states. Its representativeness to rare (and possibly dangerous) states. It's size,
origin, selection process, and possible inherent biases. Human agency in vetting the data (known source of bias introduced into
training data).
Model input (in operation): what the model uses as input (e.g. sensors). Possibly the weight of those inputs (if some are deemed
more important that others). What is the rationale behind the selection of inputs and their associated weights? How does each
input contribute to the model completeness?
Methods for testing data-driven algorithms (ML), are not well established. As a result, the evidence generated from testing data-
driven algorithms must be assessed with respect to the evidence properties listed in App.A.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.3.8 The decision-making logic and its presumptions shall be made available to the Society in a format that
is human readable and understandable.
Guidance note:
Decision-making strategy shall be evaluated in context of the verification scope. What kind of decision does the model make,
and how can a human operator recognize an incorrect decision. In case it is not possible to provide a description of the logic
functionality in a human readable and understandable format it may be considered, on a case by case acceptance basis, to
support, alternatively and/or additionally, this requirement by testing the logic to a level where it is considered that sufficient
verification is provided.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.3.9 Vendors delivering AVA based verification methods shall be approved in accordance with DNV-CP-0507
System and software engineering AOM level 2.

Class guideline — DNV-CG-0557. Edition August 2021 Page 35


Data-driven verification

DNV AS
Section 3
1.4 Digital twin (DT)
1.4.1 When digital twins are part of any verification method the intended use, fidelity level and verification
scope shall be specified in detail. It shall also be clearly defined what prerequisites and performance
requirements apply for the digital twin. It shall also be possible to verify the performance and the fidelity of
the digital twins and shall be documented and made known to the Society.
Guidance note 1:
It is expected that DTs can contribute with high value in assurance processes and that they can be used in combination with
traditional verification processes. The introduction of the notation DDV with qualifier DT, may also open for the possibility to base
classification related acceptance on testing purely performed on a digital asset representing the actual system or parts of a system.
However, the abilities and confidence level of the digital twins in the maritime industry are not yet developed to the degree where
major parts of the classification scope can be based upon this. Until then, it is expected that testing on the actual system must
still form the basis for the majority of the classification scope with a possibility to replace only a few, simple items with testing on
the digital twin. The agreed upon tests will be under continuous consideration and may be revoked at any indication of reduced
confidence with the twin. See also Sec.2 [1.1.16].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 2:
DTs can potentially be accepted as an enhancement in relation to the Smart notation according to DNV-CG-0508 Sec.5 also when
based on other purposes than classification.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.4.2 DTs shall be approved and documented in accordance with the requirements in this document. See
Sec.2 [1.4] and Sec.2 [1.5] for documentation and certification requirements.
Guidance note:
Documenting compliance to the requirements can alternatively, when agreed with the Society, be done through qualification of
novel technology as per DNV-CG-0508 Sec.5.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.4.3 Detailed test programs shall be established as part of the verification. The verification process shall
assure the performance and fidelity in accordance with required specification for the applicable systems or
parts of a system. Verification shall consider criticality of the relevant function.

1.4.4 DTs shall be assured together with their requirements. The level of assurance, of both the digital
twin and its governing requirements, will depend upon the safety criticality of the target functions in the
verification of which the digital twin represents. The assurance process of the digital twin shall, if not
otherwise agreed, be according to DNV-RP-A204 and DNV-RP-0510 or DNV-RP-0513 Assurance of simulation
models for data driven models or simulation models, respectively.
Guidance note:
For hybrid models elements from both DNV-RP-0510 and DNV-RP-0513 Assurance of simulation models are relevant.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.4.5 The data delivery (both results and data management) to the Society shall be so that it is possible for
the Society to perform independent verification. See also Sec.2 [1.1.12].

1.4.6 The decision-making logic and data used shall be made available to the Society in a format that is
humanly readable and understandable. In case data is part of the knowledge-base for the digital twin (e.g.
machine learning), the training and test data shall also be part of the assurance process and made available
(and readable) to the Society.

Class guideline — DNV-CG-0557. Edition August 2021 Page 36


Data-driven verification

DNV AS
Guidance note:

Section 3
Decision-making strategy shall be evaluated in context of the verification scope. What kind of decision does the model make,
and how can a human operator recognize an incorrect decision. In case it is not possible to provide a description of the logic
functionality in a human readable and understandable format it may be considered, on a case by case acceptance basis, to
support, alternatively and/or additionally, this requirement by testing the logic to a level where it is considered that sufficient
verification is provided.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.4.7 Vendors delivering DT based verification methods shall be approved in accordance with DNV-CP-0507
System and software engineering AOM level 2.

Class guideline — DNV-CG-0557. Edition August 2021 Page 37


Data-driven verification

DNV AS
SECTION 4 REQUIREMENTS FOR FLEET IN SERVICE (FIS)

Section 4
1 General

1.1 General requirements


1.1.1 The DDV specification document shall be kept up to date during the vessel lifetime. The document
shall be made available at the Society's request and shall be used to support the verification activities.

1.1.2 In case new DDV methods, or other methods, are introduced in the DDV specification program
or significant changes are made to the test methods and/or tools, these changes shall be approved in
accordance with requirements in this document. It shall be possible to identify whether changes have been
introduced, i.e. by revision control.

1.1.3 When relevant, procedures shall be incorporated to ensure that the sensor and measuring equipment
are calibrated and have sufficient accuracy and capacity to perform it functions.

1.1.4 It is accepted that the DDV specification document describes and specifies verification activities which
are outside the scope of class. Based on owner’s request and specific agreements the Society may assist in
assessment also of this scope.
Guidance note:
Inspection and test activities part of surveys associated with certificates where the Society has delegated authority by a flag
administration, may include separate approval from the applicable flag administration.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.5 Based on owner’s request and specific agreements the Society may assist in doing verification at a
higher frequency than stipulated by the rules for classification.

1.1.6 A minimum verification/test frequency shall be established for each verification/test item.
Guidance note:
The frequency should be derived from type of failure mode (criticality), rate of progression and lead time to failure (LTTF) from
detecting an anomaly/fault syndrome. Data acquisition rate should in general be determined and selected in order to capture a
complete set of data before conditions change significantly.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

1.1.7 Modifications to relevant target control systems, verification systems and other systems used for
storing, transferring and presenting verification data shall follow requirements in DNV-RU-SHIP Pt.7 Ch.1
Sec.2 [3.1.7], DNV-RU-SHIP Pt.7 Ch.1 Sec.6 [12.1.3] and relevant parts of DNV-CP-0507 where applicable.
The change management system shall be demonstrated to the Society upon request.

1.1.8 Arrangements accepted according to this class guideline shall, as relevant, comply with the data
quality and security management requirements found in DNV-RU-SHIP Pt.6 Ch.11 Sec.1 [4].

1.1.9 For applications specifically addressed in App.B the requirements specified in App.B will apply in
addition.
Guidance note:
App.B applies specifically to DP systems.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 38


Data-driven verification

DNV AS
Section 4
1.2 Survey of the DDV system
1.2.1 Evaluation of the functionality and effectiveness of the DDV function is normally assumed covered as
a part of the evaluations to be performed when the DDV system is in normal use and as part of the activities
performed during evaluation of the body of evidence.

1.2.2 When required by the Society specific activities, e.g. testing, may be required in order to demonstrate
the effectiveness and functionality of the DDV system.

1.3 Verification of alterations to the target system


1.3.1 The rules of DNV-RU-SHIP Pt.7 Ch.1 apply: the owner shall advise the Society of major alterations to
the target system hardware or software. The owner may, however, assign the task of advising the Society
to a responsible body, representing the owner, e.g. a manufacturer. The Society will consider the need for
documentation, re-survey or test.

1.4 Verification of alterations to the DDV system


1.4.1 The owner shall advise the Society of alterations to the verification system and functionality hardware
or software. The owner may, however, assign the task of advising the Society to a responsible body,
representing the owner, e.g. a manufacturer. The Society will consider the need for documentation, re-survey
or test. Relevant requirements in this document apply.

1.4.2 The Society shall be advised when the built-in test system is based upon continuous learning while
in operation. All relevant aspects of its alteration due to continuous learning capability with respect to [1.2],
shall be documented.

1.5 Procedural and personnel requirements


1.5.1 The party responsible for performing the DDV activities shall ensure sufficient competence according
to applicable inspection and test methods complexity. There shall be a clear definition of competence
requirements related to all activities.

1.5.2 All inspection and test activities shall have:


— detailed description to the level of detail necessary for a skilled person
— interval/date
— preparation note describing any preparation necessary
— required qualification (competence) level of personnel to perform the inspection and/or test
— applicable documentation (test procedures, service manuals and drawings)
— information of checks and measurements to be recorded
— reporting requirements (failures, condition, positive reporting, pictures, etc.)
— all descriptions in english.

1.5.3 Detailed procedures for performing the verification activities safely and efficiently shall be established.
These shall include procedures for test setup and preparation, how to perform the testing, how to perform
the verification (e.g. by the Society) when applicable, including data management, and for putting the
system under test back in to normal operation (restoring).

Class guideline — DNV-CG-0557. Edition August 2021 Page 39


Data-driven verification

DNV AS
Guidance note:

Section 4
Careful consideration on how the testing should relate to ongoing operations is vital for ensuring a sufficient level of safety. In
case it is the intention to perform vessel operations (e.g. DP operations) while testing is being performed, the implications on the
system must be carefully evaluated. In addition to safety of personnel, these evaluations should also consider vessel capacity in
the test mode, and the failure effects in case an unexpected failure/situation should occur while in test mode.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note:
In order to contribute to crew understanding of the system under test, the test equipment and method, and the test results, it
is advised that the test equipment, method, and procedures are designed to involve the onboard crew in such a way that the
activities contribute to increased levels of system understanding.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 40


Data-driven verification

DNV AS
APPENDIX A EXPLANATORY INFORMATION

Appendix A
1 Building evidence-based confidence in systems
This appendix proposes a nomenclature and a model to support discussions on (and bring attention to)
some important aspects related to evidence and evidence generation properties. Many of these aspects
in traditional verification schemes (e.g. DP with witnessed trials) have been considered to be managed
satisfactorily without further discussion. However, when verification methods are being altered, these aspects
should be considered and understood.
Evidence is used as basis for establishing and maintaining the necessary level of confidence in the control
systems. Modern technology enables more and more sophisticated verification functionality that can be
implemented in maritime control systems. Consequently, we must bear in mind the methodology and
processes by which such evidence is generated by these (built in) verification systems.
The evidence provided must give grounds for justified confidence and trust that the control system is safe in
operation. For a (body of) evidence to be able to establish this confidence, the evidence must possess certain
properties. These properties may be attributed to both the evidence type, and the actual evidence instance.
In the following paragraphs, these properties are discussed. The definition of each property (term) can be
found in Sec.1 [1.8].
Every type of evidence possesses certain capabilities, or provides an answer to the question "What can this
type of evidence actually prove?" Moreover, a particular instance of evidence must be of adequate quality:
it must be valid and it must be trustworthy. Furthermore, the entire body of evidence (that is, the collective
set of evidence) must be sufficiently comprehensive (or complete). Both the coverage and the depth of
the body of evidence instance must encompass the system features of interest to the extent necessary
to provide the required confidence that the system is safe to operate. These evidence properties are not
independent dimensions but are actually interrelated to greater or lesser extent. As one property cannot be
fully reduced to another, quantifying the overall appropriateness of the evidence is difficult. All properties
should be evaluated through a systems approach in order to identify possible deficiencies.
Evidence properties will be affected by the process by which they are generated: i.e., the verification process.
In order to ensure that the evidence properties are adequate, certain properties of the verification process
must be ensured. A proposed taxonomy for description of these properties are: Definition of the roles
involved, assessment of the intensity and rigour with which the verification is conducted, and the level of
detachment or independence of the organization performing the verification.
Other aspects within the verification process that will affect the evidence properties are: the intention,
motivation, responsibility, and competence of each role within the verification process. Moreover, the rules &
standards used as a basis for the system requirements and the development process will typically also affect
some of these properties.
Figure 1 depicts a hierarchy where, at the lowest level, different aspects of the verification effort, such as
the roles, rigour, intensity, and detachment, are positioned. These aspects are affected by the intention,
motivation, responsibility, and knowledge possessed by the different roles within the verification effort
(bottom right). Moreover, these aspects will influence (upward causation) each individual evidence property
at the level above (validity, quality, and trustworthiness), and the entire body of evidence coverage and
depth. At the highest level of the hierarchy, resides the confidence which, based on the provided evidence,
can be placed in the target system.

Class guideline — DNV-CG-0557. Edition August 2021 Page 41


Data-driven verification

DNV AS
Appendix A
Figure 1 Evidence and evidence generation properties, and their inter-influence

Thus, different aspects of the verification effort at the lowest level will influence the resulting evidence
properties at the level above. In addition, the aspects within the lowest level may also influence each other
(bidirectional arrows between the entities), e.g. a weak level of detachment may influence the intensity of
the verification due to cognitive or societal biases. Finally, at the highest level, confidence is also influenced
by aspects other than the generated body of evidence alone. These influencers include aspects such as
traditions, supplier maturity, and stakeholder authority.
These aspects and interrelationships should be considered in order to understand and optimize the
effectiveness of the evidence properties and their efficiency in creating sufficient confidence and trust in the
target system.

Class guideline — DNV-CG-0557. Edition August 2021 Page 42


Data-driven verification

DNV AS
APPENDIX B APPLICATION SPECIFIC REQUIREMENTS

Appendix B
1 Introduction
The development and application of DDV is expected to increase in the maritime and offshore industry. This
appendix covers additional requirements to specific application areas, and for these specified applications the
requirements described in this appendix are additional to the generic requirements in this class guideline.

2 Dynamic positioning (DP) systems

2.1 Background
Dynamic positioning systems are designed with varying degrees of robustness against loss of station-keeping
abilities in case of failures. This built in robustness is typically guided by rules and industry standards, and,
in addition, vessel specific design intentions. Operators generally aim for a design with built in capacity and
robustness that matches the risks involved in the relevant industrial tasks. For this approach to be successful,
it is instrumental that the intended capacity and robustness are in place, and that these are both verified and
documented.
Over time, the industry has developed industry practices on how to verify that the intended robustness has
been achieved during the design and building phases, and that it is maintained during a vessel's operational
life. The class rules, both design rules and rules for verification of vessels in service, are important elements
in this. Until now, the methods used for this verification have normally required that a classification surveyor
(often in addition to other experts, like DP FMEA consultants and equipment vendor experts) is present
at predetermined calendar based intervals to witness that the required verification activities are being
performed correctly, and to observe and evaluate the results of these activities. For vessels in operation,
these traditional assurance activities normally require that the vessel is taken out of operation for verification
trials. Especially for vessels with redundant DP class notations, such trials may be seen as overly intrusive.

2.2 Philosophy
This section outlines philosophical and functional objectives that should guide development of proposed
alternate verification schemes when applied for verification of DP systems.
A DP system consists of components and systems acting together to achieve a sufficiently reliable station
keeping capability. The necessary redundancy level for components and systems is determined by the
consequence of a loss of station keeping capability. DP systems requiring redundancy must have a defined
and predicable reduction in station keeping capability following the worst case single failure. The residual
capability is often referred to as the 'post failure capability' or more fully 'the post worst case single failure
capability'. Codes, standards and practices have evolved over the years to achieve this objective by verifying
the residual DP capability which is available after worst case failure.
The verification of the robustness against the potential for a loss of station keeping abilities, relies on proving
DP system:
— performance, i.e. each redundancy group can provide the intended capacity when required
— protection, necessary protective functions are in place to ensure that failure effects are within
requirements necessary to ensure DP system design intentions
— detection and monitoring in order to have control of the DP system status.
Technology development, advances in sensing technology, data centricity, deployment and purposeful
application of the same on DP vessels can help to achieve the following objectives:
— design/build to test systems are built in a manner that allows verification by testing
— test on demand capability – systems can be tested when required/needed
— systems continuously report their status by self-monitoring.

Class guideline — DNV-CG-0557. Edition August 2021 Page 43


Data-driven verification

DNV AS
Purposeful application of the above concepts should help:

Appendix B
— provide confidence in DP system capabilities through effective verification and validation of performance,
protection and detection
— preventing unwarranted human intervention
— reducing the cognitive burden imposed upon personnel tasked with the delivery of DP operations.
Guidance note:
Deployment of systems should not be solely focused on gathering and transmitting data to a shore-based facility to meet:

— assurance requirements (statutory/independent verification, etc.)


— post processing and data analytics.

Information derived from data should also be available on board the vessel in an intuitable format to facilitate:

— anomaly detection, abnormal behaviour, reduction in confidence


— fault finding and decision making by on board personnel.

Access to information on board can e.g. be achieved through the conscious embedment of knowledge-based systems within the DP
system itself.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

— expanding the scope of verification and validation to a broader spectrum than that achievable by
traditional means, without an increase in the burden (time and resources)
Guidance note:
Application of digital technologies and data centric approaches should not be limited to conduct current scope through
alternate means.
It is expected that the efficiencies that can be realized should facilitate effective verification and validation.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---
— facilitating condition based monitoring and response.

2.3 Verification scope and reporting


2.3.1 General

2.3.1.1 The vessel shall at all times keep track of the status of the complete DP verification, consisting of new
and more traditional methods, approved per these requirements and the traditional DP requirements in the
DNV rules for ships. The system arranged for this is hereby referred to as the DP verification program. See
also [2.3.1.2]. This should preferably be in form of a digital format for administrating DP related verification
activities and their results. Alternatively, new methods can be incorporated into traditional DP survey
schemes and FMEA test programs to form the required DP verification program.
Guidance note:
The traditional verification scheme for a DP system is typically based on annual and 5 yearly verification activities. It is therefore
expected that a DP verification program should cover the planned verification activities over a 5-year period.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.3.1.2 The DP verification program shall at all times keep updated records of the verification status of the
complete DP system. I.e. for each verification item, information on what, how and when verification has been
performed and the corresponding status/results. Test intervals and due dates shall also be identified.

2.3.1.3 New verification methods accepted per this document, and incorporated in to traditional DP
classification survey schemes and FMEA test programs, guided by DNV-RU-SHIP Pt.7 Ch.1 Sec.6 [12] shall
comply also with the relevant part of these rules.

2.3.1.4 The DP verification program may fully or partly replace, or be integrated with, the traditional
functional and failure test documentation (e.g. FMEA and FMEA trials documents). The complete DP

Class guideline — DNV-CG-0557. Edition August 2021 Page 44


Data-driven verification

DNV AS
verification program shall be documented and demonstrated to the Society upon request. The following shall

Appendix B
be clearly defined:
— the verification scope(s)
— what parts of the scope are covered by which method(s)
— reporting requirements, including how to handle findings
— required qualification (competence) level of personnel to perform the verification activities.
All inspection, data collection and test activities shall have:
— detailed description to the level of detail necessary for a skilled person. Clear and precise language should
be used to allow non-domain experts to understand and avoid misunderstandings
— interval/date
— description of necessary preparations
— applicable documentation (test procedures, service manuals and drawings)
— information of checks and measurements to be recorded
— specification of expected results and acceptance criteria. When possible clear pass/fail criteria should be
provided for each test.
All test and inspection results shall be satisfactorily documented. A test result history with all recorded results
shall be kept. In case the plan is in a more traditional document format this may be handled as test report
revisions.

2.3.1.5 The DP verification program shall have functionality for detailed reporting of the actual results from
each verification activity and test performed (commonly referred to as 'positive reporting'). The reporting
system shall have functionality for identifying which results/data that has been submitted to the Society and
keep record of which results that are accepted or not.

2.3.1.6 For each of the activities in the DP verification program there shall be a minimum frequency for when
the activity shall be performed in order to gather the verification data. In practice this may be arranged by
assigning due dates and or validity periods to each activity. The execution of the verification activities may
be distributed over time as long as each activity is performed within its specified due date. E.g. according to
annual and 5-yearly renewal schemes. Inspection and testing shall normally not exceed 5-year intervals for
any verification/test item.
Guidance note:
The frequency of the test should be duly arranged with respect to risk. In case verification, additional to class minimum, in line
with international standards and guidelines is requested it must be noted that these typically also dictate test scope and intervals/
frequency.
Examples may be IMCA and MTS standards/guidelines, and IMO DP guidelines.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.3.1.7 The scope of the DP verification program shall as a minimum be based on the DP vessel FMEA and
the relevant class rules. The test plans shall comply with all relevant requirements for DP FMEA and the
traditional DP FMEA test programs for the relevant DP class notation.
Guidance note:
When designing a DP verification program based upon new verification methods, the focus should not be on replicating the
traditional scope. Even though it is expected that significant parts of the traditional scope should be kept, the new verification
scope should be based on system analysis (including the vessel DP FMEA) and the capabilities provided by the new available
verification methods. The goal should be to provide a more efficient and better verification, potentially with better coverage, in
order to provide increased confidence in the robustness of the DP system.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.3.1.8 The DP verification program (and vessel DP FMEA) shall be kept up to date in order to reflect any
changes made to the DP system or DP system technical operating modes.

Class guideline — DNV-CG-0557. Edition August 2021 Page 45


Data-driven verification

DNV AS
2.3.1.9 It is accepted that the DP verification program may involve activities which are outside the scope of

Appendix B
class. Based on owner's request and specific agreements the Society may assist in assessment also of these
parts.
Guidance note:
This applies both to scope and test frequency and may typically be ensuring a scope in compliance with the test requirements in
the IMO DP guidelines, enablingthe Society to issue an FSVAD or DPVAD on behalf of the flag state, when accredited by the flag
state to do so. Other industry standards, e.g. IMCA M190, may also be applied.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.3.1.10 It should be clearly indicated in the DP verification program and in the test reporting which parts of
the programs that shall be, and have been, verified by the Society. The program and the reporting systems
shall also be organized in such a way that it is efficient in use.
Guidance note:
E.g. in case the reporting to the Society is automatic, the program must only request class verification for items that are requested
to be part of the class scope. It must also be easy for the class surveyor to access and identify which items that are requested to
be verified.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.3.1.11 For vessels carrying a DP class notation which requires the DP system to have redundancy the DP
FMEA and the DP FMEA test program shall comply with the requirements for the relevant DP class notation
and cover also the DDV verification system (hardware and software) as relevant. See DNV-RU-SHIP Pt.6 Ch.3
Sec.1 or DNV-RU-SHIP Pt.6 Ch.3 Sec.2 as for the relevant DP class notation. The DP FMEA test program may
also incorporate the DDV verification activities. However, see also Sec.2 [1.1.2]. In case the DDV method is
agreed to replace the traditional methods in an FMEA test program, the program shall be updated to reflect
the use of new and traditional methods, including at which intervals/surveys the methods is intended to be
used. Alternatively, the DP FMEA test program may be replaced by a digital DP verification program. The DP
verification program shall likewise comply with the relevant FMEA test program requirements and incorporate
also the verification system as relevant, in the same way as described.

2.3.1.12 A complete DP verification program shall incorporate verification elements as required in Table 1
below:

Table 1 DP verification program main elements

Main element Additional description

Verification of physical condition This may be done, e.g. when the vessel is in port.

Verification of relevant safety related This may be done, e.g. when the vessel is in port.
aspects (HSE issues)

Protective systems Typically, on demand functionality in power generation and distribution systems.
Typically, part of failure response test programs (e.g. FMEA trial program).
Guidance note:
Generator and bus-tie protective devices are required tested as part of main
class renewal surveys. Test results proving compliance with the approved relay
coordination study should be included as part of this element.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Normal system functionality Normal function of all systems. DP control, power and actuators.

Verification of back-up functionality Mode change systems, independent joystick, independent manual levers, back-
up DP systems, alternative DP systems, thruster emergency stop systems.
Typically, part of failure response test programs (e.g. FMEA trial program).

Class guideline — DNV-CG-0557. Edition August 2021 Page 46


Data-driven verification

DNV AS
Main element Additional description

Appendix B
Single failure response testing E.g. PRSs, sensors and thruster systems. Typically, part of failure response test
programs (e.g. FMEA trial program).

Redundancy verification Degraded function and failure modes. Typically, part of failure response test
programs (e.g. FMEA trial program).

Separation verification This may be done e.g. when the vessel is in port.

Management of change DP system and DP verification program, including verification methods. This
may be done e.g. when the vessel is in port.

Owner defined elements The Society may upon special agreement perform verification of additional
elements when this is requested by the owner.

2.4 Additional requirements for use of digital survey applications on DP


systems
2.4.1 For DSA intended for dynamic positioning systems the required surveyor verification interface in Sec.3
[1.2.12] shall include thruster, and power system mimics in such a way that it is easy to understand the
system status in terms of DP redundancy groups. This documentation must have naming of equipment
consistent with the onboard equipment, test program and data collected.

2.4.2 The DSA shall also provide basic information on the DP system operation during testing in terms of
power system and thruster layout mimics, and in addition also provide information on the redundancy design
intent and the intended DP technical system configuration of these systems. Mimics shall comply with the
vessel physical configuration when relevant.

2.4.3 The DP system performance in terms of station keeping (i.e. position and heading performance during
testing) shall be clearly indicated as part of the displayed body of evidence.

2.4.4 The DSA shall also provide easy access to the latest revision of the vessel DP FMEA for all users.

2.5 Requirements to DP fleet in service (FIS)


2.5.1 General requirements

2.5.1.1 A DP verification program shall be implemented on the vessel. The program shall organize and keep
track of all DP verification activities, including when, what and how to test and keep an historic record of
all results. The program may be built up of smaller blocks but shall be organized so that it is keeps track
of the complete DP system. When accepted by the Society the DP verification program may also cover
the requirement for the DDV specification program, i.e. a separate DDV specification document will not be
needed.
Guidance note:
A traditional DP FMEA and FMEA test program (or e.g. a DP DSA) may be the core of such a system, utilizing the new verification
functionality. It is anticipated that a digital DP verification program will be more effective compared to the traditional hard copy
FMEA analysis, test program and report format.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 47


Data-driven verification

DNV AS
Guidance note:

Appendix B
When a DDV class notation is applied to a DP vessel, the Society may accept that the traditional survey arrangement for dynamic
positioning system notations as described in DNV-RU-SHIP Pt.7 Ch.1 Sec.6 [12] is, partly or completely, replaced by arrangements
approved in accordance with this document.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.1.2 The DP verification program shall be kept up to date during the vessel lifetime. In case new methods
are introduced into the DP verification program or significant changes are made to the test methods and
tools these changes shall be approved in accordance with requirements in this document. The DP verification
program shall be updated accordingly. It shall be possible to identify whether changes have been introduced,
i.e. by revision control.

2.5.1.3 The Society shall on request be provided access to the DP verification program or otherwise be kept
updated on the status.

2.5.1.4 The 5-yearly complete DP system verification scope shall be witnessed on board the vessel by a
Society surveyor at least within each 5-yearly renewal period. See also [2.5.5] and [2.5.6]. The trial(s) shall
be performed based on an agreed verification program, which may consist of a combination of different
verification methods. E.g. both new and more traditional verification methods may be used in combination to
execute the testing, and gather the body of evidence, while being witnessed by the Society surveyor.

2.5.1.5 It is accepted that the DP verification program involves activities which are outside the scope of the
Society. Based on owner’s request and specific agreements the Society may assist in assessment also of this
scope.

2.5.2 DP verification program

2.5.2.1 The vessel specific DP verification program shall at all times be kept up to date to cover any changes
made to the DP system, to the verification methods or to the verification equipment. For requirements to the
DP verification program, see Sec.2 [1.2].

2.5.3 Verification of alterations to the DP system

2.5.3.1 The rules in DNV-RU-SHIP Pt.7 Ch.1 Sec.6 [12.4] applies: the owner shall advise the Society of
major alterations to the DP system hardware or software. The owner may, however, assign the task of
advising the Society to a responsible body, representing the owner, e.g. a manufacturer. The Society will
consider the need for documentation, re-survey or test.
Guidance note:
In addition to renewal of the DP controller hardware or software, a major alteration might also be:

— installation of a new position reference system or other sensor interfaced to the DP control system
— changes to the thruster system
— software changes
— structural changes
— changes in power system.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.3.2 Verification of alterations, which in a traditional survey scheme as described in DNV-RU-SHIP Pt.7
Ch.1 Sec.6 [12] would require survey or testing witnessed by the Society, may be based upon test methods
approved according to this document. I.e. when these methods can provide genuine information as required
per these rules for the Society to base its objective evaluations on.

Class guideline — DNV-CG-0557. Edition August 2021 Page 48


Data-driven verification

DNV AS
Guidance note 1:

Appendix B
Examples of verification methods for alterations requiring testing to be performed may be based upon, e.g.:

— the required body of evidence gathered by use of approved methods for automatic, remote or manual operator performed tests
— remote witnessing.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 2:
Examples of alterations where witnessing of trials potentially could be replaced by methods mentioned are:

— installation of new reference systems


— installation of INS solutions
— DP control system upgrades, with limited changes in DP control system application functionality.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.4 Verification to be performed in between renewal trials

2.5.4.1 Vessels operating their annual DP verification scheme according to these requirements may not
perform the traditional 'one session' annual DP surveys and trials. Typically, the body of evidence may
be gathered incrementally within approved due dates by use of data-driven methods. These may be e.g.
automatic and/or administrated by the owner. The verification activities described in this section shall as a
minimum provide the same verification level per year as the traditional annual DP surveys.
Guidance note 1:
The scope of the annual survey shall as a minimum be the class annual scope. It is accepted that the DP verification program
involves activities which are outside the scope of the Society. Based on owner's request and specific agreements the Society may
assist in doing verification also on these parts, e.g. according to requirements for annual testing in IMO MSC/Circ. 645 or IMO
MSC.1/Circ. 1580.
Based on owner's request and specific agreements the Society may assist in doing verification at a higher frequency than
stipulated by the classification rules. This can be used to agree on a tailored annual verification scope based upon owner's needs
and market requests.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 2:
Examples of verification to be performed in between renewal trials may be based upon, e.g.:

— data gathered by use of approved automatic, remote or manual operator performed test programs, including digital survey
applications (DSA)
— analysis of data gathered from normal operation
— analysis of data gathered from incidents and unexpected system behaviour.
The sum of the verification may be used to verify that the DP system is healthy to operate and responds in a robust manner to the
documented incidents and/or test scenarios.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Guidance note 3:
Even if data gathered from real incidents and unexpected system behaviour can be very useful for verifying robust system
behaviour it would be expected that such data will be limited. Hence, it must be expected that data gathered from operations and
automatic/manual testing will be needed in order to have sufficient data to base the verification on.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.4.2 When the vessel has qualifier A attached to the DP class notation (e.g. DYNPOS(AUTR, A) the
yearly class scope shall comprise a complete verification of the DP system requirements, including visual
inspection, functional, failure, redundancy and separation requirements, in the same way as the 5-yearly
renewal trials. See DNV-RU-SHIP Pt.7 Ch.1 Sec.6 [12.3.1]

2.5.4.3 A yearly onboard visit shall be performed by a DNV qualified surveyor. During this visit a visual
inspection of all DP related equipment and the DP related verification equipment.

Class guideline — DNV-CG-0557. Edition August 2021 Page 49


Data-driven verification

DNV AS
2.5.4.4 The DP verification program shall have functionality for detailed reporting of the actual results from

Appendix B
each verification activity and testing performed. The reporting system shall have functionality for identifying
which results/data that have been submitted to the Society and keep record of which results that have been
accepted or not accepted.

2.5.4.5 The DP verification program status shall be presented to the Society during annual survey if
requested.

2.5.5 Renewal trials (5-yearly)

2.5.5.1 A 5-yearly complete DP system verification trial shall be witnessed onboard by the Society as per
the relevant classification rules for the given notation. The trial shall be performed based on an agreed
verification program, which may consist of a combination of new data driven and more traditional verification
methods.
For methods to be accepted for information gathering without direct onboard witnessing also for the 5-yearly
renewal trial, this must be based on special agreement. A prerequisite will be documented, demonstrated and
validated effectiveness of the new method(s).
Guidance note:
When traditional witnessed annual trials are replaced by methods approved according to these requirements which do not require
direct onboard witnessing, these methods will not be accepted for replacing the requirement for onboard witnessing of the
complete 5-yearly renewal trials.
New verification methods may typically be accepted without witnessing for:

— annual surveys/trials
— survey of upgrades which under the traditional classification regime would require witnessing (e.g. installation of new DP position
reference systems).

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.5.2 The 5-yearly trials, or specified parts of it, may be required to be performed by traditional methods,
e.g. in combination with the new method for verification and validation of the effectiveness of the new
methods.

2.5.5.3 A thorough visual inspection of all DP related equipment shall be performed.


Guidance note:
Parts of this will typically be covered by other class surveys, e.g. main class electrical, prolusion/thruster and machinery surveys.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.6 Distributed renewal trials (5-yearly)

2.5.6.1 The 5-yearly renewal trials may be distributed and performed incrementally over the five-year period,
e.g. in combination with a traditional annual survey or as separate distributed trial sessions, provided that
the DP verification and reporting system is designed to support such a scheme. The DP verification and
reporting system is subject for approval.
Guidance note:
When such a scheme is designed, it is anticipated that the relevant DP scope will be reviewed and distributed over the five year
period based upon the need for verification and in a risk perspective. Dividing a traditional complete scope in five (5) equal parts
may without any more detailed analysis of the needed frequency and test method for each test item, may not be considered as
sufficient.

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

2.5.6.2 The DP verification and reporting system shall, preferably be on a digital format, and keep track of:
— the total scope

Class guideline — DNV-CG-0557. Edition August 2021 Page 50


Data-driven verification

DNV AS
— a detailed plan for when, in the five-year cycle, each test shall be performed, alternatively due dates for

Appendix B
each test may be implemented, to ensure that the frequency is within requirements for each test item
— when each item has been tested and the accumulated results, including close out of findings, from each
test.

2.5.6.3 It shall at any time be easy to generate an overview of the verification status of the complete test
scope. I.e. when each item was last verified, if it is still within due dates and the results of the latest activity
related to the specific tests, including potential findings, their status and how they have been handled.

2.5.6.4 The requirements for witnessing shall follow the same requirements as when the renewal tests are
performed in one session.

2.6 DP Surveys performed by use of remote witnessing


2.6.1 For surveys specified in this part the stated requirements are in addition to the general requirements
valid for all applications.

2.7 General
2.7.1 For remote witnessing of DP systems annual and renewal surveys, application shall in general be
based on REW(HMI2) solutions. For testing and inspections with a specified and relatively small scope and
duration, e.g. occasional surveys and deletion of conditions of class, the remote testing may be based on
REW when agreed with the Society. See DNV-RU-SHIP Pt.6 Ch.11 Sec.3

2.7.2 For remote verification of non-redundant DP notations, as a minimum, the following equipment shall
be arranged for in the remote location:
— Minimum one DP verification operator station, which shall present a remote presentation of the DP control
system operator screen(s).
— Minimum one vessel control and monitoring system verification operator station, which shall present a
remote presentation of the vessel control and monitoring system screen(s).

2.7.3 For remote verification of redundant DP notations, as a minimum, the following equipment shall be
arranged for in the remote location:
— Minimum two DP verification operator stations, which shall present a remote presentation of the DP
control system operator screens.
— For DPS(3) and enhanced reliability DYNPOS(E/ER) notations these DP verification operator stations
shall also be able to present a remote presentation of the back-up and alternative DP control systems as
relevant.
— Minimum two vessel control and monitoring system verification operator stations, which shall present a
remote presentation of the vessel control and monitoring system screens.
Guidance note:
Alternative arrangements of the DP remote verification centre may be accepted on a case by case basis, provided that the
arrangements can support the needs for doing efficient verification with the required quality. This should be considered in relation
to the scope, types of tests and purpose with the testing. This guidance note applies also to [2.7.2].

---e-n-d---o-f---g-u-i-d-a-n-c-e---n-o-t-e---

Class guideline — DNV-CG-0557. Edition August 2021 Page 51


Data-driven verification

DNV AS
CHANGES – HISTORIC

Changes – historic
November 2020 edition
This is a new document.

Class guideline — DNV-CG-0557. Edition August 2021 Page 52


Data-driven verification

DNV AS
About DNV
DNV is the independent expert in risk management and assurance, operating in more than 100
countries. Through its broad experience and deep expertise DNV advances safety and sustainable
performance, sets industry benchmarks, and inspires and invents solutions.

Whether assessing a new ship design, optimizing the performance of a wind farm, analyzing sensor
data from a gas pipeline or certifying a food company’s supply chain, DNV enables its customers and
their stakeholders to make critical decisions with confidence.

Driven by its purpose, to safeguard life, property, and the environment, DNV helps tackle the
challenges and global transformations facing its customers and the world today and is a trusted
voice for many of the world’s most successful and forward-thinking companies.

WHEN TRUST MATTERS

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy