0% found this document useful (0 votes)
25 views261 pages

DPTL - Reading Material 2024-25

The document serves as reading material for a course on Data Privacy Technology and Law at the National Law Institute University, Bhopal, covering various aspects of data privacy, including definitions of data and information, privacy principles, data protection laws in different regions, and compliance frameworks. It emphasizes the importance of understanding personal data, data aggregation, de-identification, and the distinctions between data and information. The course aims to equip students with knowledge relevant to cyber security and data privacy regulations for the academic session of January to May 2025.

Uploaded by

maakimaasaan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views261 pages

DPTL - Reading Material 2024-25

The document serves as reading material for a course on Data Privacy Technology and Law at the National Law Institute University, Bhopal, covering various aspects of data privacy, including definitions of data and information, privacy principles, data protection laws in different regions, and compliance frameworks. It emphasizes the importance of understanding personal data, data aggregation, de-identification, and the distinctions between data and information. The course aims to equip students with knowledge relevant to cyber security and data privacy regulations for the academic session of January to May 2025.

Uploaded by

maakimaasaan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 261

NATIONAL LAW INSTITUTE UNIVERSITY,

BHOPAL

READING MATERIAL

DATA PRIVACY TECHNOLOGY AND LAW

Course Teacher:
Dr. Pooja Kiyawat

B.SC.LLB (HONS.) [CYBER SECURITY]

SECOND SEMESTER

JAN 2025 – MAY 2025


Academic Session 2024-2025

1
TABLE OF CONTENTS

INTRODUCTION.................................................................................................................... 3

PRIVACY PRINCIPLES ...................................................................................................... 13

THE DATA LIFE CYCLE .................................................................................................... 21

DATA PROTECTION IMPACT ASSESSMENT ............................................................... 38

PRIVACY AND DATA PROTECTION FRAMEWORK .................................................. 55

DATA PROTECTION LAW IN EU ..................................................................................... 62

DATA PROTECTION LAWS IN USA ................................................................................ 78

DATA PROTECTION LAWS IN INDIA........................................................................... 111

COMPLIANCE WITH DATA PRIVACY FRAMEWORK IN INDIA .......................... 154

PRIVACY GOVERNANCE ................................................................................................ 163

PRIVACY POLICIES .......................................................................................................... 189

TRAINING AND AWARENESS ........................................................................................ 205

DATA BREACH INCIDENT PLAN .................................................................................. 218

2
INTRODUCTION

What is Data?

Data is a collection of raw, unorganised facts and details like text, observations, figures,
symbols and descriptions of things etc. In other words, data does not carry any specific purpose
and has no significance by itself. Moreover, data is measured in terms of bits and bytes – which
are basic units of information in the context of computer storage and processing.

Data is defined as a collection of individual facts or statistics. (While “datum” is technically


the singular form of “data,” it’s not commonly used in everyday language.) Data can come in
the form of text, observations, figures, images, numbers, graphs, or symbols. For example, data
might include individual prices, weights, addresses, ages, names, temperatures, dates, or
distances.

Data is a raw form of knowledge and, on its own, doesn’t carry any significance or purpose. In
other words, you have to interpret data for it to have meaning. Data can be simple—and may
even seem useless until it is analyzed, organized, and interpreted.

There are two main types of data:

• Quantitative data is provided in numerical form, like the weight, volume, or cost of
an item.
• Qualitative data is descriptive, but non-numerical, like the name, sex, or eye color of
a person.

What is Information?

Information is processed, organised and structured data. It provides context for data and enables
decision making. For example, a single customer’s sale at a restaurant is data – this becomes
information when the business is able to identify the most popular or least popular dish.

Information is defined as knowledge gained through study, communication, research, or


instruction. Essentially, information is the result of analyzing and interpreting pieces of data.
Whereas data is the individual figures, numbers, or graphs, information is the perception of
those pieces of knowledge.

3
For example, a set of data could include temperature readings in a location over several years.
Without any additional context, those temperatures have no meaning. However, when you
analyze and organize that information, you could determine seasonal temperature patterns or
even broader climate trends. Only when the data is organized and compiled in a useful way can
it provide information that is beneficial to others.

The Key Differences Between Data vs Information

• Data is a collection of facts, while information puts those facts into context.
• While data is raw and unorganized, information is organized.
• Data points are individual and sometimes unrelated. Information maps out that data to
provide a big-picture view of how it all fits together.
• Data, on its own, is meaningless. When it’s analyzed and interpreted, it becomes
meaningful information.
• Data does not depend on information; however, information depends on data.
• Data typically comes in the form of graphs, numbers, figures, or statistics. Information
is typically presented through words, language, thoughts, and ideas.
• Data isn’t sufficient for decision-making, but you can make decisions based on
information.

Examples of Data vs Information

To further explore the differences between data and information, consider these examples of
how to turn data into insights:

• At a restaurant, a single customer’s bill amount is data. However, when the restaurant
owners collect and interpret multiple bills over a range of time, they can produce
valuable information, such as what menu items are most popular and whether the prices
are sufficient to cover supplies, overhead, and wages.
• A customer’s response to an individual customer service survey is a point of data. But
when you compile that customer’s responses over time—and, on a grander scheme,
multiple customers’ responses over time—you can develop insights around areas for
improvement within your customer service team.
• The number of likes on a social media post is a single element of data. When that’s
combined with other social media engagement statistics, like followers, comments, and

4
shares, a company can intuit which social media platforms perform the best and which
platforms they should focus on to more effectively engage their audience.
• On their own, inventory levels are data. However, when companies analyze and
interpret that data over a range of time, they can pinpoint supply chain issues and
enhance the efficiency of their systems.
• Competitors’ prices are individual data elements, but processing that data can reveal
where competitors have an advantage, where there may be gaps in the market, and how
a company can rise above its competition.

IT Act, 2000 Section 2 (1)

(o) ―data‖ means a representation of information, knowledge, facts, concepts or


instructions which are being prepared or have been prepared in a formalised manner, and
is intended to be processed, is being processed or has been processed in a computer system
or computer network, and may be in any form (including computer printouts magnetic or
optical storage media, punched cards, punched tapes) or stored internally in the memory of
the computer;

(v) ―information‖ includes 2 [data, message, text,] images, sound, voice, codes, computer
programmes, software and data bases or micro film or computer generated micro fiche;

IT SPDI Rules, 2011Section 2 (1)(i)

"Personal information" means any information that relates to a natural person, which, either
directly or indirectly, in combination with other information available or likely to be
available with a body corporate, is capable of identifying such person.

GDPR Article 4

‘personal data’ means any information relating to an identified or identifiable natural


person (‘data subject’); an identifiable natural person is one who can be identified, directly
or indirectly, in particular by reference to an identifier such as a name, an identification
number, location data, an online identifier or to one or more factors specific to the physical,
physiological, genetic, mental, economic, cultural or social identity of that natural person;

Section 43A Explanation

5
(iii) "sensitive personal data or information" means such personal information as may be
prescribed by the Central Government in consultation with such professional bodies or
associations as it may deem fit.]

SPDI Rules, 2011 Rule 3

Data Information

Data is unorganised and unrefined facts Information comprises processed,


organised data presented in a meaningful
context

Data is an individual unit that contains raw Information is a group of data that
materials which do not carry any specific collectively carries a logical meaning.
meaning.

Data doesn’t depend on information. Information depends on data.

Raw data alone is insufficient for decision Information is sufficient for decision
making making

An example of data is a student’s test score The average score of a class is the
information derived from the given data.

What is personal data?

Personal data, sometimes referred to as personal information, is generally defined as, any
information relating to an identified or identifiable living individual. This very broad definition
includes many data elements that are specifically identifiable, such as name; government-
issued identifiers, such as national ID and passport number; DNA profile information;
photographs; geolocation; and biometrics. It also includes data that may be identifiable if
combined with other information, such as date of birth, ZIP code, gender, cookies and other
digital objects, and internet browsing history. Personal data comprises important information
about a person, too, such as medical and genetic information, educational history, political
preferences and more.

6
Personal data defined in laws and regulations

In laws and regulations, personal data has been referred to and defined in various other ways.
For example, the term personal information (PI) has been defined in the California Consumer
Privacy Act (CCPA) as “information that identifies, relates to, describes, is capable of being
associated with, or could reasonably be linked, directly or indirectly, with a particular consumer
or household.” The term personally identifiable information (PII) is defined by the U.S. Office
of Management and Budget’s revised Circular A-130, “Managing Information as a Strategic
Resource,” as “information that can be used to distinguish or trace an individual’s identity,
either alone or when combined with other information that is linked or linkable to a specific
individual.” The terms personal data, personal information, and personally identifiable
information are often used interchangeably; although, depending on the law or jurisdiction,
they may have slightly different meanings. This course will use the term “personal data.”

Many organizations argue they protect privacy through the use of aggregate, de-identified or
anonymous data. However, do their users understand what the terms mean? What is aggregate
data? Is there a difference between de-identified and anonymous data? For researchers,
which data sets have more value: aggregate or anonymous?

Users often agree to personal data sharing with de-identification, without grasping the details.

If you’ve ever wondered what’s going on, wonder no more. Here’s your guide to data de-
identification, aggregation, and the different levels of anonymity.

Aggregate data: to combine and summarize

So, what is aggregate data? Aggregation refers to a data mining process popular in statistics.
Information is only viewable in groups and as part of a summary, not per the individual. When
data scientists rely on aggregate data, they cannot access the raw information.
Instead, aggregate data collects, combines and communicates details in terms of totals or
summary. Many popular statistics and database languages allow for aggregate functions, with
tutorials available for R, SQL and Python.

Consider the following: a marketing company runs a survey to see if people prefer their
company’s brand, or their competitors’. When they present the data to management, it is in
aggregate form: showing which brand is the most popular. They might include additional

7
information on the groups they talked to, such as voting preference by age or location. With
aggregate information, we can get details on what brands are popular by age or in certain
regions, but the exact details on how individuals voted are never revealed.

Can aggregation protect privacy?

As data aggregation only displays information in groups, many consider it a safeguard to


protect personal information. After all, you cannot compromise privacy if the data only shows
the results for groups of individuals, right?

Sadly, it’s not so easy; with the right analysis, aggregate information can reveal
significantly personal details. What if you ask the aggregate blog data: how many visitors
you get from Ireland, who view the blog on a smartphone? What if you ask for the number of
visitors from Ireland, who use a smartphone, in one day? Or visitors from Ireland who use a
smartphone, and clicked on an Amazon ad for menswear on a single day? By applying multiple,
specific filters, it might be possible to single out an individual, intentional or not. Aggregation
can protect privacy, but there is no guarantee that it always does.

For organizations that use data aggregation, Ed Felton with the FTC has a warning: aggregate
data can be useful, but it doesn’t guarantee privacy.

“The simple argument that it’s aggregate data, therefore safe to release, is not by itself
sufficient.”

De-identification: removing personal details

De-identification is a process that removes personal details from a data set. This approach
aims to protect privacy while still providing comprehensive data for analytics. Some of the data
is better at identifying individuals than others. We are easy to identify when the data includes
our name, address, email, birth date or other unique factors. With de-identification, we remove
those unique identifiers from the raw data.

A retail store that uses de-identification may track individual purchases, dates and, store
locations, but remove the names and addresses. While “Susan Smith from 75 Clark Drive in
Great Falls, Montana shops for engineering books”, the store’s database records her as a “user

8
of the Montana location who buys engineering books”. De-identification takes out Susan’s
name and identifiers so that her purchase could come from anyone.

De-identification is a particularly popular privacy safeguard with clinics and organizations that
process health information. The Health Insurance Portability and Accountability Act (HIPAA)
addresses de-identification under section 164.514. According to HIPAA, information is de-
identifiable when

“there is no reasonable basis the information can be used to identify an individual”.

HIPAA permits some allowances for de-identified data, such as disclosures for research or to
public officials.

From de-identified to re-identified: it might not take much.

Unfortunately for organizations who might hope to use de-identification as a safeguard, many
now see it as poor protection. People can be identifiable by more than names and numbers,
thanks to detailed data sets. If a data subject’s job is ‘Mayor’ and the raw data includes city, it
doesn’t take much to figure out who’s who.

An extremely popular case of highlighting the flaw of de-identification came in 2006 with
Netflix. Per Robert Lemos with SecurityFocus, in a contest to improve the company’s
algorithm, Netflix released a set of 2 million subscribers. The company de-identified the data
set by removing user names. Yet to their surprise, researchers from Austin were able to identify
users. They did so by using the data available and filling in the blanks from other sources:
combining user ratings with a public database of movie scores. Needless to say, according
to Epic.org, Netflix cancelled the contest.

De-identification is also flawed because there’s no universal agreement over what


information is personally identifiable. Is the data de-identified if IP addresses remain? What
about dates of birth? Standards exist, including HIPAA’s Safe Harbour, but are they enough?
According to Privacy Analytics, part of the IQVIA group of companies, Safe Harbour “does
not actually ensure that the risk of re-identification is low except in very limited
circumstances.” That’s bad news for health organizations that rely on it, since per HIPAA
section § 164.514.2.ii, allowances for de-identified data are only acceptable if there’s no

9
evidence the data can be re-identified. Recent studies over the past ten years, including Risks
to Patient Privacy: A Re-identification of Patients in Maine and Vermont Statewide Hospital
Data now means new standards are needed.

What about coded data? Tokenization?

Coded data and tokenization are solid ways to protect sensitive data. For coded data, all
sensitive information is stripped out and replaced with code words, numbers, or unique
identifiers. The codes map to another database or document that works as a key. Information
is re-identified by matching the code with its corresponding sensitive data.

In tokenization, we automate the process, replacing sensitive data with a reference variable.
The token maps with a more secure database that holds the sensitive information. When
processing information, the system analyzes tokens against records in the secure database. If it
finds the token’s corresponding match, processing continues using the sensitive data.

Coded data and tokens protect information security. They are efficient because they only
hide sensitive data. If an analyst wishes to process the data without referencing personal details,
they can. Likewise, data sets that use code identifiers or tokens are safer against theft. If the
data is compromised, sensitive data remains concealed. For example, an attacker that steals
data on credit card sales cannot see the card numbers if tokens are in use.

Be aware, however, that while tokens, coded data and unique identifiers offer better security
they do not make data anonymous. Data that uses tokens or code identifiers are still subject to
privacy regulations. Privacy laws are not solely concerned with data breach and access. Privacy
legislations work to minimize the potential misuse of personal data. So long as the data can,
with authorization, be re-identified, privacy agreements must be in place.

Anonymous data: we can’t tell who you are… or can we?

Anonymous data refers to information when it is impossible to identify individuals. Truly


anonymous data sets are a privacy enthusiast’s dream. The ability to collect, store, and
analyze data without the capability of recognizing individuals make an ideal safeguard. For
organizations that manage to keep their data anonymous, the benefits are huge. Anonymous
data is easier to sell, process, analyze and retain, as it requires fewer safeguards for protection.

10
Fewer rules apply: anonymous data is often exempt from privacy legislations, including the
E.U’s General Data Protection Regulation. According to the GDPR, information “which does
not relate to an identified or identifiable natural person or to personal data rendered anonymous
in such a manner that the data subject is not or no longer identifiable” is not subject to privacy
requirements.

How do you make data anonymous? Most techniques fall into one of three categories:
cryptographic, generalization (also known as recoding), and randomization.

Cryptographic methods encrypt the information in storage, making the data anonymous until
decrypted for use. This protects the data but means re-identification can happen when the data
is decrypted for processing.

Generalization techniques borrow from data aggregation and de-identification, to


deliberately remove identifiers and reduce precise data. Under generalization, for example, an
individual’s height or weight becomes a range, instead of the exact number.

Randomization skews the results by adding data and moving elements around so that re-
identification results are full of errors. The Finnish Social Science Data Archive’s Data
Management Guidelines provide in-depth explanations on techniques for anonymizing
qualitative and quantitative data.

Why we may need to give up the idea of anonymous data altogether

Unfortunately, the ability for personal data to be anonymous may no longer be an option. The
ingenuity that can be used to re-identify individuals is utterly astounding. Writing for The
Guardian, Olivia Solon lists examples of using paparazzi shots and nameless taxi logs to
establish celebrity bad tippers. Cory Doctorow writes for BoingBoing.net that journalist Svea
Eckert and data scientist Andreas Dewes identified a German MP’s medication regime through
data collected by browser plug-ins. In July 2019, New York Times journalist Gina
Kolata published evidence that scientists can re-identify ‘anonymized’ U.S. Census data.
Between advances in data science and an increasing trove of data to fill in the gaps, the concept
of anonymous data may become meaningless.

So if none of these techniques fully protect privacy, what do we do?

11
First, recognize that while aggregate, de-identified and anonymized data sets don’t protect
privacy completely, they do still offer some level of protection. If your data is aggregated, de-
identified or anonymized, there’s less chance of it being read by daily processors. Thankfully,
pulling personal information from this heavily processed data requires tools and skills which
are not available to every individual.

Second, be aware if you see these phrases in privacy policies or terms of use that your personal
information is still accessible. A service that collects anonymous data can still be gathering
personal information. Companies that share aggregate or de-identified information are still
sharing personal details: what are your feelings on that?

If you operate a business that uses aggregation, de-identification or anonymization,


recognize that these can’t be your only safeguards. You should still have other physical,
technical and administrative protection measures in place. A data breach of de-identified data
can still cost you, particularly if there’s evidence that personal details can be collected. Use
these techniques as a tool, but not the end-all of privacy and security programs.

12
PRIVACY PRINCIPLES
There are few more commonly shared aspects of data protection than the Fair Information
Practices (FIPs), also known by the term Fair Information Practice Principles (FIPPs).

What are the FIPs? The Fair Information Practices create a set of obligations and
responsibilities for the organization that is collecting personal data while also creating a set of
individual rights for the data subject. In this module, we will discuss the FIPs broadly: data
minimization or collection limitation, use limitation, safeguards or security, notice or openness,
access or individual participation and accountability.

History of the FIPs

As previously stated, the FIPs exist to help organizations understand how to consider the rights
of the data subjects while imposing reasonable responsibility and controls around the data life
cycle. The data life cycle is discussed in more depth in module 3. The FIPs were first developed
in the early 1970s, around the time that computers enabled organizations to collect large
amounts of personal data and to store and search it quickly and easily.

The FIPs themselves have been refined and restated over time but have remained remarkably
stable despite the changes in technological capabilities and the evolving ways in which
information is collected. They first appeared in 1973 in the U.S. Department of Health,
Education, and Welfare Advisory Committee’s Report, “Records, Computers, and the Rights
of Citizens” [HEW Report]. The most commonly used version of the FIPs is the 1980
Organisation for Economic Co-operation and Development (OECD) Guidelines on the
Protection of Privacy and Transborder Flows of Personal Data [OECD Guidelines]. There are
other important restatements and variations, including the 1974 U.S. Privacy Act, the 1981
Council of Europe Convention for the Protection of Individuals with regard to Automatic
Processing of Personal Data [Convention 108], the Asia-Pacific Economic Cooperation
[APEC], which in 2004 agreed to a privacy framework, and the 2009 Madrid Resolution—
International Standards on the Protection of Personal Data and Privacy.

FIPs categories

The FIPs were developed to help properly manage privacy and other information management
risks during the data life cycle. They can be organized into four groups or categories that can
be useful for conceptualization, and some principles may have aspects of more than one
category. FIPs by their very design are interrelated.

13
Some FIPs are specific to managing the data life cycle.

• Data minimization or collection limitation falls under this category. This principle means an
organization should collect the least amount of personal data it needs, should obtain it by lawful
and fair means and should maintain that data for as little time as possible.

• Another key principle is use limitation. The way an organization uses data should be limited
to those uses specified by the organization, unless a data subject has given consent for, or there
is a legal exception for, alternate uses.

• A closely related idea is the concept of purpose specification, in which an organization


commits to disclose specific purposes for which it will use data, then only uses data for those
compatible purposes.

Other principles are focused on data integrity and protection of the information itself.

• One such principle is data quality and relevance. This is the idea that personal data should be
relevant to the purposes for which it is to be used and should be accurate, complete and timely
to be fair to data subjects.

• Similarly, the safeguards or security principle helps establish that reasonable security
safeguards should protect personal data.

Several FIPs are concerned with data subject rights.

• One principle is notice or openness. It means that an organization should be clear and open
to the extent required by law about how it manages personal data and explains its practices and
policies regarding personal data.

• Another principle is access and individual participation. Access allows a person to understand
the data an organization has about them and to obtain, amend, correct or otherwise challenge
it. Individual participation means a person should consent to the extent possible about data
collection and use.

The final category is principles that relate to management-level controls.

The relevant principle or principles are typically known broadly as accountability.


Accountability means that organizations should be accountable for complying with the
principles and obligations in the other FIPs. This FIP typically includes activities such as

14
universal and targeted training, control testing activities, auditing, investigations and other
reviews and reports.

The FIPs are the basis for most data protection laws you will encounter. Aspects of them are
used in practice in nearly every country. Because of this, anyone who works with personal data,
regardless of their field, should understand the FIPs. They are a versatile tool for examining
any question involving the collection of personal data and can often be helpful in determining
the proper course of action.

Data minimization/Collection limitation

The data minimization or collection limitation principle means an organization should:

• Collect the least amount of personal data it needs

• Obtain it by lawful and fair means

• Only maintain that data for as little time as possible

Collection should be lawful and fair to the data subject, and, when appropriate, with the
person’s knowledge and consent.

This principle can be difficult to understand because it may seem counterintuitive.

Some may wonder, “Is it really a problem to be deceptive to get valuable information?”

Unfair and deceptive techniques may not only be illegal, but they also breach a sense of trust
with data subjects.

Similarly, others may ask, “Why should we get rid of that data? It may be valuable.” While
that may be true, as with many questions in data protection, it is often both a question of fairness
and risk.

Fairness: Older data may not be accurate as circumstances change.

Risk: The more personal data an organization has, the more difficult a breach would be to
mitigate. So, eliminating older data is generally a good step to reduce risk.

You may have experienced a time when an organization asked for data that felt unnecessary
for the task. This situation can be especially problematic if the requested data is sensitive or
has a higher risk of causing harm.

15
For example, does an organization need to collect a recipient’s mailing address or age to email
them a newsletter? Discussing this type of hypothetical situation can help stakeholders and
business partners understand the logic in seeking only the personal data the organization needs.

Use limitation

The use limitation principle means that how an organization says it will use data should limit
the way the organization uses that data unless a data subject has given consent for, or there is
a legal exception for, alternate uses.

How an organization shares, uses or otherwise makes personal data available should be
consistent with and limited to what it says in its notice, or as otherwise made clear by law,
unless the data subject consents to alternate uses.

A closely related idea is the concept of purpose specification, in which an organization commits
to disclose specific purposes for which it will use data, then only uses data in ways compatible
with those purposes.

In other words, an organization should explain to the data subject the purposes for which
personal data is being collected when collecting that data and then limit uses of the data to
those purposes or others that are compatible with those specified.

Much like purpose specification, data quality or relevance is often expressed as a separate
principle.

It is the idea that personal data should be relevant to the purposes for which it is to be used and,
to the extent necessary for those purposes, should be accurate, complete and timely to be fair
to data subjects.

Safeguards/Security

For purposes of data protection, the safeguards or security principle establishes that personal
data should be protected by reasonable physical, technical and administrative security
mechanisms.

Personal data should be protected against risks such as loss or unauthorized access, destruction,
use, modification or disclosure in ways that uphold its confidentiality, integrity and availability.
16
Notice/Openness

Also called transparency, the notice or openness principle means that an organization should
be clear and open to the extent required by law about how it manages personal data and explain
its practices and policies with respect to personal data.

This principle has to do with the power imbalance between an individual and an organization.

Because organizations can collect information from multiple sources or through multiple
methods, including passively without the data subject’s awareness, for fundamental fairness,
an individual should be able to learn and understand what personal data an organization has
and what the organization plans to do with it.

The data subject should have a way to establish:

• The existence and nature of personal data that an organization maintains

• The purposes for which it is used

• Whom to contact within the organization for questions about how the data is managed

At its core, processing activities should not be a secret from those whose data is being
processed.

Access/Individual participation

The access or individual participation principle allows a person to understand the data an
organization has about that person and how that data will be used. In other words, the person
should have the ability to know that data exists and should be able to obtain access to the data,
amend it, correct it or otherwise challenge it. It also means, as the name suggests, that a person
should have the opportunity, to the extent possible, to consent in advance to the collection and
use of their data.

The principle is based on the notion that the organization should involve the individual in the
process of collecting and using personal data. It should seek to obtain the data subject’s consent
for collection, use, sharing, retention and maintenance of this sort of data. It should not be a
secret.

17
The access or individual participation principle provides some of the most important data
subject rights, and it can be a powerful tool in helping to ensure that an individual’s data is
accurate.

Many laws provide data subjects with the right to file requests and an appeal, if needed, to
organizations to obtain a copy of their data and to file requests to amend or correct it as well as
delete it. These laws often contain specific, enforceable timelines for response. Similarly, data
should be provided in a clear and readable format to the data subject. It is customary to provide
limits on or charge no fees and prevent taking any adverse action against those who ask for this
type of redress.

Accountability

The accountability principle means organizations are responsible for complying with the
principles and obligations in the other FIPs.

Accountability is not a difficult concept to state, but it can be difficult to enforce within a large
and diffuse organization.

• One key role is someone to establish the rules and oversee compliance. That person is often
called a privacy officer, chief privacy officer, data protection officer or something similar.

• Programs can establish employee accountability for failing to follow policies, such as
discipline and termination.

• Vendor accountability can mean termination of the business relationship

There are many ways to help enhance accountability. Some of the most common, and most
visible, are training programs. One of the most important parts of a data protection program is
ensuring employees, vendors and other business partners are appropriately trained on relevant
data protection standards.

While general training helps ensure everyone understands certain key concepts, role-based
training can help those in key roles have a deeper understanding of privacy obligations relevant
to their jobs.

Another key aspect of accountability is auditing. Using an independent review to determine


how the organization is performing against its commitments is an excellent way to ensure your

18
organization is being accountable. Audits can provide a view into where improvements can be
made and provide an opportunity to make changes in advance of any regulatory enforcement.

How do organizations use the FIPs?

The FIPs provide a set of commonly recognized guidelines that can be used by any organization
to help it establish a framework to align with national, local and industry data protection
standards, build a compliance framework and create tools to help understand privacy failures.

FIPs inconsistencies

FIPs are not self-implementing or self-enforcing. While FIPs have often been incorporated into
law, the way legislatures and regulators do so is not always entirely consistent and can lead to
different results at the statutory or regulatory level. Individual data controllers do not always
behave in the same way, even within the same industry or same country or under the same legal
framework. Different data types, whether about health, biometrics, gender or gender identity,
political or union affiliation or other sensitive data can sometimes attract heightened
requirements. Types of data and business goals can be significant factors in how the FIPs are
incorporated, as can an organization’s general tolerance for risk.

Penalties and enforcement

Some laws provide for direct judicial redress or private rights of action for violations and
accountability failures. The U.S. Privacy Act of 1974 is an example of such a law. Other laws
provide for powerful thirdparty data protection authorities to act, such as the General Data
Protection Regulation. Other approaches may involve express criminal or civil penalties for
violations. In some cases, particularly in the United States, much of the private industry is self-
governed by various industry codes of conduct and commitments, subject to regulatory
enforcement. Closely examining each data protection law will reveal differences in approach
based on the country, the entities subject to the law, the type of data and other goals or factors.

How do organizations use the FIPs?

The most critical lesson is that there is no single “correct” approach to implementing the FIPs,
just different ones. It is most important to be internally consistent with any established policy

19
or procedure that aligns to the FIPs, and to comply with all local and national laws that may
apply. Rarely is implementing the FIPs a mechanical exercise, but instead an analysis that
requires careful balance of the context and other factors.

20
THE DATA LIFE CYCLE

What is the data life cycle?

Organizations that process personal data must determine how best to safeguard that data and
how the data can and will be used throughout the data life cycle. The data life cycle involves
every stage of data processing—from the moment it is collected, throughout its use and storage
and until it is deleted. Understanding the data life cycle is essential to incorporating privacy by
design into an organization’s privacy strategy.

To properly initiate privacy by design and follow privacy engineering practices, it is necessary
to understand the stages of the data life cycle. The data life cycle that will be presented here
provides a generic, high-level overview of how data flows through an organization. Since
organizations have different levels of data use and types of technology used to process personal
data, the finer details of each stage will vary from organization to organization. Understanding
the general process, however, is critical to identify where the potential risks are when
processing personal data. Typically, privacy professionals identify five stages in the data life
cycle:

1. Collection

2. Use

3. Disclosure, also referred to as sharing or disseminating

4. Retention, also referred to as storing or storage, and

5. Destruction, also referred to as deletion

Despite being referred to as a life cycle, data rarely moves through an organization in a
straightforward, stage-by-stage manner. Once initial collection has occurred, additional
information may be added to a particular individual’s profile over time. The profile may be
shared externally or internally multiple times.

Some data in the profile may be used for one purpose, while other data from the same profile
may be used for a different purpose entirely. Even deletion may be conditional. Recognizing
the fluid nature of data processing helps organizations identify specific data flows.

21
Regardless of how personal data is being used, it is vital to recognize that organizations never
truly own any individual’s personal data. When organizations forget this simple fact, the
individuals to whom the data belongs, or data subjects, suffer for the organization’s lack of
care and diligence. Incorporating privacy into everything the organization does concerning
personal data must be foremost during the entire data life cycle for the data to be properly
respected and protected.

What is privacy by design (PbD)?

Privacy by design (PbD) is the concept of embedding privacy throughout the entire life cycle
of processing personal data, including technologies, systems, processes, practices and policies,
from early design state to deployment, use and ultimately disposal. Privacy should be
incorporated into all levels of operations organically, rather than viewed as a trade-off or
something to consider after a product, system, service or process has been built.

What is privacy engineering?

Privacy engineering combines privacy by design with the organization’s privacy values and
principles, individuals’ expectations, social and ethical considerations and the legal
environment, and applies all of these considerations to the technological systems and programs
an organization is developing and using, while assessing the risks involved in processing
personal data. When privacy is ignored or overlooked while developing software or selecting
technologies that process personal data, organizations run the risk of both legal repercussions
and reputational harm, as well as potentially creating lasting damage for the individuals whose
data they process.

Consent

Before personal data is collected, data subjects should provide consent for the collection and
use of their personal data. Consent may be explicit or implied. Explicit consent requires the
data subject to act in a way that specifically communicates consent. Some examples of explicit
consent include requiring the individual to:

• Click or mark a checkbox on a web-based or data-entry form

• Respond to an automatically generated email

• Provide verbal authorization

22
Implied (passive) consent does not require specific action—there is no checkbox to mark or
paper to sign. Instead, there could be a sign at the entrance to a building stating that surveillance
cameras are in use. Entering the premises implies the individual gives consent to be recorded.

Notice

A privacy notice should precede any personal data collection. Privacy notices are both good
practice and are increasingly required by law in various jurisdictions.

Privacy notices should provide the individual with answers to the following questions:

• Who is collecting the personal data and by which methods?

• What personal data is being collected?

• How will the personal data be used?

• How can consent for collection and use of the personal data be provided and removed?

• With whom will the personal data be shared?

• How long will the personal data be retained?

The notice should contain details about the data life cycle:

• How data is collected and by whom

• How data will be used

• With whom data will be disclosed, and

• How long data will be retained

Details about how data is destroyed are not generally disclosed in a privacy notice.

Where notice is required by law, organizations under that jurisdiction must comply with the
specific legal requirements. These requirements may include additional details about the
organization, such as contact information, specific methods for providing or removing consent
for data collection and use, and details about individuals’ rights under the law. To ensure
compliance, it is vital to involve the legal department in developing privacy notices!

23
The data life cycle By understanding the data life cycle and considering the impacts using an
individual’s personal data has on that individual throughout each stage, privacy professionals
can build privacy into their policies and procedures.

Collection

The data life cycle begins with collection, which refers to the process of gathering data that
relates to an individual.

Just as with consent, data may be collected actively (where the data subject is aware of the
collection) or passively (where the collection occurs but the individual is not directly involved
and may not even be aware of it).

Data can be collected in different ways. Organizations must identify every method for
obtaining personal data so it can be properly tracked, processed and protected.

Collection Method

In first-party collection, individuals provide data about themselves directly to the organization,
such as by filling out and submitting a web form.

Through surveillance, the organization observes the individual (or data about them) without
direct input from that individual. Note that this does not always mean visual observation. It
includes methods such as web tracking through cookies and GPS tracking of cell phones.

Through repurposing or secondary use, data is used for a different purpose than the one it was
originally collected for. For example, an organization sends marketing material to an address
provided for shipping purposes. When data is repurposed, it is a good practice, and may be
required by law, to obtain new or additional consent. Being clear about how personal data will
be used, especially when new uses are introduced, shows respect for the individual’s data and
avoids potential accusations of misuse.

Third-party collection happens when data collected by one organization is transferred to


another organization for processing. Some examples are data that is purchased by the second
organization, transferred to a partner or subsidiary or exchanged as part of a merger. Another
example is when a marketing service aggregates and provides information about website users
that helps online behavioral advertising tools provide relevant ads to individuals.

24
Rarely does collection only occur upon initial contact with an individual. Generally, additional
data is collected about an individual through various methods after the initial collection.
Regardless of when data is collected, organizations should apply the principle of data
minimization and collection limitation, striving to collect only the data they need for specific
identified purposes.

Passively collected data is not necessarily harmful to an individual and may, in fact, be useful.
For example, that same loyalty program algorithm allows the pharmacy to provide the shopper
with instant coupons when they are in the store for items they regularly buy or notifications to
order a refill when a prescription is running low. Generally, however, the more personal data
an organization gathers, the easier it is to identify and categorize the individual, and the higher
the risk to that individual, as well as to the organization, in the event of a data incident or
breach. To reduce that risk, it is critical that organizations only collect and retain the data they
need for specific purposes.

Use

Use is the processing or sharing of personal data for any purpose beyond simple storage or
deletion. Use is very broad and encompasses many actions that may not seem like using the
personal data. Any individual accessing personal data is using it—even if they do nothing more
than read it. Processing data for security or fraud prevention is use. Sharing the data is also use,
but is typically considered a separate life cycle stage.

Before any personal data is collected, organizations must determine what they need the data
for; in other words, how will it be used? It is common for some data about a data subject to be
used for one purpose while other data is used for entirely different purposes. Has the data
subject been informed of and consented to the different ways their data will be used? Using
data in ways data subjects do not expect and have not consented to can create privacy risks or
harm to them and may even be illegal. To help minimize the risk, consider whether the data
can be de-identified to reduce risk. In other words, is some data used strictly for statistical or
analytical purposes? If so, is it necessary to have it associated with a specific individual?

Consider all aspects of what the personal data is or could be used for and incorporate these uses
into the privacy notice to help minimize risk and inform data subjects. Additionally, limit

25
access to the data to those who need it for its specified purposes to reduce the risk of data being
used in ways that were not disclosed.

Disclosure

Disclosure is sharing or providing access to personal data. Disclosure may be internal, that is,
within the organization, or external, as when data is shared with third parties or others outside
of the organization.

Data minimization or collection limitation, is critically connected to disclosure, both internally


and externally. The recipient should only have access to the data necessary for them to perform
their tasks related to the purpose for collection.

Granting necessary access to personal data within an organization is not considered sharing. If
an organization is large or has subsidiaries, however, some circumstances may require
providing notice to the data subject that their data is being shared internally. For example, a
large organization has multiple subsidiaries, one that provides kitchen goods and another that
provides clothing. An individual who purchases clothing may not expect their data to be shared
with the kitchen goods company, even though it is under the same parent company. Obtaining
consent, or providing notice, before disclosing the personal data to the kitchen goods company
reduces access to, and allows the data subject control of, that personal data.

Sharing data externally may also mean processing by third parties, companies the data subject
does not have a relationship with and may not be familiar with. It is generally considered a best
practice, and is in some cases the law, that organizations disclose to data subjects details about
any third parties their personal data will be shared with.

Retention

Retention, or storage, refers to the process of saving the collected data to enable its specified
use until it is destroyed—or, in other words, how long an organization is going to keep the
personal data. Data should be retained only as long as is reasonably necessary. Retention must
comply with legal and regulatory requirements, applicable industry standards and business
objectives. This applies to data stored while it is in use or being shared, as well as data that is
archived for legal or other requirements.

26
Personal data should only be stored (1) when it is practical and beneficial, and (2) when the
benefit and duration of storing the data aligns with the data subject’s expectations. The
organization should delete personal data once it has fulfilled the purposes of collection and met
any applicable legal requirements.

When deciding whether to retain personal data, organizations must consider the original
disclosed purpose for collecting the personal data. Has that purpose changed over time? Has
the data subject been provided with new notices as the use changed? Remember that the
personal data collected may increase due to passive collection, such as surveillance or third-
party sharing.

The organization must assess its retained data and determine whether it has a legitimate interest
in keeping that data. If it does not, the data should be deleted.

There are risks to retaining data that go beyond legal implications. In the event of a breach, any
data retained by the organization may be subject to exposure and misuse. Therefore, by limiting
data retained, organizations reduce the risk of legal repercussions, but, more importantly,
reduce the risk to the data subjects involved.

To that end, organizations must establish clear policies determining how long specific data will
be retained. These retention policies should be developed prior to collection, must comply with
legal requirements, and should address the organization’s business objectives and data
subjects’ expectations.

Retention may be based on the limitations of an organization’s storage constraints. However,


privacy professionals must consider legal obligations for retaining certain types of personal
data for specified time periods. If required by law, the organization may need to use an alternate
means of storage.

Destruction

Data destruction refers to removing or otherwise making unrecoverable any saved personal
data from digital systems and shredding or otherwise destroying any hard copies. Data
destruction is, of necessity, linked closely to data retention. It is essential to establish (1) how
long to retain the data, (2) the method of notification (and who must be notified) once the
retention period is over and (3) the proper action for destroying the data.

27
Personal data that is no longer needed should be properly destroyed. For example, improperly
disposing of paper medical files may make them accessible to unauthorized individuals. If
paper files are not shredded into tiny particles, personal data can be recovered by putting the
shreds back together. This similarly applies to digital files. Deleting files and even emptying a
digital trash bin does not destroy the files. Deleted digital files can often be retrieved using
recovery software.

Data destruction becomes more complex when it involves other parties. It has become
exceedingly common to leverage third-party services in storing personal data, such as cloud
storage providers. To accommodate the ongoing protection of data, organizations must agree
beforehand on how these third parties will meet or facilitate data destruction requirements.
Organizations must ensure that, upon request, personal data has been fully removed from the
third party’s systems.

As with anything involving personal data, creating an appropriate plan for data destruction, and
following through on that plan, is key: • Know where the data is stored and with whom it is
shared • Have policies in place for internal destruction and contacts for external destruction •
Ensure those involved know their roles, and • Verify that data has been destroyed as expected

28
Privacy by Design

privacy by design is the practice of embedding privacy into all stages of the development of
technologies, systems, processes, practices and policies that involve personal data. The data
life cycle plays a critical role in privacy by design. Without knowing how an organization
collects, uses, discloses, retains and destroys personal data, it is impossible to build privacy
into data processing. The Fair Information Practices discussed, also play a key role in privacy
by design, helping organizations to build good privacy practices.

For example, a global retailer wishing to gain insights into customer satisfaction by region
could ask its customers to review its services anonymously and gather consumer statistics by
letting them select their region instead of deducing it from their home address or IP address.
Offline, the retailer may grant customers the ability to press a “satisfied” or “dissatisfied”
button at the stores’ exits to indicate the ratio of satisfied customers by region.

Privacy by design: Use

Only use collected data for the purposes communicated to the data subject.*

Any new or additional uses require additional data subject consent through clear
communication. For example, when an online shopper wishes to be notified by email when a
specific shirt is back in stock, the organization may not send them unsolicited emails about
similar items unless the individual explicitly consented to these notifications.

*Related FIPs: notice/openness and use limitation

Privacy by design: Disclosure

Limit the disclosure of personal data with others to avoid potential misuse.*

For example, for internal disclosure, only the HR department may access employee records,
whether stored digitally or physically. For external disclosure, it is not necessary for an
employer to provide a payroll processing company with an employee’s email address or contact
phone number.

*Related FIPs: use limitation and data minimization/collection limitation

29
Privacy by design: Retention

Store personal data only to fulfill the original purpose for collection.*

An example of retention to fulfill the original purpose is an online pet supply shop that allows
recurring customers to create accounts to facilitate faster checkouts. Data may need to be
archived for a limited time to meet business or regulatory compliance needs. For example, a
business may wish to track which pet supplies are most popular by season over the last three
years to predict future stocking needs. Through privacy by design, this data may be aggregated
and then archived without any attached personal data.

*Related FIPs: use limitation and data minimization/collection limitation

Privacy by design: Destruction

When deleting and destroying data, purge all personal data from all information systems (with
the possible exception of backup systems) to prevent accidental or intentional recovery.*

For example, to avoid human errors, a business could automate a process that removes all data
from connected systems when no longer needed. Additionally, it could anonymize all stored
records before deletion to limit any recovery.

*Related FIP: data minimization/collection limitation

Privacy by design: Accountability

Throughout all stages of the data lifecycle, the FIP accountability is an important consideration
to the following steps: • Ensuring policies and contracts are followed • Running audits and
internal checks • Reviewing the personal data being processed to ensure the only data collected
and used is needed for the disclosed purposes • Destroying extraneous data • Knowing the
privacy risks that affect the personal data being processed.

PRIVACY BY DESIGN

The 7 Foundational Principles of Privacy by Design are:

30
1. Proactive not Reactive; Preventative not Remedial

The Privacy by Design approach is characterized by proactive rather than reactive


measures. It anticipates and prevents privacy invasive events before they happen. PbD does
not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy
infractions once they have occurred − it aims to prevent them from occurring. In short,
Privacy by Design comes before-the-fact, not after.

2. Privacy as the Default

We can all be certain of one thing − the default rules! Privacy by Design seeks to deliver
the maximum degree of privacy by ensuring that personal data are automatically protected
in any given IT system or business practice. If an individual does nothing, their privacy still
remains intact. No action is required on the part of the individual to protect their privacy −
it is built into the system, by default.

3. Privacy Embedded into Design

Privacy by Design is embedded into the design and architecture of IT systems and business
practices. It is not bolted on as an add-on, after the fact. The result is that privacy becomes
an essential component of the core functionality being delivered. Privacy is integral to the
system, without diminishing functionality.

4. Full Functionality – Positive-Sum, not Zero-Sum

Privacy by Design seeks to accommodate all legitimate interests and objectives in a


positive-sum “win-win” manner, not through a dated, zero-sum approach, where
unnecessary trade-offs are made. Privacy by Design avoids the pretence of false
dichotomies, such as privacy vs. security, demonstrating that it is possible, and far more
desirable, to have both.

5. End-to-End Security – Lifecycle Protection

Privacy by Design, having been embedded into the system prior to the first element of
information being collected, extends securely throughout the entire lifecycle of the data
involved — strong security measures are essential to privacy, from start to finish. This
ensures that all data are securely retained, and then securely destroyed at the end of the
process, in a timely fashion. Thus, Privacy by Design ensures cradle to grave, secure
lifecycle management of information, end-to-end.

31
6. Visibility and Transparency

Privacy by Design seeks to assure all stakeholders that whatever the business practice or
technology involved, it is in fact, operating according to the stated promises and objectives,
subject to independent verification. Its component parts and operations remain visible and
transparent, to both users and providers alike. Remember, trust but verify!

7. Respect for User Privacy

Above all, Privacy by Design requires architects and operators to keep the interests of the
individual uppermost by offering such measures as strong privacy defaults, appropriate
notice, and empowering user-friendly options. Keep it user-centric!

What is Privacy Engineering?


Privacy engineering is a methodological framework of integrating privacy in the life cycle of
IT system design and development. It operationalizes the Privacy by Design (PbD) framework
by bringing together methods, tools and metrics, so that we can have privacy protecting
systems. With the pandemic, digital innovation has become the need of the hour and thus, has
brought PbD even more in the limelight. The goal of privacy engineering is to make Privacy
by Design the de-facto standard for IT systems.

Different bodies have different definitions of privacy engineering, but the gist is the same – To
address complete lifecycle of individual privacy and not just during data storage and analysis.
Privacy engineering incorporates a more holistic approach covering legalities, risk analysis and
user sentiment.

US-based National Institute of Standards and Technology (NIST) defines privacy engineering
as “a specialty discipline of systems engineering focused on achieving freedom from conditions
that can create problems for individuals with unacceptable consequences that arise from the
system as it processes PII.” The below image sheds more light on the objectives of Privacy
Engineering:

32
Privacy engineering, by making privacy an integral part of the designing and development
process (SDLC), tries to reduce risks and to protect privacy at scale.

As per Gartner’s definition, “Privacy engineering is an approach to business process and


technology architecture that combines various methodologies in design, deployment and
governance. Properly implemented, it yields an end result with both:

1. Easily accessible functionality to fulfill the Organisation for Economic Co-operation


and Development (OECD) eight privacy principles and,
2. Mitigation against the impact of a breach of personal data by reimagining defense in
depth from a privacy-centric vantage.

The process involves ongoing re-calculation and re-balancing of the risk to the individual data
owner while preserving optimum utility for personal data- processing use cases.”

Bridging the gap between IT, Risk and Compliance, Privacy, Security and Business

Thus, privacy engineering is the foundation of holistic privacy. It will help to build a structured
framework and bring privacy as a mainstream concept for Organizations to focus on.

33
Privacy protection continues to be a very critical issue for individuals, businesses and
governments all across the globe. People in the form of consumers, want personalized content
and service deliveries, but at the same time they want privacy protections to be maintained at
all costs and they expect organizations and businesses to take action to protect consumers and
from governments to protect citizens’ data.

Few common things that I believe are true regarding this scenario are:

• Consumers want transparency about how businesses are storing, processing and
utilizing their data.
• They are very concerned about how their personal information is used by advanced
technologies like AI and any kind of abuse erodes their trust – completely.
• Many consumers don’t trust that private businesses will follow/have regulations and
compliances in place to keep their data secure. So, they look up to their government to
protect their data with laws, policies and other enforcement mechanisms.
• Once the trust is lost, consumers take action to protect themselves and their data. They
even switch companies or providers and move to the ones whom they trust can keep
their data safe. Many terminate relationships with traditional and online businesses over
data privacy.

With the advent of different privacy laws like EU’s GDPR and more, framework has been
formulated for Data Subject Access Requests (DSAR). Many privacy laws enable consumers
to raise requests concerning their data and provide control in the hands of the consumers that
they can take action if they are dissatisfied with how their data is stored, processed or utilized.

Privacy engineering that bonds innovation with PbD, ensures that every IT system must
provide the highest possible privacy to personal data. This increases the consumers’ trust that
their data is safe because the privacy has been ingrained in the system.

PROS OF PRIVACY ENGINEERING CONS OF PRIVACY ENGINEERING

Reduces dependence on external security Requires laws and policies – most are still under
enforcements development phase

34
Privacy is the default setting as it’s Violations possible because of some mistakes during
embedded in design. It provides proactive design or development phase, bad actors, government
protection, not remedial one. mandates, availability of new technologies etc.

It provides end to end security with Some may find it expensive to implement as this
complete protection of system lifecycle. requires skilled engineers.

It respects user privacy – ensures that the


technology and systems remain user Some may find it restrictive to innovation.
centric.

Privacy Design Strategies

Privacy by design requires that privacy concerns are tackled throughout the full life cycle of
system development. In this paradigm, privacy protection is a system requirement like any
other functional requirement, that must be addressed from the beginning, and that shapes the
final design and implementation of the system. To support privacy by design, we therefore
need guiding principles to resolve privacy requirements throughout the system development
life cycle, in particular during concept development, analysis, design and implementation
phases. To this end, this blog post (and the accompanying full paper) presents eight privacy
design strategies.

Strategies, patterns and technologies

Privacy issues can be resolved at several levels of abstractions. At the lowest level, privacy
enhancing technologies implement specific privacy protection mechanisms. These
technologies are important during the implementation phase. Design patterns are tools at a
higher level of abstraction that "describe a commonly recurring structure of communicating
components that solves a general design problem within a particular context". They apply to
the design phase and do not necessarily play a role in the earlier, concept development and
analysis, phases of the software development cycle. The main reason is that such design
patterns are already quite detailed in nature, and more geared towards solving an
implementation problem. To guide the development team in the earlier stages, privacy design
strategies at a higher level of abstraction are needed.

Privacy design patterns can be mapped to privacy design strategies. This mapping is not unique:
a pattern may implement more than one strategy. Similarly, privacy enhancing technologies

35
implement one or more design patterns. By studying these mappings (of which the current
research is just a first step) we can identify for which strategies adequate patterns are missing,
and similarly, for which patterns new technologies need to be developed.

Deriving privacy design strategies

A natural starting point to derive some privacy preserving strategies is to look at when and how
privacy is violated, and then consider how these violations can be prevented. I have taken
Daniel Solove's taxonomy [Solove, D.J. A taxonomy of privacy. University of Pennsylvania
Law Review 154, 3 (2006), 477-564] as point of departure here. His general subdivision of
activities that affect privacy (information collection, information processing, information
dissemination and invasions) inspired us to look at IT systems at a higher level of abstraction
to determine where and how privacy violations can be prevented.

Current data protection legislation in general views an IT system as an information storage


system, i.e. a database system where personal identifiable information about people is stored
in one or more database tables. Applying this legislation to such systems, the following general
observations can be made. Data collection should be minimised, for example by not storing
individual rows in a database table for each and every individual, and the number of attributes
stored should correspond to the purpose. Data collected for one purpose should be
stored separately from data stored for another purpose, and linking of these database tables
should not be easy. When data about individuals is not necessary for the purpose,
only aggregate data should be stored. Personal data should be properly protected, and strict
access control procedures should limit access to authorised persons only. A data subject should
be informed about the fact that data about her is being processed, and she should be able to
request modifications and corrections where appropriate. In fact the underlying principle of
information self-determination dictates the she should be in control. Finally, the collection and
processing of personal data should be done in accordance to a privacy policy, that should be
actively enforced. The current proposal for the revision of the European privacy directive (into
a regulation) also stresses the fact that data controllers should be able to demonstrate
compliance with data protection legislation.

36
Privacy design strategies in the database metaphor

Given this analysis form the legal point of view, we see we can distinguish the following eight
privacy design strategies:

minimise, separate, aggregate, hide, inform, control, enforce and demonstrate.

A graphical representation of these strategies, when applied to a database system, is given in


this figure.

37
DATA PROTECTION IMPACT ASSESSMENT

4.3 Assessments and Impact Assessments

Three types of assessments and impact assessments are of concern at this point: privacy
assessments, PIAs and DPIAs. Although these terms are often used in other contexts to refer
to the same concept, you can also differentiate them.

4.3.1 Privacy Assessment: Measuring Compliance


Privacy assessments measure an organization’s compliance with laws, regulations, adopted
standards and internal policies and procedures. Their scope includes education and awareness;
monitoring and responding to the regulatory environment; data, systems and process
assessments; risk assessments; incident response; contracts; remediation; and program
assurance, including audits.
Privacy assessments are conducted internally by the audit function, the DPO or a business
function, or externally by a third party. They can happen at a predefined time period or be
conducted in response to a security or privacy event or at a request of an enforcement authority.
The standards used can be subjective, such as employee interviews, or objective, such as
information system logs.

4.3.2 Privacy Impact Assessment


A PIA is an analysis of the privacy risks associated with processing personal information in
relation to a project, product or service. To be an effective tool, a PIA also should suggest or
provide remedial actions or mitigations necessary to avoid or reduce/minimize those risks.
Requirements regarding PIAs emanate from industry codes, organizational policy, laws,
regulations or supervisory authorities.
PIAs can help facilitate privacy by design, which is the concept of building privacy directly
into technology, systems and practices at the design phase. It helps ensure privacy is considered
from the outset, and not as an afterthought. Privacy by design will be covered in greater detail
in Chapter 8.
To be an effective tool, a PIA should be accomplished early, in other words:

• Prior to deployment of a project, product or service that involves the collection


of personal information

• When there are new or revised industry standards, organizational policies, or


laws and regulations

38
• When the organization creates new privacy risks through changes to methods by
which personal information is handled

Below are some of the events that may trigger the need for a PIA:

• Conversion of information from anonymous to identifiable format

• Conversion of records from paper-based to electronic format

• Significant merging, matching and manipulation of multiple databases


containing personal information

• Application of user-authentication technology to a publicly accessible system

• System management changes involving significant new uses and/or application


of new technologies

• Retiring of systems that held or processed personal data

• Incorporation of personal information obtained from commercial or public


sources into existing databases

• Significant new interagency exchanges or uses of personal information

• Alteration of a business process resulting in significant new collection, use and


disclosure of personal information

• Alteration of the character of personal information due to addition of


qualitatively new types

• Implementation of projects using third-party service providers

Regardless of the geographical location or the requirements based in law, regulation or


guideline, the PIA is a risk management tool used to identify and reduce the privacy/data
protection risks to individuals and organizations and aimed at ensuring a more holistic risk
management strategy. Details of PIAs, how they are used, and formats, methodologies and
processes around the assessments will vary depending on industry, private- or public-sector
orientation, geographical location or regional requirements, and sensitivity or type of data. The
privacy professional should identify the appropriate methodology and approaches based on
these various factors and tailor the model to the specific needs of the organization.
One of the biggest challenges for privacy professionals is to prioritize the projects, products
or services that should be submitted to a PIA. To identify the data-processing activities that
represent a higher privacy risk, some organizations first conduct an express PIA, which consists

39
of a small questionnaire that assesses the need for a full and more comprehensive PIA. This
approach enables all stakeholders to dedicate their resources to the areas where the risks and
potential harms for individuals are most significant and to mitigate these risks, creating better
outcomes and more effective protection for individuals.13

4.3.3 PIAs in the United States


The U.S. government, under the E-Government Act of 2002, required PIAs from government
agencies when developing or procuring IT systems containing personally identifiable
information (PII) of the public or when initiating an electronic collection of PII. 14 This
requirement is preceded by a privacy threshold analysis (PTA) to determine if a PIA is needed.
The PTA would seek to determine from whom data is collected, what types of personal data
are collected, how such data is shared, whether the data has been merged, and whether any
determinations have been made as to the information security aspects of the system.15 The
Privacy Act requirements include the rights to receive timely notice of location, routine use,
storage, retrievability, access controls, retention and disposal; rights of access and change to
personal information; consent to disclosure; and maintenance of accurate, relevant, timely and
complete records.16 As such, the PIA will describe in the detail the information collected or
maintained, the sources of that information, the uses and possible disclosures, and potential
threats to the information.
The uses to which the information is put by the system are described next, including the legal
authority for collecting the data, the retention periods and eventual destruction, and any
potential threats based on use of the data. Also included are any information dissemination and
the controls used, the rights listed above, the information security program used, and
compliance with the Privacy Act.17 Under implementation guidance, the following were
reasons for initiating a PIA:

• Collection of new information about individuals, whether compelled or voluntary

• Conversion of records from paper-based to electronic format

• Conversion of information from anonymous to identifiable format

• System management changes involving significant new uses and/or application


of new technologies

• Significant merging, matching or other manipulation of multiple databases


containing PII

40
• Application of user-authentication technology to a publicly accessible system

• Incorporation into existing databases of PII obtained from commercial or public


sources

• Significant new interagency exchanges or uses of PII

• Alteration of a business process resulting in significant new collection, use


and/or disclosure of PII

• Alteration of the character of PII due to the addition of qualitatively new types
of PII

• Implementation of projects using third-party service providers18

4.3.4 International Organization for Standardization (ISO)


ISO 29134 is a set of guidelines for the process of running a PIA and the structure of the
resulting report.19 It is not a standard for PIAs, unlike, say, a standard for information security.
These guidelines define a PIA as a process for identifying and treating, in consultation with
stakeholders, risks to PII in a process, system, application or device. They reiterate that PIAs
are important not only for controllers and their processors but also for the suppliers of digitally
connected devices. They also specify that the PIA should start at the earliest design phase and
continue until after implementation. The process first involves conducting a threshold analysis
to determine whether a PIA is needed, then preparing for a PIA, performing a PIA and
following up on the PIA. The performing phase consists of five steps:

1. Identifying information flows of PII

2. Analyzing the implications of the use case

3. Determining the relevant privacy-safeguarding requirements

4. Assessing privacy risk using steps of risk identification, risk analysis and risk
evaluation

5. Preparing to treat privacy risk by choosing the privacy risk treatment option;
determining the controls using control sets such as those available in ISO/IEC
27002 and ISO/IEC 29151, and creating privacy risk treatment plans

The follow-up phase consists of:

• Preparing and publishing the PIA report

• Implementing the privacy risk treatment plan

41
• Reviewing the PIA and reflecting changes to the process

The structure of the PIA report should include sections on the scope of the assessment,
privacy requirements, the risk assessment, the risk treatment plan, and conclusions and
decisions. The risk assessment includes discussion of the risk sources, threats and their
likelihood, consequences and their level of impact, risk evaluation, and compliance analysis.
There should also be a summary that can be made public.20

4.3.5 Data Protection Impact Assessments


When an organization collects, stores or uses personal data, the individuals whose data is being
processed are exposed to risks. These risks range from personal data being stolen or
inadvertently released and used by criminals to impersonate the individual, to worry being
caused to individuals that their data will be used by the organization for unknown purposes. A
DPIA describes a process designed to identify risks arising out of the processing of personal
data and to minimize these risks as much and as early as possible. DPIAs are important tools
for negating risk and for demonstrating compliance with the GDPR.21
Under the GDPR, noncompliance with DPIA requirements can lead to fines imposed by the
competent supervisory authority. Failure to carry out a DPIA when the processing is subject to
a DPIA, carrying out a DPIA in an incorrect way, or failing to consult the competent
supervisory authority where required can result in an administrative fine of up to €10 thousand,
or in the case of an undertaking, up to 2 percent of the total worldwide annual revenue of the
preceding financial year, whichever is higher.22
DPIAs are required only when an organization is subject to the GDPR and although the term
PIA is often used in other contexts to refer to the same concept, a DPIA has specific triggers
and requirements under the GDPR.

4.3.6 When is a DPIA Required?


In case the processing is “likely to result in a high risk to the rights and freedoms of natural
persons,” the controller shall, prior to processing, carry out a DPIA.23 The nature, scope,
context, purpose, type of processing, and use of new technologies should, however, be
considered. The use of new technologies, in particular, whose consequences and risks are less
understood, may increase the likelihood that a DPIA should be conducted. Article 35 provides
some examples when a processing operation is “likely to result in high risks:”
a. Systematic and extensive evaluation of personal aspects relating to natural
persons which is based on automated processing, including profiling, and on

42
which decisions are based that produce legal effects concerning the natural
person or similarly significantly affect the natural person;
b. Processing on a large scale of special categories of data, or of personal data
relating to criminal convictions and offences; or
c. A systematic monitoring of a publicly accessible area on a large scale.24
This abovementioned list is nonexhaustive and the Article 29 Working Party (WP29)25
provides a more concrete set of processing operations that require a DPIA due to their inherent
high risk:26

• Evaluation or scoring: includes profiling and predicting, especially from


aspects concerning the data subject’s performance at work, economic situation,
health, personal preferences or interests, reliability or behavior, and location or
movements.

• Automated decision-making with legal or similar significant effect:


processing that aims at taking decisions on data subjects producing “legal effects
concerning the natural person” or that “similarly significantly affects the natural
person.”

• Systematic monitoring: processing used to observe, monitor or control data


subjects, including data collected through networks or a systematic monitoring
of a publicly accessible area.

• Sensitive data or data of a highly personal nature: this includes special


categories of personal data, as well as personal data relating to criminal
convictions or offences.

• Data processed on a large scale: the WP29 recommends that the following
factors, in particular, be considered when determining whether the processing is
carried out on a large scale: (1) the number of data subjects concerned, either as
a specific number or as a proportion of the relevant population; (2) the volume
of data and/or the range of different data items being processed; (3) the duration,
or permanence, of the data processing activity; and (4) the geographical extent of
the processing activity.

• Matching or combining datasets: for example, originating from two or more


data processing operations performed for different purposes and/or by different

43
data controllers in a way that would exceed the reasonable expectations of the
data subject.

• Data concerning vulnerable data subjects: the processing of this type of data
is a criterion because of the increased power imbalance between the data subjects
and the data controller, meaning the individuals may be unable to easily consent
to, or oppose, the processing of their data, or exercise their rights. Vulnerable
data subjects may include children, employees and more vulnerable segments of
the population requiring special protection (e.g., persons with mental health
concerns, asylum seekers, the elderly).

• Innovative use or application of new technological or organizational


solutions, such as the combined use of fingerprints and face recognition for
improved physical access control.

• When the processing in itself prevents data subjects from exercising a right
or using a service or a contract.27

In most cases, a data controller can consider that a processing that meets two criteria would
require a DPIA to be carried out. In general, the WP29 considers that the more criteria are met
by the processing, the more likely it is to present a high risk to the rights and freedoms of data
subjects, and therefore to require a DPIA, regardless of the measures the controller envisions
adopting.
However, in some cases, a data controller can consider that a processing meeting only one of
these criteria requires a DPIA. In cases where it is not clear whether a DPIA is required, the
WP29 recommends that a DPIA be carried out.28 A DPIA is a useful tool to help controllers
build and demonstrate compliance with data protection law.29
Conversely, a processing operation may still be considered by the controller not to be “likely
to result in a high risk.” In such cases, the controller should justify and document the reasons
for not carrying out a DPIA and include/record the views of the data protection officer.

In addition, as part of the accountability principle, every data controller “shall maintain
a record of processing activities under its responsibility” including inter alia the
purposes of processing, a description of the categories of data and recipients of the data
and “where possible, a general description of the technical and organizational security
measures referred to in Article 32(11)” (Article 30(1)) and must assess whether a high
risk is likely, even if they ultimately decide not to carry out a DPIA.30

44
4.3.6.1 What Should a DPIA Include?
The GDPR sets out the minimum features of a DPIA:

• A description of the processing, including its purpose and the legitimate interest
being pursued

• The necessity of the processing, its proportionality and the risks that it poses to
data subjects

• Measures to address the risks identified31

Figure 4-1: Generic Iterative Process for Carrying Out a DPIA32

The Information Commissioner’s Office (ICO) in the UK has available on its website several
guidelines on DPIAs.33 Similarly, the Commission nationale de l’informatique et des libertés
(CNIL) in France has updated its PIA guides as well as its PIA tool.34 The method is consistent
with the WP29 guidelines and with risk management international standards. CNIL’s PIA
method is composed of three guides:

1. The method explains how to carry out a PIA

45
2. The models help to formalize a PIA by detailing how to handle the different
sections introduced in the method

3. The knowledge base is a code of practice that lists measures to be used to treat
the risks35

4.3.6.2 When Must the Supervisory Authority be Contacted?

Whenever the data controller cannot find sufficient measures to reduce the risks to an
acceptable level (i.e., the residual risks are still high), consultation with the supervisory
authority will be necessary.36

Scenarios that should lead to contact with a supervisory authority include:

• An illegitimate access to data leading to a threat on the life of the data subjects,
a layoff or a financial jeopardy

• Inability to reduce the number of people accessing the data because of its sharing,
use or distribution modes, or when a well-known vulnerability is not patched

Moreover, the controller will have to consult the supervisory authority whenever Member
State law requires controllers to consult with, and/or obtain prior authorization from, the
supervisory authority in relation to processing by a controller for the performance of a task
carried out by the controller in the public interest, including processing in relation to social
protection and public health.37

4.3.6.3 Components of a DPIA


Different methodologies could be used to assist in the implementation of the basic requirements
set out in the GDPR. Annex 1 of the WP29’s Guidelines on Data Protection Impact Assessment
lists several examples of data protection and privacy impact assessment methodologies. To
allow these different approaches to exist while allowing controllers to comply with the GDPR,
common criteria have been identified on Annex 2 of the same guidelines. They clarify the basic
requirements of the Regulation but provide enough scope for different forms of
implementation. These criteria can be used to show that a particular DPIA methodology meets
the standards required by the GDPR. It is up to the data controller to choose a methodology,
but this methodology should be compliant with the criteria provided in Annex 2 of the
guidelines mentioned earlier in this paragraph.38

46
4.3.7 Attestation: A Form of Self-Assessment
Attestation is a tool for ensuring functions outside the privacy team are held accountable for
privacy-related responsibilities. Once you have determined the privacy responsibilities of each
department and the relevant privacy principles against which their personal information
practices will be assessed, you can use this information to craft questions related to each
responsibility. The designated department is required to answer the questions and, potentially,
provide evidence. Attestation questions should be specific and easy to answer, usually with yes
or no responses. Regular self-assessments can form part of an organization`s privacy
management systems and demonstrate a responsible privacy management culture.
In the United States, one example of attestation involves NIST 800-60, a guide from the
National Institute of Standards and Technology (NIST) and the U.S. Department of Commerce
(DOC).39 The guide maps types of information and information systems to security categories.
For example:

• Task—classify data.

• Owner—IT.

• Questions—Has the NIST 800-60 classification system been reviewed to ensure


understanding of each category? Has each type of data within the information
system been mapped to a category? Have data types that cannot be easily
categorized been flagged, analyzed and classified by the chief information
security officer (CISO)?

• Evidence—spreadsheet with data inventory, categories and classifications.

The Irish Office of the Data Protection Commission (DPC) has published a guide to such
audits with helpful appendices for self-assessment.40

4.4 Physical and Environmental Assessment

Information security is the protection of information for the purpose of preventing loss,
unauthorized access or misuse. Information security requires ongoing assessments of threats
and risks to information and of the procedures and controls to preserve the information,
consistent with three key attributes:

1. Confidentiality: access to data is limited to authorized parties

2. Integrity: assurance that the data is authentic and complete

47
3. Availability: knowledge that the data is accessible, as needed, by those who are
authorized to use it

Information security is achieved by implementing security controls, which need to be


monitored and reviewed to ensure that organizational security objectives are met. It is vital to
both public and private sector organizations.
Security controls are mechanisms put in place to prevent, detect or correct a security incident.
The three types of security controls are physical controls, administrative controls and technical
controls.41 For now, we are going to focus on the assessment of physical and environmental
controls. The other information security practices will be covered in greater detail in Chapter
8.
Physical and environmental security refers to methods and controls used to proactively
protect an organization from natural or manmade threats to physical facilities and buildings as
well as to the physical locations where IT equipment is located or work is performed (e.g.,
computer rooms, work locations). Physical and environmental security protects an
organization’s personnel, electronic equipment and data/information. Key terms and concepts
found in this competency include:

• Access cards

• Access control

• Alarms

• Assessment

• Asset disposal, including document destruction, media sanitization (e.g., hard


drives, USB drives)

• Biometrics

• Defense-in-depth

• Environmental threats

• Identification and authentication

• Inventory

• Manmade threats

• Natural threats

• Perimeter defense

48
• Risk management

• Threat and vulnerability

• Video surveillance

4.5 Assessing Vendors

A procuring organization may have specific standards and processes for vendor selection. A
prospective vendor should be evaluated against these standards through questionnaires, privacy
impact assessments and other checklists. Standards for selecting vendors may include:

• Reputation. A vendor’s reputation with other companies can be a valuable gauge


of the vendor’s appropriate collection and use of personal data. Requesting and
contacting references can help determine a vendor’s reputation.

• Financial condition and insurance. The vendor’s finances should be reviewed


to ensure the vendor has sufficient resources in case of a security breach and
subsequent litigation. A current and sufficient insurance policy can also protect
the procuring organization in the event of a breach.

• Information security controls. A service provider should have sufficient


security controls in place to ensure the data is not lost or stolen.

• Point of transfer. The point of transfer between the procuring organization and
the vendor is a potential security vulnerability. Mechanisms of secure transfer
should be developed and maintained.

• Disposal of information. Appropriate destruction of data and/or information in


any format or media is a key component of information management—for both
the contracting organization and its vendors. The Disposal Rule under the Fair
and Accurate Credit Transactions Act of 2003 (FACTA) sets forth required
disposal protections for financial institutions. The Disposal Rule requirements
provide a good baseline for disposal of personal information more generally.

• Employee training and user awareness. The vendor should have an established
system for training its employees about its responsibilities in managing personal
or sensitive information.

• Vendor incident response. Because of the potentially significant costs


associated with a data breach, the vendor should clearly explain in advance its

49
provisions for responding to any such breach with the cooperation needed to meet
the organization’s business and legal needs.

• Audit rights. Organizations should be able to monitor the vendor’s activities to


ensure it is complying with contractual obligations. Audit needs can sometimes
be satisfied through periodic assessments or reports by independent trusted
parties regarding the vendor’s practices.42

Evaluating vendors should involve all relevant internal and external stakeholders, including
internal audit, information security, physical security and regulators. Results may indicate
improvement areas that may be fixed or identify higher-level risk that may limit the ability of
that vendor to properly perform privacy protections. Once risk is determined, organization best
practices may also be leveraged to assist a vendor too small or with limited resources by
offering help with security engineering, risk management, training through awareness and
education, auditing, and other tasks.
Contract language should be written to call out privacy protections and regulatory
requirements within the statement of work and then mapped to service-level agreements to
ensure there are no questions about the data privacy responsibilities, breach response, incident
response, media press releases on breaches, possible fines, and other considerations, as if the
vendor were part of the organization. The following list gives a few examples of the kind of
information you may want to consider, including:

• Specifics regarding the type of personal information to which the vendor will
have access at remote locations

• Vendor plans to protect personal information

• Vendor responsibilities in the event of a data breach

• Disposal of data upon contract termination

• Limitations on the use of data that ensure it will be used only for specified
purposes

• Rights of audit and investigation

• Liability for data breach

The purpose of the vendor contract is to make certain all vendors are in compliance with the
requirements of your organization’s privacy program.

50
4.5.1 Assessing Vendors under the GDPR
Article 28 of the GDPR uses the device of limiting the controller’s use of processors to those
who can provide “sufficient guarantees” about the implementation of appropriate technical and
organizational measures for compliance with the Regulation and for the protection of the rights
of data subjects. This idea of sufficient guarantees encompasses much more than the creation
of contracts, but the use of contracts is a key control mechanism. The focus is on obtaining
proof of the processor’s competence.
To make any sense and to be truly effective, the idea of sufficient guarantees must encompass
assurance mechanisms. There must be appropriate checking and vetting of the processor by the
supplier via a third-party assessment or certification validation, both before a contract is created
and afterward. In appropriate circumstances, the processes of assurance must include audit
processes, which are made clear in Article 28(3)(h). If the controller is unable to establish proof
of the processor’s competence, it must walk away; otherwise, it will be in automatic breach of
Article 28. All this must work in a commercial context. The defining feature of the controller-
processor relationship is that the processor can act only on the instructions of the controller. As
can be seen from Article 28(10), if the processor steps outside the boundaries of its instructions,
it risks being defined as a controller, with all the attendant obligations.
None of these provisions are new to European data protection law from the previous directive
to the GDPR. What is new to legislation is the duty to assist the controller with achieving
compliance and reducing risk, which includes assisting the controller with the handling of the
personal data breach notification requirements (Article 28(3)(f)). Given this more practical
application, it is clear that the controller and the processor will need to work closely to ensure
effective incident detection and response. Arguably, Article 28 poses many challenges for the
established order of things, particularly when there is an imbalance between the controller and
processor in the market. When the processor is a technology giant, there may be a risk that it
may use its more powerful position in a way that the GDPR says triggers controllership under
Article 28(10). Thus, for the processor industry, there are very good incentives to behave
flexibly during contract formation and procurement.43

4.6 Mergers, Acquisitions and Divestitures: Privacy Checkpoints

Mergers, acquisitions and divestitures contain many legal and compliance aspects, with their
own sets of concerns related to privacy. Mergers form one organization from others, while
acquisitions involve one organization buying one or many others; in divestitures, companies

51
sell one division of an organization for reasons that may include the parent company’s desire
to rid itself of divisions not integral to its core business.
An organization can be exposed to corporate risk by merging with or acquiring companies
that have different regulatory concerns. Merger and acquisition processes should include a
privacy checkpoint that evaluates:

• Applicable new compliance requirements

• Sector-specific laws [e.g., Health Insurance Portability and Accountability Act


(HIPAA) in the United States]

• Standards [e.g., the Payment Card Industry Data Security Standards (PCI DSS)]

• Jurisdictional laws/regulations [e.g., the Personal Information Protection and


Electronic Documents Act (PIPEDA), GDPR]

• Existing client agreements

• New resources, technologies and processes to identify all actions that are required
to bring them into alignment with privacy and security policies before they are
integrated into the existing system

With respect to both partial and total divestitures, the organization should conduct a thorough
assessment of the infrastructure of all, or any part of, the entity being divested prior to the
conclusion of the divestiture. These activities are performed to confirm that no unauthorized
information, including personal information, remains on the organization’s infrastructure as
part of the divestiture, with the exception of any preapproved proprietary data.

4.7 Summary

In recent years, with the proliferation of information communication technologies and the
complex data protection problems they raise, data assessments and risk management have taken
an even more prominent role in various privacy law regimes. The risk-based approach, which
is now fully enacted by the GDPR, confirms this trend. Nevertheless, the necessity of
conducting data assessments goes beyond compliance with certain legal requirements. They
are an important risk management tool with clear financial benefits. Identifying a problem early
will generally require a simpler and less costly solution. Moreover, data assessments allow
organizations to reduce the ongoing costs of a project by minimizing the amount of information
being collected or used and devising more straightforward processes for staff. Finally, data

52
assessments improve transparency and accountability, making it easier for data subjects and
supervisory authorities to understand how and why the personal data is being used.

GDPR Data Protection Impact Assessment

The instrument for a privacy impact assessment (PIA) or data protection impact assessment
(DPIA) was introduced with the General Data Protection Regulation (Art. 35 of the GDPR).
This refers to the obligation of the controller to conduct an impact assessment and to document
it before starting the intended data processing. One can bundle the assessment for several
processing procedures.

Basically, a data protection impact assessment must always be conducted when the processing
could result in a high risk to the rights and freedoms of natural persons. The assessment must
be carried out especially if one of the rule examples set forth in Art. 35(3) of the GDPR is
relevant. In order to specify the open-ended wording of the law regarding the basic obligation
to perform a privacy impact assessment, the supervisory authorities are involved. In a first
draft, the Article 29 Working Party created a catalogue of ten criteria which indicate that the
processing bears a high risk to the rights and freedoms of a natural person. These are for
example scoring/profiling, automatic decisions which lead to legal consequences for those
impacted, systematic monitoring, processing of special personal data, data which is processed
in a large scale, the merging or combining of data which was gathered by various processes,
data about incapacitated persons or those with limited ability to act, use of newer technologies
or biometric procedures, data transfer to countries outside the EU/EEC and data processing
which hinders those involved in exercising their rights. A privacy impact assessment is not
absolutely necessary if a processing operation only fulfils one of these criteria. However, if
several criteria are met, the risk for the data subjects is expected to be high and a data protection
impact assessment is always required. If there is doubt and it is difficult to determine a high
risk, a DPIA should nevertheless be conducted. This process must be repeated at least every
three years.

In addition, the national supervisory authorities have to establish and publish a list of
processing operations which always require a data protection impact assessment in their
jurisdiction (positive list). They are also free to publish a list of processing activities which

53
specifically do not require a privacy impact assessment (negative list). If a company has
appointed a Data Protection Officer, his advice must be taken into account when conducting a
DPIA. How and by what criteria the consequences and risks for the data subjects are assessed,
remains largely unanswered. The first templates were guided by the inspection schemes of ISO
standards or the Standard Data Protection Model.

54
PRIVACY AND DATA PROTECTION FRAMEWORK
Evolution of Data Privacy

Data Privacy Day is held on 28 January each year on the anniversary of when the Council of Europe's
('CoE') Convention of Protection of Individuals with regard to Automatic Processing of Personal Data
('Convention 108') was opened for signature, representing the first legally binding international treaty
for privacy and data protection. OneTrust Data Guidance has compiled a short legal history of the right
to privacy and data protection across the globe, highlighting the fundamental historical and legal
developments, which have laid the foundations for the right of privacy and data protection as it exists
today.

A brief timeline

The right to privacy integrated into the rule of law through constitutions
The right to privacy had legal relevance long before the recent data protection regulations,
finding some of its foundations in constitutional law. References to the protection of this right
has often been included in the constitutional charters of countries around the world.

1789: US Bill of Rights

The Bill of Rights details a 'right of people to be secure in their persons, houses, papers, and
effects, against unreasonable searches and seizures'.

1890: The right to be let alone

American lawyers Samuel Warren and Louis Brandeis publish their ground-breaking Article,
'The Right to Privacy', on the Harvard Law Review, where privacy is described as 'the right to
be let alone'.

Brandeis and Warren identified technology as the driving force behind the development and
subsequent protection of privacy, warning that 'instantaneous photographs and newspaper
enterprise have invaded the sacred precincts of private and domestic life'.

55
1948: United Nations Declaration of Human Rights

The United Nations ('UN') Declaration of Human Rights, a milestone document in the history
of human rights drafted by a UN committee chaired by Eleanor Roosevelt, enshrines a
rudimentary right to privacy. Article 12 provides that 'No one shall be subjected to arbitrary
interference with his privacy, family, home, or correspondence, nor to attacks upon his honour
and reputation. Everyone has the right to the protection of the law against such interference or
attacks'.

1950: European Convention on Human Rights

Drafted by the CoE, Article 8 of the European Convention on Human Rights ('ECHR') follows
the UN Declaration of Human Rights and provides protection for an individual's 'private and
family life, his home and his correspondence', although subject to certain restrictions that are
'in accordance with law' and 'necessary in a democratic society'.

1974: FERPA and Privacy Act

Enacted in 1974, the American Family Educational Rights and Privacy Act ('FERPA') protects
the privacy of student education records. Among other things, it provides parents the right to
have access to their children's education records, the right to seek to have the records amended,
and the right to have some control over the disclosure of personally identifiable information
from the education record.

In the same year, the Privacy Act of 1974, was enacted in the US and established a Code of
Fair Information Practice on the collection, maintenance, use, and dissemination of personally
identifiable information by federal agencies. This created one of the first frameworks for
balancing the need to process information about individuals with the rights of individuals to be
protected against unjustified invasions of their privacy.

1980: OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data

The OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data
('OECD Guidelines') is one of the first international efforts toward a harmonised privacy
framework, although they are not legally binding for members. In particular, among other

56
things, the OECD guidelines established the seven principles of notice, purpose, consent,
security, disclosure, access, and accountability.

1981: Convention 108

Convention 108 is the first binding international instrument aimed at protecting individuals
against abuses derived from the collection and processing of personal data, and sought to
regulate the cross-border flow of personal data. Convention 108 contains foundational concepts
which are reflected in modern data protection and privacy laws today.

1995: EU Data Protection Directive

The European Union's Data Protection Directive (Directive (EU) 95/46/EC) ('the Data
Protection Directive') is the predecessor to the General Data Protection Regulation (Regulation
(EU) 2016/679) ('GDPR') and was the first instrument aimed at harmonised data protection
within the union. It established foundational data protection principles that would later be
enshrined in the GDPR, such as, transparency and proportionality. The Data Protection
Directive created a baseline of data protection that was echoed in data protection legislation
globally.

1996-1999: United States' HIPAA, COPPA, and GLBA

The late-1990s saw increased sector-specific privacy regulation in the US. Three significant
privacy legislations which continue to shape the US privacy landscape were enacted, namely:
the Health Insurance Portability and Accountability Act of 1996 ('HIPAA'), the Children's
Online Privacy Protection Act ('COPPA'), and the Gramm–Leach–Bliley Act ('GLBA').

HIPAA is applicable in the health care sector and establishes requirements for the processing
of healthcare information and protects personally identifiable information processed by
healthcare and health insurance industries. COPPA on the other hand, provide protections for
minor and regulates personal information processing of children under 13 years of age, both in
and outside of the US. It includes information about what organisations must include in their
privacy policies, how to verify consent from parents, and how websites should protect
children's safety and privacy online.

57
The GLBA which is applicable to the financial sector stipulates that companies are under a
duty to outline their information sharing practices to all customers, including the kind of
information they collect and the third parties with whom they may share the information. The
GLBA also requires that financial institutions provide customers their right to opt-out of third-
party disclosures of this kind.

2002: ePrivacy Directive

In force since 2002, the Directive on Privacy and Electronic Communications (2002/58/EC)
(as amended) ('the ePrivacy Directive') was designed to meet the needs of digital technologies,
complement the Data Protection Directive, and cover all issues of private electronic
communication, while also improving transparency and security for users. Importantly, the
ePrivacy Directive established a regulation governing cookies and tracking technology, which
was largely unregulated at the time.

2005: APEC Privacy Framework

The APEC Privacy Framework, published in 2005, is intended to provide clear guidance and
direction to businesses and government entities in APEC economies on common privacy issues
and the impact of privacy upon the way legitimate business practices and government functions
are to be conducted. Moreover, the APEC Privacy Framework was modelled on the OECD
Guidelines, although it has been shaped to tackle the different legal characteristics and context
of the APEC region. In particular, the APEC Privacy Framework establishes the nine principles
for preventing harm, notice, collection, limitations, uses of personal information, choice,
integrity of personal information, security safeguards, access and correction, and
accountability.:

2012: European Charter of Fundamental Rights of the European Union

The European Charter of Fundamental Rights of the European Union ('the Charter') is the
second legal tool to ensure the protection of fundamental and human rights in Europe after the
ECHR. While the ECHR was drafted by the CoE and applies to 47 Member States, the Charter
applies only to the EU Member States. Interestingly, Article 7 of the Charter and the
abovementioned Article 8 of the ECHR both provide for a similar right of privacy for 'private

58
and family life, home and communications'; however Article 8 of the Charter goes further and
provides a separate and distinct right to data protection, stating that 'everyone has the right to
the protection of personal data concerning him or her'.

2013-2020: Schrems I and II

On 6 October 2015, the Court of Justice of the European Union ('CJEU') stated that the
European Commission had failed to fully guarantee adequate data protection safeguards, and
invalidated the Safe Harbor. Subsequently, in 2020, the CJEU, in its decision Data Protection
Commissioner v. Facebook Ireland Limited, Maximillian Schrems (C-311/18) ('the Schrems II
Case'), declared invalid the European Commission's decision on the adequacy of the protection
offered by the EU-US Privacy Shield, the mechanism that replaced the Safe Harbor regime in
2016.

Specifically, the CJEU ruled that the US regulations on access and use by US authorities of
data originating in the EU had limitations that did not meet the standards of adequacy required
by EU law, in light of the principle of proportionality. Indeed, the CJEU considered that
surveillance programmes based on US law were not limited to what is strictly necessary and
proportional as required by EU law.

Despite their ruling, the CJEU, upheld the general validity of Standard Contractual Clauses
('SCCs'), but emphasised that organisations relying on SCCs must 'verify, on a case-by-case
basis and, where appropriate, in collaboration with the recipient of the data, whether the law of
the third country ensures adequate protection, under EU law, pursuant to the SCC, and where
necessary, adopt additional safeguards to those offered by those clauses'.

2014: Malabo Convention

The African Union Convention on Cyber Security and Personal Data Protection ('the Malabo
Convention'), adopted in 2014, is an important data protection international agreement in
Africa, and it aims to establish a legal framework for cybersecurity and data protection within
the African Union Member States, as well as defines objectives for the same. Moreover, the
preamble of the Malabo Convention further highlights that it seeks to address the need for
harmonised legislation in the area of electronic commerce, personal data protection, and

59
cybersecurity in Member States, and establish in each Member State a mechanism capable of
combating violations of privacy that may be generated by personal data collection, processing,
transmission, storage, and use.

2016-2018: Introduction of the GDPR

In 2016, the EU adopted the GDPR, which entered into effect on 25 May 2018, replacing the
Data Protection Directive and updating EU privacy legislation for the age of the internet. The
GDPR is considered a privacy benchmark, due to the comprehensive nature of the Regulation.
Its definitions, data protection principles, data subject rights, as well as obligations for
controllers and processors have been replicated in numerous laws and initiatives around the
world. The GDPR's extra-territorial application encourages national supervisory authorities to
increase their enforcement, which has led to increased enforcement actions and case law from
national courts as well as the CJEU.

2017: European Commission proposal for ePrivacy Regulation

The Draft ePrivacy Regulation was originally proposed in 2017, following which there have
been several discussions and new drafts released. The Draft ePrivacy Regulation has been
designed to update requirements related to privacy and electronic communications and
harmonise these with the GDPR. Currently, negotiations are still ongoing and it remains to be
seen when the Draft ePrivacy Regulation will be finalised and become law.

2020: CCPA

In 2020, California became the first US State to enact a comprehensive data protection law.
The California Consumer Privacy Act ('CCPA') creates obligations for certain businesses
operating in California and provides certain rights for consumers, such as the right of access,
the right of deletion, and the right to opt-out of the sale of their personal information. The
CCPA has inspired subsequently data protection legislation passed and introduced in other US
States, and at the Federal level.

60
2021: China's Personal Information Protection Law

The Personal Information Protection Law ('PIPL') is China's first comprehensive data
protection law, and outlines requirements for personal information handlers. In line with
international standards, the PIPL establishes duties for personal information handlers, such as
the appointment of a personal information protection officer and includes provisions on
conducting personal information protection impact assessments, creates restrictions on
international data transfers, as well as provides individual rights.

The right to privacy in 2023 and beyond

Privacy and data protection is an ever-changing field with new and amended data protection
and privacy laws emerging to keep pace with technological advancements. In particular, 2023
saw and will continue to see the expansion of privacy laws with a number of privacy laws
entering into effect and progressing through national legislatures. A notable mention is the
American Data Privacy and Protection Act ('ADDPA') which represents the first bi-partisan
federal data protection legislation in the US and is currently under consideration in the U.S.
House of Representatives. If adopted the ADDPA will significantly change the existing privacy
landscape in the US.

On the other side of the ocean, the European legislator did not stand idly by, in fact, the future
of privacy regulation in Europe also promises to be eventful. More specifically, the following
regulations, among other things, will enter into force:

• Regulation (EU) 2022/868 of 30 May 2022 on European Data Governance and


amending Regulation (EU) 2018/1724 (Data Governance Act) will apply in full from
24 September 2023.
• Regulation (EU) 2022/1925 of 14 September 2022 on Contestable and Fair Markets in
the Digital Sector and Amending Directives (EU) 2019/1937 and (EU) 2020/1828
(Digital Markets Act) will move into its implementation phase and start to apply as of
2 May 2023.
• Regulation (EU) 2022/2065 of 19 October 2022 on a Single Market For Digital Services
and Amending Directive 2000/31/EC (Digital Services Act) will enter into force cross
the territory of the EU 15 months after its entry into force, or from 1 January 2024,
whichever is later.

61
DATA PROTECTION LAW IN EU
The EU's data protection laws have long been regarded as a gold standard all over the world.
Over the last 25 years, technology has transformed our lives in ways nobody could have
imagined so a review of the rules was needed.

In 2016, the EU adopted the General Data Protection Regulation (GDPR), one of its greatest
achievements in recent years. It replaces the1995 Data Protection Directive which was adopted
at a time when the internet was in its infancy.

The GDPR is now recognised as law across the EU. Member States have two years to ensure
that it is fully implementable in their countries by May 2018.

The timeline below contains key dates and events in the data protection reform process from
1995 to 2018.

22 June 2011

EDPS Opinion on EC Communication 'A comprehensive approach on personal data protection


in EU'

The European Data Protection Supervisor publishes an Opinion on the European Commission's
Communication.

Did you know

The GDPR kickstarts the updating of other regulations...

• Member States are entitled to provide specific rules or derogations to the GDPR, where
freedom of expression and information is concerned; or in the context of employment
law; or to preserve scientific or historical research.
• Similarly, the entry into force of the GDPR requires the updating of other EU
regulations, such as the revision of the ePrivacy directive which regulates the
confidentiality of communications and the use of cookies, or Regulation 45/2001 which
applies to the EU institutions when they process personal data.

25 January 2012

62
EC proposal to strengthen online privacy rights and digital economy

The European Commission proposes a comprehensive reform of the EU's 1995 data protection
rules to strengthen online privacy rights and boost Europe's digital economy.

Did you know

Expanded territorial reach

• Organisations established outside the EU, offering goods and services to, or monitoring
individuals in the EU, must comply with the GDPR and designate a representative in
the EU.

7 March 2012

EDPS Opinion on EC data protection reform package

The European Data Protection Supervisor adopts an Opinion on the Commission's data
protection reform package.

Did you know

Accountability

• The accountability principle means that organisations and any third parties who help
them in their data processing activities must be able to demonstrate that they comply
with data protection principles. This includes for instance, documenting their
processing activities to prove that they adopted appropriate measures and steps to
implement their obligations. In certain cases, organisations will have to carry out a data
protection impact assessment.

23 March 2012

WP29 Opinion on data protection reform proposal

The Article 29 Working Party adopts an Opinion on the data protection reform proposal.

63
Did you know

Consent

• Consent of the individual is one of the few circumstances under which an organisation
may lawfully process personal data. Consent must be freely given, informed and
unambiguous. Individuals may withdraw their consent at any time. In addition, consent
to process sensitive personal data as well as consent to transfer personal data outside
the EU must be explicit.
• Parental consent is required for children aged 13 to 16, depending on the Member State.

5 October 2012

WP29 update on data protection reform

The Article 29 Working Party provides further input on the data protection reform discussions.

Did you know

Data breach notification

• Organisations must notify data breaches to their data protection authority within 72
hours unless the breach is unlikely to pose a risk for individuals. In specific cases, they
will have to inform the affected individuals.

12 March 2014

EP adopts GDPR

The European Parliament demonstrates strong support for the GDPR by voting in plenary with
621 votes in favour, 10 against and 22 abstentions.

64
Did you know

One-Stop-Shop & Consistency Mechanism

• The GDPR introduces a single point of contact for cross-border data protection matters.
Where the processing organisation is established in several Member States and/or
where individuals in several Member States are affected, the supervisory authority in
the Member State where the organisation has its main establishment will be the lead
authority, responsible for adopting measures directed at the organisation, in cooperation
with all involved supervisory authorities.

15 June 2015

The Council reaches a general approach on the GDPR

Did you know

The European Data Protection Board

• The European Data Protection Board will replace the Article 29 Working Party. The
European Data Protection Supervisor will provide the secretariat for this new,
independent European body of which all European data protection authorities will be
members. The role of the EDPB will be to ensure the consistency of the application of
the GDPR throughout the Union, through guidelines, opinions and decisions.

27 July 2015

EDPS recommendations on the final text of the GDPR

The European Data Protection Supervisor publishes his recommendations to the European co-
legislators negotiating the final text of the GDPR in the form of drafting suggestions. He also
launches a mobile app comparing the Commission's proposal with the latest texts from the
Parliament and the Council.

65
Did you know

Fines

• The GDPR introduces fines for organisations breaching EU data protection law which
can amount to €20 million or 4% of the company’s worldwide turnover.

15 December 2015

EP, Council and EC reach an agreement on the GDPR

The European Parliament, the Council and the Commission reach an agreement on the GDPR.

Did you know

International Data Transfers

• The GDPR ensures that the rights and safeguards it provides to individuals in the EU
are preserved when their data are transferred outside of the Union
• The European Commission will continue to adopt adequacy decisions where a country
offers a legal framework for data protection that is essentially equivalent to the EU.
• Without an adequacy decision, data can be transferred to a country if the processing
organisation puts in place binding corporate rules, contractual clauses or other
appropriate safeguards.
• Without these, transfers can only take place under strict circumstances, for example,
with the consent of the individual or where the transfer is necessary for the conclusion
or the performance of a contract.

2 February 2016

66
The Article 29 Working Party issues an action plan for the implementation of the GDPR

Did you know

Your Data Protection Rights

• The GDPR reinforces a wide range of existing rights and establishes new ones for
individuals. These include the:
• Right of data portability: You have the right to receive your personal data from an
organisation in a commonly used form so that you can easily share it with another.
• Right not to be profiled: Unless it is necessary by law or a contract, decisions affecting
you cannot be made on the sole basis of automated processing.

27 April 2016

General Data Protection Regulation

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on
the protection of natural persons with regard to the processing of personal data and on the free
movement of such data, and repealing Directive 95/46/EC (General Data Protection
Regulation)

Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on
the protection of natural persons with regard to the processing of personal data by competent
authorities for the purposes of the prevention, investigation, detection or prosecution of
criminal offences or the execution of criminal penalties, and on the free movement of such
data, and repealing Council Framework Decision 2008/977/JHA

24 May 2016

67
The Regulation enters into force, 20 days after publication in the Official Journal of the EU

Did You Know

Your Data Protection Rights

• The GDPR reinforces a wide range of existing rights and establishes new ones for
individuals including:
• the right to erasure (right to be forgotten); you can request that an organisation delete
your personal data, for instance where your data are no longer necessary for the
purposes for which they were collected or where you have withdrawn your consent.

10 January 2018

EC proposes two new regulations on privacy and electronic communications and on the data
protection rules applicable to EU institutions

The European Commission proposes two new regulations on privacy and electronic
communications (ePrivacy) and on the data protection rules applicable to EU institutions
(currently Regulation 45/2001) that align the existing rules to the GDPR.

Did You Know

Privacy By Default

• Organisations processing personal data must take measures to ensure that the data is
protected by default.
Privacy by default requires that technical and organisational measures are put in place
so that only the necessary personal information is processed, used or accessed for a
specified purpose.

6 May 2018

68
Data Protection Directive for the police and justice sectors into national legislation applicable
from this day

Members States must have transposed the Data Protection Directive for the police and justice
sectors into national legislation. It will be applicable from this day.

Did You Know

Privacy by Design

• Organisations processing personal data must take measures to ensure that the data is
protected by design.
The aim of privacy by design is to build privacy and data protection into the design and
architecture of information and communication systems and technologies so that they
comply with privacy and data protection principles.

22 May 2018

Proposal for a Regulation on the protection of personal data in EU institutions

Proposal for a Regulation of the European Parliament and of the Council on the protection of
individuals with regard to the processing of personal data by the Union institutions, bodies,
offices and agencies and on the free movement of such data, and repealing Regulation (EC) No
45/2001 and Decision No 1247/2002/EC [First reading] - Preparation for the trilogue

25 May 2018

Corrigendum

Corrigendum to Regulation (EU) 2016/679 of the European Parliament and of the Council of
27 April 2016 on the protection of natural persons with regard to the processing of personal
data and on the free movement of such data, and repealing Directive 95/46/EC (General Data
Protection Regulation)

Corrigendum to Directive (EU) 2016/680 of the European Parliament and of the Council of 27
April 2016 on the protection of natural persons with regard to the processing of personal data
by competent authorities for the purposes of the prevention, investigation, detection or

69
prosecution of criminal offences or the execution of criminal penalties, and on the free
movement of such data, and repealing Council Framework Decision 2008/977/JHA

The General Data Protection Regulation will apply from this day

Did you know

Appointment of a Data Protection Officer

• Some organisations, for instance those whose core activities involve regular and
systematic monitoring of personal or sensitive data on a large scale as well as public
sector organisations, will have to appoint a Data Protection Officer to ensure they
comply with the GDPR.

What is the GDPR? Europe’s new data privacy and security law includes hundreds of
pages’ worth of new requirements for organizations around the world. This GDPR
overview will help you understand the law and determine what parts of it apply to you.

The General Data Protection Regulation (GDPR) is the toughest privacy and security law
in the world. Though it was drafted and passed by the European Union (EU), it imposes
obligations onto organizations anywhere, so long as they target or collect data related to people
in the EU. The regulation was put into effect on May 25, 2018. The GDPR will levy harsh fines
against those who violate its privacy and security standards, with penalties reaching into the
tens of millions of euros.

With the GDPR, Europe is signaling its firm stance on data privacy and security at a time when
more people are entrusting their personal data with cloud services and breaches are a daily
occurrence. The regulation itself is large, far-reaching, and fairly light on specifics, making
GDPR compliance a daunting prospect, particularly for small and medium-sized enterprises
(SMEs).

We created this website to serve as a resource for SME owners and managers to address specific
challenges they may face. While it is not a substitute for legal advice, it may help you to
understand where to focus your GDPR compliance efforts. We also offer tips on privacy

70
tools and how to mitigate risks. As the GDPR continues to be interpreted, we’ll keep you up to
date on evolving best practices.

If you’ve found this page — “what is the GDPR?” — chances are you’re looking for a crash
course. Maybe you haven’t even found the document itself yet (tip: here’s the full regulation).
Maybe you don’t have time to read the whole thing. This page is for you. In this article, we try
to demystify the GDPR and, we hope, make it less overwhelming for SMEs concerned about
GDPR compliance.

History of the GDPR

The right to privacy is part of the 1950 European Convention on Human Rights, which
states, “Everyone has the right to respect for his private and family life, his home and his
correspondence.” From this basis, the European Union has sought to ensure the protection of
this right through legislation.

As technology progressed and the Internet was invented, the EU recognized the need for
modern protections. So in 1995 it passed the European Data Protection Directive, establishing
minimum data privacy and security standards, upon which each member state based its own
implementing law. But already the Internet was morphing into the data Hoover it is today. In
1994, the first banner ad appeared online. In 2000, a majority of financial institutions offered
online banking. In 2006, Facebook opened to the public. In 2011, a Google user sued the
company for scanning her emails. Two months after that, Europe’s data protection authority
declared the EU needed “a comprehensive approach on personal data protection” and work
began to update the 1995 directive.

The GDPR entered into force in 2016 after passing European Parliament, and as of May 25,
2018, all organizations were required to be compliant.

Scope, penalties, and key definitions

First, if you process the personal data of EU citizens or residents, or you offer goods or services
to such people, then the GDPR applies to you even if you’re not in the EU. We talk more
about this in another article.

71
Second, the fines for violating the GDPR are very high. There are two tiers of penalties,
which max out at €20 million or 4% of global revenue (whichever is higher), plus data subjects
have the right to seek compensation for damages. We also talk more about GDPR fines.

The GDPR defines an array of legal terms at length. Below are some of the most important
ones that we refer to in this article:

Personal data — Personal data is any information that relates to an individual who can be
directly or indirectly identified. Names and email addresses are obviously personal data.
Location information, ethnicity, gender, biometric data, religious beliefs, web cookies, and
political opinions can also be personal data. Pseudonymous data can also fall under the
definition if it’s relatively easy to ID someone from it.

Data processing — Any action performed on data, whether automated or manual. The
examples cited in the text include collecting, recording, organizing, structuring, storing, using,
erasing… so basically anything.

Data subject — The person whose data is processed. These are your customers or site visitors.

Data controller — The person who decides why and how personal data will be processed. If
you’re an owner or employee in your organization who handles data, this is you.

Data processor — A third party that processes personal data on behalf of a data controller.
The GDPR has special rules for these individuals and organizations. They could include cloud
servers like Tresorit or email service providers like Proton Mail.

What the GDPR says about…

For the rest of this article, we will briefly explain all the key regulatory points of the GDPR.

Data protection principles

If you process data, you have to do so according to seven protection and accountability
principles outlined in Article 5.1-2:

1. Lawfulness, fairness and transparency — Processing must be lawful, fair, and


transparent to the data subject.

72
2. Purpose limitation — You must process data for the legitimate purposes specified
explicitly to the data subject when you collected it.
3. Data minimization — You should collect and process only as much data as absolutely
necessary for the purposes specified.
4. Accuracy — You must keep personal data accurate and up to date.
5. Storage limitation — You may only store personally identifying data for as long as
necessary for the specified purpose.
6. Integrity and confidentiality — Processing must be done in such a way as to ensure
appropriate security, integrity, and confidentiality (e.g. by using encryption).
7. Accountability — The data controller is responsible for being able to demonstrate
GDPR compliance with all of these principles.

Accountability

The GDPR says data controllers have to be able to demonstrate they are GDPR compliant. And
this isn’t something you can do after the fact: If you think you are compliant with the GDPR
but can’t show how, then you’re not GDPR compliant. Among the ways you can do this:

• Designate data protection responsibilities to your team.


• Maintain detailed documentation of the data you’re collecting, how it’s used, where it’s
stored, which employee is responsible for it, etc.
• Train your staff and implement technical and organizational security measures.
• Have Data Processing Agreement contracts in place with third parties you contract to
process data for you.
• Appoint a Data Protection Officer (though not all organizations need one — more on
that in this article).

Data security

You’re required to handle data securely by implementing “appropriate technical and


organizational measures.”

Technical measures mean anything from requiring your employees to use two-factor
authentication on accounts where personal data are stored to contracting with cloud providers
that use end-to-end encryption.

73
Organizational measures are things like staff trainings, adding a data privacy policy to your
employee handbook, or limiting access to personal data to only those employees in your
organization who need it.

If you have a data breach, you have 72 hours to tell the data subjects or face penalties. (This
notification requirement may be waived if you use technological safeguards, such as
encryption, to render data useless to an attacker.)

Data protection by design and by default

From now on, everything you do in your organization must, “by design and by default,”
consider data protection. Practically speaking, this means you must consider the data protection
principles in the design of any new product or activity. The GDPR covers this principle
in Article 25.

Suppose, for example, you’re launching a new app for your company. You have to think about
what personal data the app could possibly collect from users, then consider ways to minimize
the amount of data and how you will secure it with the latest technology.

When you’re allowed to process data

Article 6 lists the instances in which it’s legal to process person data. Don’t even think about
touching somebody’s personal data — don’t collect it, don’t store it, don’t sell it to advertisers
— unless you can justify it with one of the following:

1. The data subject gave you specific, unambiguous consent to process the data. (e.g.
They’ve opted in to your marketing email list.)
2. Processing is necessary to execute or to prepare to enter into a contract to which the
data subject is a party. (e.g. You need to do a background check before leasing property
to a prospective tenant.)
3. You need to process it to comply with a legal obligation of yours. (e.g. You receive
an order from the court in your jurisdiction.)
4. You need to process the data to save somebody’s life. (e.g. Well, you’ll probably know
when this one applies.)

74
5. Processing is necessary to perform a task in the public interest or to carry out some
official function. (e.g. You’re a private garbage collection company.)
6. You have a legitimate interest to process someone’s personal data. This is the most
flexible lawful basis, though the “fundamental rights and freedoms of the data subject”
always override your interests, especially if it’s a child’s data. (It’s difficult to give an
example here because there are a variety of factors you’ll need to consider for your
case. The UK Information Commissioner’s Office provides helpful guidance here.)

Once you’ve determined the lawful basis for your data processing, you need to document this
basis and notify the data subject (transparency!). And if you decide later to change your
justification, you need to have a good reason, document this reason, and notify the data subject.

Consent

There are strict new rules about what constitutes consent from a data subject to process their
information.

• Consent must be “freely given, specific, informed and unambiguous.”


• Requests for consent must be “clearly distinguishable from the other matters” and
presented in “clear and plain language.”
• Data subjects can withdraw previously given consent whenever they want, and you
have to honor their decision. You can’t simply change the legal basis of the processing
to one of the other justifications.
• Children under 13 can only give consent with permission from their parent.
• You need to keep documentary evidence of consent.

Data Protection Officers

Contrary to popular belief, not every data controller or processor needs to appoint a Data
Protection Officer (DPO). There are three conditions under which you are required to appoint
a DPO:

1. You are a public authority other than a court acting in a judicial capacity.
2. Your core activities require you to monitor people systematically and regularly on a
large scale. (e.g. You’re Google.)

75
3. Your core activities are large-scale processing of special categories of data listed
under Article 9 of the GDPR or data relating to criminal convictions and offenses
mentioned in Article 10. (e.g. You’re a medical office.)

You could also choose to designate a DPO even if you aren’t required to. There are benefits to
having someone in this role. Their basic tasks involve understanding the GDPR and how it
applies to the organization, advising people in the organization about their responsibilities,
conducting data protection trainings, conducting audits and monitoring GDPR compliance, and
serving as a liaison with regulators.

We go in depth about the DPO role in another article.

People’s privacy rights

You are a data controller and/or a data processor. But as a person who uses the Internet, you’re
also a data subject. The GDPR recognizes a litany of new privacy rights for data subjects,
which aim to give individuals more control over the data they loan to organizations. As an
organization, it’s important to understand these rights to ensure you are GDPR compliant.

Below is a rundown of data subjects’ privacy rights:

1. The right to be informed


2. The right of access
3. The right to rectification
4. The right to erasure
5. The right to restrict processing
6. The right to data portability
7. The right to object
8. Rights in relation to automated decision making and profiling.

Conclusion

We’ve just covered all the major points of the GDPR in a little over 2,000 words.
The regulation itself (not including the accompanying directives) is 88 pages. If you’re
affected by the GDPR, we strongly recommend that someone in your organization reads it and
that you consult an attorney to ensure you are GDPR compliant.

76
Evolution of Data Protection Laws in UK

In the UK, the key pieces of legislation governing data protection are the UK General Data
Protection Regulation (Regulation (EU) (2016/679) ('UK GDPR') and the Data Protection Act
2018 ('the Act').

The current version of the legislative framework (as amended, following the withdrawal of the
UK from the European Union on 31 January 2020) has applied in the UK since 1 January 2021.

In respect of electronic communications (in particular marketing activities), the Privacy and
Electronic Communications (EC Directive) Regulations 2003 ('PECR') sit alongside the UK
GDPR and the Act, providing a further set of specialised rules.

The Retained EU Law (Revocation and Reform) Bill ('REUL'), currently before the UK
Parliament ('Parliament'), will significantly impact the UK data protection framework. At
present, the REUL will 'sunset' most of EU laws which were retained as part of UK law after
Brexit. The effect of this will be that the current UK data protection framework (in particular
the UK GDPR and PECR) will expire on 31 December 2023, unless the law is specifically
'assimilated' into domestic law or the 'sunset' is extended (potentially until 2026).

The Act will be unaffected by the operation of REUL, but as its provisions are supplementary
to the UK GDPR and cannot 'standalone' as a full framework for processing carried out by the
vast majority of controllers and processors, except for law enforcement and intelligence
services processing (unless amended), it is to be expected that the UK Government ('the
Government') will announce a programme for data protection law reform later in 2023.

A previous proposed new law the Data Protection and Digital Information Bill, which would
amend the UK GDPR and the Act, was withdrawn in September 2022 after a change of
Government. It is unclear if it will return to Parliament in the same or amended form.

This note therefore sets out the known UK law position until 31 December 2023.

77
DATA PROTECTION LAWS IN USA
1. Relevant Legislation and Competent Authorities
1.1 What is the principal data protection legislation?
There is no single principal data protection legislation in the United States (U.S.). Rather, a
jumble of hundreds of laws enacted on both the federal and state levels serve to protect the
personal data of U.S. residents. At the federal level, the Federal Trade Commission Act (FTC
Act) (15 U.S. Code § 41 et seq.) broadly empowers the U.S. Federal Trade Commission (FTC)
to bring enforcement actions to protect consumers against unfair or deceptive practices and to
enforce federal privacy and data protection regulations. The FTC has taken the position that
“deceptive practices” include a company’s failure to comply with its published privacy
promises and its failure to provide adequate security of personal information, in addition to its
use of deceptive advertising or marketing methods.

As described more fully below, other federal statutes primarily address specific sectors, such
as financial services or healthcare. In parallel to the federal regime, state-level statutes protect
a wide range of privacy rights of individual residents. The protections afforded by state statutes
often differ considerably from one state to another, and some are comprehensive, while others
cover areas as diverse as protecting library records to keeping homeowners free from drone
surveillance.

1.2 Is there any other general legislation that impacts data protection?
Although there is no general federal legislation impacting data protection, there are a number
of federal data protection laws that are sector-specific (see question 1.3 below), or focus on
particular types of data. By way of example, the Driver’s Privacy Protection Act of 1994
(DPPA) (18 U.S. Code § 2721 et seq.) governs the privacy and disclosure of personal
information gathered by state Departments of Motor Vehicles. Children’s information is
protected at the federal level under the Children’s Online Privacy Protection Act (COPPA) (15
U.S. Code § 6501), which prohibits the collection of any information from a child under the
age of 13 online and from digitally connected devices, and requires publication of privacy
notices and collection of verifiable parental consent when information from children is being
collected. The Video Privacy Protection Act (VPPA) (18 U.S. Code § 2710 et seq.) restricts
the disclosure of rental or sale records of videos or similar audio-visual materials, including
online streaming. Similarly, the Cable Communications Policy Act of 1984 includes

78
provisions dedicated to the protection of subscriber privacy (47 U.S. Code § 551). Finally,
even in the absence of legislation, presidential administrations are often active with
rulemaking, executive orders, and other authorities. For example, in March 2023, the Biden-
Harris administration released its National Cybersecurity Strategy.

State laws also may impose restrictions and obligations on businesses relating to the collection,
use, disclosure, security, or retention of special categories of information, such as biometric
data, medical records, social security numbers, driver’s licence information, email addresses,
library records, television viewing habits, financial records, tax records, insurance information,
criminal justice information, phone records, and education records, just to name some of the
most common.

Every state has adopted data breach notification legislation that applies to certain types of
personal information about its residents. Even if a business does not have a physical presence
in a particular state, it typically must comply with the state’s laws when faced with the
unauthorised access to, or acquisition of, personal information it collects, holds, transfers or
processes about that state’s residents. The types of information subject to these laws vary, with
most states defining personal information to include an individual’s first name or first initial
and last name, together with a data point including the individual’s SSN, driver’s licence or
state identification card number, financial account number or payment card information.

Some states are more active than others when it comes to data protection. Massachusetts, for
example, has strong data protection regulations (201 CMR 17.00), requiring any entity that
receives, stores, maintains, processes, or otherwise has access to “personal information” of a
Massachusetts resident in connection with the provision of goods or services, or in connection
with employment, (a) to implement and maintain a comprehensive written information security
plan (WISP) addressing 10 core standards, and (b) to establish and maintain a formal
information security programme that satisfies eight core requirements, which range from
encryption to information security training.

In 2019, New York expanded its data breach notification law to include the express requirement
that entities develop, implement and maintain “reasonable” safeguards to protect the security,
confidentiality and integrity of private information. Significantly, New York’s Stop Hacks and

79
Improve Electronic Data Security Act (SHIELD Act) (N.Y. Gen Bus. Law § 899-bb) identified
certain administrative, technical, and physical safeguards which, if implemented, are deemed
to satisfy New York’s reasonableness standard under the law. Previously, New York
prioritised the regulation of certain financial institutions doing business in the state, by setting
minimum cybersecurity standards, with requirements for companies to perform periodic risk
assessments and file annual compliance certifications (23 NYCRR 500).

Illinois has a uniquely expansive state law (740 ILCS 14/), which imposes requirements on
businesses that collect or otherwise obtain biometric information. The Illinois Biometric
Information Privacy Act (BIPA) is notable as, at the time of writing, the only state law
regulating biometric data usage that allows private individuals to sue and recover damages for
violations. In January 2019, the Illinois Supreme Court offered an expansive reading of the
protections of BIPA, holding that the law does not require individuals to show they suffered
harm other than a violation of their legal rights to sue. Recent decisions have continued the
trend toward an expansive reading of BIPA. In February 2023, the Illinois Supreme Court held
that a company violated BIPA every time it took employees’ fingerprints to clock in and out
of their shifts, not simply once for each affected employee. Also in February 2023, the Illinois
Supreme Court held that claims brought under BIPA are subject to a five-year statute of
limitations.

California has a long history of adopting privacy-forward legislation, and in 2018, the state
enacted the California Consumer Privacy Act (CCPA), which became effective on January 1,
2020. The law introduced new obligations on covered businesses, including requirements to
disclose the categories of personal information the business collects about consumers, the
specific pieces of personal information the business collected about the consumer, the
categories of sources from which the personal information is collected, the business or
commercial purpose for collecting or selling personal information, and the categories of third
parties with which the business shares personal information. It also introduced new rights for
California residents, including the right to request access to and deletion of personal
information and the right to opt out of having personal information sold to third parties.

More recently, we have seen a number of states push towards enacting similar comprehensive
consumer data privacy laws. Specifically, in 2020, California amended the CCPA with the

80
California Privacy Rights Act (CPRA), which expanded the rights granted to consumers and
increased compliance obligations on businesses. In 2021, Virginia enacted the Consumer Data
Protection Act (Virginia CDPA) becoming the second state with a comprehensive data privacy
law, followed shortly thereafter by Colorado, which enacted the Colorado Privacy
Act. Continuing this trend, in 2022, Utah enacted the Utah Consumer Privacy Act and
Connecticut enacted an Act Concerning Personal Data Privacy and Online Monitoring
(Connecticut Privacy Act). In 2023, Iowa and Indiana have enacted comprehensive data
privacy laws, and the state legislatures of Montana and Tennessee have all passed
comprehensive data privacy laws and await their respective Governor’s signature. In the
absence of a data privacy framework at the federal level, states continue to pursue
legislation. In addition, state regulators are also actively making rules to implement these laws,
with final rulemaking from California and Colorado, for example, becoming effective in 2023.

1.3 Is there any sector-specific legislation that impacts data protection?

Key sector-specific laws include those covering financial services, healthcare,


telecommunications, and education.

The Gramm Leach Bliley Act (GLBA) (15 U.S. Code § 6802(a) et seq.) governs the protection
of personal information in the hands of banks, insurance companies and other companies in the
financial service industry. This statute addresses “Non-Public Personal Information” (NPI),
which includes any information that a financial service company collects from its customers in
connection with the provision of its services. It imposes requirements on financial service
industry companies for securing NPI, restricting disclosure and use of NPI and notifying
customers when NPI is improperly exposed to unauthorised persons.

The Fair Credit Reporting Act (FCRA), as amended by the Fair and Accurate Credit
Transactions Act (FACTA) (15 U.S. Code § 1681), restricts use of information with a bearing
on an individual’s creditworthiness, credit standing, credit capacity, character, general
reputation, personal characteristics or mode of living to determine eligibility for credit,
employment or insurance. It also requires the truncation of credit card numbers on printed
receipts, requires the secure destruction of certain types of personal information, and regulates

81
the use of certain types of information received from affiliated companies for marketing
purposes.

In addition to financial industry laws and regulation, the major credit card companies require
businesses that process, store or transmit payment card data to comply with the Payment Card
Industry Data Security Standard (PCI-DSS).

The Health Information Portability and Accountability Act, as amended (HIPAA) (29 U.S.
Code § 1181 et seq.) protects information held by a covered entity that concerns health status,
provision of healthcare or payment for healthcare that can be linked to an individual. Its
Privacy Rule regulates the collection and disclosure of such information. Its Security Rule
imposes requirements for securing this data.
The Telephone Consumer Protection Act (TCPA) (47 U.S. Code § 227) and associated
regulations regulate calls and text messages to mobile phones, and regulate calls to residential
phones that are made for marketing purposes or using automated dialling systems or pre-
recorded messages. Relatedly, the Controlling the Assault of Non-Solicited Pornography and
Marketing Act (CAN-SPAM Act) (15 U.S. Code § 7701 et seq.) and associated regulations set
basic rules for sending commercial emails, including providing an opt-out right to recipients.

The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g) provides
students with the right to inspect and revise their student records for accuracy, while also
prohibiting the disclosure of these records or other personal information on the student, without
the student’s or parent’s (in some instances) consent.

Where a federal statute covers a specific topic, the federal law may pre-empt any similar state
law on that topic. However, certain federal laws, like the GLBA for instance, specify that they
are not pre-emptive of state laws on the subject.

1.4 What authority(ies) are responsible for data protection?

While the United States has no plenary data protection regulator, the FTC’s authority is very
broad, and often sets the tone on federal privacy and data security issues. In addition, a variety
of other agencies regulate data protection through sectoral laws, including the Office of the

82
Comptroller of the Currency (OCC), the Department of Health and Human Services (HHS),
the Federal Communications Commission (FCC), the Securities and Exchange Commission
(SEC), the Consumer Financial Protection Bureau (CFPB) and the Department of
Commerce. At the state level, the CPRA established the first dedicated privacy regulator in
the United States, the California Privacy Protection Agency (CPPA). The CPPA’s
responsibilities will include enforcement of the CPRA with the California Attorney General,
rulemaking under the CPRA, and promoting public awareness of privacy issues. Other states
have continued to authorise their Attorneys General to conduct rulemaking or to bring
enforcement actions related to violations of their respective data privacy law.

2. Definitions
2.1 Please provide the key definitions used in the relevant legislation:

• “Personal Data”: In the United States, information relating to an individual is


typically referred to as “personal information” (rather than personal data), though
notably, recent privacy legislation in Virginia, Colorado, Utah and Connecticut
use the term “personal data”. The definition of personal information in the U.S. is
not uniform across all states or all regulations. In addition, certain data may be
considered personal information for one purpose but not for another.
• “Processing”: The definition of processing in the U.S. is not uniform across all
states or all regulations. The general concept encompasses operations performed
on personal information or data, including collection, use, storage, disclosure,
analysis, etc.
• “Controller”: Unlike California, more recent states, including in Virginia,
Colorado, Utah, and Connecticut, have incorporated this term in their data privacy
legislation. Though definitions may vary, the general concept refers to the entity
that determines the purpose and means of processing personal information.
• “Processor”: Unlike California, more recent states, including in Virginia,
Colorado, Utah, and Connecticut, have incorporated this term in their data privacy
legislation. Though definitions may vary, the general concept refers to an entity
that processes personal information on behalf of a controller.
• “Data Subject”: The state data protection statutes typically cover a “consumer”
residing within the state. The definition of “consumer” differs by state. Under

83
many state data protection statutes, a “consumer” is an individual who engages
with a business for personal, family or household purposes. In contrast, under the
CCPA a “consumer” is defined broadly as a “natural person who is a California
resident”.
• “Sensitive Personal Data”: The definition of processing in the U.S. is not uniform
across all states or all regulations. For those jurisdictions that address sensitive
personal data, it refers to personal information of heightened concerns, potentially
including racial or ethnic origin, genetic or biometric data, or precise geolocation
data.
• “Data Breach”: The definition of a Data Breach depends on the individual state
statute, but typically involves the unauthorised access or acquisition of
computerised data that compromises the security, confidentiality, or integrity of
personal information.

3. Territorial Scope
3.1 Do the data protection laws apply to businesses established in other jurisdictions?
If so, in what circumstances would a business established in another jurisdiction be
subject to those laws?

Businesses established in other jurisdictions may be subject to both federal and state data
protection laws for activities impacting U.S. residents whose information the business collects,
holds, transmits, processes or shares.

4. Key Principles
4.1 What are the key principles that apply to the processing of personal data?

• Transparency: The FTC has issued guidelines espousing the principle of


transparency, recommending that businesses: (i) provide clearer, shorter, and more
standardised privacy notices that enable consumers to better comprehend privacy
practices; (ii) provide reasonable access to the consumer data they maintain that is
proportionate to the sensitivity of the data and the nature of its use; and (iii) expand
efforts to educate consumers about commercial data privacy practices.
• Lawful basis for processing: While there is no “lawful basis for processing”
requirement under U.S. law, the FTC recommends that businesses provide notice

84
to consumers of their data collection, use and sharing practices and obtain consent
in limited circumstances where the use of consumer data is materially different
than claimed when the data was collected, or where sensitive data is collected for
certain purposes. Certain new state laws require obtaining consent in certain
circumstances, such as prior to processing sensitive personal data.
• Purpose limitation: The FTC recommends privacy-by-design practices that
include limiting “data collection to that which is consistent with the context of a
particular transaction or the consumer’s relationship with the business, or as
required or specifically authorized by law”.
• Data minimisation: See above.
• Proportionality: See above.
• Retention: The FTC recommends privacy-by-design practices that implement
“reasonable restrictions on the retention of data”, including disposal “once the data
has outlived the legitimate purpose for which it was collected”. Additionally, state
laws may also specify specific retention parameters, for example, Texas’s Capture
or Use of Biometric Identifier Act (CUBI) requires the destruction of biometric
identifiers within a reasonable time, but not more than a year after the purpose for
capturing the biometric identifiers has ended.

5. Individual Rights
5.1 What are the key rights that individuals have in relation to the processing of their
personal data?

• Right of access to data/copies of data: These rights are statute-specific. For


example, under certain circumstances, employees are entitled to receive copies of
data held by employers. In other circumstances, parents are entitled to receive
copies of information collected online from their children under the age of
13. Under HIPAA, individuals are entitled to request copies of medical
information held by a health services provider. At the state level, the CCPA
provides a right of access to California residents for personal information held by
a business relating to that resident. Recent state privacy laws, including the
CPRA, Virginia CDPA, the Colorado Privacy Act, the Utah Consumer Privacy
Act, and the Connecticut Privacy Act, provide a similar right.

85
• Right to rectification of errors: These rights are statute-specific. Some laws, such
as the FCRA, provide consumers with a right to review data about the consumer
held by an entity and request corrections to errors in that data. At the state level,
the right to correct information commonly attaches to credit reports, as well as
criminal justice information, employment records, and medical records. State data
privacy legislation, including the CPRA, the Virginia CDPA, the Colorado
Privacy Act, and the Connecticut Privacy Act, provide a consumer the right to
correct inaccuracies in personal data held by a business.
• Right to deletion/right to be forgotten: These rights are statute-specific. By way
of a federal law example, COPPA provides parents the right to review and delete
their children’s information and may require that data be deleted even in the
absence of a request. Some state laws, such as the CCPA, provide a right of
deletion for residents of the respective states, with certain exceptions. Recent state
privacy laws, including the CPRA, Virginia CDPA, the Colorado Privacy Act, the
Utah Consumer Privacy Act, and the Connecticut Privacy Act, provide a similar
right to delete.
• Right to object to processing: These rights are statute-specific. Individuals are
given the right to opt out of receiving commercial (advertising) emails under
CAN-SPAM and the right to not receive certain types of calls to residential or
mobile telephone numbers without express consent under the TCPA. Some states
provide individuals with the right not to have telephone calls recorded without
either consent of all parties to the call or consent of one party to the call.
• Right to restrict processing: These rights are statute-specific. Certain laws
restrict how an entity may process consumer data. For example, the CCPA allows
California residents, and the Nevada Privacy Law allows Nevada residents to
prohibit a business from selling that individual’s personal information. Recent
state privacy laws, including the Virginia CDPA, Colorado Privacy Act, and
Connecticut Privacy Act, provide a right to restrict processing for the purposes of
sale, targeted advertising, and profiling. The Utah Consumer Privacy Act provides
a slightly narrower right to restrict processing for the purposes of sale or targeted
advertising.
• Right to data portability: These rights are statute-specific. Examples of
consumer rights to data portability exist under HIPAA, where individuals are

86
entitled to request that medical information held by a health services provider be
transferred to another health services provider. In addition, the CCPA currently
provides a right of data portability for their respective state residents. Recent state
privacy laws, including the CPRA, Virginia CDPA, the Colorado Privacy Act, the
Utah Consumer Privacy Act, and the Connecticut Privacy Act, provide a similar
right to data portability.
• Right to withdraw consent: These rights are statute-specific. By way of example,
under the TCPA, individuals are permitted to withdraw consent given to receive
certain types of calls or texts to residential or mobile telephone lines. For an
example under state law, the Colorado Privacy Act requires that consumers have
the right to withdraw consent, because the regulations do not consider consent that
a consumer cannot easily withdraw, to “be freely given”. Other state laws, such
as the CCPA, address the right to withdraw consent by empowering users to limit
the processing of sensitive personal data at any time.
• Right to object to marketing: These rights are statute-specific. Several laws
permit consumers to restrict marketing activities involving their personal
data. Under CAN-SPAM, for example, individuals may opt out of receiving
commercial (advertising) emails. Under the TCPA, individuals must provide
express written consent to receive marketing calls/texts to mobile telephone
lines. California’s Shine the Light Act requires companies that share personal
information for the recipient’s direct marketing purposes to either provide an opt-
out or make certain disclosures to the consumer of what information is shared, and
with whom. Recent state privacy laws, including the CPRA, Virginia CDPA, the
Colorado Privacy Act, the Utah Consumer Privacy Act, and the Connecticut
Privacy Act, provide consumers with the right to opt out of processing of their
personal information for targeted advertising.
• Right protecting against solely automated decision-making and profiling: State
privacy rules against profiling, including in California, Virginia, Colorado, and
Connecticut, became effective for the first time in 2023.
• Right to complain to the relevant data protection authority(ies): These rights are
statute-specific. By way of example, individuals may report unwanted or
deceptive commercial email (“spam”) directly to the FTC, and telemarketing
violations directly to the FCC. Similarly, anyone may file a HIPAA complaint

87
directly with the HHS. At the state level, California residents may report alleged
violations of the CCPA to the California Attorney General. Under the CPRA,
California residents will be able to report alleged violations to the
CPPA. Similarly, the Utah Consumer Privacy Act provides that Utah residents
may report alleged violations to the state’s Consumer Protection Division.

5.2 Please confirm whether data subjects have the right to mandate not-for-profit
organisations to seek remedies on their behalf or seek collective redress.

This is not applicable in our jurisdiction. A few US data privacy laws allow for individuals to
institute an action for violations of data privacy statutes or regulations, including actions that
could take the form of a class action or collective redress. However, most US data privacy
laws do not authorise such individual actions. Rather, the trend under US data privacy laws is
to restrict enforcement to regulators.

6. Children’s Personal Data


6.1 What additional obligations apply to the processing of children’s personal data?

Children’s information is protected at the federal level under COPPA (15 U.S. Code §
6501). COPPA requires operators of: (a) commercial websites and online services directed to
children under the age of 13; or (b) general or mixed audience commercial websites or online
services with actual knowledge they are collecting personal information from children under
the age of 13 to meet specific compliance obligations where they collect personal information
from children under the age of 13. Specifically, COPPA requires that covered operators: (1)
publish certain privacy notices, including a COPPA-compliant privacy policy and “direct
notice” to parents prior to the collection of personal information from their child; (2) obtain
parental consent prior to collecting personal information from a child under the age of 13; (3)
provide parents a choice regarding disclosure of a child’s information to third parties under
certain circumstances; (4) provide parents access to their child’s personal information and
opportunities to delete that information or prevent further use or collection of a child’s
information; and (5) maintain the confidentiality, security, and integrity of the information
collected. At the time of writing, additional federal legislation that would increase protections
for children’s privacy online has been introduced and is currently pending.

88
At the state level, the CCPA alters its right to opt out of sale of personal information for
consumers under the age of 16. Businesses are prohibited from selling personal information
of consumers under the age of 16 without affirmative authorisation from a consumer aged 13–
15 or from the parent or legal guardian of a consumer under the age of 13. Recent privacy
laws, including in Virginia, Colorado, Utah, and Connecticut, consider the personal data of a
child below the age of 13 as sensitive personal data. In Virginia, Utah, and Connecticut,
controllers must process a child’s data in accordance with COPPA. The Colorado Privacy Act
requires consumer consent before processing sensitive personal data, but notably exempts
personal data subject to COPPA. In 2022, California enacted the Age-Appropriate Design
Code Act, which imposes requirements addressing transparency requirements, default settings
and data protection impact assessments. Notably, the law, effective July 1, 2024, applies to
children under 18, not under 13 like COPPA or other laws involving children’s data.

7. Registration Formalities and Prior Approval


7.1 Is there a legal obligation on businesses to register with or notify the data
protection authority (or any other governmental body) in respect of its processing
activities?

Both Vermont and California require data brokers to register with the respective Attorneys
General. The Vermont requirement defines a “data broker” to include entities that knowingly
collect and sell or license to third parties the personal information of a consumer with whom
the business does not have a direct relationship (9 V.S.A. chapter 62). California’s data broker
definition similarly encompasses the knowing collection and sale of personal information
regarding consumers with which the business does not have a direct relationship (Cal. Civ.
Code § 1798.99.80(d)).

7.2 If such registration/notification is needed, must it be specific (e.g., listing all


processing activities, categories of data, etc.) or can it be general (e.g., providing a broad
description of the relevant processing activities)?

The states that have mandated data broker registration generally do not require a specific
description of relevant data processing activities. California makes it optional for the data
broker to provide within its registration any information concerning its data collection practices

89
(Cal. Civ. Code § 1798.99.82). Vermont, in contrast, is more demanding and requires
registrants to disclose information regarding consumer opt-out, whether the data broker
implements a purchaser credentialing process, and the number and extent of any data broker
security breaches it experienced during the prior year. Where data brokers knowingly possess
information about minors, Vermont law requires that they detail all related data collection
practices, databases, sales activities, and opt-out policies (9 V.S.A. § 2446).

7.3 On what basis are registrations/notifications made (e.g., per legal entity, per
processing purpose, per data category, per system or database)?

Data broker registrations are made on a “per legal entity” basis.

7.4 Who must register with/notify the data protection authority (e.g., local legal
entities, foreign legal entities subject to the relevant data protection legislation,
representative or branch offices of foreign legal entities subject to the relevant data
protection legislation)?

Within the states for which it applies, registrations are required based on the business falling
within the definition of a “data broker” pursuant to state law. Generally, a “data broker” is
defined as a business that knowingly collects and sells the personal information of a consumer
with whom the business does not have a direct relationship.

7.5 What information must be included in the registration/notification (e.g., details


of the notifying entity, affected categories of individuals, affected categories of personal
data, processing purposes)?

See question 7.2 above.

7.6 What are the sanctions for failure to register/notify where required?

In Vermont, the penalty is US$50 per day in addition to the registration fee of US$100. In
California, a data broker that fails to register is liable for civil penalties, fees, and costs of

90
US$100 for each day the data broker fails to register and an amount equal to the fees that were
due during the period it failed to register.

7.7 What is the fee per registration/notification (if applicable)?

Fees vary by state. The data broker registration fee in Vermont is US$100 and in California it
is US$400.

7.8 How frequently must registrations/notifications be renewed (if applicable)?

In both Vermont and California, data brokers are required to register annually.

7.9 Is any prior approval required from the data protection regulator?

Data broker registration submissions require Attorney General approval in both Vermont and
California.

7.10 Can the registration/notification be completed online?

Data broker registration for both Vermont and California may be completed online.

7.11 Is there a publicly available list of completed registrations/notifications?

Vermont and California maintain publicly available lists of registered data brokers.

7.12 How long does a typical registration/notification process take?

Neither Vermont nor California publish information concerning the typical amount of time for
the data broker registration process.

91
8. Appointment of a Data Protection Officer
8.1 Is the appointment of a Data Protection Officer mandatory or optional? If the
appointment of a Data Protection Officer is only mandatory in some circumstances,
please identify those circumstances.

Appointment of a Data Protection Officer is not required under U.S. law, but certain statutes
require the appointment or designation of an individual or individuals who are charged with
compliance with the privacy and data security requirements under the statute. These include
the GLBA, HIPAA, and the Massachusetts Data Security Regulation, for example.

8.2 What are the sanctions for failing to appoint a Data Protection Officer where
required?

Potential sanctions are statute/regulator-specific.

8.3 Is the Data Protection Officer protected from disciplinary measures, or other
employment consequences, in respect of his or her role as a Data Protection Officer?

This is not applicable in our jurisdiction.

8.4 Can a business appoint a single Data Protection Officer to cover multiple entities?

This is not applicable in our jurisdiction.

8.5 Please describe any specific qualifications for the Data Protection Officer
required by law.

This is not applicable in our jurisdiction.

8.6 What are the responsibilities of the Data Protection Officer as required by law or
best practice?

92
This is not applicable in our jurisdiction.

8.7 Must the appointment of a Data Protection Officer be registered/notified to the


relevant data protection authority(ies)?

This is not applicable in our jurisdiction.

8.8 Must the Data Protection Officer be named in a public-facing privacy notice or
equivalent document?

This is not applicable in our jurisdiction.

9. Appointment of Processors
9.1 If a business appoints a processor to process personal data on its behalf, must the
business enter into any form of agreement with that processor?

Under certain state laws and federal regulatory guidance, if a business shares certain categories
of personal information with a vendor, the business is required to contractually bind the vendor
to reasonable security practices. HIPAA, for example, requires the use of Business Associate
Agreements for the transfer of protected health information to vendors. The CCPA, Virginia
CDPA, Colorado Privacy Act, the Utah Consumer Privacy Act, and the Connecticut Privacy
Act require written contracts for certain entities that process personal information.

9.2 If it is necessary to enter into an agreement, what are the formalities of that
agreement (e.g., in writing, signed, etc.) and what issues must it address (e.g., only
processing personal data in accordance with relevant instructions, keeping personal data
secure, etc.)?

The form of the contract typically is not specified. HIPAA, however, is an example of a statute
with minimum requirements for provisions that must be included within Business Associate
Agreements. These agreements must include limitations on use and disclosure, and require
vendors to abide by HIPAA’s Security Rule, to provide breach notification and report on

93
unauthorised use and disclosure, to return or destroy protected data, and to make its books,
records, and practices available to the federal regulator. Requirements under state data privacy
legislation vary by jurisdiction. Under the CCPA, the contract must restrict the service
provider from retaining, using, or disclosing personal information for any purpose other than
performance of the services specified in the contract. Additional mandatory contract
provisions on both service providers and contractors include requiring that contracts prohibit
service providers from selling or sharing personal information and from retaining, using, or
disclosing personal information outside of the direct business relationship between the business
and the service provider. Additionally, the Virginia CDPA, Colorado Privacy Act, the Utah
Consumer Privacy Act, and the Connecticut Privacy Act each require that a contract set forth
instructions for processing, including the type of data subject to processing and the nature and
purpose of processing, and set specific requirements regarding engagement of
subcontractors. The Colorado Privacy Act further requires that controllers and processors
implement appropriate technical and organisational safeguards related to security.

10. Marketing
10.1 Please describe any legislative restrictions on the sending of electronic direct
marketing (e.g., for marketing by email or SMS, is there a requirement to obtain prior
opt-in consent of the recipient?).

Prior express written consent is required under the TCPA before certain marketing texts may
be sent to a mobile telephone line. Other federal statutes have opt-out rather than opt-in consent
requirements. For instance, under CAN-SPAM, marketing emails – or emails sent for the
primary purpose of advertising or promoting a commercial product or service – may be sent to
those not opting out, provided the sender is accurately identified, the subject line and text of
the email are not deceptive, the email contains the name and address of the sender, the email
contains a free, simple mechanism to opt out of future emails, and the sender honours opt-outs
within 10 days of receipt.

10.2 Are these restrictions only applicable to business-to-consumer marketing, or do


they also apply in a business-to-business context?

94
The TCPA and CAN-SPAM Act apply to both business-to-consumer and business-to-business
electronic direct marketing. In contrast, business-to-business telephone communications,
except those intended to induce the retail sale of non-durable office or cleaning supplies, are
exempt from the Telemarketing Sales Rule described in question 10.3 below.

10.3 Please describe any legislative restrictions on the sending of marketing via other
means (e.g., for marketing by telephone, a national opt-out register must be checked in
advance; for marketing by post, there are no consent or opt-out requirements, etc.).

Marketing by telephone is regulated on the national level by the Telemarketing Sales Rule, a
regulation under the Telemarketing and Consumer Fraud and Abuse Prevention Act. This act
established the national Do Not Call list of telephone numbers that cannot be used for
marketing communications (calls and texts) and disclosure requirements for companies
engaging in telephone marketing. It also proscribes limitations on the use of telephone
marketing, including, for instance, limiting the time of day for marketing calls, requiring the
caller to provide an opt-out of future calls, and limiting the use of pre-recorded
messages. There are no consent or opt-out requirements for sending marketing materials
through postal mail. In addition, with the growing prevalence of telemarketers using spoofed
caller IDs, the FCC is becoming more aggressive with its enforcement of the Truth in Caller
ID Act. Furthermore, several states maintain an independent Do Not Call list and regulations
in which telemarketers must comply.

It is noted that the FTC, which regulates deceptive practices, has brought enforcement actions
relating to the transmission of marketing emails or telemarketing calls by companies who have
made promises in their publicly posted privacy policies that personal information will not be
used for marketing purposes. Additionally, many states apply deceptive practices statutes to
impose penalties or injunctive relief in similar circumstances, or where violation of a federal
statute is deemed a deceptive practice under state law. Finally, recent comprehensive state data
privacy laws, including in California, Virginia, Colorado, Utah, and Connecticut, offer
consumers an opt-out of sale, disclosure, or processing of personal information in relation to
targeted advertising or profiling.

10.4 Do the restrictions noted above apply to marketing sent from other jurisdictions?

95
Potentially, depending on if the entity sending the marketing is subject to jurisdiction in U.S.
court and if the recipient is within the U.S.

10.5 Is/are the relevant data protection authority(ies) active in enforcement of


breaches of marketing restrictions?

The FTC, FCC, and the Attorneys General of the states are active in enforcement in this area.

10.6 Is it lawful to purchase marketing lists from third parties? If so, are there any
best practice recommendations on using such lists?

Yes. However, the purchaser of the list should correlate it with the national Do Not Call list
and the purchaser’s email opt-out lists. Some states forbid the sale of email addresses of
individuals who have opted out of receiving marketing emails, and some forbid the sale of
information obtained in connection with a consumer’s purchase transaction.

10.7 What are the maximum penalties for sending marketing communications in
breach of applicable restrictions?

The penalties under CAN-SPAM can reach as high as US$50,120 per email. The penalties
under the TCPA are US$500 per telephone call/text message violation, US$1,500 for each
wilful or knowing violation, and additional civil forfeiture fees with a penalty of up to
US$10,000 for intentional violations (based on the Telephone Robocall Abuse Criminal
Enforcement and Deterrence Act (TRACED Act), passed in 2019), plus fines that can reach
US$16,000 for each political message or call sent in violation of the Act. Once a consumer’s
telephone number has been registered on the DNC registry for 31 days, DNC laws prohibit you
from calling it. Your business can be fined up to US$11,000 per call by the New York
Department of State, as well as by the FTC and FCC. By way of example, the FTC and the
attorneys general of several states obtained a judgment of US$280 million in 2017 for a
company’s repeated violation (involving over 66 million calls) of the TCPA, the FTC’s
Telemarketing Sales Rule, and state law. Similarly, in March 2021, the FCC issued a US$225
million fine – the largest in the history of the agency – against telemarketers based in Texas for

96
violations of the TCPA and the Truth in Caller ID Act in connection with approximately 1
billion robocalls.

Many states have their own deceptive practices statutes, which impose additional state penalties
where violations of federal statutes are deemed to be deceptive practices under the state statute.

11. Cookies
11.1 Please describe any legislative restrictions on the use of cookies (or similar
technologies).

The federal Computer Fraud and Abuse Act has been used to assert legal claims against the use
of cookies for behavioural advertising, where the cookies enable “deep packet” inspection of
the computer on which they are placed. At least two states, California and Delaware, require
disclosures to be made where cookies are used to collect information about a consumer’s online
activities across different websites or over time. The required disclosure must include how the
operator responds to so-called “do not track” signals or other similar mechanisms. In addition,
the CCPA’s broad definition of “sale” (which includes when a consumer’s personal
information is made available for collection by third-party cookies for monetary “or other
valuable” consideration) and “sharing” (which encompasses the collection of data for use in
cross contextual advertising), imposes obligations on businesses to provide certain notice and
choice mechanisms (e.g., opt out) to consumers.

In addition, the FTC Act and state deceptive practices acts have underpinned regulatory
enforcement and private class action lawsuits against companies that failed to disclose or
misrepresented their use of tracking cookies. One company settled an action in 2012 with a
payment of US$22.5 million to the FTC, and in 2016 agreed to pay US$5.5 million to settle a
private class action involving the same conduct. In 2022, one company settled with the
California Attorney General for US$1.2 million for failing to disclose to consumers that it was
selling their personal information by making it available to third party advertisers via online
cookie trackers.

11.2 Do the applicable restrictions (if any) distinguish between different types of
cookies? If so, what are the relevant factors?

97
The Computer Fraud and Abuse Act and the Electronic Communications Privacy Act, as well
as state surveillance laws, may come into play where cookies collect information from the
computer on which they are placed and report that information to the entity placing the cookies
without proper consent.

11.3 To date, has/have the relevant data protection authority(ies) taken any
enforcement action in relation to cookies?

Yes, the FTC has brought regulatory enforcement actions against companies that failed to
disclose or misrepresented their use of cookies. The California Attorney General has also
initiated regulatory investigations for violations of the CCPA rules on “selling” and “sharing”
personal information that came about through the use of cookies, which in one instance,
resulted in a settlement with the California Attorney General for US$1.2 million.

11.4 What are the maximum penalties for breaches of applicable cookie restrictions?

Maximum fines are not set by statute.

12. Restrictions on International Data Transfers


12.1 Please describe any restrictions on the transfer of personal data to other
jurisdictions.

The U.S. does not place restrictions on the transfer of personal data to other jurisdictions.

12.2 Please describe the mechanisms businesses typically utilise to transfer personal
data abroad in compliance with applicable transfer restrictions (e.g., consent of the data
subject, performance of a contract with the data subject, approved contractual clauses,
compliance with legal obligations, etc.).
This is left to the discretion of the company, as the U.S. does not place restrictions on the
transfer of personal data to other jurisdictions. With respect to receiving data from abroad,
prior to Schrems II, the EU–US Privacy Shield Framework provided a mechanism to comply
with data protection requirements when transferring personal data from the EU to the
U.S. However, since the invalidation of the Privacy Shield Framework in Schrems II, and with
98
the proposed Trans-Atlantic Data Privacy Framework awaiting an adequacy decision, the
mechanisms to govern data transfers from the EU to the U.S. are limited largely to use of
standard contractual clauses (SCCs) or binding corporate rules (BCRs).
12.3 Do transfers of personal data to other jurisdictions require
registration/notification or prior approval from the relevant data protection
authority(ies)? Please describe which types of transfers require approval or notification,
what those steps involve, and how long they typically take.

No such registration/notification is required.

12.4 What guidance (if any) has/have the data protection authority(ies) issued
following the decision of the Court of Justice of the EU in Schrems II (Case C-311/18)?
Although the FTC has not issued formal guidance following the decision in Schrems II, it has
nevertheless provided an update stating that it continues “to expect companies to comply with
their ongoing obligations with respect to transfers made under the Privacy Shield Framework”,
and encouraging those businesses to adhere to “robust privacy principles”.
Additionally, the Department of Commerce, Department of Justice (USDOJ), and the Office
of the Director of National Intelligence issued a White Paper in September 2020 that provides
guidance in light of the Schrems II decision. This White Paper provides a framework to inform
companies’ assessment of the protections afforded by U.S. law in connection with relying on
SCCs and advice to companies who have received orders authorised under FISA 702 requiring
the disclosure of data to U.S. intelligence agencies.

On March 25, 2022, the U.S. and the European Commission announced that they had reached
an agreement in principle to replace the Privacy Shield Framework with a new data transfer
framework, the Trans-Atlantic Data Privacy Framework. Under this framework, the U.S. has
committed to strengthen privacy and civil liberties safeguards governing signals intelligence
activities, establish a multi-layer redress mechanism including an independent Data Protection
Review Court available to EU citizens, and enhance oversight. On October 7, 2022, President
Joe Biden signed an Executive Order, “Enhancing Safeguards for United States Signals
Intelligence Activities”, intending to incorporate these commitments. The European
Commission has not released an adequacy decision concerning the proposed framework.

99
12.5 What guidance (if any) has/have the data protection authority(ies) issued in
relation to the European Commission’s revised Standard Contractual Clauses published
on 4 June 2021?

While public authorities in the U.S. have not issued formal guidance in relation to the European
Commission’s revised SCCs, the U.S. did submit comments on the draft SCCs issued in
November 2020. The comments do not provide any specific guidance for companies, but
rather reflect a concern that the draft revised SCCs may interfere with government efforts to
protect public safety and national security along with joint US–EU cooperation on these
issues. The U.S. also remains concerned with the ways that the draft revised SCCs create
different standards for data requests by the U.S. government in comparison to similar requests
from EU Member States.

13. Whistle-blower Hotlines


13.1 What is the permitted scope of corporate whistle-blower hotlines (e.g., restrictions
on the types of issues that may be reported, the persons who may submit a report, the
persons whom a report may concern, etc.)?

The federal Whistleblower Protection Act of 1989 protects federal employees, and some states
have similar statutes protecting state employees. Public companies subject to the Sarbanes-
Oxley Act also are required to have a whistle-blower policy which must be approved by the
board of directors and create a procedure for receiving complaints from whistle-blowers.

13.2 Is anonymous reporting prohibited, strongly discouraged, or generally


permitted? If it is prohibited or discouraged, how do businesses typically address this
issue?

Anonymous reporting generally is permitted. Rule 10A-3 of the Securities Exchange Act of
1934, for example, requires that audit committees of publicly listed companies establish
procedures for the confidential, anonymous submission by employees of concerns regarding
questionable accounting or auditing matters.

100
14. CCTV
14.1 Does the use of CCTV require separate registration/notification or prior approval
from the relevant data protection authority(ies), and/or any specific form of public notice
(e.g., a high-visibility sign)?

The use of CCTV must comply with federal and state criminal voyeurism/eavesdropping
statutes, some of which require signs to be posted where video monitoring is taking place,
restrict the use of hidden cameras, or prohibit videotaping altogether if the location is inherently
private (including places were individuals typically get undressed, such as bathrooms, hotel
rooms and changing rooms). Litigation has been instituted alleging that CCTV may also
violate biometrics laws such as BIPA. For example, in 2022 a doorbell camera provider faced
allegations that cameras recording passers-by without consent violates BIPA.

14.2 Are there limits on the purposes for which CCTV data may be used?

There generally are no restrictions on the use of lawfully collected CCTV data, subject to a
company’s own stated policies or labour agreements.

15. Employee Monitoring


15.1 What types of employee monitoring are permitted (if any), and in what
circumstances?

Employee privacy rights, like those of any individual, are based on the principle that an
individual has an expectation of privacy unless that expectation has been diminished or
eliminated by context, agreement, notice, or statute. Monitoring of employees generally is
permitted to the same extent as it is with the public, including when the employer makes clear
disclosure regarding the type and scope of monitoring in which it engages, and subject to
generally applicable surveillance laws regarding inherently private locations as well as
employee-specific laws such as those regarding the privacy of union member activities.

15.2 Is consent or notice required? Describe how employers typically obtain consent
or provide notice.

101
Consent and notice rights are state-specific, as is the use of hidden cameras. When required or
voluntarily obtained, employers typically obtain consent for employee monitoring through
acceptance of employee handbooks, and may provide notice by appropriately posting
signs. Furthermore, the CCPA provision exempting employee personal information expired in
2023, with employee personal information now treated like consumer personal information
under the CCPA.

15.3 To what extent do works councils/trade unions/employee representatives need to


be notified or consulted?

The National Labor Relations Act prohibits employers from monitoring their employees while
they are engaged in protected union activities.

15.4 Are employers entitled to process information on an employee’s COVID-19


vaccination status?

Yes. There are no laws prohibiting employers from requesting information or documentation
on an employee’s COVID-19 vaccination status. Under the Americans with Disabilities Act,
employers are required to keep medical information, such as vaccination status, confidential
and stored separately from an employee’s personnel file.

16. Data Security and Data Breach


16.1 Is there a general obligation to ensure the security of personal data? If so, which
entities are responsible for ensuring that data are kept secure (e.g., controllers,
processors, etc.)?

In the consumer context, the FTC has stated that a company’s data security measures for
protecting personal data must be “reasonable”, taking into account numerous factors, to include
the volume and sensitivity of information the company holds, the size and complexity of the
company’s operations, and the cost of the tools that are available to address
vulnerabilities. Certain federal statutes and certain individual state statutes also impose an
obligation to ensure security of personal information. For example, the GLBA and HIPAA
impose security requirements on financial services and covered healthcare entities (and their

102
vendors). In 2021, the FTC announced its revisions to its Safeguards Rule under GLBA with
major updates taking effect in December 2022. The updated rule requires highly prescriptive
safeguards including a written incident response plan, penetration testing and vulnerability
assessments, encryption of customer information, and multi-factor authentication, among other
safeguards. Some states impose data security obligations on certain entities that collect, hold
or transmit limited types of personal information. For example, the New York Department of
Financial Services (NYDFS) adopted regulations in 2017 that obligate all “regulated entities”
to adopt a cybersecurity programme and cybersecurity governance processes. The regulations
also mandate reporting of cybersecurity events, like data breaches and attempted infiltrations,
to regulators. Covered entities include those banks, mortgage companies, insurance
companies, and cheque-cashers otherwise regulated by the NYDFS. Enforcement of the
NYDFS regulation began in early 2021. In 2022, NYDFS announced a consent order against
a company that was subject to four cybersecurity incidents. The company agreed to pay a
US$5 million monetary penalty, to surrender its insurance provider licences, and to stop selling
insurance to New York residents.

16.2 Is there a legal requirement to report data breaches to the relevant data
protection authority(ies)? If so, describe what details must be reported, to whom, and
within what timeframe. If no legal requirement exists, describe under what circumstances
the relevant data protection authority(ies) expect(s) voluntary breach reporting.

At the federal level, other than breach notification requirements pertaining to federal agencies
themselves, HIPAA requires “Covered Entities” to report impermissible uses or disclosures
that compromise the security or privacy of protected health information to the HSS. Under the
Privacy Rule, if the breach involves more than 500 individuals, such notification must be made
within 60 days of discovery of the breach. Information to be submitted includes information
about the entity suffering the breach, the nature of the breach, the timing (start and end) of the
breach, the timing of discovery of the breach, the type of information exposed, safeguards in
place prior to the breach, and actions taken following the breach, including notifications sent
to impacted individuals and remedial actions. In 2022, the U.S. enacted the Cyber Incident
Reporting for Critical Infrastructure Act (CIRCIA). This law requires companies considered
part of the U.S. critical infrastructure to report substantial cybersecurity incidents to the

103
Cybersecurity and Infrastructure Security Agency (CISA) of the U.S. Department of Homeland
Security within 72 hours and to report ransomware payments within 24 hours.

While not specifically a data breach notification obligation, the Securities and Exchange Act
and associated regulations, including Regulation S-K, require public companies to disclose in
filings with the SEC when material events, including cyber incidents, occur. To the extent
cyber incidents pose a risk to a registrant’s ability to record, process, summarise and report
information that is required to be disclosed in SEC filings, management should also consider
whether there are any deficiencies in its disclosure controls and procedures that would render
them ineffective. In early 2022, the SEC proposed rules to solidify the guidance the SEC has
issued over the years regarding cybersecurity disclosures. Those rules were set to receive final
approval in April 2023, but to date the rules have not been finalised.

Some state statutes require the reporting of data breaches to a state agency or Attorney General
under certain conditions. The information to be submitted varies by state but generally includes
a description of the incident, the number of individuals impacted, the types of information
exposed, the timing of the incident and the discovery, actions taken to prevent future
occurrences, copies of notices sent to impacted individuals, and any services offered to
impacted individuals, such as credit monitoring.

16.3 Is there a legal requirement to report data breaches to affected data subjects? If
so, describe what details must be reported, to whom, and within what timeframe. If no
legal requirement exists, describe under what circumstances the relevant data protection
authority(ies) expect(s) voluntary breach reporting.

At the federal level, HIPAA requires covered entities to report data breaches to impacted
individuals without unreasonable delay, and in no case later than 60 days. Notice should
include a description of the breach, to include: the types of information that were involved; the
steps individuals should take to protect themselves, including who they can contact at the
covered entity for more information; as well as what the covered entity is doing to investigate
the breach, mitigate the harm, and prevent further breaches. For breaches affecting more than
500 residents of a state or jurisdiction, covered entities must provide local media notice, in
addition to individual notices.

104
As of May 2018, all 50 states, the District of Columbia, Guam, Puerto Rico and the U.S. Virgin
Islands have statutes that require data breaches to be reported, as defined in each statute, to
impacted individuals. These statutes are triggered by the exposure of personal information of
a resident of the jurisdiction, so if a breach occurs involving residents of multiple states, then
multiple state laws must be followed. Most statutes define a “breach of the security of the
system” as involving unencrypted computerised personal information, but some states include
personal information in any format. Triggering personal information varies by statute, with
most including an individual’s first name or first initial and last name, together with a data
point, including the individual’s Social Security Number, driver’s licence or state identification
card number, financial account number or payment card information. Some states include
additional triggering data points, such as date of birth, mother’s maiden name, passport number,
biometric data, employee identification number or username and password. The standard for
when notification is required varies from unauthorised access to personal information, to
unauthorised acquisition of personal information, to misuse of or risk of harm to personal
information. Most states require notification as soon as is practical, and often within 30 to 60
days of discovery of the incident, depending on the statute. The information to be submitted
varies by state but generally includes a description of the incident, the types of information
exposed, the timing of the incident and its discovery, actions taken to prevent future
occurrences, information about steps individuals should take to protect themselves, information
resources, and any services offered to impacted individuals such as credit monitoring.

16.4 What are the maximum penalties for data security breaches?

Penalties are statute- and fact-specific. Under HIPAA, for example, monetary fines can range
from US$100 to US$50,000 per violation (or per record), with a maximum penalty of US$1.75
million per year for each violation. By way of example, in 2020, the HHS and the Attorneys
General of 42 states entered into a US$39.5 million settlement with a health insurer in relation
to a data breach affecting the health records of over 79 million individuals. Marking the current
high point for enforcement, in 2019, a company agreed to pay a record penalty of at least
US$575 million, and potentially up to US$700 million in a data breach settlement reached with
the FTC, the CFPB, 48 states, the District of Columbia, and the Commonwealth of Puerto Rico.

105
17. Enforcement and Sanctions
17.1 Describe the enforcement powers of the data protection authority(ies).

The U.S. does not have a central data protection authority. As such, the enforcement powers of
the regulators will depend on the specific statute in question. Some laws only permit federal
government enforcement, some allow for federal or state government enforcement, and some
allow for enforcement through a private right of action by aggrieved consumers. Whether the
sanctions are civil and/or criminal depends on the relevant statute. For example, HIPAA
enforcement permits the imposition of civil and criminal penalties. While HIPAA’s civil
remedies are enforced at the federal level by HHS, and at the state level by Attorneys General,
USDOJ is responsible for criminal prosecutions under HIPAA. At the state level the CPPA
has the power to enforce consumer rights and business obligations under the CPRA.

a. Investigative Powers: Depending on the applicable data protection laws,


regulators in the U.S. may have the authority to conduct investigations into
potential violations of data protection requirements.
b. Corrective Powers: Depending on the applicable data protection laws, regulators
in the U.S. may have the authority to correct non-compliance actions of businesses
through injunctive relief or under consent orders.
c. Authorisation and Advisory Powers: Depending on the applicable data protection
laws, regulators in the U.S. will often provide a method for businesses to consult
with the regulators for additional and specific guidance.
d. Imposition of administrative fines for infringements of specified GDPR
provisions: This is not relevant for our jurisdiction.
e. Non-compliance with a data protection authority: Depending on the applicable
data protection laws, non-compliance with a data protection authority will
generally attract renewed or additional enforcement against the business.

17.2 Does the data protection authority have the power to issue a ban on a particular
processing activity? If so, does such a ban require a court order?

The U.S. does not have a central data protection authority. Enforcement authority, including
whether a regulator may ban a particular processing activity, is specified in the relevant

106
statutes. For example, 18 states have adopted the Insurance Data Security Model Law
developed by the National Association of Insurance Commissioners. Among other things,
these laws empower state insurance commissioners to issue cease-and-desist orders pertaining
to data processing violations in the insurance industry, and even to suspend or revoke an
insurance institution’s or agent’s licence to operate. The FTC may also prohibit a particular
company from engaging in a particular processing activity through a negotiated consent decree
as part of a settlement.

17.3 Describe the data protection authority’s approach to exercising those powers,
with examples of recent cases.

In the U.S., this depends on the relevant statutory enforcement mechanism and the agency
conducting the enforcement measures. The FTC, for example, in addition to publishing on its
website all of the documents filed in FTC cases and proceedings, publishes an annual summary
of key data privacy and data security enforcement actions and settlements, which provides
guidance to businesses on its enforcement priorities. Similarly, HHS publishes enforcement
highlights, summarises the top compliance issues alleged across all complaints and, by law,
maintains a website that lists mandatorily reported breaches of unsecured protected health
information affecting 500 or more individuals. By way of an example, in 2022, the FTC
entered into a consent decree that required an online marketplace to destroy improperly
obtained or unnecessary data, limit future data collection, and implement an information
security programme. Such requirements are commonplace in FTC consent decrees.

17.4 Does the data protection authority ever exercise its powers against businesses
established in other jurisdictions? If so, how is this enforced?

Extraterritorial enforcement of a U.S. law would depend on a number of factors, including


whether the entity is subject to the jurisdiction of the U.S. courts, the impact on U.S. commerce
and the impact on U.S. residents, among other factors.

18. E-discovery / Disclosure to Foreign Law Enforcement Agencies


18.1 How do businesses typically respond to foreign e-discovery requests, or requests
for disclosure from foreign law enforcement agencies?

107
When made pursuant to Mutual Legal Assistance Treaties, information requests are typically
processed through the USDOJ, which works with the local U.S. Attorney’s Office and local
law enforcement, prior to review by a federal judge and service on the U.S. company. In
addition, under the Clarifying Lawful Overseas User of Data (CLOUD) Act, businesses may
also receive requests for electronic communications, including personal data within its
possession, custody, or control directly from foreign governments and agencies that maintain
agreements with the U.S., without regard to where the business stores such data.

18.2 What guidance has/have the data protection authority(ies) issued?

This is not applicable in the USA.

19. Trends and Developments


19.1 What enforcement trends have emerged during the previous 12 months? Describe
any relevant case law or recent enforcement actions.

In 2022, the FTC announced a more aggressive approach to enforcement with regard to data
privacy and cybersecurity. The FTC’s recent activity is consistent with that approach.

For example, in one landmark case in 2023, the FTC announced a joint proposed settlement of
its enforcement action against a telehealth and prescription drug retailer for allegedly sharing
sensitive personal information with advertising companies and platforms without notice to or
authorisation from its customers. Under this order, the company is permanently prohibited
from sharing users’ health information with advertisers. The order also marked the first time
the FTC has brought an enforcement action under the Health Breach Notification Rule
(HBNR).

In addition, in August 2022, the FTC sued a company providing app analytics for its alleged
sale of geolocation data from hundreds of millions of mobile devices. The FTC alleged that
the company allowed anyone to obtain a significant sample of sensitive personal data with no
limitations on how that data was used. The litigation is pending.

108
In December 2022, the FTC announced its largest fine ever, in a settlement with a video game
company for violating the COPPA Rule alleging that it failed to obtain parental consent prior
to knowingly collecting personal information from children under the age of 13.

In addition, in 2022 the SEC also increased its cybersecurity enforcement activity, including
doubling the size of its Enforcement Division’s Cyber and Crypto Assets Unit in May
2022. For example, in September 2022, an investment bank agreed to pay a US$35 million
penalty to the SEC for the firm’s improper data disposal that risked the personal identifying
information (PII) of approximately 15 million customers.

Notable litigation in 2022, includes an October 2022 decision finding a ride-sharing company’s
Chief Security Officer guilty of obstruction of justice and misprision of a felony relating to an
attempted cover-up of a 2016 data breach. This result of this case should signal to executive
management the importance of a Company’s prompt and informed response to a data breach,
including the involvement of executive management.

State Attorneys General also played a key role in U.S. data privacy enforcement environment
under specific U.S. state laws in 2022. For example, in October 2022, an online clothing
retailer agreed to pay US$1.9 million in penalties to New York after two of its e-commerce
brands suffered a data breach in 2018 in which hackers stole personal data from 46 million
accounts. The company also agreed to maintain a comprehensive, written “Information
Security Program”, improve customer password management consistent with National
Institution for Standards in Technology’s standards, monitor network activity, run quarterly
vulnerability scans, and maintain an incident response plan. In addition, for five years the
company will also provide the New York Attorney General with copies of assessments of the
company’s information security and “Payment Card Industry” compliance.

As addressed above, the California Attorney General also announced a landmark settlement of
US$1.2 million relating to an online retailer’s alleged failure to disclose its practice of selling
consumer personal information. Notably, the California Attorney General determined that the
Company’s practice of permitting third parties to use tracking tools (i.e. cookies) to track users
across websites, amounted to a “sale” of personal information warranting addition consumer
rights and controls.

109
Finally, class action litigation under BIPA persisted in 2022. Notably, the first BIPA class
action to go to trial resulted in a judgment of US$228 million. A class of over 45,000 truck
drivers claimed that a railroad company violated BIPA (1) by not informing class members that
it was collecting or storing their biometric identifiers or information prior to collection, (2) by
not informing class members of the specific purpose and length of term for which it was
collecting their biometric identifiers or information, and (3) by not obtaining class members’
informed written consent prior to collection. The jury ultimately found that the company
recklessly or intentionally violated BIPA 45,600 times, with each violation coming with
statutory damages of US$5,000. Other large BIPA settlements in 2022, included amounts of
US$35 million with a photo-sharing platform, US$92 million with a social media platform, and
US$100 million with the provider of an online photo service.

19.2 What “hot topics” are currently a focus for the data protection regulator?

We anticipate that the following topics will remain hot over the next year: state-level consumer
data privacy law initiatives as more states move laws through their legislatures, possibly
driving action at the federal-level, including possible rulemaking proceedings by the FTC;
issues surrounding the collection and protection of biometric information (especially in relation
to student privacy); issues surrounding the privacy and security of healthcare data; consumers’
access to financial relief and other remedies when their data protection rights are violated, even
in the absence of a showing of harm; issues surrounding AdTech and targeted behavioural
advertising; issues relating to automated decision making fuelled by artificial intelligence and
machine learning; an increased focus by legislators and regulators alike on cybersecurity issues,
particularly in the wake of data breaches and ransomware attacks involving significant
technology vendor software and industrial operations; and targeting of cryptocurrency and
digital assets such as non-fungible tokens by cybercriminals.

110
DATA PROTECTION LAWS IN INDIA
Since the Supreme Court of India declared the "right to privacy" as a fundamental right in a
landmark 2017 judgment and urged the national government to establish a data protection
regime, policymakers have worked toward passing central legislation to protect privacy. And
on 11 Aug., India finally achieved this goal with the enactment of the Digital Personal Data
Protection Act.

The DPDPA replaces a set of rules made under section 43A of the Information Technology
Act, 2000 — which superficially resemble a data protection law, with a nonfunctioning
enforcement system and no reported cases to date.

In crafting the DPDPA, the Indian government reviewed established privacy frameworks in
other countries including the EU General Data Protection Regulation, whose influence is
evident through some of the legal concepts in the act. That said, while individual data privacy
and consumer rights lie at the heart of the GDPR, and similar data protection laws elsewhere,
the DPDPA appears to have also been driven by India's concerns around national security and
other political issues. This may explain the unique and distinct features of the act that depart
from the GDPR and similar data privacy regimes.

Scope

The DPDPA covers any entity that processes digital personal data within Indian territory. Data
in nondigitized forms are excluded. The act also imposes extraterritorial jurisdiction and covers
data processed outside of India, if done with the intent to offer goods and services to individuals
within India.

However, the act differs from the GDPR by excluding from its purview the profiling of data
subjects from outside the territory of India if not in connection to providing any good or service
to the data subject. For instance, profiling from outside of individuals located in India for
statistical purposes may not trigger any obligations of data processing entities under the act.

Key definitions

111
"Data fiduciary" is defined as any person who, alone or in conjunction with other persons,
determines the purpose and means of processing of personal data. This concept is directly
borrowed from the GDPR.

"Data principal" is an individual to whom personal data relates. Where such an individual is a
child, the term includes their parent or lawful guardian. Where the individual is disabled, it
includes their lawful guardian acting on their behalf.

"Data processor" is defined as any person who processes personal data on behalf of a data
fiduciary. Notably, unlike the GDPR, the act does not impose such obligations directly on the
data processor. The act instead expects data fiduciaries to ensure compliance by data processors
they engage through data processing agreements.

Special category of data and the significant data fiduciary

In a clear departure from the GDPR and the previous rules, which both categorize data based
on sensitivity, the DPDPA applies uniformly to all types of personal data — defined as "any
data about an individual who is identifiable by or in relation to such data."

In what might come as good news to covered entities, the DPDPA does not impose additional
obligations on data processing entities that process sensitive personal data (as identified under
the rules) or critical personal data, as was proposed in an earlier draft of the law. Neither does
it refer to any special category of data expressly mentioned in the GDPR, such as racial or
ethnic origin, political opinions, or sexual orientation, which require heightened protection
under the European regulation.

However, companies do need to consider whether they are a "significant data fiduciary," as
these data processing entities have a higher compliance burden. Significant data fiduciaries are
classified as such based on volume and sensitivity of the personal data and other prescribed
criteria. This means companies routinely dealing with sensitive or large volumes of personal
data are likely to be classified as such, and so, should particularly focus on reviewing their data
privacy practices ensuring compliance with the act.

Who and what is exempted?

112
Besides excluding from its application the processing by an individual for personal or domestic
purposes, the DPDPA also specifically excludes most publicly available personal data, as long
as it was made public: by the data principal (for example, views made public by a social media
user); or by someone else under a legal obligation to publish the data (such as personal data of
directors that regulated companies must publicly disclose by law). The first form of publicly
available information appears to permit external companies to scrape the data from social
networks and process it.

The act also exempts the processing of personal data necessary for research or statistical
purposes, which is an extremely broad exception. But the act will still apply to such processing
if research or statistical activity is used to make "any decision specific to the data principal."

Moreover, the act provides broad exceptions for government entities, while also exempting
processing for specific purposes, such as activities that are in the interest of the sovereignty
and integrity of India, security of the state, friendly relations with foreign states, maintenance
of public order, and the prevention of incitement to commit crimes. But these subsequent
exceptions require notice to the government to be available.

Finally, in a provision appearing to promote new businesses, the DPDPA's Section 17(3)
empowers the government to exempt any category of data fiduciaries from certain or all
compliance obligations under the act, while categorically referring to "startups" as one such
class or business which may be exempted.

Grounds for processing

The DPDPA hinges on consent as grounds for processing personal data, although additional
narrowly defined or situation-based lawful grounds are also available. These are defined as
"certain legitimate uses" listed under Section 7, and among the most likely to be relevant to the
private sector are: specified purposes for which the data principal has voluntarily provided
her/his personal data, and has not indicated their objection to use such personal data for that
purpose; fulfilment of any legal/judicial obligations of a specified nature; medical emergencies
and health services, breakdown of public order; and employment.

113
Notably, the act does not include "contractual necessity" and "legitimate interests," which
appear in the GDPR and developed data protection laws elsewhere as legal grounds for data
processing — and are probably the most common grounds for processing utilized by
organizations today, particularly global companies that treat the GDPR as the gold standard to
process personal data. The lack of these as express grounds for processing may pose a serious
challenge to businesses, especially large organizations already relying on these grounds to
process personal data for routine or necessary business operations.

Consent and notice

Like the GDPR, the DPDPA requires that consent for processing of personal data must be "free,
specific, informed, unambiguous and unconditional with a clear affirmative action." Further,
the consent should be limited to such personal data as is necessary for the specified purpose in
the request for consent. In practice, this may mean that data fiduciaries cannot not rely on
"bundled consent."

The notice for consent must inform the data principal of the personal data to be processed and
the purpose for which such data is to be processed; the manner in which the individual may
exercise their rights under the act; and the manner in which the data principal may make a
complaint to the data protection board of India. Importantly from an operational perspective,
where a data principal has given consent to processing prior to the act, the data fiduciary needs
to provide notice with the said details "as soon as it is reasonably practicable."

In what is perhaps one of the most important rights from the perspective of data subjects,
similar to the GDPR, data principals have a right to withdraw their consent at any time and data
fiduciaries are required to ensure that withdrawing consent is as easy as giving consent. Once
consent is withdrawn, personal data must be deleted unless a legal obligation to retain data
applies. Additionally, data fiduciaries must ask any processors to stop processing the data for
which consent has been withdrawn.

The right to access information about personal data

The act permits a data principal to seek the following information from a data fiduciary:

114
A summary of personal data being processed, and the processing activities being undertaken
by the data fiduciary.

The identities of all third parties with whom the data fiduciary has shared such personal data,
with a description of the personal data that has been shared.

India's government may prescribe additional information that data fiduciaries will be obliged
to share with data principals upon an exercise of their access rights.

The act does not prescribe the granularity of the information data fiduciaries must make
available to data principals, nor does it prescribe the modalities of making such information
available. At this stage, it is unclear whether data fiduciaries will be required to offer data
principals copies of the information or whether a central portal that enables data principals to
view information about the processing of their personal data would suffice. Broadly, data
fiduciaries will require data inventories that map how a data principal's personal data is stored
within their organizations, details of each third-party data is shared with and purposes for
processing.

Unfortunately, the way this right to access has been structured under the act is narrow.
Effectively, a data principal can only exercise this right if the data fiduciaries rely on their
consent to process personal data.

While consent is the primary grounds to process personal data under the act, and most
businesses do rely on consent for day-to-day data processing in business operations, the act
offers additional grounds for the processing of personal data. For instance, data fiduciaries may
process personal data for employment purposes or to comply with judgments, decrees and court
orders. In such cases, where consent is not the grounds for processing personal data, data
principals will have no access rights, thereby limiting the usefulness of this right.

The act also exempts data fiduciaries from sharing about personal data that may be transferred
to other data fiduciaries, including the government and state agencies, for purposes relating to
the prevention, detection, or investigation of offences or cybersecurity incidents. From an EU-
India and U.K.-India data-transfer perspective, data principals will not be notified of or have
the legal right to confirm that their personal data is subject to interception or is being transferred
to government bodies. This limits their ability to challenge such interception or access, raises
concerns about an effective grievance redressal process in India, and challenges EU and U.K.

115
standards of data protection. Effectively, data transfers to India post-'Schrems II' remain
challenging and will require additional safeguards.

The right to correction and erasure of personal data

Data principals have the right to correct inaccurate or misleading personal data that data
fiduciaries may process about them, complete any incomplete personal data and update
personal data. Correspondingly, where data fiduciaries choose to use personal data to make
decisions about data principals or otherwise share personal data with third party data
fiduciaries, they have an obligation to ensure the personal data they process is complete,
accurate and consistent.

Data principals also have the right to seek the erasure of their personal data. In such instances,
as well as in cases where data principals withdraw consent initially provided for the processing
of their personal data, data fiduciaries will be obliged to erase such personal data, unless
retention is necessary for the specified purpose for which it was processed or for compliance
with applicable laws. At this stage, the law is silent on whether personal data may be retained
after an exercise of the right to erasure for the establishment, exercise or defense of legal claims.

Practically, data fiduciaries have a three-fold responsibility. First, data fiduciaries must employ
systems that enable data accuracy principles, like offering data principals verification
mechanisms to recheck and confirm data sets where data is sourced directly from individuals.
Second, they must use technical tools that enable effective correction, completion, updating or
erasure of personal data. These tools should permit data fiduciaries to ensure parties with which
such personal data has shared comply with such requests, as well. Finally, data fiduciaries must
evolve complex data-retention strategies that can demonstrate adequate justifications for data
retention.

As with the right to access information about personal data, the rights to correction and erasure
of personal data only apply if a data fiduciary relies on consent as a basis of processing personal
data. India's government is expected to expand the scope of rights available to data principals
where data fiduciaries rely on grounds other than consent to process personal data. Separately,
the government will prescribe the modalities of exercising such rights.

The right of grievance redressal

116
Data principals have the right of grievance redressal in relation to a businesses' processing of
their personal data. From an enforcement perspective, aggrieved data principals will be
required to extinguish all grievance redressal processes before approaching the Data Protection
Board of India, established under the act, to file complaints.

Data fiduciaries, therefore, have an opportunity to create effective and tiered redressal
mechanisms. As a part of this process, such entities will be required to appoint grievance-
redressal officers to front-end relationships with aggrieved individuals and adopt internal
standard operating procedures for resolution, escalation and workflows.

The right to nominate

Data principals have the right to nominate other individuals to act on their behalf in the event
of their death or incapacity. An incapacity can include any unsoundness of mind or body. The
act does not permit an individual to exercise rights on behalf of another individual in any other
case besides death or incapacity.

As with most aspects of the law, the modalities of how this right will be exercised and
implemented, including questions on whether powers of attorney will suffice, paperwork and
verification processes for nominees, will be prescribed by the government.

Right to withdraw consent

Where consent is the grounds for processing personal data, the right to withdraw consent must
be provided to data principals. The way consent-withdrawal processes are structured should be
as simple as the way consent requests are made available to data principals.

The manner of withdrawal of consent must also be communicated to data principals in the
privacy notice accompanying consent requests. Upon withdrawing consent, all data processing
occurring on the basis of such withdrawn consent must cease, unless processing is permitted
on another grounds under the act or under the provisions of another law.

117
The act is also clear that any consequences resulting from the withdrawal of consent must be
borne by the data principal — indicating businesses may stop offering goods and services to
individuals once consent is withdrawn.

Children's data

Where a data principal is below the age of 18, the act includes their parent or guardian within
the definition of the term "data principal." It is unclear whether a minor data principal will be
permitted to exercise rights under the act or whether their parent or guardian will have to act
on their behalf.

The mechanisms data fiduciaries will have to adopt to establish the identities of parents or
guardians and verify their relationships with the minor data principal, as well as the question
of whether both parents can act on behalf of a child and, if so, processes for resolutions in the
event of conflicting exercises of rights, are also unclear. However, the government is expected
to prescribe detailed rules on how children's personal data will be treated under the act.

Exceptions

The act offers sweeping exceptions, which may dilute a data principal's ability to effectively
exercise their rights. Data principals have no rights regarding the processing of personal data
they choose to make publicly available, as well as personal data processed for research, archival
or statistical purposes that are not used to make decisions specific to data principals.

Data principals may, therefore, find it almost impossible to exercise rights in respect to large-
scale data mining and processing for the training of artificial intelligence and machine learning
tools.

Employees will be unable to exercise rights against employers to seek information, correction
or erasure of their personal data. Most state-based processing is also exempt from the act's
provisions.

Separately, the act applies extraterritorially. While a data fiduciary that undertakes any
processing in India is subject to the law, certain exemptions apply regarding processing
undertaken by Indian companies involving non-Indian data principals on the basis of contracts

118
with non-Indian persons. Practically, these non-Indian data principals would be unable to
exercise rights with regards to such Indian companies.

Additionally, data fiduciaries are under no obligation to recognize data principals' rights where
the underlying processing is for:

• Enforcement of a legal right or claim.


• Prevention, detection, investigation or prosecution of any offense.
• Mergers, amalgamations or restructuring approved by relevant courts in India.
• Ascertaining financial information of individuals who are loan defaulters.

Data principal duties

In a first, data principals are subject to certain duties. For example, they are obligated to comply
with the act, not impersonate other data principals, not suppress material information while
providing personal information for government identifiers and other documents issued by the
state or its agencies, not register false or frivolous complaints, and provide only verifiably
authentic information when exercising the right to correction or erasure. Penalties for
noncompliance include fines of up to INR10,000. It appears duties were imposed on data
principals to mitigate vexatious complaints.

Obligations of Data Processing Entities

India's data privacy law, the Digital Personal Data Protection Act, is unique in that it eschews
the EU General Data Protection Regulation's model of data privacy legislation in favor of a
simpler, less prescriptive law.

Consequently, the DPDPA leaves out several concepts found in the European regulation,
including some details that make the GDPR more comprehensive legislation.

One area in the DPDPA where we see little regulation relates to data processors, with only a
handful of provisions on the topic. The act defines a data processor as anyone who processes
personal information on behalf of a data fiduciary, the term used under the law to refer to a
data controller. Correspondingly, a data fiduciary is defined as any person who "alone or in
conjunction with other persons determines the purpose and means of processing of personal
data."

119
The law is focused almost entirely on data fiduciaries, including fulfilling rights of data
principals related to access, correction and deletion of personal information. Provisions related
to special protections for personal data of children, implementing a procedure for addressing
grievances of data principals and several other provisions are applicable to data fiduciaries.

The act requires a valid contract to be in place under which personal information is transferred
to a data processor. Significantly, the data fiduciary is responsible for ensuring the data
processor's compliance with the act. As such, it may be interpreted that the act does not cast an
obligation directly on data processors but instead on the data fiduciary.

This means if there is a violation of the act by the data processor, it is possible only the data
fiduciary will be held liable. However, this is not entirely clear. In the schedule for penalties,
only two provisions refer specifically to the obligation of the data fiduciary, including the
requirement to maintain reasonable security safeguards. The penalty provision refers to a
"person," rather than a data fiduciary. Though individual data principals can also be held liable
for failure to observe their obligations under the law, these provisions create some doubt about
liability being imposed only on the data fiduciary.

The data fiduciary should be extra cautious in negotiating contracts with data processors, as
the data fiduciary must assume they will be held liable for any violation by the data processor.
As such, the data fiduciary will want to include an indemnity clause holding the data processor
liable for any penalties paid by the data fiduciary due to violations of the act by the data
processor.

Significantly, this means the data fiduciary will initially be liable for violations by data
processors. Moreover, the Data Protection Board of India or the appellate authority may
conclude a violation has occurred, but may not allocate the degree of blame between the two
relevant parties — the data fiduciary and the data processor — requiring litigation before Indian
courts in the absence of an arbitration clause. This could potentially involve substantial
evidentiary proceedings to determine who was responsible and to what extent.

It should also be highlighted the act does not deal specifically with situations of multiple data
fiduciaries or joint data fiduciaries as the GDPR does. Going by the definition of a data
fiduciary , multiple data fiduciaries may exist if multiple entities determine the means and
purpose of data processing. In this case, a party that processes personal data on behalf of a data
fiduciary may be a data fiduciary, not a data processor, and would be directly liable. This

120
normally occurs when the concerned entity determines the purpose and means of processing
independently from the data fiduciary.

In India, where data privacy compliance is still relatively nascent, data fiduciaries may have a
heightened sense of fear about the likely consequences of privacy law violations by the data
processor. Liability aside, it may be more important for data fiduciaries to ensure data
processors simply do not violates the act. Hence, data fiduciaries may need to impose strict
standards on data processors, including periodic audits. This could also increase the costs of
outsourcing.

A global impact

India plays a key role in the digital economy with is its huge outsourcing and offshore industry.
The country processes a significant part of the world’s data. So how does the DPDPA apply to
data processors in India?

One of the act's provisions exempts most personal data of people outside India that are
processed in India under a cross-border contract. This means most of the law does not apply to
personal data of people outside India that is processed in India.

This may initially raise eyebrows — after all, one of the reasons for having a data privacy law
is to ensure personal data is protected in India. However, when personal data is collected in the
country of the data subject, it is done so under that country's laws. Applying the law of the
processor would lead to confusion, especially where the laws of the data processor are
substantially different.

To illustrate, given the varying bases for collecting personal data under the GDPR and the
DPDPA, a controller in the EU may collect and process personal data under legitimate interest
whereas, under India’s law, the same controller would need to obtain consent. A key provision
in this regard is the requirement that the data fiduciary ensure reasonable security safeguards
are in place to protect personal data. Consequently, if there is negligence in safeguarding
personal data, the legal system of the data processor is well-placed to hold the data processor
responsible.

Some clarification may still be necessary with respect to the web of provisions related to the
personal information of people outside India. Two provisions of the act impose important

121
obligations on the data fiduciary: mandating compliance with the act, including involving
processing on its behalf by a data processor, and requiring the establishment of reasonable
security safeguards. Whether an Indian data processor's failure to maintain reasonable security
safeguards could result in liability on the foreign data fiduciary rather than on the processor
remains to be seen.

EU controllers need to consider issues arising from the 'Schrems II' judgment, while allowing
processing of personal information in India. This has not been a serious challenge so far, though
there are some concerns, chiefly the lack of independent oversight of government surveillance.
The lack of specificity could also mean it can call for large databases of personal information
if it so pleases. The act gives the government very broad power to call for any information from
a data fiduciary or intermediary, although this power may be limited by language that suggests
such power is to be exercised only for the purposes of the act. Moreover, even though the law
does not apply directly to the data processor in India, and the data fiduciary is very likely to be
a customer outside India, a data processor could be considered to be an intermediary and still
be subject to the exercise of this power.

Further, protections relating to government surveillance under existing telecommunications


and information technology laws are not replicated in the act. Once the law comes into force
and Indian industry faces challenges relating to its interpretation, these issues will hopefully be
sorted out. There is some concern, though, that India's apparent data protection authority, the
Data Protection Board of India, is merely an adjudication body without regulatory powers, to
issue guidance notes for example, as the European Data Protection Board does. This may
hamper India's ability to develop its own jurisprudence around the new legislation concerning
these and other issues.

Implementation challenges

The DPDPA is far more exhaustive than the existing data protection framework covered under
the Information Technology Act, 2000 and the Information Technology (Reasonable Security
Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. As a result,
like with the implementation of any new legislation, organizations are likely to face several
changes.

122
One is the lack of clarity on the time frame for implementation. Despite the government's
indicated time frame, privacy professionals, especially at larger organizations, may desire more
clarity. They will need to understand further compliance requirements that could be imposed,
how and when such provisions will take effect, and the extent to which they should edit existing
privacy programs before taking significant steps to transition.

Heavy compliance costs, both in terms of manpower and resources, are another major cause of
concern, especially for smaller organizations that may need to onboard technology and
resources to ensure compliance. Organizations will likely also need to update technology or
procure software to aid in compliance with several of the act's requirements, like data
portability, consent mechanisms and provision of the right to data erasure.

Many of these requirements become even more challenging for organizations that use emerging
technologies, like artificial intelligence and blockchain, as part of business offerings where
personal data is inadvertently processed.

Authority and grievance redressal process

Any grievance raised by a data principal in relation to data processing must first be addressed
by the internal grievance redressal mechanism adopted by a data fiduciary. If this fails, the Data
Protection Board of India is vested with the powers to receive and investigate complaints raised
by data principals.

The data protection board, established as an independent supervisory authority under the act,
will serve as a "digital office," the first of its kind in India. The board is proposed to be led by
a chairperson and will have government-appointed members serving two-year renewable
terms.

The board has been bestowed broad powers to initiate inquiries, investigate complaints, impose
fines and penalties, and take other actions as required upon receiving a complaint from a data
principal, consent manager, the government or an intimation from the data fiduciary itself. The
board has also been granted the power to refer disputing parties to mediation and accept
voluntary undertakings from data fiduciaries to take, or refrain from certain actions, as
settlement. The act, however, does not confer the board with any lawmaking powers to issue
directions or regulations.

123
Broad powers of the government

The government has been granted broad powers and discretion to stipulate rules, prescribe the
manner and timelines for data fiduciaries to respond to requests from data principals, authorize
details relating to data-breach notifications, adopt delegated legislation, formulate the
requirements of valid notice for obtaining a data principal's consent for processing data, and
other actions required for implementation.

Besides the foregoing, the act also empowers the government to request access to any
information from a data fiduciary, any entity processing personal data, an intermediary (as
defined by the IT Act) or from the board. This authority is extremely broad and is subject to
fewer restrictions than those provided for under the existing IT Act and SPDI Rules. Further,
the government is also empowered to order or direct any government agency and intermediary
to block information from public access "in the interests of the general public," after the board
sanctions the concerned data fiduciary at least two times and advises the government to issue
such an order.

Appellate body

While the board has been granted the powers of a civil court under the Code of Civil Procedure,
1908, with respect to the powers to summon and enforce the attendance of any person, receive
affidavits, require discovery and produce and inspect documents, the act expressly forecloses
individuals' access to civil courts for relief under the law. It will be interesting to see how this
interplays with the Supreme Court decision that found citizens have a fundamental right to
privacy under Article 21 of India's constitution. The act instead grants any person aggrieved by
an order of the board the right to file an appeal before the Telecom Disputes Settlement and
Appellate Tribunal.

While the TDSAT derives authority from the Department of Telecommunications, the Ministry
of Electronics and Information Technology spearheaded the adoption of the act. Accordingly,
given that the TDSAT was originally set up to handle disputes pertaining to
telecommunications and information technology — in contrast to the board which is proposed
to be constituted purely to regulate the processing of digital personal data in India — the act's

124
appeals process begs the question of whether the TDSAT is the right appellate body to handle
data privacy appeals.

Further, while Section 43A of the IT Act will be repealed when the DPDPA comes into effect,
the rest of the IT Act's provisions remain in force. As a result, in case of a data breach where
multiple provisions of the IT Act are triggered, a data principal or any impacted party may
indulge in forum shopping by seeking recourse from tribunals/authorities that are most likely
to provide favorable outcomes. This may lead to confusion and conflict among affected parties
and regulatory authorities.

That said, the Cyber Appellate Tribunal, which was the appellate body under the IT Act for
certain notified matters, was merged with the TDSAT in 2017. Accordingly, the TDSAT may
be the logical choice to entertain appeals of decisions passed by the board. This, however, does
not address the concern that the primary role of the TDSAT has historically been to serve as
the appellate body for telecom disputes.

Penalties

The DPDPA stipulates varying penalty amounts depending on the violation. A data fiduciary
may be fined a penalty of INR50 crore (approximately USD6 million) for breach of any
provision of the act or the implementing rules for which no specific penalty is stipulated, and
up to INR250 crore (approximately USD30 million) for failure to fulfill the obligation to take
reasonable security safeguards to prevent a personal data breach.

The act also sets out general parameters that may be considered to determine the appropriate
penalty, such as the nature, gravity and duration of the breach type and nature of personal data
affected, the repetitive nature and implications of the breach, among others. Under Section 43A
of the IT Act, a company breaching its obligations in respect of personal data, thereby causing
wrongful loss or gain, is liable to pay damages to the affected individual. The data principal's
right to receive compensation has been done away with under the act. However, given that the
board has been granted extensive powers to issue directions to discharge its functions under
the act, including the powers vested in a civil court, it is unclear whether such powers extend
to granting compensation to data principals.

125
Further, the government retains broad powers to implement additional rules. Accordingly, it
remains to be seen whether any rules or directions related to compensation of data principals
will be implemented when the act comes into effect.

Additionally, unlike the E.U. General Data Protection Regulation and the California Consumer
Privacy Act, the DPDPA permits the board to levy penalties on data principals, to ensure they
do not take undue advantage of any noncompliance under the law that may be attributable to
their own action. The board can prescribe a penalty of up to INR10,000 (approximately
USD120) on a data principal if they fail to perform duties stipulated under the act.

Data localization

In Section 40 of the first draft of India's data protection legislation, the Personal Data Protection
Bill, 2018 required a copy of all personal data to be stored within the territory of India at all
times. A wide cross-section of stakeholders stridently objected to this data localization
approach, as it fundamentally disrupts the way business is conducted online.

As a result, subsequent versions of the law, namely Section 17 of the Digital Personal Data
Protection Bill, 2021, progressively diluted this stance. The first step in this direction was
permitting transfers of personal data to certain "whitelisted" countries that the government
would separately provide.

This was further diluted in the version that was finally enacted into law. DPDPA Section 16
permits personal data to be freely transferred to all countries or territories outside India, except
those the central government specifically identifies.

With that, a law that initially looked like it would impose an absolute prohibition on the cross-
border transfer of data ended up taking a "blacklist" approach to cross-border data transfers,
limiting them to specifically enumerated countries or territories.

Basis for permitted data transfers

Under the GDPR, data transfers are permitted where the jurisdiction or entity receiving the data
offers a level of protection that is sufficient to safeguard the personal data of the European

126
residents whose data is being transferred. Thus, data transfers are permitted to countries the
European Commission determined ensure an adequate level of protection in Article 45, or
between entities in jurisdictions that are subject to binding corporate rules in Article 47 or
appropriate safeguards in Article 46. These provisions set out the principles according to which
the permissibility of cross-border data transfers can be evaluated.

The DPDPA, on the other hand, offers no such basis for determination of countries to which
data transfers will be prohibited. It only states countries to which data transfers will be restricted
will be listed. There is no obligation to provide justifications of adequacy or offer any other
mechanism, equivalent to standard contractual clauses or binding corporate rules by which data
transfers may be permitted to entities in such prohibited jurisdictions.

Significant data fiduciaries

The act also introduced the concept of a significant data fiduciary — an entity that is subject to
a higher threshold of compliance on account of it processing high volumes of data, processing
high-risk data or operating in a politically sensitive industry.

It reserved the right to subject significant data fiduciaries to additional compliance


requirements determined by the central government.

That being the case, it is conceivable that the central government could use this power to restrict
significant data fiduciaries from transferring personal data outside the country or to specified
jurisdictions.

Exemptions

Even as the act lays the groundwork for country-specific restrictions on data transfers, Section
17 clarifies that these restrictions may not apply in relation to certain processing activities.
Examples of such exempted processing activities, along with indicative use cases where such
exemptions may be utilized by both the government and private entities, include:

Prevention, detection, investigation or prosecution of offences under Indian law.


127
Even if restrictions on the cross-border transfer of personal data to specified jurisdictions exist,
Indian police and law enforcement agencies will not be subject to them in relation to
international criminal investigations or extradition mandates. Arguably, private companies
could also avail this exemption when data needs to be transferred in relation to ongoing internal
investigations or fraud.

Enforcement of a legal right or claim.

Restrictions on the cross-border transfer of personal data to a specified jurisdiction will not
come in the way of transfers that are necessary to enforce legal rights, such as property disputes,
matrimonial disputes, immigration cases, financial claims, etc.

Processing pursuant to a contract with a foreign entity.

Restrictions on the cross-border transfer of personal data will not apply to any processing
pursuant to a contract with a foreign entity. This is particularly relevant to the portion of the
Indian outsourcing industry that deals primarily with non-Indian personal data, which they
process for their foreign clients.

Processing pursuant to legally approved mergers, demergers, acquisitions and other such
arrangements between companies.

Any Indian entity that enters any such arrangement with a foreign company will be able to
avail of this exemption to transfer employee information and other personal data to such foreign
company, even if it is located in a jurisdiction to which data transfers are prohibited.

Processing to ascertain the financial position of a defaulter to a financial institution.

The fact that the cross-border transfer of personal data has been prohibited to a specified
jurisdiction will not operate to prevent such transfers when financial institutions need to
ascertain financial assets and liabilities of defaulting customers.

The performance of regulatory, supervisory, or judicial functions.

Regulatory authorities will not be prohibited from transferring personal data as needed for
cross-border enforcement, regulation, or supervision, even if the jurisdictions in question are
listed as those to which data transfers are prohibited.
128
Continued application of sectoral laws

Section 16(1) of the DPDPA explicitly states restrictions providing additional requirements or
higher degrees of protection under existing laws will also continue to apply. This suggests any
restrictions set by the central government under Section 16(1) will serve as the baseline
protection across all categories of personal data, but sectoral regulators could, if they so choose,
prescribe additional restrictions or protections, as required depending on the nature of data or
the needs of the industry.

At present, India has cross-border data transfer restrictions in multiple sectors. For instance,
the country's banking regulator — the Reserve Bank of India — stipulated certain categories
of payment data, such as transaction information and customer credentials, can only be stored
within India.

Similarly, certain categories of telecommunications data, including accounting information


related to subscribers, cannot be transferred outside India. There are equivalent localization
requirements in the insurance sector. All these restrictions will continue to operate,
notwithstanding the lack of data localization obligations set out under the act.

COMPARATIVE ANALYSIS WITH THE EU GENERAL DATA


PROTECTION REGULATION AND OTHER MAJOR DATA PRIVACY
LAWS

India's soon-to-be enforced Digital Personal Data Protection Act seeks to balance individual
privacy and the country's emerging digital economy.

Unlike global data laws, the DPDPA applies only to digital personal data, excluding non-digital
personal data unless subsequently digitized. Perhaps inspired by Singapore's Personal Data
Protection Act, the DPDPA creates a broad exception for personal data made public either by
an individual or a law. Contrary to the EU General Data Protection Regulation, the act does
not exclude processing pursuant to journalistic purposes from its scope.

The DPDPA treats all personal data uniformly, without imposing heightened obligations for
sensitive personal data. Entities that determine the means and purposes of processing personal
data are termed "data fiduciaries," instead of "data controllers." Individuals identifiable by or
in relation to any data are termed "data principals," rather than "data subjects" — implying a
129
fiduciary relationship of trust in India's digital economy. Notably, in relation to children and
persons with disabilities, the act includes parents or lawful guardians under its definition of
data principals, raising questions on how overlapping rights between such data principals may
be reconciled.

The act additionally allows data principals to provide or withdraw consent through consent
managers, data-blind entities that facilitate interoperable data sharing, to enable seamless
sharing of data inter alia within India's digital public infrastructure. Consent managers will be
accountable to data principals under the act, a requirement that exists perhaps to address a
potential conflict of interest, such as in case of monetary dependence on data fiduciaries.
Consent managers may be subject to additional obligations notified through forthcoming rules.

Unlike the GDPR and the California Consumer Privacy Act, which apply certain obligations
to data processors directly, the DPDPA applies only to data fiduciaries, requiring them to
execute valid contracts with data processors. The nature of contractual protections that should
be passed on to data processors is not specified.

SCOPE AND APPLICATION


1. Territorial Scope

GDPR
Applies to:

• Organizations that have an establishment in the EU and process personal data "in the context
of" the EU establishment.

• Organizations that are not established in the EU but process personal data related to either
offering goods or services in the EU or monitoring the behavior of individuals in the EU.

DPDPA
Applies to digital personal data processed:

• Within the territory of India.

• Outside India, in connection with the offering of goods or services in India.

130
Except data security requirements, offshore entities in India are exempt from the DPDPA
when:

• The offshore entity processes personal data on behalf of a foreign data fiduciary.

• The personal data only relates to foreign data principals.

2. Subject-Matter Scope

GDPR
Applies to:

• Personal data.

• Automated processing or nonautomated processing where personal data forms part of a filing
system.

Does not apply to:

• Anonymous data.

• Personal data processed by natural persons for purely personal or household purposes.

• Processing by law enforcement and national security agencies.

DPDPA
Applies to:

• Automated and nonautomated processing of digital personal data.

• Automated and nonautomated processing of nondigitized personal data that is subsequently


digitized.

Does not apply to:

• Anonymous personal data implicitly, since applicability is limited to personal data.

• Personal data processed by an individual for purely personal or domestic purposes.

• Personal data made publicly available, either by a data principal or another person, under an
obligation of Indian law to publicize such data.

131
Except data security requirements, does not apply to processing pursuant to:

• Enforcing a legal right or claim.

• The performance of a judicial, regulatory or supervisory function.

• Prevention, detection or investigation of offences.

• Processing by an Indian data processor on behalf of a foreign data fiduciary, where the
personal data only relates to foreign data principals.

• Certain mergers and acquisitions approved by a competent authority.

• Ascertaining the assets and liabilities of any person who has defaulted in payment of a loan
or advance taken from a financial institution.

Also allows the government to additionally exempt from its scope:

• Classes of data fiduciaries, including startups, considering the nature and volume of personal
data processed.

• Government agencies in the interest of national security, public order, investigation of


offences, etc.

3. Definition of Personal Data

GDPR

Defines personal data as any information related to an identified or identifiable natural person,
the data subject. An identifiable natural person is one who can be identified, directly or
indirectly, taking "all of the means reasonably likely to be used" into account.

DPDPA

Defines personal data as information about a natural person identifiable by or in relation to


such data.

4. Definition of Sensitive Personal Data

GDPR
Defines "special categories of personal data" as personal data revealing:

132
• Racial or ethnic origin.

• Political opinions, religion or philosophical beliefs.

• Trade union membership.

• Genetic data.

• Biometric data, for the purpose of uniquely identifying a natural person.

• Health.

• Sex life or sexual orientation.

Personal data related to criminal convictions and offenses, while not special category data, is
subject to distinct rules defined by EU or member state law.

DPDPA
Treats all personal data uniformly, without separately classifying special or sensitive categories
of personal data.

5. Relevant Parties

GDPR
Controller: The natural or legal person, public authority, agency or other body that, alone or
jointly with others, determines the purposes and means of the processing of personal data.

Processor: A natural or legal person, public authority, agency or other body that processes
personal data on behalf of the controller.

Data subject: An identified or identifiable natural person.

DPDPA
Data fiduciary: Any individual, company or juristic entity that alone, or in conjunction with
another, determines the means and purposes of processing personal data.

Data processor: Any state, company, juristic entity or individual who processes personal data
on behalf of a data fiduciary.

Data principal: The natural person to whom the personal data relates. In relation to children
and persons with disabilities, the definition includes the parent or lawful guardian.

Consent manager: The DPDPA allows data principals to give, manage, review, or withdraw
their consent through a "consent manager." Consent managers are accountable to the data
principal and are to act on their behalf in such manner as is prescribed through rules.
133
Significant data fiduciaries: Classes of data fiduciaries notified by the government as
considering certain factors. These include the nature and volume of personal data processed,
the risk posed to data principals from their processing activities, risk to electoral democracy,
threat to the country's sovereignty and integrity, and security of state. Significant data
fiduciaries are subject to additional obligations under the DPDPA.

LAWFULNESS OF PROCESSING

1. General Principles

GDPR
Sets out seven principles in Article 5:

• Lawfulness, fairness and transparency.

• Purpose limitation.

• Data minimization.

• Accuracy.

• Storage limitation.

• Integrity and confidentiality.

• Accountability.

DPDPA
Reflects the following commonly accepted data protection principles in its various
requirements:

Lawfulness: Personal data may be processed only pursuant to a lawful purpose.

Fairness: Consent must be free, specific, informed, unconditional and unambiguous. Consent
should be provided through a clear affirmative act of the data principal, signifying an agreement
to such processing.

Data minimisation: Where consent is the basis for processing, personal data collected should
be limited to what is necessary for a specified purpose.

Storage limitation: Data collected should be retained only until necessary for the specified
purpose unless further retention is required by an Indian law.

134
Purpose limitation: Where consent or voluntarily provided personal data is the basis for
processing, personal data should only be processed pursuant to specified purposes. Integrity:
Where personal data is likely to be disclosed to another data fiduciary or used to make decisions
about a data principal, the data fiduciary must ensure the processing of such personal data
ensures its completeness, accuracy and consistency.

Confidentiality: Data fiduciaries are required to protect the personal data in their possession,
or control and implement reasonable security safeguards to prevent a personal data breach.
Data fiduciaries are also required to implement appropriate technical and organisational
measures.

Accountability: Data fiduciaries are required to, irrespective of an agreement to the contrary,
ensure compliance with DPDPA provisions.

Significant data fiduciaries are required to carry out periodic audits through an independent
data auditor, data protection impact assessments in accordance with the DPDPA and rules
prescribed, etc.

2. Legal Basis for Processing Personal Data

GDPR
Includes six lawful bases for processing personal data, subject to additions by member states:

• Consent.

• Performance of a contract.

• Legal obligation.

• Legitimate interests.

• Life protection and vital interests.

• Public interest.

DPDPA
Prescribes nine additional grounds for processing personal data beyond consent, defined as
legitimate uses, including the following:

• Use of voluntarily provided data by the data principal for a specified purpose, where the data
principal has not objected to such use.

• Performance of a state function.

135
• Performance of legal obligation, or in the interests of the sovereignty and integrity of India.

• To fulfil any legal obligation.

• To comply with a judicial order.

• To respond to medical emergencies involving a threat to an individual's life.

• During a threat to public health.

• For undertaking measures to ensure public safety or provide assistance during a disaster or
public order breakdown.

• For employment purposes, or to safeguard the employer from loss or liability such as
corporate espionage, to maintain confidentiality of proprietary information or to provide any
service or benefit sought by an employee.

3. Consent

GDPR
Imposes a number of requirements for obtaining valid consent:

• Consent must be freely given, specific and informed.

• It must be granted by an unambiguous affirmative action.

• Generally, the provision of a service cannot be made conditional on obtaining consent for
processing that is not necessary for the service.

• A request for consent must be distinct from any other terms and conditions.

• Consent for separate processing purposes must be provided separately.

• Individuals have the right to withdraw consent at any time "without detriment" and it should
be as easy to withdraw consent as it was to give it.

DPDPA
Requires consent to be:

• Freely given, specific and informed.

• Unconditional. This possibly implies the provision of a service cannot be conditioned on


providing consent for collecting any unnecessary data.

• Unambiguous.

• Capable of being withdrawn with comparable ease to which consent was given.

136
• In clear and plain language.

• Accessible in English as well as in all the official languages as prescribed in the Indian
Constitution.

4. Legitimate Interests

GDPR
Processing is permitted without consent where it is necessary for the controller's or a third
party's legitimate interests, provided such interests are not overridden by the rights and interests
of the data subject.

It is the controller's responsibility to determine whether the interests it pursues under this basis
are legitimate and proportionate. Controllers are expected to document these assessments.

5. Conditions for Processing Sensitive Data

GDPR
Includes 10 lawful bases for processing sensitive data, subject to additions by member states:

• Explicit consent.

• Compliance with obligations and exercising rights in the employment and social security
context.

• Life protection and vital interests.

• Legitimate activities by foundation, associations or other not-for-profit bodies with political,


philosophical, religious or trade union aims that process data about members.

• Establishment, exercise or defense in legal claims.

• Manifestly made public by the individual.

• Substantial public interest defined by law.

• Preventive or occupational medicine, assessment of the working capacity of the employee,


medical diagnosis, and the provision of health or social care or treatment.

• Substantial public interest in health.

• Archiving, scientific or historical research purposes.

DPDPA

137
Treats all personal data uniformly, without creating special categories of personal data, so the
grounds for processing all personal data remain the same.

PROTECTIONS FOR CHILDREN

GDPR
Imposes additional obligations when collecting consent from children under age 16 or at an
age set between 13 and 16 by member state law.

Where providing certain electronic services at a distance, i.e., "information society services,"
directly to a child and where the processing is based on consent, consent must be provided by
a parent or guardian.

Processing personal data of children is pertinent to other GDPR requirements, so notices must
be tailored to children. The fact that data subjects are children could tip the balance of the
legitimate interest test or trigger a DPIA.

One recital states significant automated decisions should not be taken concerning children.

DPDPA
Defines a child as an individual under age 18.

There is a general obligation to obtain "verifiable consent" from the parent or lawful guardian
of children and persons with disabilities. The government can notify rules specifying the
manner of obtaining such verifiable parental consent.

Data fiduciaries are prohibited from undertaking processing that:

• Is likely to have a detrimental effect on the wellbeing of a child.

• Involves tracking, behavioral monitoring of children or targeted advertising directed at


children.

These restrictions would not apply for certain classes of data fiduciaries, or for certain purposes
as prescribed through rules by the government.

The government may notify classes of data fiduciaries exempt from these restrictions upon
satisfaction that their processing activities are verifiably safe.

INDIVIDUAL RIGHTS

138
1. Transparency Requirements

GDPR
Requires information to be provided in a concise, transparent, intelligible and easily accessible
form, using clear and plain language.

Where personal data is collected directly from the individual, notice must be provided at or
before the time of collection.

For personal data collected indirectly, i.e., from another source), notice must be provided
within one month or upon first contact with the individual, if earlier, unless providing notice
would be impossible or require disproportionate effort.

Detailed requirements for the content that must be included in notices.

DPDPA
Requires consent to be obtained using clear and plain language.

Requires a notice to be provided before or at the time of personal data collection, disclosing:

• Personal data processed and the purpose for such processing.

• The manner in which data principals may exercise the rights available under the DPDPA.

• The manner in which the data principal can make a complaint to the Data Protection Board
of India in such manner as prescribed.

Requires the consent language to be accessible in English as well as all official languages set
out in the Indian Constitution.

Data fiduciaries are required to publish the business contact information of a point of contact,
and a data protection officer in the case of significant data fiduciaries, to answer data principals'
questions regarding personal data processing.

2. Right to Access

GDPR
Gives individuals the right to receive information about how their personal data is processed
and a copy of their personal data.

Personal data must be provided:

• Free of charge, except where requests are manifestly unfounded, excessive or for additional
copies.

139
• In electronic form whe requested.

• Within one month unless an extension applies.

Exceptions apply when providing the information would adversely affect the rights and
freedoms of others, including intellectual property rights.

DPDPA
Where consent or voluntary use is the basis for processing personal data, gives the data
principal the right of access in the manner prescribed to:

• A summary of personal data processed by the data fiduciary and the processing activities
undertaken by that data fiduciary with respect to such personal data.

• The identities of all other data fiduciaries and data processors with whom personal data has
been shared, with a description of the personal data.

• Any other information related to the their personal data and its processing, as may be
prescribed.

3. Right to Correct and Update

GDPR
• Grants data subjects the right to: Correct inaccurate personal data.

• Complete incomplete personal data.

When personal data is updated, it must be communicated to each recipient to which it was
disclosed, unless this would involve disproportionate effort.

The controller must restrict processing where the accuracy of the data is disputed for the time
needed to verify the request.

DPDPA
When consent or voluntary use is the basis for processing personal data, grants data principals
the right to:

• Correct inaccurate or misleading personal data.

• Complete incomplete personal data.

• Update outdated personal data.

140
4. Right to be Forgotten

GDPR
Grants data subjects the right to request the deletion of personal data processed by the
controller, when the data is no longer needed for the purpose of processing, when the data
subject withdraws consent or objects, and when processing is unlawful or deletion is required
by law.

If the controller grants a request for the deletion of data that was previously made public, the
controller needs to "take reasonable steps" to inform any third parties that may be processing
the data of the data subject's request.

There is also an obligation to communicate the request directly to any known recipients of the
data unless it would be impossible or would require disproportionate effort.

Controllers may rely on a number of exceptions, including establishing, exercising or


defending legal claims, conducting research that meets certain conditions, and other compelling
legitimate interests to override a request.

DPDPA
Does not expressly recognize a right to be forgotten.

However, it allows data principals to withdraw their consent related to personal data
processing.

Where consent is the basis for processing personal data, and such consent is withdrawn, the
data fiduciary would be required to cease the processing of personal data, including its retention
or public disclosure, within a reasonable time, unless the data is required to be retained under
an Indian law. Additionally, the data principal may exercise the right to erasure of their personal
data where its further retention is not necessary for the specified purpose, and is not required
by law.

5. Rights Related to Profiling


GDPR
Gives data subjects the right not to be subject to solely automated decisions, including
profiling, that produce legal or significant effects, unless certain conditions are met.

Where such decisions are permitted, data subjects have a right to obtain human intervention
and contest the decision.

141
Controllers must also provide meaningful information about the logic of decisions and take
reasonable steps to prevent bias, error or discrimination.

DPDPA
Does not include an overarching right not to be subject to profiling or significant decisions,
except the restriction against behavioral monitoring, tracking or offering targeted ads to
children.

ACCOUNTABILITY REQUIREMENT

1. Appointment of a Data Protection Officer

GDPR
Requires controllers and processors not established in the EU, that are subject to the GDPR, to
appoint a representative in the EU, except if processing is occasional and does not involve large
scale processing of sensitive data.

Requires private entities to appoint a DPO only when a "core activity" of the controller or
processor involves either the regular and systematic monitoring of data subjects on a large scale
or the large-scale processing of sensitive data.

The DPO must have sufficient independence and skill to carry out its functions and must be
able to report to the highest levels of management within the organization.

DPOs may be outsourced.

Guidance from EU regulators recommend DPOs should be based in the EU.

DPDPA
Requires all significant data fiduciaries to appoint DPOs based out of India. Significant data
fiduciaries are classes of data fiduciaries notified by the government considering the nature and
volume of personal data processed, the risk posed to the rights of data principals, risk to
electoral democracy, potential impact on the sovereignty and integrity of India, security of the
state, etc.

The DPO must represent the significant data fiduciary and be accountable to the board of
directors or governing body of the significant data fiduciary.

There are no express skill requirements for DPOs. Guidance on terms for appointing DPOs
may be provided through rules under the DPDPA.

142
2. Record of Processing

GDPR
Requires controllers and processors to retain detailed records of their processing activities
unless very narrow exceptions apply.

DPDPA
Does not require data fiduciaries to maintain a record of processing activities. However, data
fiduciaries may practically be required to maintain records of their processing activities when
demonstrating compliance with the DPDPA.

Notably, during any proceeding, the data fiduciary should be able to prove a notice was given
to the data principal and consent was obtained in accordance with DPDPA provisions.

3. Data Protection Impact Assessment

GDPR
Requires controllers to conduct a DPIA for certain "high risk" activities, including:

• Systematic and extensive profiling.

• Large-scale processing of sensitive data.

• Systematic monitoring of a publicly accessible area on a large scale.

In cases where the risks cannot be mitigated, the controller must consult with the data protection
authority before engaging in the processing.

DPDPA
Requires significant data fiduciaries to carry out DPIAs. Such assessment should consider the
impact of processing on data principals' rights, the purpose of processing, the assessment and
management of risks to data principals' rights, and other prescribed matters.

4. Privacy by Design

GDPR
Includes a requirement to implement appropriate compliance processes through the lifecycle
of any product, service or activity.

By default, only personal data necessary for a purpose should be processed and personal data
should not be publicly disclosed without an individual's affirmative action.

DPDPA

143
Notably, does not require data fiduciaries to implement privacy by design.

Where consent or voluntary use is the basis for processing personal data, ensuring personal
data collected is limited to what is necessary for a specified purpose is required.

Like the GDPR, the DPDPA requires all data fiduciaries to implement appropriate technical
and organizational measures.

5. Audit Requirement

GDPR
Does not include audit requirement that are applicable to controllers.

Processors must agree to audit provisions in contracts with controllers.

DPDPA
Requires significant data fiduciaries to appoint an independent data auditor and carry out
periodic audits.

6. Appointment of Processors

GDPR
Subjects processing by processors to detailed contracts, with requirements set out in Article 28.

DPDPA
Requires data fiduciaries to be responsible for compliance, without any provision being
applicable directly to data processors. However, data fiduciaries are required to engage data
processors only pursuant to a valid contract.

SECURITY AND BREACH NOTIFICATIONS

1. Information Security

GDPR
Requires controllers and processors to implement appropriate technical and organizational
measures to protect the security of personal data.

DPDPA
Requires data fiduciaries to protect the personal data under their control or possession,
including involving any processing undertaken by or on its behalf, and implement necessary

144
security safeguards to prevent a personal data breach. This requirement is likely to be passed
on to data processors.

2. Breach Notification

GDPR
Requires controllers to notify the DPA of a breach within 72 hours unless the breach is unlikely
to result in a risk to individuals.

Notification may be made in stages as information becomes available.

Controllers must notify individuals of a breach without undue delay only if it is likely to result
in a "high risk" to individuals.

Processors must notify a controller of a breach without undue delay.

DPDPA
Requires data fiduciaries to notify the board and the affected data principals in the event of a
personal data breach, in a manner prescribed through rules.

The time period for notifying breaches may be established by rules.

INTERNATIONAL DATA TRANSFERS

1. Data Localisation Requirements

GDPR
Does not require localization unless international data transfer requirements are not met.

DPDPA
Does not generally require data localization. However, the government may impose restrictions
on transfers of personal data to specific countries through notification. Additionally, any
stricter law such as sector-specific data localization requirements, e.g., in respect of payment
data, insurance data, or telecommunications subscriber data, will continue to apply.

2. International Data Transfers

GDPR
Only permits the transfer of personal data outside the European Economic Area when:

• The recipient is in a territory considered by the European Commission to offer an adequate


level of protection for personal data after an assessment of its privacy laws and law enforcement
access regime.

145
• Appropriate safeguards are put in place, such as European Commission-approved standard
contractual clauses or binding corporate rules approved by DPAs.

• A derogation applies, such as where data subjects provide explicit consent, the transfer is
necessary to fulfil a contract, or there is a public interest founded in EU or member state law,
among others.

DPDPA
Generally permits international data transfers. The government may restrict transfers of
personal data to specific countries notified through rules under the act. The rules will also
specify the nature of such restrictions.

ENFORCEMENT

Penalties

GDPR
Does not stipulate criminal liability, but permits member states to impose criminal penalties
for violations of the regulation and applicable national rules.

Administrative fines up to 20 million euros or 4% of annual global revenue.

DPAs may also issue injunctive penalties, which include the ability to block processing, restrict
international transfers and require the deletion of personal data.

Individuals may bring claims in court for compensation and mechanisms exist for
representative actions on behalf of a class of individuals.

DPDPA
Does not impose any criminal penalties, but imposes monetary penalties for "significant"
breaches. In calculating the amount of penalty, relevant factors to be considered include:

• The nature, gravity and duration of the breach.

• The type and nature of personal data affected by the breach.

• The repetitive nature of the breach.

• Whether, as a result of the breach, the data fiduciary realized a gain or avoided any loss.

• Whether the monetary penalty imposed is proportionate and effective.

• The likely impact of the imposition of the monetary penalty on such data fiduciary.
146
During any stage of a proceeding for compliance with the DPDPA, the board may accept a
voluntary undertaking by a data fiduciary. This may include a commitment to take appropriate
action within a stipulated timeframe determined by the board. The board's acceptance would
bar all proceedings under the DPDPA against the data fiduciary.

In addition to monetary penalties, the Central Government may direct any of its agencies or
any online intermediary to block access to any information which enables a data fiduciary to
offer goods or services in India, based on a written reference from the board intimating that the
data fiduciary has been subject to a monetary penalty in two or more cases, and advising the
Central Government that blocking access to the data fiduciary's offerings is in the public's
general interests.

MISCELLANEOUS PROVISION

1. Anonymised Data

GDPR
Does not define anonymous data, which cannot identify an individual by means reasonably
likely to be used, falls outside of the scope of the law (reasonable steps to re-identify). In
practice, anonymization is a high standard to meet.

DPDPA
Does not define anonymized personal data

2. Exemptions for Research

GDPR
Permits a number of exemptions for scientific or historical research, archiving in the public
interest, and statistical purposes, including:

• Further processing for such purposes may be considered "compatible."

• EU or member state law may permit controllers to process sensitive data for such purposes.

• EU or member state law may provide derogations from certain individual rights.

For the research exemptions to apply, controllers must implement appropriate safeguards,
which may be specified by law, such as pseudonymization.

DPDPA

147
Does not apply its provisions of to the processing personal data for research, archiving or
statistical purposes, if such processing is:

• Not used to make a decision about a data principal.

• Carried out in accordance with the standards prescribed by the government.

3. Rulemaking Authority

GDPR
Gives rulemaking authority to national DPAs and the EU Data Protection Board to issue
nonbinding guidance clarifying the application of its provisions.

Some limited areas are left to national law, such as clarifying the conditions for processing
criminal-record data or adopting additional derogations from certain provisions.

DPDPA
Either permits the Central Government to promulgate additional rules or regulations that may
clarify its requirements and/or specifies additional requirements.

4. Application to Public Authorities

GDPR
Applies to public entities, subject to narrow exemptions:

• Law enforcement and other "competent authorities" are subject to a separate, but similar
framework when they proces personal data for law enforcement purposes.

• EU institutions are subject to a separate but similar framework.

• Activities that fall outside the scope of EU law, such as national security and intelligence
services, are subject only to national law.

DPDPA
Generally applies to public agencies, as well as private parties.

However, the Central Government has the broad authority to exempt any government agency
from any or all provisions in the interest of sovereignty, security, public order, integrity of the
state and friendly relations with foreign states, or for preventing incitement of identifiable
offenses.

148
Data protection principles

Instead of listing out data protection principles, the DPDPA internalizes principles of
lawfulness, purpose limitation, storage limitation, integrity and confidentiality, and
accountability through its various provisions.

However, the principle of purpose limitation only applies when consent or voluntary use is the
basis for processing personal data. Similarly, the requirement of data minimisation —
collecting only as much information as is necessary for a specified purpose — only applies
where consent is the basis for processing personal data.

Notably, the DPDPA does not impose a general obligation to comply with the principle of
fairness in processing personal data, as required under the GDPR.

Lawful bases

The DPDPA excludes contractual necessity and legitimate interest as grounds for processing
personal data. Consent remains the primary basis for processing, except for certain legitimate
uses, where obtaining consent may not be possible. Such situations include complying with
legal obligations, performance of state functions, complying with judicial orders, responding
to medical emergencies, and maintaining public safety and order.

The act recognizes processing for broadly defined employment purposes as an independent
basis. It also envisions the use of personal data voluntarily provided by a data principal for a
specified purpose, where the data principal does not object to such use. Voluntary use as a basis
is possibly inspired by the deemed consent ground under Singapore's PDPA, where a notice
and consent mechanism may not be practical in transactional settings.

However, the voluntary use basis is much narrower than the legitimate interest grounds for
processing, which is flexible and can be relied on beyond specified purposes, considering
broader commercial interests of the data controller, as long as the individual can reasonably
expect such processing.

Classification of data fiduciaries

Unlike the GDPR, which requires all entities to carry out data protection impact assessments
under specific circumstances, for instance when high-risk processing is involved, the DPDPA

149
only imposes this requirement on specific data fiduciaries classified as "significant data
fiduciaries." The government may classify data fiduciaries as significant considering the
volume and extent of personal data processed and risks posed to data principals, electoral
democracy, national security and public order.

The GDPR, by default, requires all public bodies and entities carrying out large-scale
processing of sensitive data and systematic monitoring of individuals as their core activity to
appoint a data protection officer. The DPDPA, meanwhile, imposes the requirement to appoint
an India-based DPO only on data fiduciaries that are classified as significant through rules —
likely to include global businesses collecting significant volumes of personal data. While the
GDPR requires the DPO to act independently, the DPDPA requires the DPO to be responsible
to the board of directors or similar governing body of the significant data fiduciary. The act
allows the government to notify additional obligations on significant data fiduciaries, the nature
of which remains unclear.

Scope of rights

While the GDPR and CCPA allow individuals to exercise a broader array of rights, under the
DPDPA, the rights available to data principals are limited to the rights of access, correction,
completion, nomination (such as of a representative to exercise rights in case of death or
incapacity), erasure, consent withdrawal and grievance redressal. Further, rights to access,
correction, completion, and erasure can only be exercised where consent or voluntary use is
the basis for processing personal data.

While the act does not explicitly provide for a right to be forgotten, it is possible the withdrawal
of consent, where consent is the basis for processing, would require the data fiduciary to delete
the personal data collected. The requirement to provide a notice to data principals only applies
when consent is the basis for processing personal data.

Crucially, the right to data portability and the right against solely automated decision-making
are excluded. However, the act does require personal data used to make a decision about a data
principal to be accurate, complete and consistent — which may make it difficult for data
fiduciaries to implement solely automated decision-making processes that could result in
inaccurate or discriminatory results.

150
Duties of a data principal

Unlike most data laws, the act imposes duties on data principals, against raising frivolous
complaints, impersonating another person and suppressing material information in identifying
oneself, such as during age-verification measures. Additionally, the act requires data principals
to comply with applicable laws.

International data transfers

Unlike the GDPR, which generally restricts data transfers unless a country is deemed adequate,
the DPDPA generally allows data transfers, unless the government restricts such transfers to
specific countries. While the nature of these restrictions remains unclear, they could mean a
stringent ban against transfers to blacklisted countries or soft obligations akin to adequacy-like
arrangements, such as binding corporate rules or standard contractual clauses, for specific
countries. Additionally, sector-specific restrictions on data transfers to regulated entities —
banking and finance, insurance, etc. —may apply as relevant.

Exemptions

The act allows the government to exempt classes of data fiduciaries from its scope, considering
the nature and volume of personal data processed, including startups. This addresses the long-
standing criticism of the GDPR for imposing excessive regulatory costs on small businesses.

The act also exempts processing pursuant to research, archival or statistical purposes, when
carried out in accordance with standards prescribed by the government.

Additionally, except data security requirements, the act exempts data processing carried out
under unique conditions, such as: to ascertain the assets and liabilities of persons who may
have defaulted in payment due on account of a loan or advance taken from a financial institution
(enabling financial institutions and fintech businesses to conduct their business); processing
where it is necessary in the context of mergers and acquisitions approved by a competent
authority in certain circumstances; and, in the context of outsourcing, where the data relates
only to foreign residents and is processed by an Indian data processor on behalf of a foreign
data fiduciary, allowing India to retain its prowess as an outsourcing hub.

151
Powers of the board

Notably, the Data Protection Board of India, the regulatory body to be formed under the act,
has powers including the ability to carry out inquiries and direct urgent or remedial measures.

However, unlike national supervisory authorities under the GDPR, the board does not have the
power to initiate a proceeding on its own. Similarly, unlike EU supervisory authorities, the
board cannot issue recommendations or codes of conduct, and such prescriptive powers are
retained by the government. While the board is required to act independently, unlike the
structural and functional independence with which EU supervisory authorities operate, the
government exercises considerable control over its composition, powers and functions. This
could have been India's opportunity to further strengthen its adequacy status under the GDPR.

Perhaps again inspired by Singapore’s PDPA, the act allows the board to accept voluntary
undertaking to address any alleged noncompliance by data fiduciaries and bar associated legal
proceedings against such data fiduciaries. Such a provision for voluntary undertaking is absent
from most global data laws.

Significantly, the board can recommend the government exercise blocking powers against
noncompliant data fiduciaries, restricting access to the data fiduciary's online goods or services,
which could lead to a virtual sales stop.

Enforcement and sanctions

While the GDPR allows member states to impose criminal penalties for certain noncompliance
with data protection law, the act does not impose any criminal penalties. The sanctions are
monetary penalties which, unlike the turnover-based penalties under the GDPR, may extend to
INR250 crores, USD30 million, in some cases.

The DPDPA only provides for the imposition of penalties for non-compliances that are
"significant" in nature. In determining the monetary penalty in case of a significant non-
compliance, instead of the turnover of the business, relevant factors to consider include the
nature, gravity and duration of the breach, type and nature of personal data affected by the
breach, and the repetitive nature of the breach, as well as mitigation measures undertaken by
the data fiduciary.

152
Notably, composite penalties may be imposed under the act for more than one instance of
noncompliance. For example, penalties for failing to undertake reasonable security safeguards
to prevent a personal data breach could add up to the penalty for being noncompliant with
child-related processing obligations.

Notably, the act does not provide for a right to compensation to data principals in case of a
noncompliance with the act.

Contrary to global data laws, the act only applies monetary penalties in case of significant
breaches, but the threshold of what constitutes a "significant" breach remains unclear.

153
COMPLIANCE WITH DATA PRIVACY FRAMEWORK IN INDIA
Compliance Journey

With the Digital Personal Data Protection Act, 2023 (“DPDP Act”) already notified, in light of
its impending enforcement, organizations will need to start undertaking several checks and
balances and implement several measures to ensure compliances with the DPDP Act. Key steps
to be undertaken are laid down below:

Understanding the Law: An organization and its relevant departments/resources must apprise
itself of the applicability and provisions of the DPDP Act – includes first identifying whether
DPDP Act applies to it and if so, what and how it would apply to it.

Gap-Assessment: This involves analysing the life-cycle of personal data sets collected,
processing of personal data within and outside the organization. Thereafter, identifying the
processes and policies of the organisations which are not in compliance with the provisions of
the DPDP Act.

Updating Internal Processes and Policies: Aligning internal processes and policies, privacy
policies, terms and conditions of usage, external personal data sharing procedures and
grievance redressal mechanism to comply with the DPDP Act.

Updating Contracts / Arrangements: Updating/ amending existing personal data processing/


sharing agreements and other contracts involving personal data to comply with the DPDP Act.

Updating User Interface: Analysing and updating the user interface to ensure that it meets
appropriate notice and consent requirements, enables realization of rights granted to a data
principal under the DPDP Act.

Adopting a Consent Mechanism: Implementing a consent management mechanism to


collect, maintain, track, and update consent from individuals for personal data collected
previously and going forward. An organization should also have mechanism for disposal of
personal data in case the data principal withdraws consent, or once the purpose for which it
was collected is completed.

Implementation of Adequate Grievance Redressal Processes: Deploying accessible and


effective mechanisms to handle complaints of data principals.

154
Protecting Personal Data: Establishing and maintaining reasonable technical and
organisational security measures to protect personal data such as encryption and monitoring
data processes.

Implement Data Breach Response: Installing pre-breach and post-breach measures to


prevent personal data breaches such as containment and remedial procedures after such breach.
Organizations should also have prompt notification mechanism for informing the data
protection board and data principals of any personal data breach.

Employee Sensitization: Conduct awareness programs for your employees to enable them to
distinguish between personal and non-personal data, internal processes, breach measures and
inform them about various compliance requirements under the DPDP Act.

Additional Obligations in case you are a ‘Significant Data Fiduciary’:

Appointing a Data Protection Officer: A significant data fiduciary is required to appoint a


Data Protection Officer (“DPO”) who will be responsible for compliance with the DPDP Act.
A DPO will also serve as a point of contact for grievance redressal mechanism for data
principals. Appointing an

Independent Auditor: The significant data fiduciaries are also required to appoint an
independent auditor for conducting a ‘periodic’ audit of all the processes and policies of a data
fiduciaries to ensure that the data fiduciary is in compliance with the DPDP Act.

Data Impact Assessment: Significant data fiduciaries are required to undertake data impact
assessment which will assess (i) the manner in which the personal data is processed; (ii) the
purpose of processing personal data; (iii) the risk and harm in relation to the processing of
personal data and the measures for managing the risk and harm; and (iv) any other process
prescribed.

Implementation Of a Data Protection Program

A data fiduciary is required to have reasonable safeguards for protection of personal data as
per the DPDP Act. A data fiduciary should have following systems in place for implementing
safeguards:

155
• Understanding life-cycle of data process in your organisation and installing
organisational and technological safeguards at every stage.
• Conduct tests to identify and assess internal risks that can lead to a potential personal
data breach.
• Installing technological solutions for mitigation of identified risks of any personal data
breach.
• Frame a security protocol specifying a phased plan of action for addressing a personal
data breach.
• This security protocol should specify the roles of different individuals including a data
protection officer, in case of a significant data fiduciary.
• Ensuring that employees should only have access to personal data on a need-to-know
basis.
• Installing an authentication process to ensure that only permitted individuals have
access to personal data.
• Pseudonymization of personal data to limit access to personal data of data principals.
• Sensitising your employees as to which data sets can be construed as personal data
breach under the DPDP Act and handling of the same to prevent a personal data breach.
• Ensuring that the privacy policies and terms of services of the data fiduciaries are
updated as per the applicable laws.

How To Undertake Data Protection Impact Assessments


Data Protection Impact Assessment is an assessment process to help identify and minimize the
risks in a business model or a project. It is imperative to implement this process to determine
the acceptable levels of risk for a particular project before undertaking the same.
If the risks in a project are not acceptable and pose a threat to the rights of a data principal, a
significant data fiduciary should then employ measures for minimization of risk.

Data Impact Assessment as per the DPDPAct:


Significant Data Fiduciary
As per the DPDP Act, a significant data fiduciary is obligated to undertake Data Protection
Impact Assessment (DPIA). However, the DPDP Act does not provide any definition on who
would qualify as a significant data fiduciary. The DPDP Act merely provides that the entities

156
notified by the government will be considered as significant data fiduciary and that may be
determined based on some factors.
The DPDP Act provides that an entity should conduct data protection impact assessment
process in the following manner:
• Assessment of the manner in which the personal data is processed;
• Assessment of the purpose of processing personal data;
• Assessment of potential harm in relation to the processing of personal data;
• Assessment of the measures for managing the risk to the rights of data principals; and
• The central government has been empowered to prescribe further matters or other steps
as part of the data impact assessment.
A significant data fiduciary is required to conduct this data impact assessment periodically,
however, the DPDP Act does not specify the timelines and the same will be prescribed
subsequently by the government.

Importance of Data Protection Officers In Organizations

Data Protection Officer: The DPDP Act requires a significant data fiduciary to appoint a data
protection officer. As per the act, a DPO should be based in India, and under the act a DPO is
tasked with overseeing data protection activities and ensuring compliance. A DPO has
following responsibilities:

• A DPO is responsible to the board of directors or any other such governing body of a
significant data fiduciary. Thereby, a DPO reports solely to the governing body of the
significant data fiduciary.
• A DPO will also be responsible to external authorities notified under the DPDP Act for
meeting the regulatory requirements under the act.
• A DPO is also responsible for representing the significant data fiduciary as per the
DPDP Act, thus, it is also a single point of contact that should be aware of all the internal
procedures and policies such as data storing, protection, retention, and disposal of
personal data.
• A DPO must be involved in identifying the processes and the manner in which personal
data is being stored and whether the same is in compliance with the DPDP Act.

157
• A DPO must be consulted by the significant data fiduciaries to understand the ways to
mitigate risks of a personal data breach and confirm whether the established measures
are appropriate.
• A DPO must ensure that data impact assessment has been undertaken in an efficient
manner and the significant data fiduciary has adopted the necessary measures identified
in the process.
• A DPO is also the first point of contact for grievance redressal for the data principals,
who must report all issues related to processing of their personal data and the exercise
of their rights under the DPDP Act.

It can be concluded that a DPO plays a central role in addressing concerns of all the
stakeholders.

Mechanism For Consent Management

• A data fiduciary is required to take consent of past and present data principals for
processing personal data. Thus, the data fiduciaries should ensure that they have notice-
consent mechanisms in place to obtain such consent.
• This will require a data fiduciary to contact each data principal, and thereby, it should
have automated systems in place to contact and process consent from data principals.
• Any consent taken is restricted to the purpose defined in a notice by the data fiduciaries.
For any purposes other than the ones provided in the notice, an organization should take
a separate consent from the data principal.
• A data fiduciary should ensure that it has notice templates in place enabling a data
principal to obtain ‘free’, ‘specific’, ‘informed’, ‘unconditional’ and ‘unambiguous’
consent.
• Further, consent should be accompanied by an affirmative action on part of the data
principal to signify that consent has been given freely by the data principal. This can
include checking a box or completion of an OTP verification process or clicking on ‘I
agree’ tab.
• Every notice for consent should also be accessible in languages specified in the Eighth
Schedule to the Indian Constitution. Thus, a data fiduciary should provide drafts of all
the notices in the scheduled languages.

158
• The consent withdrawal process to be made as easy as it was to provide consent in the
first place. Thus, if consent could be provided with a single click, such consent should
be similarly retractable.
• Once the purpose specified in a consent notice is fulfilled or becomes redundant, data
fiduciaries should have mechanisms to dispose off data unless required by applicable
laws.
• A data fiduciary should also have appropriate mechanisms to ensure that data principal
can exercise its rights such as updation, completion, erasure and other rights associated
with personal data.
• A data fiduciary should also have systems in place to ensure that they can co-ordinate
with the consent managers appointed as under the DPDP Act, in order to enable a data
principal to provide review, update or withdraw consent for processing personal data.
• A data fiduciary should ensure that all the procedures for consent management should
be able to adhere to timelines for obtaining consent, disposal of data and complying
with the requests of a data principal.

Mechanism for Protection of Data Principal Rights


• The data fiduciaries should be able to provide a summary of the data principal’s data
that has been or is being processed.
• The data fiduciaries should have appropriate grievance redressal mechanisms to give
data principals a right to register a grievance effectively and obtain a resolution for the
same.
• The data fiduciaries should also provide an option to data principals to withdraw their
consent.
• The data fiduciaries should have appropriate mechanisms to correct inaccuracies of data
principals and comply with the request of data principal to erase the data processed by
them.
• The data principals have the right to nominate another individual to exercise their rights
on their behalf in the event of death or incapacity, which should be duly facilitated by
the data fiduciaries.
• The data fiduciaries should also have the ability to readily share details of data
processors that have access to personal data of data principals.

159
• The data fiduciaries should also have a proper data disposal mechanism for disposal of
data that is no longer permissible to be stored under the DPDP Act.
• The data fiduciaries should also have effective protection and breach mitigation
mechanism.
• The data fiduciaries should also have a contract with appropriate safeguards before
sharing or transferring data to another entity.

Breach Management

A data fiduciary is required to ensure that that it (and any data processors engaged on its behalf)
take reasonable security safeguards to prevent data breaches.

Breach Management: A data fiduciary is responsible for implementing reasonable safeguards


to prevent data breaches. This includes ensuring that the data fiduciary takes appropriate
technical and organisational measures to ensure effective compliance.

Breach Reporting:

• Notice to Data Principals: In the event of a breach, a Data Fiduciary must have
appropriate mechanism to inform each affected data principals regarding breach of their
personal data, and nature of personal data leaked.
• Notice to the Data Protection Board: A data fiduciary is also required to notify the data
protection board (‘DPB’) that there has been a breach of personal data. the data
protection board has powers to instruct a data fiduciary to take measures to mitigate
personal data breach.
• Time Period: The form and manner of reporting are to be notified by way of rules to be
issued.

Data Breach Mitigation Measures:

• Training its employees to increase their security awareness such as making by making
them acquainted with the internal security procedures and policies
• A data fiduciary should also maintain offline data backups to prevent data loss and
recover quickly in case of a breach.

160
• It should also conduct mock incident response plans to reduce costs and breach
containment time.
• It should have a response team which is well versed in response protocols so that it can
immediately control and contain data breach.
• It should also have a notification template as per the format to be notified under the
DPDP Act to intimate the affected data principals and the data protection board.
• It should also have a mechanism in place to identify and fix vulnerabilities in the system
that were exposed in a personal data breach.

Challenges in Compliance

Implementation Timeline:

The timeline for implementation is being regarded as very short given that the compliance with
the DPDP Act requires overhauling of existing organizational, businesses and technical
structures. Furthermore, the government is yet to notify detailed rules for compliance with the
DPDP Act.

Increased Operation Cost:

With the promulgation of the DPDP Act, organizations will incur increased costs due to review
and modification of its policies and procedures, conducting awareness programs, implementing
mechanisms for tracking and deletion of personal data and compliance with the consent
requirements and other requirements under the DPDP Act.

Ensuring compliance by Data Processors:

As per the DPDP Act, Data Fiduciary shall, irrespective of any agreement to the contrary or
failure of a Data Principal to carry out the duties provided under this Act, be responsible for
complying with the provisions of this Act and the rules made thereunder in respect of any
processing undertaken by the Data Processor for it.

Processing of Children’s personal data:

Under the DPDP Act, personal data of minors (person under 18 years of age) cannot be
processed without ‘verifiable consent’ of their guardians. Thus, the DPDP Act has restricted
access to a key demographic which forms a huge consumer base for data fiduciaries. Further,
161
installing mechanisms to verify the age of individuals will also increase the operational cost
for a data fiduciary.

Lack of Guidance/Clarity:

The DPDP Act in its present form has left several queries unanswered such as categorization
of significant data fiduciaries, interpretation of terms such as ‘verifiably safe’ processing of
personal data for children, etc. There are several other clarifications which are expected to
come forth through rules to be notified, however, as of now the DPDP Act does not provide
proper guidance for compliance with its provisions.

162
PRIVACY GOVERNANCE

By: Ron De Jesus, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP

Building a strong privacy program starts with establishing the appropriate governance of the
program. The term privacy governance will be used here to generally refer to the components
that guide a privacy function toward compliance with privacy laws and regulations and enable
it to support the organization’s broader business objectives and goals. These components
include:

• Creating the organizational privacy vision and mission statement

• Defining the scope of the privacy program

• Selecting an appropriate privacy framework

• Developing the organizational privacy strategy

• Structuring the privacy team

Privacy professional is a general term used to describe a member of the privacy team
who may be responsible for privacy program framework development, management and
reporting within an organization.

2.1 Create an Organizational Privacy Vision and Mission Statement

The privacy vision or mission statement of an organization is critically important and the key
factor that lays the groundwork for the rest of the privacy program. The privacy vision should
align with the organization’s broader purpose and business objectives and be refined with
feedback from key partners. It is typically composed of a few short sentences that succinctly
describe the privacy function’s raison d’être. Alternatively, privacy can be covered in an
organization’s overall mission statement or code of conduct.

A privacy mission statement describes the purpose and ideas in just a few sentences. It
should take less than 30 seconds to read.

As Herath states, “In just a few clear sentences, it communicates to stakeholders across all
your different lines of business—from legal to human resources to sales and marketing—where
the organization stands on privacy … your customers and partners and the auditors and

163
regulators with whom you deal need to feel confident that they understand how your privacy
policies and procedures will affect them, that you are meeting any legal requirements and that
you are protecting their interests.” 1 The following examples illustrate some variations in
privacy vision statements.

2.1.1 Stanford University

The Stanford University Privacy Office works to protect the privacy of university,
employee, patient, and other confidential information. Our office helps to ensure the
proper use and disclosure of such information, as well as, foster a culture that values
privacy through awareness. The Privacy Office provides meaningful advice and
guidance on privacy “Best Practices” and expectations for the University community.2

2.1.2 Microsoft

At Microsoft, our mission is to empower every person and every organization on the
planet to achieve more. We are doing this by building an intelligent cloud, reinventing
productivity and business processes and making computing more personal. In all of this,
we will maintain the timeless value of privacy and preserve the ability for you to control
your data.

This starts with making sure you get meaningful choices about how and why data is
collected and used, and ensuring that you have the information you need to make the
choices that are right for you across our products and services.

We are working to earn your trust every day by focusing on six key privacy principles:

• Control: We will put you in control of your privacy with easy-to-use tools and
clear choices.
• Transparency: We will be transparent about data collection and use so you can
make informed decisions.
• Security: We will protect the data you entrust to us through strong security and
encryption.
• Strong legal protections: We will respect your local privacy laws and fight for
legal protection of your privacy as a fundamental human right.
• No content-based targeting: We will not use your email, chat, files or other
personal content to target ads to you.

164
• Benefits to you: When we do collect data, we will use it to benefit you and to
make your experiences better.3

2.1.3 International Conference of Data Protection and Privacy Commissioners

The Conference’s vision is an environment in which privacy and data protection


authorities around the world are able effectively to fulfil their mandates, both
individually and in concert, through diffusion of knowledge and supportive connections.
This vision in part of a Conference strategic plan that also includes a mission statement,
strategic priorities and an action plan.4

2.1.4 An Coimisiún um Chosaint Sonraí | Data Protection Commission

Protecting data privacy rights by driving compliance through guidance, supervision and
enforcement.5

2.1.5 Information Commissioner’s Office (ICO)

Mission: To uphold information rights for the UK public in the digital age.

Vision: To increase the confidence that the UK public have in organisations that process
personal data and those which are responsible for making public information available.

Strategic Goals

1. To increase the public’s trust and confidence in how data is used and made
available.

2. Improve standards of information rights practice through clear, inspiring and


targeted engagement and influence.

3. Maintain and develop influence within the global information rights regulatory
community.

4. Stay relevant, provide excellent public service and keep abreast of evolving
technology.

5. Enforce the laws we help shape and oversee.6

2.1.6 Data Protection Authority

The Authority’s vision (Belgium)

In its reflections and activities the Authority aims to safeguard the balance between the
right to privacy protection and other fundamental rights.7

165
2.2 Define Privacy Program Scope

After establishing a privacy mission statement and vision, you’ll need to define the scope of
the privacy program. Every organization has its own unique legal and regulatory compliance
obligations, and you’ll need to identify the specific privacy and data protection laws and
regulations that apply to it. A typical approach to identifying scope includes the following two
steps:

1. Identify the personal information collected and processed

2. Identify in-scope privacy and data protection laws and regulations

2.2.1 Identify the Personal Information Collected and Processed


The first step in gaining assurance that you are complying with your regulatory obligations is
to know what personal information your organization collects, uses, stores and otherwise
processes. There are several ways to ascertain this. Initially, you can take a less structured
approach to identifying where data lives throughout the organization by setting up information-
gathering interviews with the typical functions that usually collect, use, store and otherwise
process personal information—human resources (HR), marketing, finance, and IT/information
security. Taking a lighter touch can at least help you to determine the general categories and
locations of personal information, which will be key pieces of data for the next step.
A more robust approach includes engaging an outside consultancy to assess where personal
information is collected, stored, used and shared, or engaging other internal resources (e.g.,
internal audit) to assist the privacy team with the discovery. This more structured exercise of
identifying data throughout the data lifecycle drives the development of more accurate and
detailed data inventories, maps and other helpful documentation. Further, since maintaining
written documentation about personal information (including information about how the
organization processes the data, the categories of individuals impacted, and the recipients of
data) has become formalized through Article 30 of the EU General Data Protection Regulation
(GDPR), organizations that are subject to the GDPR should consider this more thorough,
holistic approach to the initial personal information identification efforts. Chapter 4 covers
developing data inventories, conducting data mapping and establishing these “records of
processing” in further detail.
Some key questions that should be asked to help define the scope of the privacy program are:

• Who collects, uses and maintains personal information relating to individuals,


customers and employees? In addition to your own legal entity, this group

166
includes your service providers—so you need to understand these roles and
obligations too.

• What types of personal information are collected and what is the purpose of
collection?

• Where is the data stored physically?

• To whom is the data transferred?

• When (e.g., during a transaction or hiring process) and how (e.g., through an
online form) is the data collected?

• How long is data retained and how is it deleted?

• What security controls are in place to protect the data?

2.2.2 Identify In-Scope Privacy and Data Protection Laws and Regulations
After you have identified key metadata about the personal information your organization
collects (e.g., the specific data elements collected, from whom, where it’s stored), the next step
is to identify the organization’s privacy obligations related to that data. Most global
organizations are subject to many data protection and privacy laws—and some personal
information collected and processed may be subject to more than one regulation. For example,
a healthcare services company may be subject to domestic regulations governing the handling
of personal health information. The company may also handle financial transactions and
therefore be subject to financial reporting regulations as well. Even further, organizations that
offer services to individuals located in other countries or that have locations overseas will likely
be subject to global privacy obligations. Since no two entities are alike, you will need to
determine the true scope for your situation.

If your organization plans to do business within a jurisdiction that has inadequate or


no data protection regulations, institute your organization’s requirements, policies and
procedures instead of reducing them to the level of the country in which you are doing
business. Choose the most restrictive policies—not the least restrictive.

2.2.3 Scope Challenges


Determining the scope of your privacy program can be challenging, regardless of whether the
program is domestic or global. Purely domestic privacy programs may need to monitor only
state and/or regional laws, while global programs will need to be cognizant of cultural norms,

167
differences and approaches to privacy protection. A key example is the U.S. versus EU
approach. The former takes a limited sectoral approach, with laws that apply to specific
industry sectors or categories of data, like the Health Insurance Portability and Accountability
Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Children’s Online Privacy
Protection Rule (COPPA). The latter takes a comprehensive approach (i.e., EU GDPR
generally applies to all personal data, regardless of sector).
In addition to Europe, key Asian countries, the United States, Canada and Australia have
either enacted some form of data protection legislation or are in the process of establishing
their own laws, often modeling their laws on those established by the EU.
These laws may apply to your organization, whether it is located and operates in a particular
country itself or just transfers personal information from that country to its home location.
Determining the applicability of these laws to your organization is a key responsibility of the
privacy professional, as is ensuring that relevant laws, regulations and other factors are
considered from the start of the privacy program and throughout its lifecycle. Table 2-1
describes the different privacy approaches by jurisdiction, and Chapter 3 details, by
jurisdiction, the specific privacy and data protection laws, regulations and frameworks you
should understand.

Table 2-1: Sample Approaches to Privacy and Data Protection around the Globe

Protection
Country Approach to Privacy Protection
Models

United States Sectoral Laws Enactment of laws that specifically address a particular
industry sector, such as:
• Financial transactions
• Credit records
• Law enforcement
• Medical records
• Communications

EU member Comprehensive Govern the collection, use and dissemination of personal


states, Laws information in public and private sectors with an official
Canada oversight enforcement agency that:
• Remedies past injustices

168
• Promotes electronic commerce
• Ensures consistency with pan-European laws

Australia Co-Regulatory Variant of the comprehensive model, where industry


Model develops enforcement standards that are overseen by a
privacy agency

United Self-Regulated Companies use a code of practice by a group of companies


States, Japan, Model known as industry bodies. The Online Privacy Alliance
Singapore (OPA), TrustArc (formerly TRUSTe), BBBOnline and
WebTrust are examples of this type of model
Organizations operating in the United States face domestic privacy challenges that include
determining whether your organization constitutes an entity that is subject to a law or industry
standard that regulates data or the collection of data from certain individuals. “Financial
institutions,” as defined by the Gramm-Leach-Bliley Act, are subject to GLBA.8 Certain types
of organizations and entities known as “covered entities,” such as healthcare providers (e.g.,
hospitals, clinics, pharmacies) and health plans (e.g., medical plans, organization benefit plans)
are subject to HIPAA.9 Websites collecting information from children under the age of 13 are
required to comply with the Federal Trade Commission’s (FTC’s) COPPA.10 A merchant of
any size that handles cardholder information for debit, credit, prepaid, e-purse, and ATM and
point of sale (POS) cards must follow the Payment Card Industry Data Security Standard (PCI
DSS), which is a global standard.11
As the name suggests, PCI DSS is an industry security standard, not a law, but it still imposes
certain data protection requirements on organizations, as well as certain notification obligations
in the event of breaches; some U.S. states have adopted PCI DSS as part of legislated
requirements.
Domestic U.S. privacy challenges also extend from federal laws and regulations to the states;
up to 46 states now have data breach notification laws.12 Accordingly, if you process the
personal information of any resident of a state that has adopted a breach notification law,
understand that to the extent that nonencrypted data has been compromised, your compliance
obligations may include notifying the residents of the state as well as government bodies or
state attorneys general offices.
Outside of the United States, many countries put more stringent privacy requirements on the
government than on the private sector and impose separate requirements on certain industry

169
sectors (e.g., telecommunications companies face stricter record-keeping requirements) or on
data collected on employees.
Appropriately scoping your organization’s privacy program is a challenging exercise. A
successful approach requires:

• Understanding of the end-to-end personal information data lifecycle

• Consideration of the global perspective in order to meet legal, cultural and


personal expectations

• Customizing of privacy approaches from both global and local perspectives

• Awareness of privacy challenges, including translations of laws and regulations


and enforcement activities and processes

• Monitoring of all legal compliance factors for both local and global markets

2.3 Develop and Implement a Framework

Once you’ve determined which laws apply, you must design a manageable approach to
operationalizing the controls needed to handle and protect personal information. Implementing
and managing a program that addresses the various rights and obligations of each privacy
regulation on a one-off basis is a nearly impossible task. Instead, using an appropriate privacy
framework to build an effective privacy program can:

• Help achieve material compliance with the various privacy laws and regulations
in-scope for your organization

• Serve as a competitive advantage by reflecting the value the organization places


on the protection of personal information, thereby engendering trust

• Support business commitment and objectives to stakeholders, customers,


partners and vendors

2.4 Frameworks

The term framework is used broadly for the various processes, templates, tools, laws and
standards that may guide the privacy professional in privacy program management. Privacy
frameworks began emerging in the 1970s. They can be broadly grouped into three categories:
principles and standards; laws, regulations and programs; and privacy program management
solutions. Examples include:

170
2.4.1 Principles and Standards
Fair Information Practices provide basic privacy principles central to several modern
frameworks, laws and regulations.13 Practices and definitions vary across codifications: rights
of individuals (notice, choice and consent, data subject access), controls on information
(information security, information quality), information lifecycle (collection, use and retention,
disclosure), and management (management and administration, monitoring and enforcement).
The Organisation for Economic Co-operation and Development (OECD) Guidelines on
the Protection of Privacy and Transborder Flows of Personal Data are the most widely
accepted privacy principles; together with the Council of Europe’s Convention 108, they are
the basis for the EU Data Protection Directive and the GDPR.14
The American Institute of Certified Public Accountants (AICPA) and Canadian Institute of
Chartered Accountants (CICA), which have formed the AICPA/CICA Privacy Task Force,
developed the Generally Accepted Privacy Principles (GAPP) to guide organizations in
developing, implementing and managing privacy programs in line with significant privacy laws
and best practices.15
The Canadian Standards Association (CSA) Privacy Code became a national standard in
1996 and formed the basis for the Personal Information Protection and Electronic Documents
Act (PIPEDA).16
The Asia-Pacific Economic Cooperation (APEC) Privacy Framework enables Asia-
Pacific data transfers to benefit consumers, businesses and governments.17
Binding corporate rules (BCRs) are legally binding internal corporate privacy rules for
transferring personal information within a corporate group. Article 47 of the GDPR lists
requirements of BCRs (e.g., application of GDPR principles).18 Under the GDPR, BCRs must
be approved by the competent supervisory authority.
The European Telecommunications Standards Institute (ETSI) is an independent, not-
for-profit, standardization organization in the telecommunications industry and produces
globally applicable standards for information and communications technologies, including
fixed, mobile, radio, converged, broadcast and internet technologies.19

2.4.2 Laws, Regulations and Programs


The Canadian PIPEDA provide well-developed and current examples of generic privacy
principles implemented through national laws.20
EU data protection legislation includes the GDPR, which offers a new framework for data
protection with increased obligations for organizations and far-reaching effects.21
171
The EU-U.S. Privacy Shield was officially adopted in 2016 by the European Commission
and establishes a cross-border data transfer mechanism between the two regions that replaces
the previous Safe Harbor Framework.22
HIPAA is a U.S. law passed to create national standards for electronic healthcare
transactions, among other purposes.23 HIPAA required the U.S. Department of Health and
Human Services to promulgate regulations to protect the privacy and security of personal health
information. The basic rule is that patients must opt in before their information can be shared
with other organizations—although there are important exceptions, such as for treatment,
payment and healthcare operations.
Local data protection authorities, such as France’s Commission nationale de
l’informatique et des libertés (CNIL), provide guidance on legal frameworks.24

2.4.3 Privacy Program Management Solutions


Privacy by design (PbD) calls for privacy to be taken into account throughout the whole
product engineering process to ensure consideration of consumers’ privacy protections.25 This
approach includes reasonable security for consumer data, limited collection and retention of
such data, and reasonable procedures to promote data accuracy.
The European Union Agency for Network and Information Security (ENISA) provides
recommendations on cybersecurity, supports policy development and its implementation, and
collaborates with operational teams throughout Europe.26
The National Institute of Standards and Technologies (NIST) has published An
Introduction to Privacy Engineering and Risk Management in Federal Systems, introducing
concepts of privacy engineering and risk management for federal systems, including a common
vocabulary to facilitate better understanding and communication of privacy risk within federal
systems and effective implementation of privacy principles.27 NIST also published the
Framework for Improving Critical Infrastructure Cybersecurity Version 1.1.28 The framework
enables organizations—regardless of size, degree of cybersecurity risk, or cybersecurity
sophistication—to apply the principles and best practices of risk management to improving
security and resilience. It provides a common organizing structure for multiple approaches to
cybersecurity by assembling standards, guidelines and practices that are working effectively
today.
The different frameworks have varying objectives based on business needs, commercial
grouping, legal/regulatory aspects, and government affiliations. The privacy questions most
frameworks answer primarily include:

172
• Are privacy and the organization’s privacy risks properly defined and identified
in the organization?

• Has the organization assigned responsibility and accountability for managing a


privacy program?

• Does the organization understand any gaps in privacy management?

• Does the organization monitor privacy management?

• Are employees properly trained?

• Does the organization follow industry best practices for data inventories, risk
assessments and privacy impact assessments (PIAs)?

• Does the organization have an incident response plan?

• Does the organization communicate privacy-related matters and update that


material as needed?

• Does the organization use a common language to address and manage


cybersecurity risk based on business and organizational needs?

2.4.4 Rationalizing Requirements


Once an organization decides on a framework or frameworks, it will be easier to organize the
approach for complying with the plethora of privacy requirements mandated by the laws and
regulations that are applicable to it. One option is to rationalize requirements, which essentially
means implementing a solution that materially addresses them. This activity is made simpler
by several factors. First, at a high level, most data privacy legislation imposes many of the same
types of obligations on regulated entities, and much of this regulation requires entities to offer
similar types of rights to individuals. Among these shared obligations and rights, data
protection regulations typically include: notice, choice, consent, purpose limitations, limits on
retaining data, individual rights to access, correction and deletion of data, and the obligation to
safeguard data—duties that are generally covered by the privacy frameworks previously
identified. Further, there seems to be a growing consensus among data protection regulators
and businesses on the actions and activities that meet these regulatory obligations.
Note that a rationalized approach to creating a privacy strategy also necessitates addressing
requirements that fall outside of the common obligations (often termed outliers) on a case-by-
case basis. Outliers result when countries’ local laws exceed the requirements of national law,
or when countries have industry- or data-specific requirements.

173
For example, rationalizing the common legal obligation of providing individuals with a right
of access to their personal information means the organization must also identify the time
frames within which data must be provided to individuals per applicable privacy law. In the
EU, as a result of GDPR, prescribed time frames within which an organization must provide
access to individuals (e.g., employees, consumers) now exist. In countries where no legal
requirements exist (and the granting of access may be merely an organization policy), or where
there is a generous amount of time extended to provide data, the organization can adopt a
procedure that sets a common time period within which data must be provided. A rationalized
approach that seeks to address both sets of requirements would result in the organization
establishing a standard access process that generally meets the demands of many countries,
with a local process that meets specific time frame requirements for individuals in EU countries
only.
Another approach organizations employ, when possible, is to look to the strictest standard
when seeking a solution, provided it does not violate any data privacy laws, exceed budgetary
restrictions, or contradict organization goals and objectives. This approach is used more
frequently than most organizations realize. In the example above, rather than responding to
access requests of only EU-based individuals within a
30-day period, the organization would provide all individuals globally with access to their data
within a prescribed, GDPR-compliant time frame. Other examples are shredding everything
versus shredding only documents that contain personal or confidential information, or rolling
out laptop encryption for the entire employee population as opposed to targeting only
individuals who may perform functions that involve personal information.

2.5 Privacy Tech and Government, Risk and Compliance Vendors and Tools

Some organizations choose to use privacy tech vendors to help them achieve compliance within
their selected privacy program framework. Privacy tech vendors offer a range of solutions,
from assessment management to data mapping and de-identification and incident response.
Note that a product or solution in and by itself is not compliant. When deployed as part of a
properly thought-out privacy program, the solution is a tool that assists the organization with
GDPR and multijurisdictional and regulatory compliance requirements.

2.5.1 Categories of Privacy Tech Vendors


Privacy tech vendors in the category of privacy program management typically work directly
with the privacy office. Vendors may manage:

174
• Assessment

• Consent

• Data mapping

• Incident response

• Privacy information

• Website scanning/cookie compliance

Enterprise program management services require buy-in from the privacy office, IT and C-
suite. Services include:

• Activity monitoring

• Data discovery

• De-identification/pseudonymization

• Enterprise communications29

2.5.2 Governance, Risk and Compliance Tools


According to a recent survey completed by the IAPP and EY, at least 35 percent of privacy
professionals surveyed use governance, risk and compliance(GRC) tools as part of their privacy
framework.30 GRC is an umbrella term whose scope touches the privacy office as well as other
departments, including HR, IT, compliance and the C-suite. GRC tools aim to synchronize
various internal functions toward “principled performance”—integrating the governance,
management and assurance of performance, risk and compliance activities. While many IT
vendors provide capabilities to meet a single compliance requirement, true GRC vendors
provide tools to oversee risk and compliance across the entire organization, helping to automate
GRC initiatives that are mostly manual or beyond an organization’s current capabilities.
GRC tools are generally used to:

• Create and distribute policies and controls and map them to regulations and
internal compliance requirements

• Assess whether the controls are in place and working, and fix them if they are
not

• Ease risk assessment and mitigation31

175
2.6 Develop a Privacy Strategy

Now that the framework through which the organization will organize its privacy requirements
has been identified, the next consideration is the privacy strategy. Essentially, a privacy
strategy is the organization’s approach to communicating and obtaining support for the privacy
program. Personal information may be collected and used across an organization, with many
individuals responsible for protecting this information. No one solution mitigates all privacy
risk, and there is no one-size-fits-all strategy that can be adopted. There are positive benefits to
implementing a privacy strategy in today’s environment, among them management’s growing
awareness of the importance of protecting personal information and the financial impact of
mismanagement. With that said, getting buy-in at the appropriate levels can still be difficult.
Building a privacy strategy may mean changing the mindset and perspective of an entire
organization. Everyone in an organization has a role to play in protecting the personal
information an organization collects, uses and discloses. Management needs to approve
funding to resource and equip the privacy team, fund important privacy-enhancing resources
and technologies, support privacy initiatives such as training and awareness, and hold
employees accountable for following privacy policies and procedures. Sales personnel must
secure business contact data and respect the choices of these individuals. Developers and
engineers must incorporate effective security controls, build safe websites, and create solutions
that require the collection or use of only the data necessary to accomplish the purpose. All staff
must understand and employ fundamental practices required to protect personal data—from
secure methods of collecting, storing and transmitting personal data (both hard-copy and
electronic) through secure methods of destruction. The adage “the chain is only as strong as its
weakest link” truly reflects the way an organization must approach its privacy program. There
are no shortcuts, and every individual within an organization contributes to the success of the
privacy program.
Before an organization can embark on this journey, the management team will need to
understand why their involvement and support is so critical. It is important to know the ultimate
destination before beginning, and to have a roadmap for the journey. These factors and more
must be contained in the privacy strategy to ensure success, buy-in and ownership from the
widest possible pool of stakeholders.

176
2.6.1 Identify Stakeholders and Internal Partnerships
One of the most challenging aspects of building a privacy program and the necessary
supporting strategy is gaining consensus from members of the organization’s management on
privacy as a business imperative. Building and gaining this consensus in stages is a must.
The first major step in building a coalition of supporters is to conduct informal one-on-one
conversations with executives within the organization who have accountability for information
management and/or security, risk, compliance or legal decisions. Internal partners, such as HR,
legal, security, marketing, risk management and IT should also be included in conversations,
as they too will have ownership of privacy activities, and their buy-in will be necessary.
Depending on the organization’s industry and corporate culture, the executives, managers and
internal partners will each play a role in the development and implementation of the privacy
strategy for the privacy program.
From these conversations, you should start to get a sense for which executive will serve as
the program sponsor, or “champion” for the privacy program, or whether an executive is even
necessary. The program sponsor should be someone who understands the importance of
privacy and will act as an advocate for you and for the program. Effective program sponsors
typically have experience with the organization, the respect of their colleagues, and access to
or ownership of a budget. Final budgetary decision makers are the preferred program sponsors,
but if they are unavailable, it is best to obtain approval from executive management closest to
the organization’s top executive. Frequently, sponsors function as risk or compliance
executives within the organization. Sometimes chief operating officers (COOs) or chief
information officers (CIOs) serve as program sponsors.

A privacy champion at the executive level acts as an advocate and sponsor to further
foster privacy as a core organization concept.

Most organizations, regardless of their size, industry and specific business, use personal
information for roughly the same bundle of activities—for example, staff recruitment and
ongoing employment administration, customer relationship management and marketing, and
order fulfillment. Further, the use of this personal information is managed by a similar array of
executives—regardless of the organization or its activities. It is common to refer to the
individual executives who lead the relevant activities and own responsibility for them as
stakeholders. Typically, in a larger organization, an executive privacy team will include some
or all of the following individuals: senior security executive [e.g., chief security officer (CSO)];
senior risk executive [e.g., chief risk officer (CRO)]; senior compliance executive [(e.g., chief

177
compliance officer (CCO)]; senior HR executive; senior legal executive (e.g., general counsel);
senior information executive (e.g., CIO); senior physical security/business continuity
executive; senior marketing executive (CMO); and a senior representative of the business.
Several best practices when developing internal partnerships include:

• Become aware of how others treat and view personal information

• Understand their use of the data in a business context

• Assist with building privacy requirements into their ongoing projects to help
reduce risk

• Offer to help staff meet their objectives while offering solutions to reduce risk of
personal information exposure

• Invite staff to be a part of the privacy advocate group to further privacy best
practices

In smaller organizations, a legal department may create contract requirements if there


is no procurement.

2.6.2 Conduct a Privacy Workshop for Stakeholders


With the support of the privacy program sponsor, you should plan to conduct a workshop for
the stakeholders who will support efforts to develop and launch a privacy program. Don’t
assume that all stakeholders have the same level of understanding about the regulatory
environment or the complexity of the undertaking—there will invariably be different levels of
privacy knowledge among the group. This is an opportunity to ensure everyone has the same
baseline understanding of the risks and challenges the organization faces, the data privacy
obligations that are imposed on it, and the increasing expectations in the marketplace regarding
the protection of personal information.

Conduct a privacy workshop for stakeholders to level the privacy playing field by
defining privacy for the organization, explaining the market expectations, answering
questions, and reducing confusion.

2.6.3 Keep a Record of Ownership


Once the importance of the privacy program has been established, key internal stakeholders
may form a steering committee to ensure clear ownership of assets and responsibilities. Keep
a record of these discussions as a tool for communication, and to ensure stakeholders can refer

178
to what was decided. Such documentation also helps support accountability requirements of
the GDPR and serves as the privacy program’s due diligence in terms of which functions and
individuals should be held accountable for privacy compliance.

2.7 Structure the Privacy Team

Structuring the privacy team is the last objective to formalizing the organization’s approach to
privacy. This section will focus on the many factors that should be considered to assist with
the decisions involved in structuring the team and to ensure that the foundation for those
decisions aligns with business objectives and goals. This final step aligns privacy governance
for the organization with its privacy strategy.

2.8 Governance Models

There are different approaches and strategies for creating privacy office governance models.
This text is not intended to educate thoroughly on the idiosyncrasies of various governance
models, but to provide examples of types of governance models that should be examined when
structuring your privacy program. Give thoughtful consideration to the models. They will be a
basis for the making of decisions by your privacy team and the policies they will need to
establish.
You should consider whether to apply the model only within a given geographical region(s)
or globally depending on your operations. Many large organizations find they need to consider
global implications when structuring privacy teams.
The positioning of the privacy team within an organization should rely on the authority it will
receive under the governance model it follows. Positioning the privacy team under the
corporate legal umbrella may be substantially different from aligning the team under the IT
umbrella. Executive leadership support for the governance model will have a direct impact on
the level of success when implementing privacy strategies.
No matter which model is chosen, there are some important steps to integrate into it:

• Involve senior leadership

• Involve stakeholders

• Develop internal partnerships

• Provide flexibility

• Leverage communications

179
• Leverage collaboration

Privacy governance models include centralized, local and hybrid versions, but are not limited
to only these options. Governance models and the choice of the correct model objectives should
ensure information is controlled and distributed to the right decision makers. Since decision
making must be based on accurate and up-to-date management data, the allocation and design
of the governance model will foster intelligent and more accurate decisions.

2.8.1 Centralized
Centralized governance is a common model that fits well in organizations used to utilizing
single-channel functions (where the direction flows from a single source) with planning and
decision making completed by one group. A centralized model will leave one team or person
responsible for privacy-related affairs. All other persons or organizations will flow through this
single point. Often this single point is the chief privacy officer (CPO) or corporate privacy
office.

2.8.2 Local or Decentralized


Decentralization is the policy of delegating decision-making authority down to the lower levels
in an organization, at a distance from and below a central authority. A decentralized
organization has fewer tiers in the organizational structure, a wider span of control, and a
bottom-to-top flow of decision making and ideas.
In a more decentralized organization, the top executives delegate much of their decision-
making authority to lower tiers of the organizational structure. As a correlation, the
organization is likely to run on less-rigid policies and wider spans of control among each officer
of the organization. The wider spans of control also reduce the number of tiers within the
organization, giving its structure a flat appearance. One advantage of this structure, if the
correct controls are in place, will be the bottom-to-top flow of information, allowing decisions
about lower-tier operations to be well-informed. For example, if an experienced technician at
the lowest tier of an organization knows how to increase the efficiency of production, the
bottom-to-top flow of information can allow this knowledge to pass up to the executive
officers.

2.8.3 Hybrid
A hybrid governance model allows for a combination of centralized and local governance. This
is most typically seen when a large organization assigns a main individual (or department)

180
responsibility for privacy-related affairs and for issuing policies and directives to the rest of the
organization. The local entities then fulfill and support the policies and directives from the
central governing body. Members of the privacy team may also sit locally; for example, with
regional compliance hubs in large multinationals. Each region may have a privacy manager
who reports in to local management and/or the CPO at the global level.

2.8.4 Advantages and Disadvantages


Centralized management offers many advantages, with streamlined processes and procedures
that allow the organization to create efficiency by using the same resources throughout the
organization. Since decisions are made at the top layer, individual employees or groups cannot
make their own decisions and must seek approval from a higher level.
With fewer layers of management, decentralized managers create and manage their own
business practices. This may be inefficient, because each process may be reproduced many
times instead of once. On the other hand, employees are also tasked with solving problems with
which they are closest and most familiar.
The hybrid approach uses a decentralized decision-making process that tends to provide less
outside influence for employees yet offers the advantage of the organizational resources of a
larger, centralized organization. Typically, the hybrid model will dictate core values and let the
employee decide which practice to use to obtain its goals. Working groups, individual offices
and other groups are encouraged to make business decisions that consider revenue, operating
costs and operations. Such models allow an organization to function in a global environment
yet maintain common missions, values and goals.
Mixing centralized and decentralized management approaches into a hybrid approach enables
the organization to achieve desired results that may span the globe or locations across town.
Employees believe their contributions provide a sense of ownership, which encourages them
to perform more efficiently and effectively, consistent with top management.

2.9 Establish the Organizational Model, Responsibilities and Reporting Structure

In establishing the overall organizational privacy model, one must consider the organizational
structure as related to strategy, operations and management for responsibilities and reporting.
The privacy professional should know how each major unit functions and should understand
its privacy needs. The following is a short list of those roles for both large and small
organizational structures, including:

• CPO

181
• Privacy manager

• Privacy analysts

• Business line privacy leaders

• First responders (i.e., incident response team members)

• Data protection officers (DPOs), including those for whom privacy is not their
only responsibility, if applicable to the organization

Organizational structures function within a framework by which the organization


communicates, develops goals and objectives, and operates daily. Companies can use one of
several structures or switch from one to another based on need. Principles within that
framework allow the organization to maintain the structure and develop the processes
necessary to do so efficiently. Considerations include:

• Hierarchy of command. The authority of senior management, leaders and the


executive team to establish the trail of responsibility.

• Role definition. Clear definition of the responsibilities to create individual


expectations and performance.

• Evaluation of outcomes. Methods for determining strengths and weaknesses


and correcting or amplifying as necessary.

• Alteration of organizational structure. Ability to remain dynamic and change


as necessary to meet current objectives, adopt new technology or react to
competition.

• Significance. Complex structure typical for large organizations; flat structures


for smaller organizations.

• Types of structures. Product organizational structures, functional organizational


structures and others.

• Customers. Consider the different needs depending on nature of products and


services the organization offers.

• Benefits. To the organization, customers and stakeholders, as aligned to the


objectives and goals.

182
2.9.1 Titles Used for Privacy Leaders
The titles an organization uses to denote its privacy leaders reveals much information about its
approach to privacy, its reporting structure, and its industry. According to a recent survey
completed by the IAPP and EY, the terms privacy, chief and officer are the most popular terms
used in privacy management titles and are more often used than terms like counsel, director or
global.32 Further, a larger percentage of U.S.-headquartered organizations use the terms
privacy, vice president and director for privacy management roles when compared with their
European counterparts; similar roles in the EU are more likely to use the term data in such
titles (likely a result of the “privacy” versus “data protection” divide between U.S. and EU-
headquartered firms).
Some companies are asking their CPO to serve in the role of DPO (discussed in section 2.9.5),
with or without adding the title. According to the survey, such companies are most likely in
unregulated industries and those with business-to-business (B2B) models, suggesting these
companies tend to appoint fewer but perhaps more educated or qualified personnel to privacy
leadership roles and then ask more of them.33

2.9.2 Typical Educational and Professional Backgrounds of Privacy Leaders


Regardless of the title an organization chooses for its privacy leader, it’s important that the
individual possess the requisite skills and qualifications. While a legal background is a common
requirement for most privacy positions, project management, controls implementation, audit,
and information security experience have emerged as key qualities of privacy professionals.
The company’s specific industry, where the privacy leader is placed within the organization,
and to whom the leader reports also influence the desired background and skills of privacy
leaders. For example, a privacy leader reporting into the general counsel would likely be
expected to possess legal qualifications, while a privacy leader reporting to the chief
information security officer (CISO) may be expected to have a certain level of security and
technical knowledge, in addition to privacy expertise.
As a relatively new field, and given the breadth of skills required outside of knowledge of
privacy laws and regulations, privacy professionals have come from a diverse range of
educational backgrounds. More recently, however, degree programs specializing in privacy
law, cybersecurity, and privacy engineering have become available, including:

• Carnegie Mellon’s Master of Science in Information Technology—Privacy


Engineering (MSIT-PE)34

183
• Ryerson University’s Certificate in Privacy, Access and Information
Management35

• Brown University’s Executive Master in Cybersecurity36

The emergence of these programs has further legitimized privacy as a distinct profession
requiring increased attention, resourcing, executive support and credentialing.

2.9.3 Professional Certifications


Like other professional certifications, those offered by the IAPP provide a way for individuals
in the industry to demonstrate that they possess a fundamental understanding of global privacy
laws, concepts, best practices and technologies.37 IAPP certifications, which are accredited by
the American National Standards Institute (ANSI) under the International Organization for
Standardization (ISO) standard 17024: 2012, are increasingly listed as minimum requirements
in privacy job descriptions.

2.9.4 Conferences and Seminars


Conferences and seminars are rich resources for information and expert presentations on
effective ways to build a privacy program and address privacy governance. Individuals learn
from privacy experts about approaches to privacy management by attending sessions, working
groups, and/or panel discussions that are assembled specifically to address this topic. Other
topics include governance structures. Presentations on managing security incidents, creating a
sustainable training and awareness program, and designing and implementing programs
educate the audience on the subject matter itself while also providing industry insights into
how an organization manages these issues and assigns accountability. Information is also
obtained through informal exchanges of ideas among privacy professionals and those interested
in this industry. Learning from experts and peers is an incredibly valuable method for acquiring
information about privacy approaches.

2.9.5 The DPO Role


Designation of a DPO is a new requirement under Article 37 of the GDPR. The concept of
designating an individual accountable for an organization’s privacy compliance, however, is
not new. For example, Canada’s PIPEDA requires that an organization appoint someone to be
accountable for its compliance with the act’s fair information principles; South Korea’s Data
Protection Act mandates the appointment of a DPO with specific responsibilities;38 and
Germany, under its implementation of EU Data Protection Directive (95/46/EC), required

184
organizations to appoint a DPO under certain circumstances (e.g., if the company employed
more than nine persons). With the GDPR, this requirement is formalized, and so are key criteria
with respect to the need, reporting position and qualifications of the DPO.

2.9.5.1 When is a DPO Required?


Article 37 of the GDPR establishes the specific criteria triggering the requirement for an
organization to designate a DPO.39
Subject to some exceptions, designation of a DPO is required:

• By public authorities or bodies

• Where the organization’s “core” activities consist of processing operations that


require “regular and systematic monitoring of data subjects on a large scale”

• Where the organization’s “core” activities consist of processing “special”


categories of data on a large scale40

Even if it’s determined that a DPO is not required, the organization may choose to voluntarily
appoint one. Keep in mind that formally appointing a DPO will subject the organization to the
following DPO requirements:

• Reporting structure and independence. The position of the DPO is formally


elevated by Article 38, whereby the DPO is required to “report to the highest
management level of the controller or the processor.” While “highest
management level” is not further defined by the GDPR, its literal interpretation
would be at the level of C-level management or the board of directors.41 In
practice, such a reporting line may not be feasible or practical, depending on
several factors such as the size of the company, the accessibility of CEO, and the
likelihood of the reporting line to affect the DPO’s independence. Organizations
should consider these key factors when deciding the DPO’s reporting lines.

• Qualifications and Responsibilities. Article 37 mandates several requirements


for the DPO’s qualifications and position, including that the DPO possess “expert
knowledge of data protection law and practices.” Quantifying “expert
knowledge” is subjective—a reasonable interpretation of someone possessing
expert knowledge in the field would be the privacy professional who has spent
most of their career practicing privacy law or operationalizing privacy programs,
for example.42 Such expertise is likely required as a result of Article 39, which

185
requires the DPO to perform certain activities, including monitoring the
company’s compliance with the GDPR, providing advice during data protection
impact assessments (DPIAs) and cooperating with supervisory authorities.43

Designating a DPO is no trivial task given the role’s specific qualifications, responsibilities
and organizational visibility. It’s important to create a position that is “fit for purpose,” in other
words, one that considers the company’s unique requirements in light of the criteria expected
of DPOs by the GDPR.

2.10 Summary

Defining the appropriate governance of a privacy program is complex and challenging. Once
adopted and implemented, proper governance ensures that an organization’s approach to
privacy adequately supports its compliance with legal obligations, aligns with broader business
objectives and goals, is fully supported at all levels across the company, and culminates in the
protection of personal information.

Endnotes
1 Kirk M. Herath, Building a Privacy Program: A Practitioner’s Guide, p. 75, (Portsmouth, NH: IAPP, 2011).
2 Mission Statement, University Privacy Office, Stanford University, https://privacy.stanford.edu/about-
us/mission-statement (accessed November 2018).
3 Privacy at Microsoft, Microsoft, https://privacy.microsoft.com/en-US/ (accessed November 2018).
4 Mission and Vision, International Conference of Data Protection and Privacy Commissioners,
https://icdppc.org/the-conference-and-executive-committee/strategic-direction-mission-and-vision/ (accessed
November 2018).
5 Mission Statement, An Coimisiún um Chosaint Sonraí | Data Protection Commission, https://www
.dataprotection.ie/docs/Mission-Statement/a/7.htm (accessed November 2018).
6 Mission, vision ad goal, ICO, https://ico.org.uk/about-the-ico/our-information/mission-and-vision/ (accessed
November 2018).
7 The Authority’s vision and mission, Data Protection Authority, https://www
.dataprotectionauthority.be/vision-and-mission (accessed November 2018).
8 GLBA, 15 U.S.C, Subchapter I, § 6809 (1999).
9 HIPAA of 1996, 45 C.F.R. §§ 160.102, 160.103.
10 COPPA of 1998, 15 U.S.C. 6501–6505.
11 PCI DSS, PCI Security Standards Council, https://www.pcisecuritystandards.org/documents/PCI_DSS_v3-2-
1.pdf (accessed November 2018).

186
12 National Conference of State Legislatures, State Security Breach Notification Laws,
www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-
laws.aspx (accessed November 2018).
13 The Code of Fair Information Practices, Epic.org, https://epic.org/privacy/consumer/code_fair_info.html
(accessed November 2018).
14 OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, OECD,
http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofperson
aldata.htm (accessed November 2018).
15 Generally Accepted Privacy Principles: CPA and CA Practitioner Version, August 2009, IAPP,
https://iapp.org/media/presentations/11Summit/DeathofSASHO2.pdf (accessed November 2018).
16 Principles Set Out in the National Standard of Canada Entitled Model Code for the Protection of Personal
Information, CAN/CSA-Q830-96, Government of Canada, Justice Laws website,
https://laws-lois.justice.gc.ca/eng/acts/P-8.6/page-11.html#h-26 (accessed November 2018).
17 APEC Privacy Framework (2015), APEC, https://www.apec.org/Publications/2017/08/APEC-Privacy-
Framework-(2015) (accessed November 2018).
18 GDPR, Article 47, http://www.privacy-regulation.eu/en/article-47-binding-corporate-rules-GDPR
.htm (accessed November 2018).
19 ETSI, https://www.etsi.org/standards (accessed November 2018).
20 Personal Information Protection and Electronic Documents Act, (S.C. 2000, C.5), Government of Canada,
Justice Laws website, laws-lois.justice.gc.ca/eng/acts/P-8.6/index.htm (accessed November 2018).
21 GDPR, http://www.privacy-regulation.eu/en/index.htm (accessed November 2018).
22 Privacy Shield Framework, https://www.privacyshield.gov/EU-US-Framework (accessed November 2018).
23 HIPAA of 1996, 45 C.F.R. §§ 160.102, 160.103.
24 CNIL, https://www.cnil.fr/en/home (accessed November 2018).
25 Ann Cavoukian, “Privacy by Design: The 7 Foundational Principles,” https://iab.org/wp-content/IAB-
uploads/2011/03/fred_carter.pdf (accessed November 2018).
26 ENISA, https://www.enisa.europa.eu/ (accessed November 2018).
27 Sean Brooks, Michael Garcia, Naomi Lefkovitz, Suzanne Lightman, Ellen Nadeau, “An Introduction to
Privacy Engineering and Risk Management in Federal Information Systems,” NIST), U.S. Department of
Commerce (DOC), https://nvlpubs.nist.gov/nistpubs/ir/2017/NIST.IR.8062.pdf (accessed November 2018).
28 Framework for Improving Critical Infrastructure Cybersecurity Version 1.1, April 16, 2018, NIST, U.S. DOC,
https://nvlpubs.nist.gov/nistpubs/CSWP/NIST.CSWP.04162018.pdf (accessed November 2018).
29 2018 Privacy Tech Vendor Report, IAPP, https://iapp.org/resources/article/2018-privacy-tech-vendor-report/
(accessed November 2018).
30 IAPP-EY Annual Privacy Governance Report of 2016, IAPP, https://iapp.org/resources/article/iapp-ey-annual-
governance-report-2017/ (accessed November 2018); to see the IAPP-EY Annual Privacy Governance Report
of 2018, visit, https://iapp.org/resources/article/iapp-ey-annual-governance-report-2018/ (accessed November
2018).

187
31 Neil Roiter, “IT GRC tools: Control your environment,” CSO from IDG, https://www.csoonline
.com/article/2127514/compliance/it-grc-tools--control-your-environment.html (accessed November 2018).
32 IAPP-EY Annual Privacy Governance Report of 2017, IAPP, https://iapp.org/resources/article/iapp-ey-annual-
governance-report-2017/ (accessed November 2018); To see the IAPP-EY Annual Privacy Governance Report
of 2018, visit https://iapp.org/resources/article/iapp-ey-annual-governance-report-2018/ (accessed November
2018).
33 IAPP-EY Annual Privacy Governance Report of 2017, IAPP, https://iapp.org/resources/article/iapp-ey-annual-
governance-report-2017/ (accessed November 2018); To see the IAPP-EY Annual Privacy Governance Report
of 2018, visit, https://iapp.org/resources/article/iapp-ey-annual-governance-report-2018/ (accessed November
2018).
34 MSIT—Privacy Engineering, Carnegie Mellon University, http://privacy.cs.cmu.edu/ (accessed November
2018).
35 Privacy, Access and Information Management, Ryerson University, https://ce-
online.ryerson.ca/ce/default.aspx?id=3778 (accessed November 2018).
36 Executive Master in Cybersecurity, Brown University, https://professional.brown.edu/cybersecurity/ (accessed
November 2018).
37 IAPP, https://iapp.org/ (accessed November 2018).
38 Cynthia Rich, “Privacy and Security Law Report,” Bloomberg BNA, https://media2.mofo.com/
documents/150518privacylawsinasia.pdf, accessed November 2018.
39 Thomas J. Shaw, Esq., The DPO Handbook: Data Protection Officers under the GDPR, (Portsmouth, NH:
IAPP, 2018).
40 GDPR, Article 37, www.privacy-regulation.eu/en/article-37-designation-of-the-data-protection-officer-
GDPR.htm (accessed November 2018).
41 GDPR, Article 38, www.privacy-regulation.eu/en/article-38-position-of-the-data-protection-officer-
GDPR.htm (accessed November 2018).
42 GDPR, Article 37.
43 GDPR, Article 39, www.privacy-regulation.eu/en/article-39-tasks-of-the-data-protection-officer-GDPR.htm
(accessed November 2018).

188
PRIVACY POLICIES
By: Edward Yakabovicz, CIPP/G, CIPM, CIPT

Policies provide a deliberate system of principles to guide decisions by dictating a course of


action and providing clear instructions for implementation through procedures, protocols or
guidance documents. The objective of this chapter is to review the basic construct of an
organizational policy specific to privacy management. It will define the components of a
privacy policy and the practices necessary to make that policy successful by discussing the
importance of communication and the way other organizational policies support and reinforce
the privacy policy. As Bob Siegel, the founder and president of Privacy Ref, Inc., explains:1

It is not enough for a business to create a privacy policy and place it on its website; a
business must define policies and practices, verify that their employees are following
the practices and complying with policies, and confirm that third-party service
providers are adequately protecting any shared information as well. As customer
demands and regulatory requirements change, the business’ privacy practices and
policies must be reviewed and revised to meet this changing business environment.

5.1 What is a Privacy Policy?

A privacy policy governs the privacy goals and strategic direction of the organization’s privacy
office. As discussed in Chapter 2, it is important that the organization first develop a privacy
vision or mission statement that aligns with its overall strategy and objectives. This statement
helps guide management of the privacy program and allocation of resources to support the
program. It also serves as the foundation for developing effective privacy policies. Depending
on the industry and organization’s customers, the privacy policy could also be dictated by law
and regulations or by industry standard.
Policies become difficult to create if there is no definition of how they influence and can be
used by an organization. They should be considered at the highest level of governance for any
organization. In addition, they should be clear and easy to understand, accessible to all
employees, comprehensive yet concise, action-oriented, and measurable and testable. Policies
should align to organizational standards for format, structure and intent to meet organizational
goals.
The privacy policy is a high-level policy that supports documents such as standards and
guidelines that focus on technology and methodologies for meeting policy goals through

189
manuals, handbooks and/or directives. Examples of documents supported by the privacy policy
include:

• Organization standards, such as uniforms, identification badges and physical


building systems

• Guidelines on such topics as the use of antivirus software, firewalls and email
security

• Procedures to define and then describe the detailed steps employees should
follow to accomplish tasks, such as hiring practices and the creation of new user
accounts2

The privacy policy also supports a variety of documents communicated internally and
externally that:

• Explain to customers how the organization handles their personal information

• Explain to employees how the organization handles personal information

• Describe steps for employees handling personal information

• Outline how personal data will be processed

5.2 Privacy Policy Components

Although policy formats will differ from organization to organization, a privacy policy should
include the following components:
Purpose. This component explains why the policy exists as well as the goals of the privacy
policy and program, which could be used to meet a privacy standard based on national, regional
or local laws. This component could also meet other nonbinding standards or frameworks that
answer the needs of the organization.
Scope. Scope defines which resources (e.g., facilities, hardware and software, information,
personnel) the policy protects.
Risk and responsibilities. This section assigns privacy responsibilities to roles throughout
the organization, typically overseen by a privacy program office or manager. The
responsibilities of leaders, managers, employees, contractors, vendors and all users of the data
at the operations, management and use levels are delineated. Most importantly, this component
serves as the basis for establishing all employee and data user accountability.

190
Compliance. Compliance issues are a main topic in privacy policy. Sometimes they are
found in the relevant standard—such as the applicable data protection law—and are not written
into the organization’s privacy policy document. Potential compliance factors include the
following:

• General organization compliance to ensure the privacy policy assigns roles and
responsibilities at the proper level in the organization to create an oversight
group. This group has responsibility for monitoring compliance with the policy,
conducting enforcement activities, and aligning with the organization’s
priorities.

• The ability to apply penalties and disciplinary actions with authorization for
the creation of compliance structures that may include disciplinary actions for
specific violations.

• Understanding of the penalties for noncompliance with laws and regulations.


Legal and regulatory penalties are typically imposed within any industry to
enforce behavior modification needed to rectify previous neglect and lack of
proper data protection. Privacy is no different; organizations are now held
accountable for protecting the privacy of the data with which they have been
entrusted. As penalties for violation of privacy laws and regulations become more
serious, the privacy professional must be prepared to address, track and
understand any penalty that could affect the organization.

The privacy policy should not be confused with detailed process manuals and practices that
are typically outlined in standards, guidelines, handbooks and procedures documents.
Remember, the privacy policy is the high-level governance that aligns with the privacy vision
or mission statement of the organization.

5.2.1 Privacy Notice versus Privacy Policy


A privacy policy is an internal document addressed to employees and data users. This
document clearly states how personal information will be handled, stored and transmitted to
meet organizational needs as well as any laws or regulations. It will define all aspects of data
privacy for the organization, including how the privacy notice will be formed, if necessary, and
what it will contain.
A privacy notice is an external communication to individuals, customers or data subjects
that describes how the organization collects, uses, shares, retains and discloses its personal

191
information based on the organization’s privacy policy. This is discussed in more detail in
Chapter 6.

5.3 Interfacing and Communicating with an Organization

Protecting personal information and building a program that drives privacy principles into the
organization cannot be the exclusive job of the privacy officer or the privacy team, any more
than playing a symphony is the exclusive responsibility of the conductor. As with an orchestra,
many people, functions and talents will merge to execute the privacy vision or mission of the
organization.
Many organizations create a privacy committee or council composed of stakeholders (or
representatives of functions). These individuals may launch the privacy program and manage
it throughout the privacy policy lifecycle. They can be instrumental in making strategic
decisions that may affect the vision, change key concepts, or determine when alterations are
needed. Because of their experience and knowledge, they play a critical role in communicating
the privacy policy, which is almost as important as having a solid privacy policy; without
informed communications, the policy will simply sit on a shelf or hard drive.
Organizations with a global footprint often create a governance structure composed of
representatives from each business function and from every geographic region in which the
organization has a presence to ensure that proposed privacy policies, processes and solutions
align with local laws and are tailored to them as necessary. This governance structure also
provides a communication chain, both formally and informally, that the privacy professional
should continue to use in performing key data protection activities.

5.4 Communicating the Privacy Policy within the Organization

The privacy program management team should answer the following questions when
developing an effective internal communications plan:

• What is the purpose of the communication? For example, does it simply


communicate the existence of a policy or spread knowledge about the policy, or
is it intended to train employees and cause behavior modification concerning
privacy?

• How will the privacy team work with the communications team? What
methods—such as meetings, phone calls, and conference calls—will be used?

192
• Who is the audience for the communication relating to policy? Are there different
potential user groups such as production or administrative staff, managers, and
vendors?

• What existing communication modes—such as a company intranet—can be


employed? What assets, such as posters, flyers, mouse pads and other awareness
tools, will be needed?

• Which functional areas most align with the privacy program, and how should one
best communicate with each? For example, production, administrative,
information assurance, cybersecurity (sometimes called information security)
and human resources (HR) may all need to be in close coordination.

• What is the best way to motivate training and awareness for the organization?
Which metrics are best for tracking effectiveness and demonstrating the return
on investment?

• Has the privacy team conducted a privacy workshop for stakeholders to define
privacy for the organization, explain the market expectations, answer questions,
and reduce confusion?

Communications should include the formal privacy policy to help ensure that everyone
(including third-party service providers) in an organization receives the same guidance and
adheres to the same privacy mission and vision.

5.5 Policy Cost Considerations

Several potential costs are associated with developing, implementing and maintaining policies.
The most significant are related to implementing the policy and addressing the impacts on the
organization that potentially limit, reduce, remove or change the way data is protected.
Historically, privacy has been governed by information security, but it now requires resources
assigned directly to privacy management. The establishment and upkeep of a privacy
management program does not come at a negligible cost to the organization or impact on its
people. Limiting any business function has a direct and sometimes measurable impact on
employees’ or data users’ ability to perform certain tasks. The privacy professional should be
cognizant that all changes made to any policy affect the organization.3
Other costs are incurred through the policy development and management process.
Administrative and management functions are required to draft, develop, finalize and then

193
update the policy. Beyond that, the policy must be disseminated and then communicated
through training and awareness activities. Although the cost is unavoidable for policy
management, in most cases due to regulations, there must be a balance between practical
protections to meet any regulations and laws, the organization’s privacy vision or mission, and
the organization’s need to perform the intended business transactions.4

5.6 Design Effective Employee Policies

An article by Ronald Breaux and Sam Jo reminds us:

“…that employees and data users are typically the most common cause of data
breaches, data loss and data misappropriation if appropriate safeguards are not
instituted and enforced. To mitigate these risks, develop comprehensive policies and
procedures that dictate which employees have access to particular data by category to
… include instructions on reporting impermissible uses or violations of policies related
to confidentiality and security, and contain onboarding and exit procedures to protect
against information misappropriation upon termination of employment.” 5

Comprehensive privacy policies must align with supporting documents, including additional
policies that respond to the needs and intent of the organization to fix an issue, serve a specific
purpose, or meet a specific goal. Higher-level policies and procedures include items such as
security configurations and responsibilities, while examples of those that address issues include
behavior modification, proper usage of organization property, newer technology threats, social
media use, email use and internet use. Documents addressing these issues should be reviewed
and updated regularly. Regardless of the intent, supporting policies may contain the following
data.
Issue/objective statement. To formulate a policy on an issue, the information owner/steward
must first define the issue with any relevant terms, distinctions and conditions included. It is
often useful to specify the goal or justification for the policy to facilitate compliance. For
example, an organization might want to develop an issue-specific policy on the use of
“unofficial software,” which might be defined to mean any software not approved, purchased,
screened, managed or owned by the organization. The applicable distinctions and conditions
might then need to be included for some software, such as software privately owned by
employees but approved for use at work or owned and used by other businesses under contract
to the organization.

194
Statements of the organization’s position. Once the issue is stated and related terms and
conditions are detailed, this section is used to clearly state the organization’s position (i.e.,
management’s decision) on the issue. In the previous example, this would mean stating whether
the use of unofficial software as defined is prohibited in all or some cases; whether there are
further guidelines for approval and use; or whether case-by-case exceptions may be granted,
by whom, and on what basis.
Applicability. Issue-specific policies also need to include statements of applicability. This
means clarifying where, how, when, to whom and to what a policy applies. For example, it
could be that the hypothetical policy on unofficial software is intended to apply only to the
organization’s own on-site resources and employees and not to contractors with offices at other
locations. Additionally, the policy’s applicability might need to be clarified as it pertains to
employees travelling among different sites, working from home, or needing to transport and
use hardware at multiple sites.
Roles and responsibilities. The assignment of roles and responsibilities is also usually
included in issue-specific policies. For example, if the policy permits employees to use
privately owned, unofficial software at work with the appropriate approvals, then the approval
authority granting such permission would need to be stated. (The policy would stipulate, who,
by position, has such authority.) Likewise, the policy would need to clarify who would be
responsible for ensuring that only approved software is used on organizational system
resources and, possibly, for monitoring users regarding unofficial software.
Compliance. Some types of policies may describe unacceptable infractions and the
consequences of such behaviors in greater detail. Penalties may be explicitly stated and
consistent with organizational personnel policies and practices. When used, they can be
coordinated with appropriate officials, offices, and even employee bargaining units. A specific
office in the organization may be tasked with monitoring compliance.
Points of contact and supplementary information. For any issue-specific policy, indicate
the appropriate individuals to contact in the organization for further information, guidance and
compliance. Since positions tend to change less often than the individuals occupying them,
specific positions may be preferable as the point of contact. For example, for some issues, the
point of contact might be a line manager; for others, it might be a facility manager, technical
support person, system administrator or security program representative. Using the above
example once more, employees would need to know whether the point of contact for questions

195
and procedural information would be their immediate superior, a system administrator or an
information security official.
Many offices in the organization may be responsible for selecting, developing, updating and
finalizing policies and all supporting documents, including the privacy office, legal, HR and
information security. This distribution of responsibility helps ensure a clear and accurate policy
that meets the needs of the organization and any regulatory or external standards.
The following section presents several high-level examples of supporting documents that
affect data protection and the privacy vision or mission of the organization, including materials
on acceptable use, information security, procurement, and data retention and destruction. These
represent only a small subset of topics that can be considered as privacy-supporting policies.

5.6.1 Acceptable Use Policies: Guest Wireless Access


An acceptable use policy (AUP) stipulates rules and constraints for people within and outside
the organization who access the organization’s network or internet connection. It outlines
acceptable and unacceptable use of the network or internet connections to which the user agrees
either in written or electronic form. Violation typically leads to loss of use and/or punitive
action either by the organization or by law enforcement if necessary. People affected include
employees, students, guests, contractors and vendors.
The information security function usually plays a major role in developing acceptable use
policies. This type of policy considers the following:

• Others’ privacy

• Legal protections (e.g., copyright)

• Integrity of computer systems (e.g., anti-hacking rules)

• Ethics

• Laws and regulations

• Others’ network access

• Routing patterns

• Unsolicited advertising and intrusive communications

• User responsibilities for damages

• Security and proprietary information

• Virus, malware protection and malicious programs

196
• Safeguards (e.g., scanning, port scanning, monitoring) against security breaches
or disruptions of network communication

5.6.2 Information Security Policies: Access and Data Classification


Internal information security policies serve several purposes:

• To protect against unauthorized access to data and information systems

• To provide stakeholders with information efficiently, while simultaneously


maintaining confidentiality, integrity and availability (CIA)

• To promote compliance with laws, regulations, standards and other


organizational policies

• To promote data quality

An information security policy establishes what is done to protect the data and information
stored on organization systems, including the following:

• Risk assessments

• User and password policies

• Administrative responsibilities

• Email policies

• Internet policies

• Intrusion detection

• Antivirus and malware policies

• Firewall rules and use

• Wireless management

5.7 Procurement: Engaging Vendors

Vendors should be held to the same privacy standards as the organization. When engaging
vendors, an organization may:

• Identify vendors and their legal obligations

• Evaluate risk, policies and server locations

• Develop a thorough contract

197
• Monitor vendors’ practices and performance

• Use a vendor policy

An organization must exercise similar due diligence for mergers, acquisitions and
divestitures. More information on these can be found in Chapter 4.

5.7.1 Create a Vendor Policy


Vendor policies should guide an organization in working with third parties from procurement
through termination. Policy components may include requirements for vendors, logistics (e.g.,
where work should be conducted), and onboarding and employee training. A vendor policy
may require identification and inventories of all vendors and entry points, such as free survey
tools, personal information the vendor can access, and legal obligations on the organization
and vendor. The vendor policy may stipulate that the procuring organization evaluate its
processes for risk assessment, its risk profile, and categories of vendors based on risk. This
may include evaluating the vendor’s internal policies; affiliations and memberships with other
organizations; mandatory and nonmandatory certifications; location of data servers; and data
storage, use, and transport.

5.7.2 Develop a Vendor Contract


It’s important to work with the organization’s legal and HR departments on any contract,
including the following:

• Standard contract language

• Requirement to inform the organization when any privacy/security policies


change

• Prohibition against making policy changes that weaken privacy/security


protections

• Data migration/deletion upon termination

• Vendor security incident response procedures

• Vendor liability

• Right to audit

198
5.7.3 Monitor Vendors
After the basic vendor policy and contract are complete, the procuring organization should
consider the vendor in its monitoring plan to ensure crossover with its audit and compliance
functions. This may include recurring on-site visits, attestations, and/or periodic reassessments.

5.7.4 Implement Procurement/Information Security Policies: Cloud Computing Acceptable


Use
Cloud computing technologies can be implemented in a wide variety of architectures, models,
technologies and software design approaches. The privacy challenges of cloud computing
present difficult decisions when choosing to store data in a cloud. Public, private and hybrid
clouds offer distinct advantages and disadvantages.
The privacy aspects of any potential cloud choices should be considered before engaging
vendors or external parties. Working from requirements, an organization should determine the
purpose and fit of a cloud solution, obtain advice from experts, then contact external cloud
vendors. Furthermore, understanding the policies, procedures and technical controls used by a
cloud provider is a prerequisite to assessing the security and privacy risks involved.
With the increased use of cloud computing and other offsite storage, vendors that provide
cloud computing services may pose distinct privacy challenges, especially because of
compliance requirements and security risks. An organization should ensure its acceptable use
policy for cloud computing requires the privacy and security of its data as well as compliance
with policies, laws, regulations and standards. Risks of processing data using cloud-based
applications and tools should be mitigated. The policy should stipulate approval of all cloud
computing agreements by appropriate leadership, such as the chief information officer (CIO).
Both information security and privacy teams should agree on the policy and vendor of choice
before final decisions are made. This ensures alignment of the stakeholders and the policy that
will be used to protect the organization. It may also outline specific cloud services that may be
used, restrictions for processing sensitive information in the cloud, restrictions for personal use,
and data classification and rules for handling.

5.7.5 Implement HR Policies


HR handles diverse employee personal information and typically will have policies to guide
processing. HR policies often provide rules regarding who may access employee data and
under what circumstances. Employee data includes any data the employee has created in the

199
process of performing normal business efforts for the organization, including emails, phone
calls, voice mail, internet browsing, and use of systems.
When creating or updating any HR policy that concerns privacy, HR should consult with
legal and information security to ensure all laws, regulations, and other possible organization
policies are met. Especially regarding updates to laws and regulations such as the EU General
Data Protection Regulation (GDPR), the privacy professional should consult with all
stakeholders prior to creating or updating any HR policy.
HR privacy concerns can be addressed through several types of HR policies. These policies
may address the following privacy concerns:

• Employee communications, including employee browser histories, contact lists,


phone recordings and geolocations

• Employee hiring and review, including performance evaluations, background


checks, and the handling of resumes

• Employee financial information, such as bank account information, benefits


information and salary

5.7.5.1 Types of HR Policies


Typical HR privacy policies to consider include the following:

• Handling of applicant information

• Employee background checks

• Access to employee data

• Termination of access

• Bring your own device (BYOD)

• Social media

• Employee/workplace monitoring

• Employee health programs

5.8 Data Retention and Destruction Policies

Data retention and destruction policies should support the idea that personal information should
be retained only for as long as necessary to perform its stated purpose. Data destruction triggers
and methods should be documented and followed consistently by all employees. These should

200
align with laws, regulations and standards, such as time limits for which records must be saved.
Ownership of a data retention/destruction policy may vary and intersect with privacy, legal, IT,
operations, finance and the business function.
Actions an organization can take to develop a data retention policy include:

• Determine what data is currently being retained, how and where

• Work with legal to determine applicable legal data retention requirements

• Brainstorm scenarios that would require data retention

• Estimate business impacts of retaining versus destroying the data

• Work with IT to develop and implement a policy

Data management requires answers to questions such as why we have the data, why we are
keeping it, and how long we need to keep it. The process begins with identifying all the data
contained in the organization and determining how it is used. Next, the organization should
match the data to the legal obligations around retention. Data retention and data deletion should
be executed with caution. Keeping the data for as long as the organization has a legitimate
business purpose is a common best practice. To comply with legal requirements and
organization governance standards, the organization should review all associated policies,
standards, guidelines and handbooks. This includes every relevant country’s required minimum
retention time. Legal requirements could change if the company is involved in litigation and
discovery actions. Thus, the policy and all supporting standards and technical controls should
be flexible.

5.8.1 Implementing Policies


Privacy-related policies will not be effective if individuals do not care about or follow them.
An organization should seek ways to enable employees to integrate the policies into their daily
tasks. The privacy team can achieve this objective by aligning policies with existing business
procedures, training employees and raising awareness.

5.8.2 Aligning with Procedures


Multinational and multisector organizations have additional challenges to ensure policies are
consistent and uniform across all locations while satisfying local laws, regulations and industry
guidance. Different business functions may have diverse policy needs. The organization should
document and review policies of the following functions and others to ensure alignment:

201
• HR

• Business development (when assessing proposed projects)

• Project management

• Procurement and contract management

• Risk management

• Incident management

• Performance management

Inconsistencies between policies should be explained fully to ensure there are no gaps or
misunderstandings.

5.9 Implementing and Closing the Loop

Once policies have been created, approved and put in place, they must be communicated to the
organization. Raising awareness and properly training employees and data users is key to
knowledge transfer and retention.
Awareness means to be vigilant or watchful. From a privacy perspective, achieving
awareness requires communicating the various components of an organization’s privacy
program, thus creating a vigilant or watchful attitude toward the protection of privacy data.
Everyone who handles privacy information must be alert to the constant need to protect data.
Yet no one is immune to the daily pressures and deadlines that can distract attention from the
big picture. This reality underscores the need for organizations to put reminders in front of their
workforces to keep attention focused on the proper handling and safeguarding of personal
information. These reminders may take the form of awareness tools such Data Privacy Day on
January 28, infographics, tip sheets, comics, posters, postcards, stickers, blogs, wikis,
simulations, email campaigns, announcements on the intranet, web sessions, drop-in sessions,
and lunch-and-learns. Raising workforce awareness on a consistent basis should be one of the
top activities considered by any privacy management team.
Formal training practices are also part of closing the communication loop. Training may be
delivered through dedicated classroom, instructor-led courses or online platforms. Employees
and data users may be required to train regularly. This is where the privacy vision for the
organization and policy enforcement is communicated clearly and consistently. Training and
awareness reinforce policies by educating personnel with constant and consistent reminders

202
about why they are important, who they affect, and how they are accomplished. Training is
covered in greater detail in Chapter 7.
Finally, policies apply to everyone in the organization. One loophole or one break in the
organization’s protection can lead to a hack on the entire organization. Leadership,
management, the privacy office, and the information security office do not have a waiver to
break any policy they believe does not address them. Individual choices that breach policy can
place the entire workforce at risk for significant legal consequences and loss of credibility. If a
policy is disconnected from reality, it needs to be corrected—and all risk factors mitigated—
as soon as possible to protect the data owners’ privacy rights and to protect the organization
from crippling loss and impacts.

5.10 Summary

A privacy policy should be considered a living document that adapts over time based on the
needs of the organization, the evolving business environment, regulatory updates, changing
industry standards and many other factors. This could be considered the lifecycle of the policy
that continues to be reviewed and updated on a regular basis. Part of this lifecycle should be
the communication of the policy through effective training and awareness practices that should
also be recurring and mandatory for every employee, vendor, contractor or other data user
within the organization.
The privacy policy should contain at a minimum the purpose, scope, responsibilities and
compliance reasons to allow the reader a full understanding of how privacy will be managed.
In some cases, the privacy policy may also address risks, other organizational responsibilities,
data subject rights, data use rules and other privacy-related information and practices. The
composition of the policy should align with the needs of the organization in meeting national,
state and local laws or other standards for data privacy protection.
Beyond the privacy policy are other supporting policies that provide practical guidance on
potential issues or specific intent. These include information security policies that also protect
data, but for a different purpose and with potentially different tools, people, and processes to
support common goals between privacy and security.
It is important to understand that information security is a complex topic that will span the
organization and overlap privacy management. By becoming familiar with information
security practices and stakeholders, the privacy professional will open channels of

203
communication with those key players throughout the organization and during any incident
response.
Managing privacy within an organization requires the contribution and participation of many
members of that organization. Because privacy should continue to develop and mature over
time within an organization, functional groups must understand just how they contribute and
support the overall privacy program, as well as the privacy principles themselves. Importantly,
individual groups must have a fundamental understanding of data privacy because, in addition
to supporting the vision and plan of the privacy officer and the privacy organization, these
groups may need to support independent initiatives and projects from other stakeholders.
The privacy professional should have awareness of other policies and standards that support
privacy or offer other data protections. An example is the data retention/records management
strategies that reinforce the basic concept that data should only be retained for the length of
time the business needs to use the data. Records management and data retention should meet
legal and business needs for privacy, security and data archiving.
Creating privacy policies does not mean employees or other internal data users will know
and follow them or understand their purpose and intent. The same is true for any organization
policy, standard, guideline or handbook. The privacy policy, like many other business-related
policies, has a specific intent to protect data privacy during and after business use. To meet the
privacy intent, users of the data will need to be educated and reminded on a regular basis of the
organization’s vision and mission. Because data users focus on their primary objectives and
jobs rather than on privacy, education and reminders about what privacy is and how and why
of privacy management is important for the continued success of the organization.

Endnotes

1 Bob Siegal, “Kick-Starting a Privacy Program,” The Privacy Advisor, IAPP, February 2013,
https://iapp.org/news/a/2013-01-22-kick-starting-a-privacy-program/ (accessed February 2019).
2 “An Introduction to Computer Security,” NIST Special Publication 800-12 Revision 1, National Institute of
Standards and Technology, U.S. Department of Commerce, https://doi.org/10.6028/NIST.SP.800-12r1
(accessed November 2018).
3 Id.
4 Id.
5 Ronald Breaux and Sam Jo, “Designing and Implementing an Effective Privacy and Security Plan,” The Privacy
Advisor, IAPP, March 2014, IAPP, https://iapp.org/news/a/designing-and-implementing-an-effective-privacy-
and-security-plan/ (accessed November 2018).

204
TRAINING AND AWARENESS
By: Chris Pahl, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP

Employees have many issues to consider as they perform their daily duties. While privacy
offices believe appropriate collection and use of personal information is the most important
priority for employees, to the employee, the focus is on task completion. This disconnect
between expected and actual behavior can frustrate the privacy office, but closing the gap
requires ongoing and innovative efforts to keep privacy integrated with everyday
responsibilities. Management needs to approve funding to support privacy initiatives such as
training and awareness and hold employees accountable for following privacy policies and
procedures. Building a privacy strategy may mean changing the mindset and perspective of an
entire organization through training and awareness. Effectively protecting personal information
within an organization means all members of the organization must do their share.
Ponemon’s 2018 Cost of a Data Breach Study estimates the average cost of a data breach is
USD $148 per record, or $3.86 million, which is a 4.8 percent increase over 2017. The United
States, Canada and Germany have the highest per capita costs at $233, $202 and $188
respectively, with Turkey, India and Brazil at $105, $68 and $67. The study finds the likelihood
of a recurring material breach in the next two years is 27.9 percent. These figures do not include
the impact on productivity, resource reassignment, delays in executing the strategic plan, and
civil or regulatory lawsuits. However, deployment of an incident response team reduces costs
by $14 per record.1
Verizon’s 2018 Data Breach Investigations Report cites 53,308 security incidents and 2,216
data breaches in 65 countries, with criminals continuing to exploit the same weakness—the
human. In 76 percent of the cases, cybercriminals were financially motivated, and 28 percent
of the attacks were committed by insiders. Verizon’s report states that 17 percent are due to
employee errors such as “failing to shred confidential information, sending an email to the
wrong person or misconfiguring web servers.”2 Most shocking is that 4 percent of employees
will click on any given phishing campaign.
The frequency with which large-scale issues are triggered by clicking on suspicious links
should come as no surprise to privacy professionals. In early 2014, a Yahoo! employee
allegedly opened a “spear fishing email that created a massive vulnerability for the company.”3
Half a billion Yahoo! accounts were exposed to Russian hackers, who allegedly forged cookies
to directly access more than 6,500 Yahoo! accounts. The hackers sought access to the accounts

205
of Russian and U.S. government officials as well as high-ranking international executives. For
two years, Yahoo! was pillaged of user data and its own technology. A simple click has turned
into an ongoing nightmare for the company. It is estimated that three billion Yahoo! accounts
have been affected, and courts are allowing data breach victims and Yahoo! investors to sue
the company.4
Failures to protect personal information can become expensive. On September 26, 2018, the
New York attorney general reached the largest data breach settlement to date with Uber
Technologies. Uber agreed to pay $148 million to settle a 2016 data breach in which hackers
stole data on 57 million Uber customers, including 25.6 million riders and drivers in the United
States. Although the company was aware of the breach, it chose to conceal it from regulators
and paid the hackers $100,000 to delete the stolen data and keep the incident quiet. This
decision later resulted in the firing of Uber’s chief security officer.5
Corporate culture has a profound impact on the effectiveness of a compliance program;
however, making employees aware of their obligations to observe data minimization principles
and safeguard personal information begins with training and awareness.

7.1 Education and Awareness

Education and awareness reinforce the organization’s privacy policy and practices. Education
allows for communication and social acceptance of the privacy policy and supporting
processes. It is critical to the successful delivery of the privacy message and sets the stage for
reception and acceptance throughout the organization. Education efforts may be recorded in
employee records and include formal and informal methods such as:

• Classroom training

• Online learning through streaming, videos and websites

• Poster campaigns

• Booklets

• Workshops

The education strategy and budget typically determine the best or approved methods for
education within the organization. The privacy professional should first understand these areas
to ensure they align with and meet corporate standards before offering any solutions.

206
Tip: Have a regular coffee and catch-up on one privacy topic via a 15-minute web
conference or a face-to-face meeting.

Training is a key control, and under some regulations—such as the U.S. Health Insurance
Portability and Accountability Act (HIPAA) of 1996—it is required. However, training must
go beyond checking a box. It must address applicable laws and policies, identify potential
violations, address privacy complaints and misconduct, and include proper reporting
procedures and consequences for violating privacy laws and policies. Where appropriate, the
training delivered internally must extend to business partners and vendors. Companies should
require trainees to acknowledge in writing that they have received training and agree to abide
by company policies and applicable law, which should also be overseen by the privacy team.6
This step can easily be accomplished by concluding a course with a simple signoff, requiring
the employees to check multiple boxes regarding three to five principles they must follow at
all times.
If people are not aware of what they are processing, they are also unaware of the
consequences and liabilities that may result from mishandling data. The privacy office cannot
assume understanding without ensuring there are sufficient learning opportunities. The words
training and awareness are used interchangeably, but they serve different functions. Training
communicates the organization’s privacy message, policies and processes, including those for
data usage and retention, access control and incident reporting. Training must be engaging—
for example, using gamification or creating friendly competitive contests—to motivate
individuals to protect information. In some cases, training should be personal, teaching
employees how to implement appropriate privacy controls at their homes, which will make
them more aware in the office. Its impact must be measured using attendance and other metrics.
Metrics give leaders a powerful picture of what is occurring in the company.
An organization’s privacy awareness program reinforces the privacy message through
reminders; continued advertisement; and mechanisms such as quizzes, posters, flyers, and
lobby video screens. Reinforcement of this message ensures greater privacy awareness, which
can effectively reduce the risk of privacy data breach or accidental disclosure. If implemented
effectively, training and awareness programs can communicate beyond what is written in
privacy policies and procedures to shape expected behaviors and best practices. Where
possible, integration with other training and awareness programs reinforces the messaging.
Some mistakes typically associated with education and awareness include:

207
• Equating education with awareness

• Using only one communication channel

• Lacking effectiveness measurements

• Eliminating either education or awareness due to budget concerns

Awareness-raising is one of the key aspects of the privacy framework and should be
prioritized for all organizations. It can come in different forms, none of which require
huge budgets. If people are not aware of what they are processing, they are also
unaware of the consequences and liabilities that result from not knowing.

7.2 Leveraging Privacy Incidents

Most privacy compliance programs have mechanisms for gathering information regarding
privacy incidents. While these incidents result in investigative work by the privacy office, they
also provide training opportunities. Where possible, the privacy office should provide targeted
training to the affected department. When privacy incidents occur, it is important to consider
the following:

• Where possible, leverage lessons learned from events that make the
headlines. Use the events as learning opportunities, including discussions of how
the incidents described suggest ways to improve your company’s processes.

• Doing business means mistakes will happen. Use mistakes as learning


opportunities to improve processes rather than as cause for complaint.
Mistakes are best handled when they are approached constructively.

• Use stories. It is human nature to want to hear other people’s stories. Share a
privacy incident with others, or ask a victim of identity theft to speak about their
experience.

• Hold “lunch and learn” sessions. Lunch and learn is a perfect way to educate
employees during their lunch hour. Allow them to bring their lunch and listen to
an expert speaker on a topic of personal interest, such as how to protect families
from identity theft. These sessions could be held on one of the dedicated privacy
and cybersecurity days sponsored by the cybersecurity industry. For example,
ask a law enforcement expert to speak during lunch on worldwide Data Privacy
Day, January 28, about data breaches or identity theft and make free resources,

208
such as information available through the Federal Trade Commission (FTC) or
StaySafe Online, available to attendees. At the end of the lunch, connect personal
privacy with the responsibilities each employee has to protect the organization’s
data.

• Make it fun. Admit it: Privacy training is not fun, and those around you have no
idea why you are passionate about your job. However, take that passion and share
it through games, stickers, competitions and giveaways. The IAPP can assist by
sending you a six-foot foam superhero cutout of Prudence the Privacy Pro, with
her sidekick, Opt-Out, for a nominal charge.7

• Develop slogans that can be used in presentations to capture the essence of


the message. For example, the word security is frequently used. However,
privacy professionals know the human element is the concern. Consider playing
off the word like this: “there can be all the security in the world, but at the end
SECU-R-ITY.” The letters SECTY fade away, and employees are told U-R-IT.

7.3 Communication

Communication is one of the most effective tools an organization has for strengthening and
sustaining the operational lifecycle of its privacy program. Privacy information is dynamic and
constantly changing, so for privacy policies and procedures to remain effective, organizations
must continually communicate expectations and policy requirements to their representatives—
including contractors and vendors—through training and awareness campaigns.
Improvements to the privacy program will also depend on the organization providing ongoing
communication, guidance and awareness to its representatives regarding proper handling and
safeguarding of all privacy data. All available means should be used to take the message to
everyone who handles personal information on behalf of the organization. A good question to
ask regularly is: How effectively are we communicating the expectations of our privacy
program to the workforce—everyone who is using the data? Measure understanding through
metrics or other objective means. This requires use of multiple metrics to assess an overall
trend, which will demonstrate to the privacy office where additional, or refined, training is
required.
Each organization needs a communications strategy to create awareness of its privacy
program and a specific, targeted training program for all employees. A goal of this
communications strategy is to educate and develop privacy program advocates for each

209
affected business unit within the organization. One of the best ways to accomplish this goal is
by employing a variety of methods to communicate the message.
The privacy office is responsible for updating employees’ knowledge when changes occur.
However, employees cannot be expected to be trained on every aspect of a privacy regulation—
just on the guiding principles of compliance and expected behavioral outcomes. Additionally,
training to the details of a regulation will require more frequent retraining when changes are
made. Taking a big-picture approach for protecting personal data is easier to manage than
addressing the details of what constitutes personally identifiable information (PII).
Creating a strategic activities plan for the year is a good way to provide for regular updates.
Some groups specifically build into their plans a calendar of workforce communications to
ensure ongoing reinforcement throughout the year. For example, the plan might specify that
“every quarter we will produce a targeted email campaign that will instruct employees on how
to do x, y, z. We will conduct knowledge tests (contests) to assess learning.”

7.4 Creating Awareness of the Organization’s Privacy Program

As discussed in Chapter 5, awareness means to be vigilant or watchful. From a privacy


perspective, achieving awareness requires communicating the various components of an
organization’s privacy program, thus creating a vigilant or watchful attitude toward the
protection of personal data. The need for the privacy office to constantly put reminders in front
of their workforce requires innovative thinking to identify different reminder techniques.
Trying new approaches should not be impeded by the fear of failure—sometimes the best
planned reminders fail, but the failures spark new, more effective ideas.

7.4.1 Internally
How does an organization build an awareness program internally? A good place to start is
through interdepartmental cooperation working toward the shared goal of privacy protection.
For example, the marketing department could work with information security and tie in its
campaign with the awareness program. You may also look at including your organization’s
ethics and integrity department, as well as human resources (HR), in planning effective ways
for departments to share their awareness programs and experiences. Discuss how different
groups can work together to reinforce the privacy message with the workforce, creating an even
greater awareness of your privacy program.
You could also take an interdepartmental approach to assessing the various privacy
awareness programs throughout the organization. This can reveal both strengths and

210
weaknesses in individual programs, contributing to an overall strengthening of all internal
awareness programs.
Conferences and seminars can be rich sources of information and expert suggestions on
effective ways to build a privacy program and address privacy governance. An individual may
learn about various approaches from privacy professionals by attending presentations or panel
discussions on these topics. And often, a person learns about governance structures and
approaches to privacy through presentations on other topics.
Managing security incidents, creating a sustainable training and awareness program, and
designing and implementing programs or presentations on privacy challenges can educate the
workforce on privacy topics and provide insights into how an organization manages these
issues and assigns accountability.
Information can also be obtained through informal exchanges of ideas. Most privacy
professionals are engaged in some phase of launching a privacy program. The challenge is that
technologies are always changing, new laws are always being adopted, and processes can
always be improved.

7.4.2 Externally
Creating external awareness of a privacy program requires different resources and methods
than building internal awareness. External awareness is more directed toward building
confidence through brand marketing. This occurs, for example, when an organization makes
statements such as, “We respect your personal information, and we take steps to make sure that
your information is secure with us.” Increasing external awareness will also require obtaining
partner agreements and, in certain cases, providing training or obtaining attestations of
compliance. The challenge is to meet the reasonable expectations of consumers and regulators
and provide proof of compliance if challenged, otherwise state agencies or the FTC may file
civil penalties against your company for misleading its consumers.
External awareness is aimed at building consumer confidence in a brand by creating
recognition of a corporation’s commitment to security or to fulfilling a legal requirement.
Organizations must have integrity when it comes to handling personal information if customers
are to remain loyal.
An example of creating external awareness is found in the growing cloud computing industry.
Many corporations are now exclusively, or at least heavily, involved in providing
infrastructure, platform and software services for individuals and businesses. The marketing of

211
cloud services is built on the consumers’ perception of the ability of the host organization to
protect their personal information.
Much of this information is personal information that other organizations are transferring to
an external site for storage. The most successful cloud-hosting organizations are those that
inspire confidence in their ability to provide security for the personal data consumers entrust
to them.

7.5 Awareness: Operational Actions

The privacy team, along with all relevant departments, can take the following operational
actions to ensure ongoing awareness:

• Develop and use internal and external communication plans to ingrain


operational accountability

• Communicate information about the organization’s privacy program

• Ensure policy flexibility for incorporating changes to compliance requirements


from laws, regulations and standards

• Identify, catalog and maintain all document requirements updates as privacy


requirements change

7.6 Identifying Audiences for Training

Staff, managers, contractors and other third parties may need privacy training. The key is to
identify who has access to personal information and provide targeted training to those people.
Targeted training implies there may be a variety of training programs, depending on the
department, the type of information that is being handled, how that information is processed,
and who handles it.

7.7 Training and Awareness Strategies

Training and awareness must have the intention of changing bad behaviors and reinforcing
good ones that are integral to the success of the privacy program. Many organizations have a
learning and development group managing activities related to employee training. This
function enables policies and procedures to be translated into teachable content and can help
contextualize privacy principles into tangible operations and processes. In smaller companies,
these responsibilities may fall on the privacy function. Whatever the size of the organization,
the privacy team will always need to approve the training materials that have been produced.

212
Steps for a successful communication and awareness campaign include:

• Assessing the organization’s education and awareness initiatives

• Sustaining communication via awareness and targeted employee, management,


and contractor training

• Partnering with HR or training functions, or an organizational change


management expert

• Using badges and slogans

• Repeating training over a predetermined period (e.g., annually, biannually)

• Using microlearning or blended learning

• Inserting privacy messaging into other department trainings

• Going to road shows and staff meetings

• Tracking participation and comprehension

The communications group can assist by publishing periodic intranet content,


communicating via email, and providing posters and other collateral that reinforce good
privacy practices.

7.8 Training and Awareness Methods

Companies must think of innovative ways to communicate training and awareness


opportunities to their employees. Methods may differ based on the company’s culture and
budget. Some methods are low-cost. It is not uncommon for companies to use different ways
for delivering messaging. Examples include:

• Formal education utilizes a classroom environment to deliver official training


that may be recorded and documented. Training may also be delivered just-in-
time; for instance, an organization might provide a brief training session when
an individual has been authorized to perform a new task. Instructors can use out-
of-the-box modules (e.g., IAPP Privacy Core®).

o Different training methods include instructor/classroom, online, hybrid


and gamification

o Awareness includes email, posters, blogs, internal social media, games


and expos

213
• E-learning includes computer-based training (CBT), internet-based training
(IBT) or web-based training (WBT). E-learning can be self-paced or live (with
the active support of an instructor). Simulations can be used.

• Road shows and department team meetings offer opportunities to provide


training pamphlets or other material for individuals to pick up at booths or to
receive at staff meetings.

• Newsletters, emails and posters have been used for many years. Delivery
methods include email, websites, print materials, and physical displays among
other options. Stickers are also a unique way to deliver messages.

• Handouts containing frequently asked questions and tip sheets are helpful for
answering common questions or dispelling myths.

• Slogans and comics can be used to summarize important aspects of the program
and promoted as giveaway items.

• Video teleconferencing delivers content via videos that can be recorded live and
replayed.

• Web pages can be used to communicate data, knowledge bases, and frequently
asked questions.

• Voicemail broadcast provides an automated means to deliver a broadcast


message to all employees without having to contact each one separately.

Communication should be consistent at all levels of the organization and among all
stakeholders to ensure they understand the framework and how it affects and improves the
organization.

7.9 Using Metrics

Privacy programs are generally not seen as revenue-generating; however, they can reduce risk.
For privacy programs to show how they support the company’s mission and prove to regulators
they are actively addressing compliance risks, they must keep records regarding training and
awareness programs, including any remediation taken after an event. Metrics must go beyond
the numbers enrolled in a training or communication event and tell a story regarding process
improvement. They should be linked to other program metrics, such as reduction in the number
of privacy events. It may take time to measure the impact of training and awareness activities.

214
Sample training and awareness metrics include:

• Number of training or awareness opportunities by topic

• Number of individuals who enrolled or received awareness communication

• Training method (e.g., live, online, poster, road shows)

• Percent of training completed

• Results of quizzes or knowledge tests

• Changes to the number of privacy incident reports or requests for consultation or


additional training

Table 7-1: Metric Template Example: Awareness and Training Measure8

Field Data

Measure ID Security Training Measure 1 (or a unique identifier to be filled out by the
organization)

Goal • Strategic goal: ensure a high-quality workforce supported by


modern and secure infrastructure and operational capabilities
• Privacy goal: ensure that organization’s personnel are
adequately trained to carry out their assigned information-
security-related duties and responsibilities

Measure Percentage of information system security personnel who have received


security training (see NIST SP 800-53 Controls: AT-3: Security Training
for definitions)

Measure Type Implementation

Formula Number of personnel who have completed security training within the past
year/total number of information system security personnel *100

Target High percentage defined by the organization

Implementation • Training records maintained


Evidence • Percentage of those with significant privacy responsibilities
who have received the required training

Frequency • Collection frequency: organization-defined (e.g., quarterly)

215
• Reporting frequency: organization-defined (e.g., annually,
monthly, weekly)

Responsible • Information owner: organization-defined (e.g., training


Parties manager)
• Information collector: organization-defined (e.g., information
system security officer (ISSO), training manager, privacy
officer)
• Information customer: chief information officer (CIO), ISSO,
chief information security officer (CISO)

Data Source Training and awareness-tracking records

Reporting Format Pie chart illustrating the percentage of personnel who have received
training versus those who have not received training; if performance is
below target, pie chart illustrating causes of performance falling short of
targets
Training programs dealing with privacy policies should be based on clear policies and
standards and have ongoing mechanisms and processes to educate and guide employees in
implementation. Everyone who handles personal information needs to be trained in privacy
policies and how to deploy them within their area to ensure compliance with all policy
requirements. This applies to employees, management, contractors, and other entities with
which the organization might share personal information.

7.10 Summary

As companies continue to closely monitor training seat time, the privacy compliance program
should seek out innovative ways to ensure its message continues to be heard. This means the
program must build alliances with other similar organizations, such as cybersecurity and
physical security, to ensure a consistent message is carried through all applicable training.
Where possible, the topic of privacy should become a core topic within the company, ensuring
its importance is emphasized in the code of conduct. Awareness is an ongoing journey, during
which the privacy program can leverage company technology to build a privacy coalition and
facilitate friendly competitions but, more importantly, make protecting information personal
through practical application. An effective training and awareness program makes a complex

216
topic comprehensible and enables people to integrate key aspects of it effortlessly into their
daily routines.

Endnotes

1 Ponemon Institute, 2018 Cost of a Data Breach Study: Global Overview, IBM, July 2018,
https://www.ibm.com/security/data-breach (accessed November 2018).
2 2018 Data Breach Investigations Report, Verizon,
https://enterprise.verizon.com/resources/reports/DBIR_2018_Report_execsummary.pdf (accessed February
2019).
3 James Tennent, “Users Affected by Yahoo’s Massive Data Breach Will be Able to Sue, A Judge Has Ruled,”
Newsweek, March 12, 2018, https://www.newsweek.com/users-affected-yahoos-massive-data-breach-will-be-
able-sue-judge-has-ruled-841799 (accessed November 2018).
4 Id.
5 Austin Carr, “Uber to Pay $148 Million in Settlement Over 2016 Data Breach,” Bloomberg Law News,
September 26, 2018, https://www.bloomberg.com/news/articles/2018-09-26/uber-to-pay-148-million-in-
settlement-over-2016-data-breach (accessed November 2018).
6 “Developing a Privacy Compliance Program,” Practical Law, n.d. https://content.next.westlaw
.com/5-617-
5067?transitionType=Default&contextData=(sc.Default)&__lrTS=20180705135935578&firstPage=true&bhc
p=1 (accessed November 2018).
7 “Prudence the Privacy Pro,” IAPP, https://iapp.org/resources/article/prudence-the-privacy-pro/ (accessed
November 2018).
8 “National Institute of Standards and Technology, Special Publication 800-55, revision 1, Performance
Measurement Guide for Information Security,” 32–33, http://csrc.nist.gov/publications/nistpubs/800-55-
Rev1/SP800-55-rev1.pdf (accessed November 2018).

217
DATA BREACH INCIDENT PLAN
By: Liisa Thomas

For corporations, having a clear plan to respond to a possible data breach is often a core and
critical issue. There are a wide range of laws that apply when a company is responding to a
data breach. In the United States, there are laws in every state as well as industry-specific
federal laws. In Europe, the General Data Protection Regulation (GDPR) addresses how a
company responds to a breach, and other countries have laws as well. After addressing
notification requirements, corporations often find themselves exposed to post-notice scrutiny.
This can take the form of regulatory inquiries or lawsuits, including lawsuits from class action
lawyers.
How, then, can corporations best handle these choppy waters? Every corporation needs to be
prepared to respond to a potential data breach. No matter the type of request, they need to be
prepared to properly receive, assess and respond to it.

Breach notification laws in the United States are numerous, and lawsuits often arise
post-notification.

9.1 Incident Planning

9.1.1 What’s at Risk


In the United States, there are laws that require companies to provide notification to affected
individuals and/or government authorities in the event of a data breach. When notification must
be given depends on the jurisdiction and the law in question. How notification must be given,
and the contents of the notification, similarly may vary. Failure to give notice properly can give
rise to liability and exposure.
Even a company that notifies properly faces risks. Those risks may be in the form of public
relations (PR) scrutiny and bad press. Or they may be in the form of follow-on lawsuits or
regulatory action accusing the company of having failed to take proper actions to protect
information.

9.1.2 Legal Exposure and Liability


Companies that suffer a data breach may face both litigation exposure, reputational liability,
and potential regulatory scrutiny. Reputational liability is difficult to anticipate.

218
Similarly, it is not always clear when a fact pattern will result in a lawsuit or regulatory
investigation. Should a company face such scrutiny, factors that will be considered include:

• A purported obligation to prevent unauthorized access to or use of the data

• If the company satisfied an applicable industry standard of care

• Whether there were damages or injury, and if the organization’s conduct (or lack
thereof) was the proximate cause of the damages

9.1.3 Costs When Addressing an Incident


Another significant exposure to an organization is the underlying cost of a data incident.
According to the Ponemon Institute’s 2018 Cost of a Data Breach Study, the average cost of
an incident is $3.86 million and the cost per individual record lost or compromised is $148.1
Translating statistics to monetary values can help senior executives see the value of planning
for a data incident or breach.
Other risks include not just the cost of the incident itself, but potential loss of revenue. If a
company does face litigation or regulatory scrutiny, it may find itself subject to fines. Harder
to quantify, but no less risky, is the possible loss of existing and potential business. There might
also be an impact on business relationships and third-party contracts, which can be especially
problematic during mergers.
In addition to costs to the company, there are arguably also potential costs to the affected
individuals, who might suffer identity theft or personal reputational harm. They might also
have financial damage from misuse of financial account information.

9.2 How Breaches Occur

The same Ponemon Institute study revealed that malicious actors or criminal attacks are the
most common cause of data breaches.2 The root causes of breaches cited in the study include
malicious or criminal attack (48 percent), human error (27 percent) and systems glitch (25
percent).3
Employee error or negligence is reported to be one of the biggest causes of privacy breaches.
Even malicious and criminal attacks often take the form of phishing attacks, which rely on
unsuspecting employees. Ongoing training and awareness-raising around information security
policies and practices is therefore essential in reducing the risk of a privacy breach.
In sum, breaches can occur in many ways, including through hacking or malware, device loss
or theft, and unintended disclosure of information. Breaches are more than just a technical or

219
IT issue; everyone in an organization can and should play a role in following responsible data
privacy and collection practices.

9.3 Terminology: Security Incident versus Breach

When faced with a potential security incident, there is often a temptation to call the situation a
data breach. However, that term is a legal one, defined in different ways under various laws
around the globe. Until a lawyer has made a determination that a fact pattern meets the legal
definition, corporations should refer to a security incident as just that, an incident or a potential
incident.
An incident is a situation in which the confidentiality, integrity or availability of personal
information may potentially be compromised. For a data breach to exist, typically there must
be some sort of unauthorized access or acquisition of the information, although the definition
of breach varies. If a breach exists, impacted individuals and, in many cases, regulatory
authorities must be notified.
In sum, all breaches are incidents, but not all incidents are breaches. Only the privacy office
or legal office can declare a breach, based on certain triggers.

9.4 Getting Prepared

Companies generally recognize they may be subject to a data incident. The common phrase is
“not if, but when.” With this in mind, what measures can a company take to prepare for an
incident? Preparedness does not prevent an incident. Instead, while prevention focuses on tasks
and technologies that stop a breach from occurring, preparedness focuses on measures a
company can take to respond optimally—in other words, to answer the question, “What will
the company do when prevention fails?”
Preparedness falls into five categories: training, getting an incident response plan in place,
understanding key stakeholders, getting insurance coverage where appropriate, and managing
vendors who might be a part of an incident.

9.4.1 Training
Organizations typically face the following questions when they’re making the case for training
or planning its execution.
Why train? The answer to this is straightforward. Training exposes gaps in applications,
procedures and pre-incident plans. It can also cultivate greater overall security for customers,
partners and employees. As a result, training has the potential to reduce financial liability and

220
regulatory exposure while lowering breach-related costs, including legal counsel and consumer
notification. If appropriate training has been put in place, it can help a company get through an
incident with brand reputation and integrity preserved.
Which function within an organization should fund training? Leaders often disagree, and
what is appropriate will vary by company. Considerations to take into account include where
most of the data is housed, how other similar projects have been funded, what is driving the
compliance efforts, and what functions would be most negatively affected by an incident. Many
companies find it helpful to consider a shared-cost arrangement, for example, between
information technology (IT), finance, human resources (HR) and legal. Quantify the benefits
of training by calculating return on investment (ROI) and savings.
Who should receive training? The entire organization will likely need some form of
training. Many may only need to learn how to report a potential incident. Others may require
more in-depth training. For example, the incident response team will need thorough training.
The IT and security teams will similarly need in-depth training. While there will need to be—
and should be—different levels and programs for different employee groups, all employees
should have a basic understanding of security procedures and how to report a suspected
incident.
What form should training take? Training will take various forms, and content should be
customized to the audience. It might be a short video or a structured readiness-testing
simulation. A training exercise could also simulate an actual incident, for instance, circulating
a fake phishing email. Regardless of form, record results and update the plan accordingly.

9.4.2 Creating an Incident Response Plan


A key step in incident preparation is the formal creation of an incident response plan. To create
the plan, the drafting team will need to gather a vast amount of information and then use the
information they have gathered to develop processes and procedures. This team should be led
by the privacy office and the legal department and include help from IT, communications, HR
and senior management. The exact stakeholders will vary by organization.
When you put together the plan, you should be thinking about some key factors. These
include what types of personal information your organization collects, and in what format you
collect that information, as well as the method of collection. You will, of course, also want to
consider the applicable laws, which is one of the many reasons working with legal is so
important. Third-party relationships are also critical. What vendors are most likely to have a
breach that would affect you? Another key factor is internal administration: What works for

221
another company may not work for you. Have there been prior incidents? If so, there may be
learnings you can take away from them as you put together your plan.
The plan should touch on some key areas, including how to protect privilege, the roles and
responsibilities of team members, how to escalate possible issues and report suspicious
activities, severity rankings (i.e., what triggers escalation and what type of escalation), and
interactions with external parties (e.g., regulators, vendors, investigators for impacted
individuals, insurance providers). A good plan will also consider integration with business
continuity plans (BCPs) and provide a mechanism for engaging in learnings post-incident.
The purpose of a good plan is to map out for people in an organization what to do. The
lawyers within your organization will also want to ensure the plan appropriately includes any
regulatory requirements. This is not the place to list every law that applies, but to help the team
understand what they may be facing if an incident occurs.
There are several schools of thought about listing specific team members and contact
information. Many plans include these, but keep in mind that the information changes. Do you
want to be updating your plan constantly depending on new contact details or new job roles?
That may make sense for many organizations. For others, having the key starting point may be
enough. As always, your plan will be customized to the realities of your organization.

9.4.3 Know Your Roster of Stakeholders


Effective incident response requires systematic, well-conceived planning before a breach
occurs. The success of an incident-response plan ultimately depends on how efficiently
stakeholders and constituent teams execute assigned tasks as a crisis unfolds.
The potential size and scope of breach-related consequences can’t be understated. At issue
are current and future revenue streams, brand equity and marketplace reputation. Other risks
resulting from bad publicity include opportunity costs, such as high churn and diminished rates
of new customer acquisitions.
These high stakes demand the inclusion and expertise of stakeholders from a wide range of
job functions and disciplines. The most common locations of personal or sensitive information
within an organization are:

• IT or information security

• HR

• Marketing

222
• Customer relationship management (CRM) systems of customer care and sales
departments

• Audit and compliance

• Shareholder management

Reasons for including stakeholders from these functions in incident response planning are
obvious. However, it’s also essential to involve other senior leaders in formulating and
executing a plan that minimizes a breach’s financial and operational impact. Their involvement
will ultimately result in a stronger, more richly multidisciplinary plan that enables breached
companies to effectively restore security, preserve the evidence, and protect the brand.
Examples of departments that should be involved include:

• Business development (BD)

• Communications and PR

• Union leadership

• Finance

• President, chief executive officer (CEO)

• Board of directors

9.4.4 Insurance Coverage


Insurance may be a viable source of funding to offset breach-response and recovery costs.
While traditional policies may provide a certain level of protection, they do not normally cover
expenses resulting from a data compromise. To reduce exposure, risk managers must work
closely with insurance carriers and finance stakeholders to select the right type and level of
coverage prior to an incident. A relatively new type of coverage, called cyber-liability
insurance, may cover many breach-related expenses, including:

• Forensic investigations

• Outside counsel fees

• Crisis management services

• PR experts

• Breach notification

• Call center costs

223
• Credit monitoring

• Fraud resolution services

Any preparedness process should take insurance coverage into account. When looking at
coverage, keep in mind that you will be asked to fill out questionnaires that speak to your level
of preparedness. Before completing them, coordinate with the legal department, as you will be
making disclosures about internal operations to external third parties.

9.4.5 Management of Vendors Who May Be the Source of an Incident


Often, vendors are the ones that suffer a data breach. But because of the way data breach
notification laws are drafted, the obligation to notify may fall on your company, not the vendor.
For this reason, it’s important to have a good understanding of what information your vendors
have, how they use it, and what they will do if they suffer. This exercise goes beyond merely
reviewing contracts and updating language. For vendors with key information, it may be
necessary to do on-the-ground diligence to understand their preparedness level and to ensure
their coordination in the event of an incident.

9.5 Roles in Incident Response Planning, by Function

This section covers the core elements of incident response planning, incident detection,
incident handling, and consumer notification. The focus is on the U.S. approach to responding
to data breaches, since the United States has some of the world’s strictest and financially
consequential breach notification requirements. The section begins by identifying the roles and
responsibilities previously identified stakeholders may play during a breach.
Different stakeholder teams have different responsibilities in planning for a possible breach.
Table 9-1 details sample departmental responsibilities for the planning period. In reviewing
these responsibilities, remember that your organization is unique; just because a plan worked
in one corporate structure does not guarantee that it will work in yours. Nevertheless, Table 9-
1 provides guidelines for common planning expectations by function.

Table 9-1: Sample Departmental Responsibilities

Function Planning Role

Information Provide guidance regarding how the organization address detection,


security isolation, removal and preservation of affected systems

224
Legal Ensure the response program is designed to protect privilege and think
about and design the program with an eye toward limiting legal liability

HR Provide an employee prospective

Marketing Advise about customer relationship management

BD Represent knowledge in handling and keeping the account

PR Plan strategic and tactical communication to inform and influence

Union leadership Represent union interests

Finance Calculate and manage the bottom-line impact of containment and


correction

President/CEO Demonstrate value of preventing breaches through actions

Customer care Offer insight on customer/caller behavior

9.5.1 Information Security and/or Information Technology


Knowledge of enterprise-wide configurations, networking and protocols, and security
measures gives information security a broad enough perspective on the organization’s
electronic assets to help it identify vulnerabilities before criminals exploit them. As part of the
incident response planning process, the information security group will provide guidance
regarding the detection, isolation, removal and preservation of affected systems.

9.5.2 Legal
When developing an incident response plan, companies should always seek the advice of
competent counsel experienced in the field of data breach response. If it’s uncertain whether
your legal department possesses the requisite knowledge, an assessment, overseen by the senior
legal stakeholder, should be undertaken.
Legal stakeholders are central to incident response planning because they, more than any
other executives, understand the legal precedents and requirements for handling data and
reporting a breach. Their guidance helps companies limit the liability and economic
consequences of a breach, including avoidance of litigation and fines. In addition, most data
breach legislation requires intensive legal knowledge to implement a proper procedure. During
incident response planning, organization attorneys may negotiate any requirements the
organization wishes to impose upon its business partners. Conversely, the organization may

225
also use attorneys to help determine what it is willing to do in the event data belonging to a
client is compromised.
Finally, legal involvement in planning for an incident is critical given the need to protect
privilege during an incident investigation process, as well as the level of legal exposure and
risk that can arise depending on how a company handles an incident. After notification,
companies may often find themselves subject to regulatory scrutiny or class action lawsuits.
The involvement of a lawyer who understands these risks is a key part of successfully handling
an incident.

9.5.3 Human Resources


Given the extensive amount of personal information that typical HR departments have on hand,
it is highly advisable to include HR team members when discussing incident response planning.
HR staff may also be included because of their unique perspective regarding employees or for
notification of current or past employees.
During incident response planning, the HR stakeholders will normally address topics such as
employee data handling, security awareness training, and/or incident recognition and response.

9.5.4 Marketing
The typical marketing department has spent years, even decades, gathering, slicing, dicing and
warehousing vast amounts of customer data, much of it personal information, individually or
in the aggregate (e.g., name, address, date of birth, Social Security number, driver’s license
number). Through segmentation and analysis of such data, they gain the necessary insight to
be both the voice of the brand to external audiences and the voice of the customer to
engineering, research and development, and other internal teams.
However, being stewards of such a rich data storehouse also increases marketing’s
vulnerability to hacking and unintentional breaches. This exposure, combined with the team’s
access to campaign and CRM databases, more than qualifies marketing decision makers for a
role in incident response planning.

9.5.5 Business Development


The BD stakeholder, often aided by a dedicated account support team, monitors and manages
vital business relationships. Companies with a certain level of value or prestige receive regular,
personalized attention aimed at building trust, nurturing loyalty, and sustaining the bond over
time.

226
Stakeholders in this position gain firsthand knowledge into handling and keeping key
accounts, developing an understanding of their corporate culture, organizational strengths and
weaknesses, decision-makers’ personalities, and management styles of potential customers.
These insights can prove invaluable in incident response planning, which is why BD
stakeholders should have a seat at the table when the planning process begins.

9.5.6 Communications and Public Relations


PR and communications stakeholders are usually senior, media-savvy professionals who are
highly adept at media relations and crisis management. They serve as stewards of public image
and reputation, overseeing the development of strategic and tactical programs aimed at
informing and influencing audiences.

9.5.7 Union Leadership


Though their numbers have declined since the 1980s, unionized workers still comprise a sizable
percentage of the American workforce. According to the Bureau of Labor Statistics, the number
of wage and salary workers belonging to unions stood at 14.8 million, or 10.7 percent, in 2017.4
The AFL-CIO, the U.S.’s most prominent and well-known union, is a labor federation
consisting of more than 12.5 million members of 55 different unionized entities.5
Data belonging to union workers, like that of all employees, is stored on an organization’s
servers and is vulnerable to breach by accidental or unauthorized access. If their employer
reports a data breach, union members will naturally look to stewards or other union leaders for
information and guidance.
These individuals represent union interests and are authorized to act and speak on members’
behalf—both to internal groups and to the media at large. For these reasons, any organization
whose workers are unionized should consider including a senior union stakeholder in data
breach planning and response.

9.5.8 Finance
In their response-planning capacity, the main role of finance stakeholders is to calculate and
manage the bottom-line impact of breach containment and correction. Once the potential costs
of responding to a breach are computed, it is up to finance to allocate the necessary resources
to fund resolution and recovery. The chief financial officer (CFO) should also champion more
cost-effective measures that might help mitigate the risk of having a breach in the first place.
To further aid in containing costs, finance executives or procurement leaders can help negotiate
agreements with new or returning data-breach-resolution providers.

227
9.5.9 President, Chief Executive Officer
Executives lead; employees and stakeholders follow. In central business functions, the
president/CEO’s attitude and behavior set the tone for the entire organization. This is especially
true with policies and practices surrounding data security. Through actions taken or not taken,
and training funded or not, employees can easily discern the value their leaders truly place on
preventing breaches. Once data is compromised and the shortcomings of an organization’s
security practices become public, it is the top executive who will ultimately bear the blame.

9.5.10 Board of Directors


Boards of directors are becoming more aware of—and concerned about—companies’ level of
preparedness for an incident. Many board members have received training about data incidents,
and all are concerned about fulfilling their fiduciary obligations to their organizations. While
boards are not tasked with running the day-to-day operations of a company, many members
will want to make sure their business is ready in the event of an incident.
Boards often have a wealth of knowledge about handling a data incident, sometimes from
their direct experiences with other incidents.

9.5.11 Customer Care


The head of the customer care operation must contend with issues such as high employee
turnover and access to large amounts of potentially sensitive CRM data. These factors make
customer care teams susceptible to various forms of attacks by intruders looking to access
personal information.
Social engineering is an increasingly prevalent threat that can surface in a call center, as
criminals call repeatedly to probe and test how security procedures are applied and how often
they are enforced. According to Wombat Security, a subsidiary of Proofpoint, 76 percent of
organizations experienced a phishing attack in 2017.6 Respondents to Wombat’s survey
reported various impacts stemming from the phishing attacks, including malware infections,
compromised accounts and a loss of data.7
Aside from deployment of the necessary technology as a first line of defense, employee
training and awareness of phishing attacks can help to reduce the potential instances of an
attack. Companies are increasingly training their employees and, according to the survey, 54
percent say training has led to a quantified reduction in phishing susceptibility.8

228
When trained to recognize unusual employee or caller behaviors or to notice trends in certain
types of calls, customer care teams can help deter criminal activity and prevent potential
breaches.

9.6 Integrating Incident Response into the Business Continuity Plan

To help operations run smoothly in a time of crisis, many companies depend on a BCP. The
plan is typically drafted and maintained by key stakeholders and spells out departmental
responsibilities and actions teams must take before, during and after an event. Situations
covered in a BCP often include fires, natural disasters (e.g., tornadoes, hurricanes, floods), and
terrorist attacks.
To ensure proper execution of the BCP, all planning and response teams should know which
stakeholder is responsible for overseeing the plan and who, within a specific job function, will
lead them during an event. Knowledge of the plan and preparation for executing it can mean
the difference between a successful response and a failed one, especially during the first 24
hours.
In terms of overall organizational impact, a serious or protracted data breach can rival big
disasters. Like a fire, tornado or terrorist attack, a breach can strike unexpectedly at any time
and leave in its wake damages of immeasurable cost and consequence. As with other
calamitous events, cleaning up a digital disaster can take weeks, months or longer; in the
aftermath, victims’ lives may be changed forever.
In a 2016 study of more than 350 executives, 63 percent said their companies had a BCP in
place.9 An additional 17 percent of respondents said they were in the process of developing a
plan.10 The remaining 20 percent either did not have a business continuity management
program or were just beginning to develop such a program and had not yet reached a point
where they could write a plan.11 Two-thirds of respondents reported having to invoke the plan
within the previous two years.12
Considering a breach’s potential repercussions and the benefits than can result from informed
and thoughtful preparation, it’s imperative that companies integrate breach response planning
into their broader BCP.

9.6.1 Tabletop Exercises


Once breach preparedness is integrated into the BCP, or if the company decides to have a
standalone incident response plan, incident response training will likely be required. This
training may take many forms, including workshops, seminars and online videos, but often

229
includes tabletop exercises, a strategic mainstay of corporate trainers and business continuity
planners.
A tabletop exercise is a structured readiness-testing activity that simulates an emergency
situation (such as a data breach) in an informal, stress-free setting. Participants, usually key
stakeholders, decision makers and their alternates gather around a table to discuss roles,
responsibilities and procedures in the context of an emergency scenario.
The focus is on training and familiarization with established policies and plans. Most
exercises last between two and four hours and should be conducted at least semiannually—
more often if resources and personnel are available.

9.6.2 Updating the Plan


Soon after concluding the exercise, results should be summarized, recorded and distributed to
all participants. Perhaps most importantly, fresh or actionable insights gained from the exercise
should be added to the BCP.
It’s imperative to keep the incident response plan (or the BCP) current. There is little
strategic, practical or economic value to a plan that is painstakingly developed but seldom
tested or improved. Those responsible should always ensure the plan includes the most up-to-
date timeline, action steps, policies and procedures, and current emergency contact information
(vital, but often overlooked) for all plan participants. All those involved should be notified of
any changes or updates to the plan.

9.6.3 Budgeting for Training and Response


Breach preparedness training, especially in a large organization, represents a significant
investment. Creating an environment that ingrains data security into the corporate culture and
prepares teams to respond effectively requires an organization-wide commitment backed by
the resources to see it through.
In most cases, the long-term financial and operational benefits of teaching employees to
prevent, detect, report and resolve data breaches far outweigh the costs. The strategic upside of
investing in breach preparedness includes:

• Exposure of critical gaps in applications, procedures and plans in a


pre-incident phase

• Greater overall security for customers, partners and employees

• Reduced financial liability and regulatory exposure

230
• Lower breach-related costs, including legal counsel and consumer notification

• Preservation of brand reputation and integrity in the marketplace

Though organization leaders often agree about the value of breach awareness and training,
there is rarely consensus about who should foot the bill. Many businesses utilize a shared-cost
arrangement that equitably splits training costs among participating stakeholder groups, such
as IT, finance and HR. Negotiations between them can include everything from funding levels
and oversight to allocation of unused funds.
However costs are divided, companies should ensure that adequate funding is available to
support business continuity and breach preparedness training. To facilitate the negotiation,
parties should focus on quantifying benefits, ROI and savings, rather than the bottom-line
expense to any individual group.

9.6.4 Breach Response Best Practices


Allocating funds for breach response is just as important as training, perhaps even more so.
Typical costs incurred in responding to a breach include threat isolation, forensic investigation,
engaging of legal counsel, PR communications and media outreach, and reporting and
notification (including printing, postage and call center).
Without a breach response budget in place, companies may be forced to redistribute funds
from other critical projects or initiatives. Having to openly debate the merits and value of one
department’s initiatives over another’s may lead to tension between groups and ultimately
delay or detract from optimal breach response.

9.7 Incident Handling

The process of responding to a breach involves tasks that are not necessarily linear. Companies
facing a potential incident will deal with incident detection, ensure that stakeholders collaborate
and know their roles, investigate, ask their legal teams to conduct a legal analysis, address
reporting obligations, and recover from the situation. While these steps are all part of a well-
run response, many of them must happen in parallel. It can be helpful to think about breach
response tasks in broad categories: secure operations, notify appropriate parties, and fix
vulnerabilities.
While these groupings help keep you organized, they are not necessarily meant to be used as
a checklist, as many steps will happen concurrently. For example, a company’s CEO needs to

231
know about a breach as soon as possible—even if the breach has not yet been contained. In this
case, notifying appropriate parties would happen simultaneously with containment efforts.

9.7.1 Incident Detection


Unfortunately, there’s not one definitive way to detect a breach. Customer calls or news reports
may alert an organization to trouble before internal sources even recognize a problem.
Consider, for your organization, how you will determine whether to classify an event as an
incident or a breach. Remember also that privacy is a business function—not a technical
function—and relies on other organizations and departments to execute breach detection and
response.
Generally, a privacy incident may be described as any potential or actual compromise of
personal information in a form that facilitates intentional or unintentional access by
unauthorized third parties.

9.7.2 Employee Training


From their first day at an organization, new employees should be taught and encouraged to
assume a privacy-first mindset. When they observe that leaders and fellow associates are
genuinely committed to data security and privacy protection, new hires are more likely to
respect and comply with established reporting and data-handling policies.
Initial security indoctrination and training should also teach employees to recognize
vulnerabilities and to capture and report basic information when encountering a potential or
actual breach. Employees must understand when and how to report suspicious incidents to their
supervisors, who, in turn, should know how to properly escalate the incident to internal
authorities, such as the privacy office.

9.7.3 Reporting Worksheets


To emphasize employees’ personal responsibilities when encountering a breach, policies and
procedures should be a regular component of security training and refreshers. The following
worksheet provides a foundation for developing your own incident-reporting or privacy-
training worksheets. These are merely suggestions and not intended to be a comprehensive list.
Keep in mind as well how these materials are distributed. Does the incident involve a bad actor
who has possibly accessed your email system? If so, then reporting should not be occurring
through that potentially compromised system!
All breach planning and preparedness resources should be reviewed and approved by internal
or external legal counsel or by an expert privacy team. Part of the analysis by the legal team

232
will be to think about the issues of privilege and determine what content should be documented
and how it should be documented. In many circumstances, materials will need to be prepared
at the direction of counsel.

Sample Worksheet—Prepared at the Direction of Counsel


Facts as they are known:

• Name and contact information of person discovering the incident

• Date and time the incident was discovered or brought to your attention

• Incident date, time and location

• Type of data suspected to be involved

o Internal organization or employee data

o Client or customer data

o Third-party partner or vendor data

Employee’s description of what occurred:

• Brief description of how the incident or breach was discovered.

• Does the incident involve paper records, electronic information or both?

• What type of records or media do you believe were involved?

o Paper: letter, office correspondence, corporate document, fax or copies


thereof?

o Electronic: data file or record, email, device such as laptop, desktop,


or pad-style computer, hard drives in other electronic equipment (e.g.,
copy machines)

o Media: external hard drive, flash/thumb drive, USB key

• Do you know if the device or information was password-protected?

• Do you know if the device or information was encrypted?

• Do you believe personally identifiable information (PII) such as Social Security


numbers, account information, user names or passwords were exposed?

• Can you estimate how many records were involved?

233
• To the best of your knowledge, has the incident been contained? (That is, has
the data leak or loss stopped or is there still potential for additional data to be
lost?)

9.7.4 Collaboration Among Stakeholders


Within any organization, data is viewed and handled by any number of individuals and groups
and is often stored in several disparate locations—even across multiple states or continents.
The potential for compromising sensitive data exists throughout every business of every size
in every industry.
Regardless of organization size, however, all employees have a vested interest in being
vigilant about safeguarding data. The cost of recovering from a single breach could potentially
cripple an organization or business unit and render it unable to operate or fully employ its
workforce.
For example, whenever IT conducts security training, instructors may keep logs to record
who has attended and who has not. IT may then share this information with HR to help ensure
every employee receives the instruction required by organization policies.

The potential for compromising sensitive data exists throughout every business of every
size in every industry.

Another example of cooperation between departments is how IT and HR might work together
following detection of a virus or other cybersecurity threat. Typically, IT would detect the
intrusion and prepare very specific containment instructions for all employees. They could
autonomously issue these instructions or work with HR or communications to assure
distribution to the complete employee base via all available channels.

9.7.5 Physical Security


In many organizations, the level of technical integration between IT and facilities is so deep
and so extensive that regular contact through established lines of communication is essential to
maintaining security.
As technology advances, the lines of responsibility can begin to blur. Computers and systems
managed by IT, for example, directly control doors, electromechanical locks, remote cameras
and other access-limiting security measures maintained by facilities staff. This close
association demonstrates the need for ongoing collaboration if the safety and integrity of
physical and digital assets are to be maintained.

234
9.7.6 Human Resources
Hiring, transfers, promotions or other changes in employment status may require revisions to
an individual’s data access privileges. When such changes are needed, HR, IT and facilities
should follow established policies for monitoring and managing data access.
Other occasions requiring group coordination are employee layoffs or terminations. These
unfortunate but not uncommon events can affect thousands of individuals or just a handful. In
either case, the resulting threat to data security can take many forms, for which HR and other
teams must prepare.
Disgruntled or resentful employees, for example, may try to exact revenge for the dismissal
by stealing or destroying sensitive information. Others may attempt to obtain organization
secrets or intellectual property to sell to or gain favor with key competitors. Before employees
are terminated, HR must inform IT and facilities so that physical and electronic access for those
departing may be turned off immediately after, or in some cases even simultaneously with, the
announcement. Phones, equipment and other employee devices must also be wiped of login
and password credentials.

Every organization must ensure that it has a procedure for retrieving portable storage
devices or media from departing employees.

9.7.7 Third Parties


Sensitive data is seldom handled or processed in a single location. In today’s global economy,
huge volumes of personal information for which companies are directly responsible reside in
systems and facilities managed by outside vendors, partners and contractors. These groups
should always be accounted for in incident detection and planning.
For example, companies should make standard a clause requiring third parties to notify them
within a certain time frame when servers, websites or other business-critical systems are taken
offline. It goes without saying that companies should always require third parties to promptly
communicate any breach of data so that contingencies can be made to mitigate resulting threats.
Conversely, it’s vital for companies that work with third parties to remember that such
communication flows both ways. If the organization’s network is hit with a virus or comes
under a cyber attack, or if there are changes to call center procedures or employee data-handling
policies, the organization has an obligation to notify its partners immediately.

235
9.7.8 Tools of Prevention
To those on the front lines, prevention and detection bear many similarities to defending an
occupied fortress. They must protect sensitive information against treachery and attacks that
could come at any time. Regardless of how they originate, if the fortress is to remain secure,
threats must be detected and eliminated before it’s too late.
Today, there are numerous weapons in a security team’s arsenal of prevention. Some
techniques are familiar but still quite effective, while others are emerging and showing
tremendous promise. The successful privacy professional will be mindful of the need to
understand various prevention techniques and their intended applications and to be purposeful
about keeping up with new ones as security technology advances.
Once breach investigators conclude that an actual compromise of sensitive information has
occurred, the prenotification process is triggered. Steps taken may vary depending on several
factors, but the purpose is to confirm that the event does indeed constitute a reportable breach.

9.8 Team Roles During an Incident

Immediately following the decision to notify affected parties, tactical portions of the incident
response plan begin to unfold. Companies dealing with an incident may find themselves
balancing two possibly conflicting issues: containment and legal exposures. Companies want
to contain and remediate the problem. At the same time, should the situation be viewed as a
data breach, impacted individuals and, potentially, government agencies must be notified.
These notices often result in lawsuits or regulatory scrutiny.
An incident response process will need to balance these objectives. A successfully handled
plan will be directed by legal (to address the legal exposures and privilege concerns), who will
work hand in hand with an IT leader who is focused on containment and remediation. Other
key stakeholders will also need direct involvement.
Depending on your organization’s structure, your incident program will need a clearly
delineated leader to rally the troops and keep the process on track. When thinking about
incident response leadership, it is important to keep legal risks in mind. While an organization
may have many lawyers on staff, not all are engaged in the practice of law. The chief privacy
officer (CPO) or chief compliance officer (CCO), for example, may have legal degrees but not
be functioning as lawyers (i.e., providing legal advice to the organization). General counsels
may at times also be concerned about whether their roles will be questioned during litigation.
For this reason, many turn to outside counsel, who are often best positioned to advise on breach-

236
related matters. The distinction of individuals with law degrees who are not serving in the role
of lawyers is important, because in order to protect attorney-client privilege, the investigation
will need to be done at the direction of counsel. Thus, in many organizations, the leader is, of
necessity, legal counsel.
Organizations often have many individuals with extensive knowledge about data breach
matters. In addition to legal counsel, who are concerned with privilege, the CPO or CCO wants
to ensure that a breach is handled correctly from a compliance standpoint, and the chief
information security officer (CISO) will be focused on the nuts and bolts of investigation and
containment. The CISO’s role may include recommending outside forensic experts to help
ascertain the incident’s cause, size and scope. The CISO, in connection with the rest of the IT
department, may also oversee evidence preservation, taking affected systems offline and
correcting vulnerabilities that facilitated the incident.
Team leadership, keeping containment and privilege in mind, will include contacting and
activating appropriate response team members or their alternates. Meetings during the
investigation may be necessary, and the team leadership should think about how best to
schedule these to gather and analyze status reports and provide guidance as needed. Convening
with individual stakeholders to discuss lawsuits, media inquiries, regulatory concerns and other
pressing developments is another of the team leader’s tasks.
During the breach, team leaders will also:

• Keep individual response-team members on track to meet their performance


objectives and timelines

• Track budget adherence for all response activities

• Contact outside incident response resources to confirm engagement and monitor


performance

• Prepare a final analysis of the response effort and lead the post-event evaluation
process

The team leader may also choose to provide senior executives with an overview of the event
and of the team’s expected course of action. The breach response team leader must manage
expectations around communications, so executives know they are always as informed as
possible and do not need to continually check in during the response process, which would
hinder the team leader’s work.
Below is a list of tips to help manage expectations and communicate with executives:

237
• Manage executive leaders’ expectations by establishing the frequency of
updates/communications

• Determine what is appropriate for the situation and communicate when/if the
frequency needs to change

• Hold a kickoff meeting to present the team with the known facts and
circumstances

• Provide senior executives with an overview of the event and of the team’s
expected course of action

• Engage remediation providers to reduce consumers’ risk of fraud or identity theft

• Convene with individual stakeholders to discuss lawsuits, media inquiries,


regulatory concerns and other pressing developments

• Keep individual response-team members on track to meet their performance


objectives and timelines

• Track budget adherence for all response activities

• Contact outside incident response resources to confirm engagement and monitor


performance

• Prepare a final analysis of the response effort and lead the post-event evaluation

There is sometimes confusion about who—between legal, CPO/CCO and CISO—should be


directing and leading an incident response. The best incident response teams are those in which
the three work together, ensuring maintenance of privilege, containment and swift
investigations.

9.8.1 Legal
In addition to ensuring the protection of privilege during the investigation, legal will be focused
on determining whether there is a duty to notify under breach notification laws, and if so, what
form that notice should take. The entities to notify vary by breach.
Legal stakeholders may also recommend forensically sound evidence collection and
preservation practices and engage or prepare statements for state attorneys general, the Federal
Trade Commission (FTC) and other regulators. Stakeholders’ knowledge of laws and legal
precedents helps teams more effectively direct and manage the numerous related elements of
incident investigation and response.

238
Drafting and reviewing contracts is another vital area in which legal stakeholders should be
involved. If data belongs to a client, it can interpret contractual notification requirements and
reporting and remediation obligations. Should the organization become the target of post-
breach litigation, the legal stakeholder may also guide or prepare the defense.

9.8.2 Information Security


While some data incident matters do involve paper records, given the cyber nature of most
incidents, it is almost certain that the information security group will be engaged to address
data compromises. The CISO or the chief technology officer (CTO) or their designated person
on the incident team will focus the group’s expertise on facilitating and supporting forensic
investigations, including evidence preservation. Information security will also likely be tasked
with overseeing the deletion of embedded malware and hacker tools and correcting
vulnerabilities that may have precipitated the breach. Larger companies may establish a
computer emergency response team (CERT) to promptly address security issues.
While internal IT resources may have the experience and equipment to investigate incidents,
it is often more advantageous to bring in outside experts to identify the cause and scope of the
breach and the type and location of compromised data.
To support other groups with their breach response efforts, the technology team may also:

• Provide a secure transmission method for data files intended for the print vendor
or incident call center

• Identify the location of potentially compromised data (e.g., test development and
production environments)

• Determine the number of records potentially affected and the types of personal
information they contain

• Clean up mailing lists to help facilitate the printing process

• Sort through data to identify populations requiring special handling (e.g., minors,
expatriates, deceased)

• Monitor systems for additional attacks

• Fix the gaps in the IT systems, if applicable

239
9.8.3 Other Response Team Members
There are typically two levels to a response team. First are the leaders who will make the key
decisions about how an incident is handled. Second are the individuals who will be providing
input and support to the core team. Those in the second group will vary depending on the type
of incident. A balance should be struck between ensuring that the appropriate stakeholders are
included but that communications are controlled to avoid legal exposure. Legal counsel can be
very helpful in this regard.
Core team members will be the legal lead as well as the IT or security head investigating and
handling containment. Additional support may be needed from other areas as described in
Table 9-2.

Table 9-2: Incident Response Support Roles by Function

Function Potential Role

HR Serve as information conduit to employees

Finance Secure resources to fund resolution

Marketing Establish and maintain a positive and consistent message

PR Assume positions on the front line

Customer care Handle breach-related call traffic

BD Notify key accounts

Union leadership Communicate and coordinate with union

President/CEO Promptly allocate funds and personnel and publicly comment on breach
The following describes the typical roles these functions fulfill in most organizations. Every
company is unique, however, and care should be taken to ensure you are following the
protocols of your own organization when thinking about the roles of team members.

9.8.3.1 Human Resources


Whether breaches affect employees’ data or not, the chief human resources officer (CHRO) or
vice president of HR must guide the HR team’s response activities. Concerns over the
organization’s solvency or stock value can make it necessary to inform employees of the
incident. Moreover, employees might be contacted regarding the incident by affected persons,
the media or other parties and need to be directed on how to respond.

240
If employee data is compromised, the CHRO’s role will become more prominent, including
directing the HR team in identifying the cause and scope and overseeing communications
between management and staff. If the breach is attributed to an employee, the HR group will
take one or more of the following actions: provide training, make procedural changes,
administer the appropriate corrective action, or terminate the individual. If criminal behavior
is discovered, the legal department and/or law enforcement officials may become involved.
During and after a breach, the HR team may be called upon to perform a variety of other
corrective or educational tasks, such as:

• Facilitating employee interviews with internal and external investigators

• Identifying individuals who need training

• Holding daily meetings to summarize breach updates and create appropriate


communications for employees

• Escalating concerns to the appropriate department heads

In the aftermath of a breach, the HR stakeholder may serve as the organization’s


informational conduit, working closely with PR or corporate communications to inform and
update employees about the incident. During the breach, employees may become concerned
about the effects an event might have on their employment, organization, stock or strategic
business relationships. Therefore, HR might work with internal or external resources to address
and allay such concerns.
If an incident affects employee records, the HR team might also help investigators determine
the location, type and amount of compromised data. If the breach is traced to an organization
employee, HR would be expected to collaborate with the individual’s manager to document
the individual’s actions and determine the appropriate consequences.

9.8.3.2 Finance
The CFO or the chief financial and operating officer (CFOO) will be responsible for guiding
the organization’s post-breach financial decisions. Since breaches tend to be unplanned,
unbudgeted events, the CFO should work closely with senior management to allocate and
acquire the funds necessary to fully recover from the event.
The CFO may help negotiate with outside providers to obtain favorable pricing and terms of
service. The finance team may also collaborate with the legal group to create cost/benefit
models that identify the most practical or economical approaches.
Tasks commonly undertaken by the finance team during a breach include:

241
• Setting aside and managing appropriate reserves to pay for rapidly mounting
expenses

• Working with vendors to extend payment terms and secure potential discounts

• Promptly paying invoices for breach-related activities

• Meeting daily with the response team leader to track incident expenses

• Requesting ongoing reports from breach providers to manage and track call
center, printing and credit-monitoring costs

During a data breach, finance stakeholders apply their knowledge of the organization’s
financial commitments, obligations and cash position to recommend budget parameters for
responding to the event.
In companies where incident response is an unbudgeted expense, the finance team is often
tasked with being both proactive and creative in securing the resources necessary to fund
resolution and notification. This sum can range from several thousand to several million
dollars.
Before or after a breach, finance executives may work with insurance carriers to negotiate
insurance policy updates, including improvements to the general commercial liability (GCL)
policy and the addition of cyber insurance coverage.
Cyber insurance is a relatively new form of protection that fills gaps typically not covered by
the GCL plan. Organizations seeking first-party cyber insurance coverage have a surprisingly
diverse range of choices, including protection against losses stemming from data destruction
and theft, extortion and hacking, and revenue lost from network intrusion or interruption.
Notification expenses such as printing, mailing, credit monitoring, and call center support
may be included in a policy, along with third-party cyber liability coverage for vendors and
partners. The CFO or other finance stakeholder can offer invaluable assistance in assessing the
necessity and cost of updating insurance coverage.

9.8.3.3 Marketing and Public Relations


The chief marketing officer (CMO) is the person best qualified to help mitigate brand and
reputational damage that can follow a data breach. By collaborating with the PR team or crisis
management firm, the CMO can oversee content development for press releases, blog and
website updates, and victim notification letters. Monitoring and responding to media coverage
and arranging spokesperson interviews will also fall to members of the CMO’s team.

242
Since the marketing department may already have the expertise and infrastructure in place to
support large-scale mailings, the CMO could divert resources necessary to facilitate the
notification process. In support of the effort, the team may also:

• Suggest direct marketing best practices to maximize notification letter open rates

• Perform address/database hygiene to improve breach notification delivery and


response rates

• Analyze media coverage and report relevant developments to the response team

• Draft scripts for the incident response call center

• Develop customer retention and win-back campaigns to minimize churn and


encourage loyalty

Marketers are expert communicators, especially skilled at researching and crafting highly
targeted, consumer-driven messaging. Marketing can work with management and PR teams to
establish and maintain a positive, consistent message during both the crisis and the post-breach
notification.
Direct mail expertise may also prove beneficial in supporting the data-breach response.
Depending on organization size, marketing may control the physical infrastructure to help
launch and manage a high-volume email or letter notification outreach. Gaining internal
agreement on the post-breach allocation of marketing resources is an essential element of
breach response planning.
When a data breach occurs, and response teams are thrust into the fray regardless of the
severity, PR and communications stakeholders quickly assume positions on the front lines,
preparing for the response to potential media inquiries and coordinating internal and external
status updates.
Among their chief roles is to oversee the preparation and dissemination of breach- related
press releases, interviews, videos and social media content. As the crisis develops, they also
work to ensure message accuracy and consistency and to minimize leaks of false or inaccurate
information.
During and after a breach, PR and communications teams closely monitor online and offline
coverage, analyzing what’s being said and to what degree negative publicity is shaping public
opinion. Resulting analysis and recommendations are shared among key stakeholders and used
to adapt or refine PR messaging.

243
9.8.3.4 Customer Care
In the aftermath of a breach, customer service can recommend ways of using internal sources
to serve the needs of breach victims and identify an appropriate outsourced partner. This
stakeholder is also likely to work with others to coordinate the release of breach-related
communications with call center readiness and activities.
Given the customer service training and experience of most call center teams, using existing
staffing and assets to address breach-related inquiries may be a viable time- and cost-saving
option for some companies. If an outsourced provider is retained to answer incoming calls, the
customer service executive can play a crucial role in determining acceptable service levels,
reporting duties and necessary service-representative training.
As part of their normal duties, customer care reps are trained to remain calm when confronted
and to defuse potentially volatile encounters before they escalate. Such training, along with
experience working and delivering scripted messages in a pressure-filled environment, can
enable deployment of these team members to effectively handle breach-related call traffic.
Using internal resources in this manner, however, could potentially degrade service quality
for other incoming service calls, so the prospect of leveraging existing resources to minimize
breach response expenditures may only be attractive for certain organizations.
In companies where using in-house employees to answer breach-related calls is not an option,
the executive of customer service should consider hiring experienced outsourcers to handle call
overflow or perhaps manage the entire initiative.

9.8.3.5 Business Development


In the hands of a skilled sales or BD executive, high-value relationships can flourish for many
years. Because of their unique association with customers and the bond of trust built carefully
over time, BD decision makers are often asked to notify key accounts when their data has been
breached. Receiving unfavorable news from a trusted friend and partner may lessen the impact
and mitigate any potential backlash, such as a loss of confidence or flight to a competitor.
After obtaining the facts from IT, legal, PR or other internal teams, the BD stakeholder should
contact the account and carefully explain what happened. Accuracy and transparency are
essential. The stakeholder should stick to the known facts and under no circumstances speculate
about or downplay any aspect of the breach.
Whenever possible, updates or special instructions regarding the breach should be promptly
delivered by the stakeholder in charge of the account. This will provide reassurances that

244
someone with executive authority is proactively engaged in safeguarding the account’s
interests and security.

9.8.3.6 Outside Resources


In addition to the support of internal functional leaders, a successful response may depend
heavily on the aid of outside specialists retained to manage notification, call center and breach
remediation activities. It is a best practice to negotiate agreements with experienced breach
response providers prior to having to respond to an incident.
Professional forensic firms prepare themselves to deploy at a moment’s notice. Once on the
scene, investigators work closely with the organization’s IT group to isolate compromised
systems, contain the damage, preserve electronic evidence, establish a chain of custody, and
document any actions taken.
Depending on the type of evidence uncovered, the affected organization may need to confer
with outside counsel regarding its legal obligations. Breach definition and applicable reporting
requirements usually depend on a variety of state and federal laws and international regulations,
as well as the compromised organization’s industry. Healthcare, for example, is subject to a
different set of regulations than non-healthcare businesses. With so many variables influencing
the notify/don’t notify decision, advice from an experienced breach or privacy attorney can
prove invaluable in meeting legal obligations and mitigating unnecessary costs.
As the forensic and legal analysis concludes, the decision whether to notify affected parties
must be made. If notification is indicated, the incident response plan must be activated and “go-
live” preparations quickly initiated. While the organization’s focus shifts to executing the
incident response plan, it is also important to continue addressing the cause of the breach.
Whether through training employees, replacing equipment, installing new software, adding
staff, creating a new oversight position, or replacing the responsible vendor, some action must
be taken, and quickly. The situation that led to the breach should not be allowed to continue
unchecked, or the entire costly exercise may be repeated unnecessarily.

9.8.3.6.1 Print Vendors


A reputable print provider, for example, can be invaluable in leveraging its equipment and
assets to produce, stuff, mail and track large volumes of letters. The print vendor may also
guide the breach response team leader and appropriate support staff through the notification
effort’s many technical details. Important but less obvious support activities such as gathering

245
logos, sample signatures, letter copy, and address files must also be completed as production
and delivery deadlines approach.

9.8.3.6.2 Call Center


Once notification letters are delivered, recipients will begin calling and emailing the
organization to inquire about the event and its impact on their lives. In situations where
projected call volume is large enough for call center outsourcing, it is crucial that the team
leader fully understand the vendor’s overall capabilities. As soon as possible, agreements
should be reached and the timeline set for training and assignment of agents, call-routing
programming, message recording, preparation of service level agreements (SLAs) and call
center reporting.

9.8.3.6.3 Remediation Providers


Depending on the nature of the information compromised, breached organizations may choose
to engage remediation providers to reduce consumers’ risk of fraud or identity theft. This may
include a third-party service to monitor credit activity. The service should be offered free to
the consumer and include, at minimum: daily monitoring and alerts of activity from all three
national credit bureaus, identity theft insurance, and fraud resolution services. In some cases,
supplemental services, such as internet scanning (for compromised information), may also be
deployed to help protect consumers.

9.8.3.7 Union Leadership Role During an Incident


In preparation for a breach, union stakeholders should identify appropriate contacts within the
organization and become familiar with its overall breach response plan. Specifically, they
should know the roles, responsibilities and sequence of events to be taken by other nonunion
stakeholders and response teams.
After a breach occurs, the primary roles for the union stakeholder are communication and
coordination. Working with IT, HR or PR executives, the union steward may oversee the use
of electronic communication channels, such as social media or union intranet or website, to
provide members with timely updates and instructions. If member directories and databases are
supplied ahead of time, marketing and call center teams can notify or update members directly
through mail, email or phone calls.

246
9.8.3.8 President, Chief Executive Officer
One of the first and arguably most critical steps taken by the top executive is to promptly
allocate the funds and manpower needed to resolve the breach. Having resources readily
available helps teams quickly contain and manage the threat and lessen its overall impact.
In the period immediately after a breach, PR or communications teams will handle most of
the media interaction. At some point, however, top executives could be called upon to publicly
comment on the breach’s cause or status. As with any organization attempting to manage a
crisis, accuracy, authenticity and transparency are essential. Regular status updates from IT and
legal, as well as coaching support from PR/communications can prepare the president/CEO for
scrutiny from a potentially hostile media corps.
When addressing the public, executives would do well to follow messaging recommendations
set forth by the communications team. This helps ensure message consistency and reduces the
risks of speaking in error or going off topic. The CEO, supported by the privacy team, might
also be well-advised to get in contact with the responsible data protection authorities or
regulators to discuss the incident and assure them that the breach is being handled from the top
down. With personal information exposed, peoples’ lives and even livelihoods are at risk.
Therefore, language and tone used to address the public should always be chosen with great
care. The sensitivity with which an organization responds to a breach and executives’ actions
during the event will affect how quickly the organization’s brand trust and customer relations
are restored afterward.

9.8.3.9 Board of Directors


The board is responsible for ensuring that a company is well run and focuses on ensuring that
the company properly handles risk exposure. During an incident, company personnel and
management will find themselves in frequent communication with the board. The board will
ask questions about how an incident is being handled and whether the company is properly
mitigating its risks.

9.9 Investigating an Incident

Breach investigation is a subset of breach response and occurs once breach investigators have
concluded that sensitive information has been compromised. Professional forensic
investigators can capture forensic images of affected systems, collect and analyze evidence,
and outline remediation steps. During an investigation, on the containment side, the focus is on
isolating compromised systems, containing the damage, and documenting any actions taken.

247
On the legal side, the focus is on determining whether the event constitutes a “breach” as
defined by the relevant laws, preserving electronic evidence, and establishing a chain of
custody.

9.9.1 Containment
During the investigation phase of an incident, containment will be top of mind for the
IT/information security team. The need to prevent further loss by taking appropriate steps is
critical. These include securing physical areas and blocking bad actors’ access to impacted
data. The approach to these issues, however, needs to be balanced with the legal steps discussed
in the next section.
Another part of containment is fixing the vulnerabilities that allowed the bad actor to access
the systems in the first place. After ensuring any breach is contained, begin analyzing
vulnerabilities and addressing third parties that might have been involved. Where appropriate,
it may be necessary to share learnings, but this should be done in conjunction with the legal
steps discussed in the next section. Factors to consider include:

• Service providers. Were they involved? Is there a need to change access


privileges? What steps do they need to take to prevent future breaches? How can
you verify they have taken these steps?

• Network segmentation. Ensure your segmentation plan was effective in


containing the breach.

9.9.2 The Importance of Privilege


When investigating an incident, a company will want to make sure that its investigation and
related communications and work product are protected by attorney-client privilege. Attorney-
client privilege protects any communication between a lawyer and their client made for the
purpose of giving or obtaining legal advice. The attorney work product doctrine protects
documents or analyses made by a lawyer or at the direction of a lawyer in anticipation of
litigation. A company should involve its attorneys as soon as it discovers a breach has occurred
and ensure that the attorneys are directing the investigation for the purpose of legal advice or
in anticipation of litigation. (CC’ing an attorney on an email is not enough to create privilege.)
It is better to have the process directed by outside counsel than by inside counsel, because
courts have in some instances ruled that there was no privilege where inside counsel appeared
to be acting in a business rather than a legal capacity. A proper investigation may generate
communications and documents containing facts and opinions that reflect badly on the

248
company, or sensitive materials such as trade secrets. An investigation directed by counsel will
maintain privilege so the company has the freedom to perform a thorough and effective
investigation into the incident without fearing that the communications and documents created
during that process will be used against it in later litigation.

9.9.3 Notification and Cooperation with Insurer


After a cyber incident, a company should notify all its insurance providers, because there may
be coverage under more than just a standalone cyber policy. After notification, the company
should receive a coverage letter from the insurer outlining the scope of coverage. Depending
upon the policy, the insurer may require the company to use the insurer’s preferred service
providers during the investigation. The costs of breach response including notification, credit
monitoring, PR and data recovery can add up quickly. Accordingly, it is important for
companies to seek ongoing updates on the status of their coverage levels.

9.9.4 Credit Card Incidents and Card Schemes


Companies that have contracted with credit card companies to accept credit card payments
must notify those credit companies in case of a breach. The contract should be consulted
because it is likely to contain specific requirements not only about notification but also about
post-breach procedures and cooperation with the credit company.

9.9.5 Third-Party Forensics


It may be necessary to engage outside forensics vendors in a complex breach. To ensure the
investigation is privileged, those vendors should be engaged by the attorneys (preferably by
outside counsel) rather than the company. Furthermore, to the extent the company is insured,
it should check with the insurer in case the insurer requires the company to use particular
vendors. The engagement letter with the vendors should specify that their work is undertaken
at the direction of counsel as “work for hire” for the purpose of providing legal advice, and all
documents and communications with the vendor should be labeled confidential and
proprietary.

9.9.6 Involve Key Stakeholders During Investigations


Think carefully about how it is most appropriate to involve key stakeholders. What is the
culture of your company? What are the legal and practical risks in your situation of involving
large groups in potentially sensitive matters? In some cases, daily meetings with your core

249
response team will be needed. There may also need to be some reporting out to other
stakeholders, especially company leadership.

9.10 Reporting Obligations and Execution Timeline

Not all breaches require notification. There are various types of notification requirements to
regulators and affected individuals. If data was encrypted, or if an unauthorized individual
accidentally accessed but didn’t misuse the data, potential harm and risk can be minimal and
companies may not need to notify (based on applicable laws). Notification may be required
even without harm to an individual. Coordinating with legal counsel to understand notification
obligations is critical.
Breach-reporting obligations for legal compliance vary by jurisdiction, but tend to adhere to
certain principles, including harm prevention, collection limitation, accountability, and
monitoring and enforcement. The legal team will determine, based on the facts, whether the
situation constitutes a breach as defined by relevant laws such that notification is necessary.
If notification is needed, it must occur in a timely manner. (The specific time frame is often
dictated under the relevant statutes.) To best accomplish notification, a timeline to guide the
execution and administration of breach resolution activities is critically helpful. Close
coordination among internal and external stakeholders will also help ensure that all plan
elements are executed in the proper sequence.
No strategy is bulletproof, and no timeline perfect. But the crucial execution phase of the
incident response plan is particularly susceptible to setbacks if realistic, properly sequenced
timelines are not observed.
Because of organizations’ vastly differing cultural, political and regulatory considerations, it
is usually not practical to prescribe a rigid, one-size-fits-all breach event timeline. There is
value, however, in including some or all the following communication tactics when
formulating a breach response.

9.10.1 Notification Requirements and Guidelines


Escalation refers to the internal process whereby employees alert supervisors about a security-
related incident, who in turn report the details to a predefined list of experts—typically the
privacy office—which will then engage IT, information security, facilities or HR. Notification
is the process of informing affected individuals that their personal data has been breached.

250
During the management of a privacy incident, it is imperative that all internal
communications are locked down so that inaccurate or incomplete details regarding
the incident are not sent around the organization. The incident response team should
be responsible for all internal communications regarding the incident; these
communications should only be forwarded to staff on a need-to-know basis.

Many statutes prescribe specific time frames for providing notification—either to impacted
individuals and/or relevant regulators. The legal requirements change regularly. For planning
purposes, however, it is enough to know that when investigating an incident, time is of the
essence. Timing is even more critical once the incident has been confirmed to be a breach.
Organization privacy professionals and those charged with incident response planning and
notification should be intimately familiar with the prevailing notification requirements and
guidelines and should work with qualified legal counsel to assist in making the legal
determination about the need to give notice.

Incident response teams should always confirm requirements with legal counsel
experienced in data privacy litigation prior to initiating or forgoing any notification
campaign.

Because of the potential consequences to the organization and to those whose data has been
exposed, organizations must quickly initiate the notification process. This includes verifying
addresses; writing, printing and mailing notification letters; setting up a call center; and
arranging support services such as identity theft protection for affected individuals.
In the United States, some states mandate that notification letters contain specific verbiage or
content, such as toll-free numbers and addresses for the three major credit bureaus, the FTC
and a state’s attorney general. Multiple state laws may apply to one breach, and notification
may be delayed if law enforcement believes it would interfere with an ongoing investigation.
The notification deadline weighs heavily, in addition to the public scrutiny and already
stressful ordeal of a data breach. Mishandling notifications can lead to severe consequences,
including fines and other unbudgeted expenses. For extra support, some companies enlist the
services of a third-party breach resolution provider to assist with notification, call-handling and
credit-monitoring offers. Lining up providers in advance can sometimes reduce response times
and related costs. Coordinating with your legal counsel—who will be familiar with a wide
range of providers—as well as your insurance carrier (if you have insurance) is important in
making these assessments.

251
9.10.2 Internal Announcements
Attempting to keep employees from learning of a data loss is neither prudent nor possible. On
the contrary, transparency is typically paramount to maintaining integrity and credibility. When
a breach occurs, in most situations all employees should receive properly worded
communications about the event, along with specific guidelines and prohibitions about
externally disseminating information. Employees should be told who the designated press
contact is. When it comes to external breach inquiries, employees should always defer to those
authorized to speak about the incident and not provide information themselves.
Internal breach announcements should be timed to avoid conflict with other organizational
initiatives and to avoid negative legal exposure. To minimize the chance of leaks, align
messaging, and demonstrate transparency, these announcements should also be delivered at the
same time as external statements.
A breach may affect an organization’s real or perceived financial viability, so the HR team
should prepare to address a range of employee concerns. How these concerns should be
addressed must be considered in light of a company’s legal risks and exposures. If an event has
occurred but does not affect employees directly, the following activities may help supplement
the internal announcement process:

• Creation, approval and posting of employee-only FAQs

• Response training for HR personnel and call center staff

• Creation, approval and distribution of explanatory letter, email or intranet


communications

9.10.3 External Announcements


The creation and release of external communications should be closely coordinated with the
call center, in connection, of course, with legal counsel. In addition to notification letters and
press releases, other external strategies and tactics may be deployed to announce and manage
breach communications. Among the most important of these is to engage a professional crisis
management or communications firm (if none are available internally) and designate a senior,
media-trained executive as the organization’s spokesperson.
Talking points should be developed as quickly as possible, so the spokesperson may
confidently address the media. For consistency, foundational message points can be used to
create content for press releases, intranets, the organization’s website, and FAQ sheets for call

252
centers. A dedicated toll-free number should be secured and routed to the correct call center to
properly handle incoming calls.
While there’s no single correct way to communicate about a breach, messaging should always
be consistent. Potential consequences of inconsistent messaging include public
misunderstandings and assumptions, legal liability issues, loss of trust and consumer
confidence, and evidence of poor planning. Organizations should also consider call center FAQ
review and training, staffing-level assessment to ensure adequate coverage.

9.10.4 Regulator Notifications


Legal counsel should provide guidance on which state, federal or international regulatory
agencies require notification in the event of a data breach. In many instances in the United
States, it is appropriate to contact the state attorney general and, in some cases, the FTC. In the
healthcare industry, the Department of Health and Human Services (DHHS) may need to be
notified as well. Notification to these agencies would be determined on a case-by-case basis,
depending on the size and scope of the data breach; work with your legal counsel with data
breach experience to provide such notices.

9.10.5 Letter Drops


Letters and emails are the most common forms of breach notification. As organizations decide
to notify, the need to meet specific deadlines in accordance with applicable laws while working
within the constraints of complex production and delivery processes can be unwieldy and
difficult to reconcile.
Unlike outputting documents from a computer, industrial-level printing requires a great deal
of preparation and quality control. Verifying mailing file completeness, format consistency,
and age of mailing list data can add days to the production timeline.
Moreover, changing content during production or delivering assets (e.g., logos, signatures,
copy) after specified deadlines can unnecessarily delay notification and burn precious days in
an already accelerated schedule.
Here are some time-proven methods for ensuring a more efficient process:

• If appropriate, establish a secure data transfer channel

• Create letter copy and put it into Microsoft Word or another preferred format

• Obtain any necessary content approvals from the compliance and/or legal team

253
• Send usable data files to the print shop, including a properly formatted logo and
electronic signature

• Supply a return address for undeliverable mail

• Review final letter layout for a legible, aesthetically pleasing appearance

When planning letter drops, remember that a data breach may also involve criminal activity
and, therefore, law enforcement personnel. If officials determine that the notification will
impede their investigation or threaten national security, delays can be expected.

9.10.6 Call Center Launches


Call centers normally in place have the infrastructure, policies and procedures needed to
seamlessly switch from providing general customer service to answering breach-related calls.
For a switch to be successful, proper preparation for every call center component is required.
Adequately staffing the incident response team is one particularly critical consideration.
To increase headcount, temp agencies or outsourcers may be retained. The next steps are
drafting phone scripts (sometimes in multiple languages), conducting call-handling training,
and recording a message for the call tree. A dedicated toll-free number should be assigned and
a call escalation process identified. Other preparations may include:

• Creating, approving and uploading email templates

• Training the quality assurance team on the details of the initiative

• Pulling and analyzing reports

• Monitoring call levels to determine staffing needs

9.10.7 Remediation Offers


Besides trying to protect incident victims’ identities, companies tend to offer remediation
services to soften the blow of a breach. If a remediation offer is made, the organization should
facilitate the dialog between the parties involved, which typically include the credit-monitoring
provider, letter print shop, and call center.
As a best practice, the notification letter should feature a full description of the remediation
product, enrollment instructions, and a customer service phone number or email address. An
activation code, by which the recipient may redeem the remediation product, should also be
included. To assure close collaboration between the three groups, the following steps are highly
recommended:

254
Remediation Organization

• Create one activation code per affected person for inclusion in notification letters

• Provide a full product description to the printer and the call center vendor, along
with a toll-free number and an enrollment website URL

• Launch and test the website for enrollments

• Ramp up and train internal call center staff to enable phone enrollments and
answer product questions

• Approve the final letter copy as it pertains to the accuracy of the offer details

Print Shop

• Obtain product description and activation codes from the remediation firm

• Merge product copy and activation codes into notification letters

• Print and mail letters according to agreed-upon standards and timelines

Call Center

• Receive product description and, as appropriate, train internal staff on basic


product questions

• Determine and institute call transfer procedures between the vendor call center,
remediation firm and affected organization

9.10.8 Progress Reporting


There is some debate about the level and type of progress reporting that is needed for an
incident. Keep in mind that every situation is different. That said, making sure the incident
team is well-informed and moving toward a unified goal is critical.
For complex or large-scale data breaches where notification is required (as determined by
legal), there will be a significant number of letters mailed, calls received, and credit-monitoring
enrollments. Keeping track of this information and being prepared to report up (or down) is
important, and having a strong reporting structure plays a pivotal role in distilling the chaotic
flow of reports into a clearer, more manageable stream.
You will need to give different types of reports to different stakeholders based on their need
to know. Regardless of audience, progress reporting during the breach recovery period should
focus on the question, “What data do they need, and when do they need it?”

255
During the breach notification period, the incident team may be called upon to provide
metrics about how the event is being received by affected individuals, the press, regulators and
the public generally. Requests may come from company leadership, the board, impacted
departments, or even regulators who are closely watching the company. Typically, stakeholders
will want weekly updates, although in some circumstances daily reports are requested.
The type of reporting and frequency should always be customized to each individual event.
When putting together a reporting plan, keep in mind who is asking, what they need to know,
and legal issues of privilege and risk. The answers to these questions will inform the format of
the reporting content and structure.
During the active notification period, mail drops should be reviewed at least daily to ensure
alignment with approved delivery deadlines. Additionally, mailing and call center activities
should be closely coordinated to ensure response staffing levels are optimal. In situations where
victims receive credit activity monitoring or other remediation, it may be beneficial to track
enrollments and customer escalations daily for at least the first few weeks.
In the first days or weeks (depending on the severity of the incident), senior management
may request briefings on the developments daily. Similarly, the PR group will often track daily
breach-related news coverage to confirm that the organization’s event narrative is being
interpreted as intended. To mitigate public backlash, clarifying responses to inaccuracies or
negative press should be prepared and practiced in advance.
Investors and other external stakeholders will naturally want to keep abreast of all breach-
related developments. If the organization is publicly traded, a good practice is to update senior
management at least weekly for the first few months after breach notification.
Regular reviews should be scheduled to update functional leaders, senior managers and other
key stakeholders about the status and impact of the incident response effort. A breach’s effects
on employee productivity and morale should not be underestimated, so keeping workers
informed about how the incident is being handled is always a top priority.

9.11 Recovering from a Breach

9.11.1 Response Evaluation and Modifications


Incident response can be tested with a variety of scenarios. But even a well-written plan can
falter when the theory behind it collides with realities on the ground. As teaching tools, real-
life breaches are far superior to hypothetical scenarios, so lessons learned from all incidents
must afterward be captured, recorded and incorporated into the plan.

256
Once the initial chaos of a breach has subsided, the affected organization should carefully
evaluate its incident response plan. Even the most well-thought-out responses can benefit from
the lessons learned after a live event.
Among the most beneficial questions to answer about the response are:

• Which parts of the process clearly worked as intended?

• Which worked only after some modification?

• Which did not work at all?

• What did the team do exceptionally well? What didn’t go well?

• Were any unforeseen complications encountered? How could they have been
avoided?

• How well was the team prepared for the unexpected?

• How realistic were the plan’s response timelines?

• What was the difference between actual and budgeted costs?

• Was the team sufficiently staffed?

• Were all relevant parties part of the team?

• What could be learned and what be improved upon for the next potential breach?

9.11.2 Calculating and Quantifying the Costs


While many breach-related costs can be identified and tallied using actual invoices, others are
less apparent. Lost business opportunities and damage to brand equity are examples of costs
that may affect the bottom line for years following a breach. Table 9-3 includes typical
categories of breach-related expenses in cases where costs can be traced to specific activities.

Table 9-3: Breach-Related Expenses

Expense Description

Legal Costs

Punitive Costs Fines, lawsuits and other penalties stemming from negligence
in preventing or improperly responding to the breach

Internal Costs

257
Outside Counsel Legal review of the organization’s contractual and regulatory
obligations after a breach; may include defense costs if
litigation results

Crisis Management/PR Experts to help the organization craft and deliver cohesive,
properly timed and customer-friendly communications about
the incident

Forensic Investigators Specialists to confirm, contain and eliminate the cause of the
breach and determine the size, scale and type of records
affected

Call Center Support Staffing, training and support of the customer care team
responsible for handling calls and emails related to the incident
and its aftermath

Equipment Replacement and Equipment changes, system upgrades and physical security
Security Enhancements improvements to mitigate the current breach and prevent
future incidents

Insurance Retention (deductible) payments and fee increases associated


with the breach

Card Replacement The cost of issuing new cards (in incidents when credit card
numbers have been compromised)

Employee Training Educational activities intended to improve upon previous


programs that facilitated the breach

Remediation Costs

Victim Notification Creation and delivery of letters, emails, web pages and other
methods/channels to notify affected individuals about the
incident

Remediation Offers Provision of services such as credit monitoring, fraud


resolution and identity theft insurance to breach victims

Victim Damages Costs related to correcting damages incurred by breach victims

Intangible Costs

258
Customer Retention Marketing campaigns designed to prevent customer attrition
and win back lost business following an incident

Lost Revenue and Stock Reductions in stock price, lost customers and other revenue
Value decreases directly related to the loss

Opportunity Costs Lost productivity and revenues, as employees suspend


regularly assigned tasks to assist with breach response
According to the Ponemon Institute, the probability of a data breach in a 24-month period is
almost 28 percent.13 The numbers shown in Table 9-4 can be helpful when a privacy manager
is attempting to conduct a cost-benefit analysis or get buy-in or budget from organizational
leadership for breach preparedness measures. Several factors can affect the per-capita cost of
a data breach—both positively and negatively. Knowing this can help organizations prioritize
their spending to mitigate potential costs of a breach.

Table 9-4: Average Cost Saved per Record in the Event of a Breach14
Incident response team +$14.00

Extensive use of encryption +$13.10

Employee training +$9.30

Business continuity management (BCM) involvement +$9.30

Participation in threat-sharing +$8.70

Artificial intelligence platform +$8.20

Use of security analytics +$6.90

Extensive use of data loss protection (DLP) +$6.80

Board-level involvement +$6.50

CISO appointed +$6.50

Data classification schema +$5.10

Insurance protection +$4.80

CPO appointed +$1.80

Provision of ID protection -$1.20

Consultants engaged -$3.70

259
Rush to notify -$4.90

Extensive use of internet-of-things (IoT) devices -$5.40

Lost or stolen devices -$6.50

Extensive use of mobile platforms -$10.00

Compliance failures -$11.90

Extensive cloud migration -$11.90

Third-party involvement -$13.40


NOTE: These figures indicate money saved; e.g., having an employee training program saves,
on average, $9.30 per record in the event of a data breach.15

9.12 Benefiting from a Breach

While no organization would choose to experience a data breach, failures breed opportunity
for organizational change and growth. How can you ensure you walk away from a breach better
prepared for the future? Be sure to conduct a breach or incident response review, or a post-
incident assessment. At minimum, review these items:

• Staffing and resourcing

• Containment, including timing and processes

• The C-suite commitment, including signoff on new measures and allocation of


resources

• Clarity of roles of the response team and others

• The notification process for individuals, regulatory bodies and others

Your organization’s objectives for breach management will likely change after an incident.
Take this time to renew your funding, focus and commitment.

9.13 Summary

A proper breach response plan provides guidance for meeting legal compliance, planning for
incident response, and handling privacy incidents. An organization needs to be prepared to
respond to its internal and external stakeholders—including regulators. The privacy
professional and related team members need to be prepared to respond appropriately to each
incoming request to reduce organizational risk and bolster compliance with regulations.

260
Endnotes

1 Ponemon Institute, 2018 Cost of Data Breach Study, p. 3, July 2018,


https://public.dhe.ibm.com/common/ssi/ecm/55/en/55017055usen/2018-global-codb-
report_06271811_55017055USEN.pdf (accessed November 2018).
2 Id., p. 19.
3 Id.
4 U.S. Department of Labor, Bureau of Labor Statistics (2017),
https://www.bls.gov/news.release/archives/union2_01192018.pdf (accessed November 2018).
5 AFL-CIO, www.aflcio.org/About/AFL-CIO-Unions (accessed November 2018).
6 State of the Phish Report, p. 3, Wombat Security, https://www.wombatsecurity.com/blog/2018-state-of-the-
phish-phishing-data-insights-and-advice (accessed November 2018).
7 Id., p. 6.
8 Id., p. 7.
9 The 2016 Continuity Insights and KPMG LLP: Global Business Continuity Management (BCM) Program
Benchmarking Study, p. 10, https://assets.kpmg.com/content/dam/kpmg/kz/pdf/2016-CI-KPMG-Report.pdf
(accessed November 2018).
10 Id.
11 Id.
12 Id., p.13.
13 Ponemon Institute, 2018 Cost of Data Breach Study, IBM, p. 1, July 2018,
https://public.dhe.ibm.com/common/ssi/ecm/55/en/55017055usen/2018-global-codb-
report_06271811_55017055USEN.pdf (accessed November 2018).
14 Id., p. 22.
15 Id., p.22.

261

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy