0% found this document useful (0 votes)
52 views

CCSP 2019 - Implementing Data Discovery & Classification

discover the key concepts covered in this course list IRM objectives such as data rights, provisioning, and access models recognize data discovery approaches and techniques for structured and unstructured data list challenges associated with data discovery in the cloud enable data classification using Microsoft Azure Information Protection for sensitive data such as Protected Health Information, Personally Identifiable Information, and card holder data recognize how PKI provides security for dig

Uploaded by

ValentinBriceag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views

CCSP 2019 - Implementing Data Discovery & Classification

discover the key concepts covered in this course list IRM objectives such as data rights, provisioning, and access models recognize data discovery approaches and techniques for structured and unstructured data list challenges associated with data discovery in the cloud enable data classification using Microsoft Azure Information Protection for sensitive data such as Protected Health Information, Personally Identifiable Information, and card holder data recognize how PKI provides security for dig

Uploaded by

ValentinBriceag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

CCSP 2019: Implementing Data Discovery & Classification


Proper data governance begins with labeling data and applying security controls based on those labels. Explore information rights management (IRM)
and challenges associated with data discovery, as well as the roles played by PKI (public key infrastructure) security certificates and virtual private
networks (VPNs) in the cloud. This 6-video course prepares learners for the (ISC)2 Certified Cloud Security Professional (CCSP) exam. Begin with
IRM objectives such as data rights, provisioning, and access models. Examine data discovery approaches and techniques for structured and
unstructured data, and challenges of data discovery in the cloud. Then examine data classification, enabled by using Microsoft Azure Information
protection for sensitive data such as Protected Health Information (PHI) and Personally Identifiable Information (PII), and cardholder data. Recognize
how PKI provides security for digital IT solutions; how to use PowerShell to create PKI certificates; and how to generate certificates in a Microsoft
Azure Key Vault. Learn how VPNs are used for secure cloud resource access. Then configure a Microsoft Azure point-to-site VPN and a custom
Microsoft Azure Key Vault key for storage account encryption.

Objectives
discover the key concepts covered in this course list IRM objectives such as data rights, provisioning, and access models recognize data discovery
approaches and techniques for structured and unstructured data list challenges associated with data discovery in the cloud enable data classification
using Microsoft Azure Information Protection for sensitive data such as Protected Health Information, Personally Identifiable Information, and card
holder data recognize how PKI provides security for digital IT solutions use PowerShell to create PKI certificates generate certificates in a Microsoft
Azure Key Vault define how VPNs are used for secure cloud resource access configure a Microsoft Azure point-to-site VPN configure a custom
Microsoft Azure Key Vault key for storage account encryption summarize the key concepts covered in this course

Table of Contents
1. Course Overview
2. Information Rights Management
3. Data Discovery
4. Data Discovery Challenges
5. Data Classification
6. Public Key Infrastructure
7. PowerShell Certificate Generation
8. Cloud Certificate Generation
9. VPNs and the Cloud
10. Configuring a Cloud VPN
11. Custom Cloud Storage Encryption Keys
12. Course Summary

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 1/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

Course Overview
[Video description begins] Topic title: Course Overview [Video description ends]

Hi I'm Dan Lachance.

[Video description begins] Your host for this session is Dan Lachance . He is an IT trainer and consultant. [Video description ends]

I've worked in various IT roles since the early 1990s including as a technical trainer, as a programmer, a consultant, as well as an IT tech author and
editor. I've held and still hold IT certifications related to Linux, Novelle, Lotus, CompTIA, and Microsoft. Some of my specialties over the years have
included networking, IT security, cloud solutions, Linux management, and configuration and troubleshooting across a wide array of Microsoft
products.

CCSP or Certified Cloud Security Professional, proves to the world that you have the cloud security skills necessary to use the best practices and
guidelines set out by ISC squared, to properly design, manage and secure applications, infrastructure and data in the cloud. In this course, I'll explore
how to implement data discovery and classification.

This includes the objectives, tools and challenges related to data discovery and classification. I'll begin by describing key data discovery approaches
and techniques, and how to implement data discovery for structured as well as unstructured data. Then I'll discuss the challenges associated with data
discovery in the cloud.

I'll then shift gears and explore data classification. This will include how to implement data classification mapping, implementing data classification
labeling, and implementing data classification for sensitive data, such as protected health information, otherwise called PHI, as well as personally
identifiable information or PII, as well as cardholder data.

Finishing up, I'll provide an overview of information rights management, IRM, and I'll list the key IRM objectives such as data rights, provisioning
and access models. And finally, I'll explore IRM tools such as the issuing and revocation of certificates.

Information Rights Management


[Video description begins] Topic title: Information Rights Management. Your host for this session is Dan Lachance . [Video description ends]

As the name implies, Information Rights Management, otherwise called I-R-M or IRM focuses on who has access to specific types of information and
through which methods. So we're going to talk about resource permission such as making sure that people have read or write capabilities to the
appropriate resources in the cloud. As well as the access model to access those resources.

For example, they might be using roles that are assigned to assistant cloud administrators so that they can manage a subset of cloud resources.
Information Rights Management applies to many different types of information. Including files and folders that might be hosted on an on-premises
file server. Or files and folders hosted in Cloud Storage. Or even a hybrid of both, where you're synchronizing an on-premises file server to the cloud.
It also includes web pages, e-mail messages as well as database objects.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 2/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

Like databases themselves, tables, views, stored procedures and query types. Cloud Information Management comes in many different categories.
Including identity and access management pictured here at the top. Identity and access management is often simply referred to as I-A-M, IAM. And
here we're talking about where user accounts will exist. You might, for instance, create user accounts in a cloud repository. Or you might replicate or
synchronize your on-premises user accounts that you already have to the cloud.

That would allow users to authenticate with their on premises credentials. Yet still be able to access cloud resources without a different set of
credentials. Identity and access management can also include single sign-on or SSO. So that users don't have to keep entering in the same credentials
when they access different resources. The next item here that we see on the right are access control lists, or ACLs. ACLs apply to many different
types of data security models, including firewalls.

You can have an ACL, or an access control list. Which is a set of rules that controls what traffic is allowed into or out of a network, or even a virtual
machine network interface. But an access control list can also be applied to files or rows in a table. Where we can determine who has access to a
resource and what those specific permissions are. At the bottom here with cloud information management, we see data loss prevention otherwise
called DLP.

The idea with DLP is to put policies into motion.So that we don't have sensitive data that leaks outside of the organization. For example, sending an
email message that contains a file attachment with sensitive information. That might be fine within the organization, but not being sent outside. So we
could force encryption. Or prevent that from happening in the first place through the configuration of data loss prevention or DLP policies. Finally,
another aspect of cloud information management would be to encrypt it.

Encryption keys can either be provided by a cloud service provider if you want to encrypt the data stored in cloud storage. Or you might generate your
own keys that are under your control to do the same thing.

Data Discovery
[Video description begins] Topic title: Data Discovery. Your host for this session is Dan Lachance . [Video description ends]

Protecting data assets in the cloud, first means you have to know what data is out there and how it's stored as well as where it's stored. So first, let's
start with the definition of terms. We want to distinguish the difference between data and information. Because technically, they're not exactly the
same thing. So when we talk about data, we're really talking about things like raw text or numbers. As an example, imagine that we have an individual
cloud-based virtual machine performance metric item. Let's say the CPU utilization for a point in time, now that's data.

But information is data that has been organized in some way. So it's been ordered, it’s been structured, it’s been contextualized. So to further our first
example, information in this case might include the average of a virtual machine performance metric over time. That gives more insight. It's
informational as opposed to just a raw single data point that's not relative to anything else. Now, data discovery also means understanding how to pull
data out of databases. And we can do that using Structured Query Language, or SQL.

SQL is a language for accessing relational database objects and the data stored within them, so accessing things like tables and the records or row
within those tables. With a structured query language or a relational type of database environment, you can have multiple tables that are related via
common fields. This is referred to as normalization to reduce repeated data. So imagine you want to store customer information.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 3/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

But then you also want to store each customer transaction. Well, for each transaction, you're not going to want to repeat all the customer information
like their mailing address, their email address, their phone number. Instead, that would be a separate table linked to a transaction table, perhaps
through a customer ID field. SQL objects include the servers themselves that host databases, SQL databases as well as the rows which are stored
within tables. The rows are the actual data. When we talk about servers, databases, tables and fields, these are the constructs that allow the storage of
the actual data.

In the cloud, as well as on premises, you can also work with NoSQL databases. Now, this is different than your standard relational database because it
has a less rigid schema. Now, when we say less rigid schema, what we're talking about is the blueprint on how data will be stored is much less strict.
In other words, you can store many different types of data, even within a single table, without having to define the specific columns and their data
types.

It also is designed for working with big data. So NoSQL databases are highly scalable. NoSQL databases share a number of characteristics, such as
high performance and availability. The fact that each row can store completely different type of data than the previous row. And that we’ve got a
multitude of different types of NoSQL storages. That would be key-value, document, or graph. Now, remember that this is non-relational.

Unlike a standard relational environment, we don't have to normalize data and link it in different tables via a common field. The other aspect of
NoSQL is that it's designed to work on a very large scale. It’s designed for big data analytic processing. And so it supports horizontal scaling. Now,
technically, so does a relational database environment. But most NoSQL environments are designed with clustering in mind right from the beginning.
So clustering, load balancing to distribute incoming app requests among multiple back end nodes, and even replicating data.

Data Discovery Challenges


[Video description begins] Topic title: Data Discovery Challenges. Your host for this session is Dan Lachance . [Video description ends]

Data discovery is important in that organizations need to be aware of the data that they have and how it's being stored in a cloud computing
environment. As well as an on-premises environment, especially when you've got a hybrid, you're synchronizing data between both. The first thing to
consider is retention policies.

Retention policies can be related to backups and long-term archiving. A lot of the times, that will be influenced by laws and regulations that stipulate
details about how long certain types of data need to be retained. But then we have to think about metadata accuracy. First of all, what's metadata?
Metadata is additional data that you would tag items with, like files, folders, cloud resources like virtual machines and storage accounts.

Now, you do this, you add this metadata, so that you can categorize items in a similar manner. So if things aren't labeled correctly in the first place,
how are you supposed to find anything quickly? So that is one of the challenges with data discovery, is making sure that metadata added to items is in
fact accurate and that it's been done in the first place. The other challenge is going to be the location of data. We know that with the public cloud,
public service providers have points of presence all over the globe in different areas.

And so when we deploy cloud resources, like virtual machines or databases or storage accounts, we can deploy them in a specific part of the world.
But we can then have it replicated to a totally different region elsewhere in the world. So understanding the data replication configuration and patterns
is going to be important to discover where data is. And when we say data, we have to think about the fact that there might be multiple copies of it due
to this replication.
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 4/15
2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

The other thing to think about is content delivery networks. In the same way as replication with standard cloud storage can exist in different parts of
the world, so too can cached content that you've configured to be pushed out to different regions. So that when users request content, it's local,
therefore it's quicker. It improves the user experience. We then have to think about having the necessary permissions to access data in the first place.

Now, this is from the perspective, for example, of law enforcement that might be following through with a court order to discover data, even in the
form of eDiscovery. The appropriate permissions to access data are important once we know what types of data we have and where they are located.
The next thing to think about are audit trails. Not just normal audit activity, but also discovery activity.

The very nature of data discovery means that we're going to be accessing data which could result in audit log entries, and that should be noted and
everyone should be made aware of this. The other thing to consider is documentation that might be related to data discovery. It could also be
indirected type of documentation, like network diagrams that outline where cloud resources are stored and whether or not they're replicated in
different parts of the world.

Data Classification
[Video description begins] Topic title: Data Classification. Your host for this session is Dan Lachance . [Video description ends]

Data classification involves categorizing items, data assets, so that we can properly apply security controls such as those that are designed to protect
sensitive information. Personally Identifiable Information, otherwise called PII, P-I-I, is really anything that can uniquely identify an individual. That
could be one or more pieces of information. So when we have combined pieces of information used together, they can link back to an individual. It's
still considered PII.

So this is all about data privacy for the collection of data, the transmission of data, and the storage of data as well as the sharing of data and its
ultimate usage and processing. And that's why, when it comes to various laws and regulations, they will make reference to personally identifiable
information and ways in which it must be treated. Personally identifiable information would include items such as a web browser cookie. Technically,
it really depends on what's in the cookie, but generally speaking, it is considered PII. User location, such as through GPS coordinates, is considered
personally identifiable information, at least at that point in time.

IP addressing, credit card numbers, so anything like that. Now, we can conduct a PII audit assurance review to ensure that the current security controls
that are in place are effective. So this would involve, first, reviewing existing PII policies, evaluating the PII control efficacy. So, are things effective
in protecting that sensitive data the way they need to be protected? Usually, in accordance with laws and regulations. Then we can identify any
deficient PII controls. And then we can implement change, whether it's reconfiguring an existing solution, or putting something in place that's brand
new.

We then have protected health information, otherwise called PHI. This is similar to PII, but it's focused instead on the medical industry in the United
States. It deals with both past and current health related information, and also future health details that might be related to health care, how it will be
paid for, such as through insurance payments. Protected health information would include things such as a patient's name, their Social Security
number, medical records, blood type, lab test results, phone number, e-mail address.

Pretty much anything that could trace back to an individual. So classifying data, then, means we have to have a sense of what types of data we have
out there and where they're stored. So we need to conduct a data inventory to determine that. Then we have to think about any security clearances that
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 5/15
2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

we might have to define within the organization to allow limited or restricted access to sensitive information. And as always, we have to think about
existing controls and whether they are effective in protecting threats against those data assets.

Data classification can happen internally. So for example, when it comes to user on-boarding for new hires and working with budgets or
organizational charts, all of this can be considered internal types of items or data assets that can be classified. Then we can classify or categorize
sensitive items like medical patient information, financial information, protected health information and PII. Even things like cloud storage that might
not be allowed due to where the data would reside and which laws would be applicable.

We can also look at classifying data, for example, as being public. That would be marketing information, or product and service documentation that
might even be made available on a product website. After you've classified data and determined which is sensitive and needs to be protected, you can
think about the security controls that will protect that information. You might look at using multifactor authentication to enhance user sign in security.

You might look at encryption, whether it's encryption of data in transit over the network, or data at rest when it's stored. You might enable detailed
auditing or turn on auditing for items that are currently not being audited, that are related to sensitive information. You might change the configuration
of malware scanning.

You might enable data loss prevention or DLP policies, and you might reconsider your strategy for data synchronization and replication, such as to
different countries where cloud provider data centers might exist. And then you need to think about storage sanitization. So maybe certain types of
important information that is considered sensitive needs to be removed or deleted in a specific manner compared to non-sensitive information.

Public Key Infrastructure


[Video description begins] Topic title: Public Key Infrastructure. Your host for this session is Dan Lachance . [Video description ends]

PKI is one of those digital security solutions that can be used both on-premises as well as in a cloud computing environment. PKI or Public Key
Infrastructure is a hierarchy of digital security certificates. These certificates get issued and are managed by certificate authorities or CAs. Now, you
could have a private CA within your own organization that you have created, that in turn, will issue certificates for users or devices. But you could
also go to a public certificate authority, a public CA, and pay a fee to have certificates issued from it.

The benefit of doing that is that it will be trusted by devices. For example, if we issue our own private CA web server certificate for a public website,
well, yes, it can be enabled for HTTPS secure connectivity. The problem is, people's web browsers aren't going to trust your private CA as an issuer of
certificates. Where they would trust a public CA. So they're going to get a warning message that basically says not to proceed to that site. That's bad
for business.

So with the PKI hierarchy, at the very top we have the root certificate authority. Now, underneath that, optionally, you can also have a collection of
subordinate CAs. Now, these subordinate CAs might be for different departments within the organization. Or different business units, different
projects, even different countries. However, under the subordinate CA, you would then have the certificates issued from that subordinate CA.

Now, you certainly can have certificates issued directly from the root CA. You don't have to have this subordinate CA level in the PKI hierarchy. So
with PKI, the issuing certificate authority digitally signs issued certificates. Now, because root CAs are very important, in other words, if it gets
compromised, then the the compromised root CA means all certificates underneath it are compromised.
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 6/15
2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

So the root CA should be brought offline so it's not available unless it's absolutely needed. Because if it's offline, it's harder to get into and to hack
into. So what about a compromised subordinate CA, how would that work? Well, it's a hierarchy, so then certificates issued under that CA would be
compromised. But not certificates issued by other subordinate CAs or the root CA. So the PKI certificate then is issued by a certificate authority. And
it could be issued to users, or devices, or even software.

Now, we issue certificates, or the CA issues certificates for the purpose of encryption, integrity. So making sure the data hasn't been tampered with in
an authorized way. And authentication, such as ensuring an email message came from who it says it came from. Now, PKI certificates can be stored as
files on a device, or they can also be burned into smartcards. Now, what exactly is inside a PKI certificate? Well, here's a small sampling to give you
an idea. The X.509 version number, because PKI certificates are also called X.509 certificates.

They adhere to the X.509 certificate standard. The digital signature of the CA is also contained within an issued certificate, as well as the signature
algorithm that was used. Such as SHA-256 for example, secure hashing algorithm, 256 bits. The certificate has a unique serial number, it's got an
issue date and an expiry date after which it is not valid. Just like a driver's license, it's not valid after it expires.

Also the certificate intended use is stored within the certificate. The subject name, that one's important because that might be a user's email address or
it might be a website URL. And also a mathematically, yet unique, mathematically related public and private key pair that are used for security
operations like encryption and digital signatures.

So the public key can be shared publicly with any user or device. That's why it's called public key, there's no security risk in doing this. So what's
interesting is that if you encrypt a message to somebody, let's assume that you are sending an email message to your colleague, Bob. Well, what
happens is you need Bob's public key to encrypt the message for him. Now, the public key is also used to verify a digital signature.

If Bob digitally signed an email and sent it to you, you would use the public key for Bob to verify his digital signature. Now private keys must be
made available only to the owner of the key, hence the name private. It can also be embedded in cards like smart cards. So let's go back to our
examples of encryption and digital signatures. So encrypted messages are decrypted or unscrambled using a private key.

So in our example, we said we need Bob's public key to encrypt an email message sent to him. Well, he needs the related private key to decrypt it. The
private key is used to create a digital signature. The mathematically related public key is used to verify that signature on the other end.

PowerShell Certificate Generation


[Video description begins] Topic title: PowerShell Certificate Generation. Your host for this session is Dan Lachance . [Video description ends]

Part of security in the cloud can involve the use of PKI certificates. And there are plenty of ways that you can generate certificates, whether it's for a
web server, or for a client device for VPN use, or for a user for encrypted email. So here we're going to use Microsoft PowerShell to generate a root
certificate and a client certificate. And we're generating a root self-signed certificate that will be used to issue the client certificate. And you could use
the client certificate, for example, to connect to a client-to-site VPN, to authenticate the client to the VPN appliance.

[Video description begins] The screen displays a Windows PowerShell ISE window. This window is divided mainly into two parts. The upper half of
the window shows various kinds of code, and it also has a toolbar. The lower half of the window has an output message section. [Video description
ends]
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 7/15
2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

So here in the PowerShell ISE, in line 3, I'm creating a variable called $cert, or certificate.

[Video description begins] He refers to the following line of code in Line 3: $cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature '.
[Video description ends]

And what we're putting in that variable is the result of running the New-SelfSignedCertificate PowerShell cmdlet. You use this to generate as the
name implies SelfSignedCertificate.

[Video description begins] He refers to the following line of code in Line 4: -Subject "CN=RootCert" -KeyExportPolicy Exportable. [Video
description ends]

That's as opposed to going out on the Internet, and getting certificates issued from a trusted public certificate authority. So the type here is Custom,
and we can see that the subject parameter is using CN=RootCert. CN simply means common names, the name that you want to assign for this. I'm
calling it RootCert.

[Video description begins] He refers to the following line of code in Line 5: -HashAlgorithm sha256 -KeyLength 2048. [Video description ends]

I'm also setting the HashAlgorithm here, or digital signing, because it will digitally sign certificates it issues, like client certificates. The
HashAlgorithm is sha256, KeyLength is 2048 bits.

[Video description begins] He refers to the following line of code in Line 6: -CertStoreLocation "Cert:\CurrentUser\My" -KeyUsageProperty Sign -
KeyUsage CertSign. [Video description ends]

And the certificate location specified with the -CertStoreLocation parameter is going to be in the PowerShell certificate drive Cert:, and it's going to
go under CurrentUser\My. And the usage property is set to Sign, and the KeyUsage is certifier signer, CertSign.

[Video description begins] He refers to the following line of code in Line 10: New-SelfSignedCertificate -Type Custom -DnsName ClientCert -
KeySpec Signature '. [Video description ends]

Now the next thing we're doing in line 10 is we're using the same PowerShell cmdlet but for a different reason. In line 10, we're using New-
SelfSignedCertificate because we want to generate a client certificate from the root. So we're setting a DnsName here, I'm calling it ClientCert.

[Video description begins] He refers to the following line of code in Line 11: -Subject "CN=ClientCert" -KeyExportPolicy Exportable. [Video
description ends]

And I'm setting the subject, common name or CN =, in this case to ClientCert. That's the name of my choosing.

[Video description begins] He refers to the following line of code in Line 12: -HashAlgorithm sha256 -KeyLength 2048. [Video description ends]

Again, the HashAlgorithm sha256, KeyLength 2048 bits.

[Video description begins] He refers to the following line of code in Line 13: -CertStoreLocation "Cert:\CurrentUser\My" '. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 8/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

The certificate location is once again in the CertificateStore, or the PowerShell Cert:\CurrentUser\My.

[Video description begins] He refers to the following line of code in Line 14: -Signer $cert -TextExtension @ ("2.5.29.37={text}1.3.6.1.5.5.7.3.2").
[Video description ends]

But notice what I'm adding here, which is very important, is the Signer. Who is digitally signing this client certificate? I specify that with the -Signer
parameter. And I'm going to simply pass it my $cert variable that we created way up here beginning in line 3. That's our self-signed root certificate.
Then I've got -TextExtension with all of these numeric items separated by dots.

So the purpose of working with this type of syntax, with the New-SelfSignedCertificate cmdlet is to specify certain types of certificate extensions.
Things that you want to support or be usable through the, in this case, client certificate that's going to be generated. So let's go ahead and create these
two certificates by clicking the Run Script button here in the PowerShell ISE.

[Video description begins] The screen displays a Console1 window. This window is divided into three parts. The left section shows a folder, Console
Root. The middle section shows an empty section with the message that there are no items to show in this view. The right section has Actions heading.
[Video description ends]

After that's been done, our machine should now have a self-signed root certificate called RootCert, and a client certificate called ClientCert. On my
local machine, I've gone to the Start menu and searched for mmc.exe, Microsoft Management Console executable, and this is what pops up. So here
I'm going to go to File, Add/Remove Snap-in because I want to manage certificates. So I'm going to choose the Certificate snap-in, I'll click Add, it's
from my user account, Finish, and OK.

[Video description begins] After he selects the Certificates, the left section shows a Certificates - Current User folder. He expands this folder, which in
turn shows multiple folders, such as Personal, Trusted Publishers and so on. He clicks to expand the Personal folder, which shows a Certificates
folder within itself. He clicks the Certificates folder, which in turn shows various certificates in the middle section, such as ClientCert and RootCert.
[Video description ends]

I'm going to drill down in the navigator now under Certificates-Current User under Personal Certificates. And notice the presence of our self-signed
root certificate called RootCert, and the client certificate signed by the RootCert called ClientCert.

Cloud Certificate Generation


[Video description begins] Topic title: Cloud Certificate Generation. Your host for this session is Dan Lachance . [Video description ends]

There are many ways that you can generate PKI certificates using both on-premises tools as well as cloud based tools. In this particular case, we're
going to take a look at how to generate PKI certificates in the Microsoft Azure Cloud Environment.

[Video description begins] The Azure portal interface displays. The screen title is Azure services. Thumbnail images are present on the screen for
Virtual machines, App services, Storage accounts, SQL databases, and other services. There are different clickable boxes available for Microsoft
Learn, Azure Monitor, Security Center, and Cost Management. There is a Navigation pane to the left of the screen. It has various options such as

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 9/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

Create a resource, Home, Dashboard, All Services, All resources, Resource groups and so on. The host clicks the Create a resource option from the
Navigation pane. The screen is titled New. It has a Search field, followed by an alphabetical listing of apps. [Video description ends]

So here in the Microsoft Azure Portal, the web GUI tool, I'm going to click Create a resource in the upper left. And if you want to work with
certificates and generating and storing them in the Azure Cloud, you can do that using a Key Vault. I'll search key, I'll choose Key Vault, and then I'll
choose Create.

[Video description begins] He types in key in the Search bar, and select Key Vault from the drop-down list, which in turn opens a Key Vault page. In
this page, he clicks the Create button. When he clicks the Create button, a Create key vault page opens. The Create key vault page shows five tabs,
Basics, Access policy, Virtual network, Tags, and Review + create. Currently, the Basics tab is active. This tab shows various menus, such as
Subscription, Resource group, Key vault name and so on. [Video description ends]

So we're going to have to specify a resource group that we want to deploy this item into. Resource groups are used to manage related items as a single
unit in the Azure Cloud. I'm going to call this KeyVault490123. And I'm going to specify the region, geographical location, in this particular case as
Canada East. And that's it, that's all I'm going to do. I'm going to click the Review + Create button. It'll validate my selections, validation has passed,
so I'll create the key vault.

[Video description begins] When he clicks the Create button in the Review + create tab, an Overview page opens for the key vault. In this page, there
is a left ribbon showing various options, such as Overview, Inputs, and so on. This page shows that the deployment is complete, along with a Go to
resource button. [Video description ends]

After a moment we have a message about the deployment being completed, so I'll click Go to resource.

[Video description begins] When he clicks the Go to resource button, a page opens with the heading as the name of the key vault. In this page, there is
a left ribbon showing various options, such as Overview, Activity log, Access control (IAM), Tags, and Diagnose and solve problems in the top
section, and options such as Keys, Secrets, Certificates and so on under the Settings heading. [Video description ends]

That takes us into the properties of our newly created key vault. And one of the things that we'll see in the left-hand navigator is Certificates where I
can generate or import them. I'm going to click the Generate/Import button.

[Video description begins] When he clicks the button, a Create a certificate page opens. This page shows various options, such as Method of
Certificate Creation menu, Certificate Name field bar, Type of Certificate Authority (CA) menu, Subject and so on. [Video description ends]

We're going to generate a certificate here which I'm just going to call Cert1. It will be a self-signed certificate although we could choose a certificate
issued by an integrated CA or non-integrated CA. CA being a certificate authority, and I'm going to set the subject name. Here it's going to be a *cn=,
if this is for a server called server1.quick24x7.com, then I would specify that as the common name of the subject. Well, the subject could even be the
user.

So therefore, it could also be an email address. The validity period is going to be set for 12 months. And the percentage lifetime here, where it'll
automatically renew, is set to 80% of the lifetime, which in this case is 12 months. I'm going to go ahead and click Create to generate the certificate.

[Video description begins] When he clicks the Create button, the Certificates page opens again. [Video description ends]
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 10/15
2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

We have a message that the creation is currently pending. Click here to go to the certificate operations to monitor progress. I'm going to go ahead and
click on that.

[Video description begins] When he clicks the link, a Cert1 side panel opens, and it shows the Status as In Progress. [Video description ends]

And we can see the status is that it's currently in progress. I can keep clicking the refresh button until I see that it's been completed. So we've now
generated a PKI certificate that can be used by any cloud resources that would need it. For example, we might have a cloud hosted web application
that requires a PKI certificate to enable HTTPS.

VPNs and the Cloud


[Video description begins] Topic title: VPNs and the Cloud. Your host for this session is Dan Lachance . [Video description ends]

When the average computing user thinks about a VPN, a virtual private network, they're probably thinking about a way to anonymize their original
location. So they might be able to watch Netflix movies in a different country than they reside in, and that type of thing. But in this context, we're
talking about a virtual private network, or a VPN, that serves as an encrypted point-to-point network tunnel that is used for business purposes, such as
getting clients securely connected over the Internet to a site, such as a public cloud provider, or could be also a private cloud provider.

We can also use site to site VPNs to link an on premises network to a public cloud provider network. So essentially, to extend our on-prem network
into the cloud. A client-to-site VPN means that we've got a VPN client, that would be software running on a device, for example, a VPN app on a
smartphone, that establishes an encrypted VPN tunnel with a cloud VPN gateway.

A cloud VPN gateway is simply a software configuration, at least from the perspective of the cloud customers, the software config that defines a
connection point for VPN purposes. With a site-to-site VPN, our on-premises network must have a VPN appliance. But that could also be software
running on a server. It doesn't necessarily have to be a physical hardware VPN device.

But in the same way, we end up with an encrypted VPN tunnel over the Internet by making a connection to the cloud VPN gateway. Now, client-to-
site VPNs mean that we don't need a public IP address for the client device. That's because the client might be behind a NAT router. However, the
client can authenticate to the cloud VPN gateway in many different ways. Authentication could be by PKI certificate.

So without a PKI certificate, the client device can't even attempt to authenticate to the VPN to establish the tunnel. Or we might use centralized
RADIUS authentication, where the centralized authentication server is stored on an internal protected network. When you configure client-to-site
VPN configs inside the cloud, you have to think about the client VPN IP address pool. So the range of IPs that will be assigned when incoming client
connections to the VPN are successful.

Configuring a Cloud VPN


[Video description begins] Topic title: Configuring a Cloud VPN. Your host for this session is Dan Lachance . [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 11/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

A VPN is a great way to make sure traveling users or those working from home have a secured encrypted tunnel to get to cloud resources. Now, to do
that in the Microsoft Azure cloud computing environment, the first thing that we need to create in Azure is a virtual network gateway resource. It
represents the VPN appliance, or the VPN side of the connection, in the Azure cloud.

[Video description begins] The screen displays the Home page of Microsoft Azure. [Video description ends]

So to do that, in the Azure portal, I'm going to click Create a resource in the upper left. And I'm going to search for the word, network gateway or the
term network gateway, I'm going to choose Virtual network gateway. A local network gateway would only be applicable if you were linking in on-
premises VPN link to Azure. In other words, to link two sites together with a site-to-site VPN. That's not the case here, we're just linking individual
users to Azure through the VPN. So I'm going to choose Virtual network gateway and I'll choose Create.

[Video description begins] When he clicks the Create button, a Create virtual network gateway page opens. This page shows three tabs, Basics, Tags,
and Review + create. Currently, the Basics tab is active. This tab shows various menus, such as Subscription, Resource group, Name and so on.
[Video description ends]

Now I'm going to give this a name and go through the normal procedure of deploying it in a region and so on. So I'm going to call this GW1 for
gateway one. And I'm going to deploy this in a region where most of those users reside, so let's say in this particular case, Canada East. It's for VPN,
and I'm going to go down and define this for a virtual network called VNet1. So that's fine.

And I'm going to make sure that we create a new public IP address for this, GW1_PubIP. Because it needs to be publicly reachable over the Internet to
establish the VPN tunnel in the first place. And then I'm just going to click Review + create, I'm not going to make any other changes. So the
validation has passed based on my selection for the configuration. So let's actually click the Create button to create this virtual network gateway in the
Azure cloud. And the thing about this is you need to be patient, it can take a few minutes before it's ready to configure further.

[Video description begins] When he clicks the Create button in the Review + create tab, an Overview page opens for the key vault. In this page, there
is a left ribbon showing various options, such as Overview, Inputs, and so on. This page shows that the deployment is complete. [Video description
ends]

Once the deployment is complete, we'll then be able to go and work with the virtual network gateway to configure our client-to-site VPN. So I'm
going to click All resources, here on the left in the portal. And I'm going to filter the list by clicking Type, where it currently says equals all, and I'm
going to deselect the Select all check box. And I'm going to go all the way down to Virtual network gateway, and I'll click outside of it to put it into
effect, and there it is, GW1.

[Video description begins] In the All resources page, he applies the filter to show the newly created virtual network gateway. [Video description ends]

Going to click on it to open up its properties.

[Video description begins] When he clicks this gateway, a page opens with the same heading as its name. In this page, there is a left ribbon showing
various options, such as Overview, Activity log, Access control (IAM), Tags, and Diagnose and solve problems in the top section, and options such as
Configuration, Connections, Point-to-site configuration and so on under the Settings heading. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 12/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

What I want to do is configure what is called a point-to-site configuration. That means a client device that connects to the Azure VPN. So I'm going to
click on Point-to-site configuration. It states on the right that point-to-site is not configured, so I'm going to go ahead and click on Configure now.

[Video description begins] When he clicks the Configure now link, the Point-to-site configuration page shows various configuration options, such as
Address pool, Tunnel type, Authentication type and so on. [Video description ends]

I have to assign an address pool that will be used by incoming VPN clients. I'm going to specify 10.0.1.0, that's the network address, /24, that's the
number of bits in the subnet mask so the first three numbers here 10.0.1, identify the network. Then I can choose the tunnel type, whether it's
OpenVPN, so an SSL type of VPN. Or a Microsoft SSTP VPN, which again uses SSL, so it's an HTTPS type of transport.

Or I can use IKEv2, IKEv2 and Open VPN, or IKEv2 and SSTP. Now, depending on what I select here will determining whether Azure certificates,
PKI certificates, or RADIUS authentication is available. So notice with OpenVPN (SSL), it's only Azure certificate that is an option. If I choose SSTP
(SSL), then we've got Azure certificate and RADIUS authentication available also.

[Video description begins] When he clicks the RADIUS authentication radio button, a RADIUS authentication section appears. This section shows
two field bars, Server IP address and Server secret. [Video description ends]

So with Azure certificate we would have to paste in here the root certificate, or a public root certificate authority, or a private root certificate authority,
meaning self-signed. But you need the public certificate data from it either way because then the client device requires a client certificate. You can
have a root certificate or a CA, certificate authority, paste the public certificate portion here. The client gets issued a certificate on the device which
allows them to authenticate to the VPN.

[Video description begins] He refers to the Root certificates options available for Azure certificate as Authentication type. [Video description ends]

However, with RADIUS authentication, what you're doing is eliminating authentication from happening on the edge device. And in this context, the
edge device is the VPN clients. So I'm going to go ahead and fill in here an address of an internal RADIUS authentication server that I would have
configured already that is available in the Azure cloud. And then I would specify, of course, a server secret.

The server secret is a path phrase of some kind that is used to authenticate to that host. So this way, the edge device, VPN appliance or config, is not
doing the authentication. Because it's publicly accessible we don't want that, because if it's compromised, we have a problem. So at this point, I would
just save my point-to-site configuration by clicking the Save button. Now after that's been saved, you'll then be able to download the VPN client.

[Video description begins] When he clicks the Save option, a Download VPN client option appears in the top ribbon. He clicks this option, which
opens a Download VPN client side panel. This side panel has a Download button. [Video description ends]

Now because of the way that we've configured our tunnel type with SSTP (SSL), we have to select whether we want to download the VPN client
configuration for extensible authentication protocol or EAP, Microsoft CHAP, so Challenge Handshake Authentication Protocol v2, or EAPTLS.

I'm going to choose EAPTLS, in which case I have to choose a RADIUS root certificate and a client root certificate. Whereas with EAPMSCHAPv2,
which is considered less secure than the EAPTLS option, we don't. So when I click Download, we get prompted to download a zip file. We could
unzip it once we download it to our on-premises client device, the VPN client device. After which we could then install the VPN client configuration.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 13/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

Custom Cloud Storage Encryption Keys


[Video description begins] Topic title: Custom Cloud Storage Encryption Keys. Your host for this session is Dan Lachance . [Video description ends]

Many public cloud providers will automatically encrypt data that you have stored in the cloud, using keys that they generate and manage. But as a
cloud customer, you can also use keys that you generate. In this example, we're going to generate keys in the cloud, and use them to encrypt a
Microsoft Azure storage account.

[Video description begins] The screen displays the Home page of Microsoft Azure. [Video description ends]

So to get started, we first need a key vault. So here in Microsoft Azure, I'm going to click on All Resources in the left-hand navigator. Currently the
type is set to all, we're seeing all types of cloud resources. I'm going to click on that to filter it. So I'm going to deselect Select All, and I'm going to go
to the Ks and I'm going to choose Key vault. Apparently we have five of them. So I'll click outside of that drop-down list to activate the filter.

[Video description begins] The All resources page shows the key vaults in a tabulated format. [Video description ends]

So here I've got a key vault called KeyVault490123. I'm going to click on it, because that's where I want to create a custom key.

[Video description begins] He clicks the vault, and a page opens with the same heading as its name. In this page, there is a left ribbon showing
various options, such as Overview, Activity log, Access control (IAM), Tags, and Diagnose and solve problems in the top section, and options such as
Keys, Secrets, Certificates and so on under the Settings heading. [Video description ends]

So, on the navigator, I'll click Keys, and any existing keys will be shown.

[Video description begins] The Keys page shows that there are no keys. He clicks the Generate/Import button, which opens a Create a key page. This
page shows various settings, such as Options menu which is currently set to Generate, Name field bar, Key Type toggle button, RSA Key Size toggle
button and so on. [Video description ends]

I'm going to click the Generate/Import button up at the top. And from here we can choose to generate a new key, import it, or restore from backup. I'm
going to generate, and I'm going to call this Key1, it's going to be an RSA 2048 bit key. I'm not going to set a date where it is active and when it
expires and it's enabled by default. So I'll just go ahead and click Create.

[Video description begins] When he clicks the Create button, the Keys page opens again. This page shows the newly create key, along with its STATUS
as Enabled. [Video description ends]

And we can now see we have Key1 and its status is enabled.

[Video description begins] He switches to the Storage accounts page of Microsoft Azure. This page shows a list of various storage accounts. [Video
description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 14/15


2/22/2021 CCSP 2019: Implementing Data Discovery & Classification Transcript

So I've got the Storage Accounts view on the left. On the right, I'm going to open up an existing storage account where I want to use that custom key
for encryption. And I'm going to scroll down and choose Encryption.

[Video description begins] The Encryption page shows some brief information, and it also has a Use your own key check box. [Video description
ends]

By default, content in the storage account is encrypted using Microsoft Managed Keys. So down below, I'm going to turn on the checkmark for Use
Your Own Key.

[Video description begins] When he checks the Use your own key check box, various settings appear, such as Encryption key, Key Vault, Encryption
key and so on. [Video description ends]

It's set to select from a key vault, so under Key Vault, I will click the Select link.

[Video description begins] A Key vault side panel opens. This side panel shows a list of key vaults, along with a Create a new vault option. [Video
description ends]

And in this case, I'm going to choose our key vault where we created the key, it's called KeyVault490123. So I will click on it and down under
Encryption key, I will click Select.

[Video description begins] A Key Picker side panel opens. This side panel shows a list of keys, along with a Create a new key option. [Video
description ends]

And from here we're going to choose Key 1. And then I'm going to choose Save. So not only will this apply to new content that is stored in the storage
account. But existing content will also be encrypted using this key, using a background process.

Course Summary
[Video description begins] Topic title: Course Summary [Video description ends]

So in this course, we've examined the implementation of data discovery and classification. We did this by exploring data-discovery approaches and
techniques and the implementation of data discovery for structured and for unstructured data. We also explored the challenges associated with data
recovery in the Cloud. We looked at data classification, including how to implement data-classification mapping, data-classification labelling and data
classification for sensitive data.

We also explored Information Rights Management, IRM, including the key IRM objectives such as data rights, provisioning and access models. We
then explored IRM tools related to the issuing and replication of certificates. In our next course, we'll move on to explore meeting regulatory-
compliance needs through planning and implementation with data retention, deletion and archiving policies. As well as the use of data events for
analyzing and troubleshooting problems.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/4. Implementing Data Discovery & Classification Transcript.html 15/15

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy