0% found this document useful (0 votes)
25 views

CCSP 2019 - Data Security Technologies

The document discusses data security technologies related to cloud computing. It describes different types of data storage in cloud services like AWS and Azure, including long-term, ephemeral, and raw disk storage. It also covers topics around data ownership, encryption, key management, hashing, masking, tokenization, and data loss prevention.

Uploaded by

ValentinBriceag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

CCSP 2019 - Data Security Technologies

The document discusses data security technologies related to cloud computing. It describes different types of data storage in cloud services like AWS and Azure, including long-term, ephemeral, and raw disk storage. It also covers topics around data ownership, encryption, key management, hashing, masking, tokenization, and data loss prevention.

Uploaded by

ValentinBriceag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

2/22/2021 CCSP 2019: Data Security Technologies Transcript

CCSP 2019: Data Security Technologies


Responsibility for managing data falls on the cloud customer. In this 7-video course, learners explore data storage, threats, and security mitigations to
help ensure data protection. Examine security techniques such as hashing, data masking, data tokenization, and data loss prevention. This course can
be used in preparation for the (ISC)2 Certified Cloud Security Professional (CCSP) exam. Begin by looking at various technologies associated with
data asset security and protection. Examine Amazon Web Services storage types including long-term, ephemeral, and raw-disk. Learn how to
differentiate between data owner and data custodian, including risk profile, risk appetite, and responsibility. Look at potential threats associated with
storage types including ISO/IEC 27040. Learn about encryption for Microsoft Azure virtual machine disks, and about key management, which
involves creating an Azure Key Vault and key. Discover how to generate files hashes using Microsoft PowerShell. Look at data masking, or enabling
Microsoft Azure SQL Database dynamic masking (obfuscation), and data tokenization technologies. Finally, learn about data loss prevention by
configuring Microsoft Azure Information Protection.

Objectives discover the key concepts covered in this course define the various technologies associated with data asset security and protection describe
Amazon Web Services storage types including long term, ephemeral, and raw-disk differentiate between data owner and data custodian, including risk
profile, risk appetite, and responsibility describe potential threats associated with storage types including ISO/IEC 27040 enable encryption for
Microsoft Azure virtual machine disks create a Microsoft Azure Key Vault and key generate file hashes using Microsoft PowerShell enable Microsoft
Azure SQL Database dynamic masking (obfuscation) describe data tokenization technologies configure Microsoft Azure Information Protection
summarize the key concepts covered in this course

Table of Contents
1. Course Overview
2. Data Asset Security and Associated Technologies
3. Storage Types
4. Data Owner vs. Data Custodian
5. Storage Type Threat
6. Encryption of Data Assets
7. Key Management
8. Hashing
9. Data Masking
10. Data Tokenization
11. Data Loss Prevention
12. Course Summary

Course Overview
[Video description begins] Topic title: Course overview [Video description ends]
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 1/13
2/22/2021 CCSP 2019: Data Security Technologies Transcript

Hi, I'm Dan Lachance. I've worked in various IT roles since the early 1990s, including as a technical trainer, as a programmer, a consultant, as well as
an IT tech author and editor. I've held and still hold IT certifications related to Linux, Novell, Lotus, CompTIA, and Microsoft.

[Video description begins] Your host for the session is Dan Lachance. He is an IT Trainer / Consultant. [Video description ends]

Some of my specialties over the years have included networking, IT security, cloud solutions, Linux management, and configuration and
troubleshooting across a wide array of Microsoft products. CCSP, or certified cloud security professional, proves to the world that you have the cloud
security skills necessary to use the best practices and guidelines set out by ISCC to properly design, manage and secure applications, infrastructure
and data in the cloud.

In this course, I'll explore a variety of issues related to the management and protection of data assets hosted on a cloud platform, as well as data in
transit to and from the cloud. I'll begin by defining the various life stages of cloud hosted data, as well as talking about various technologies associated
with data security and protection.

I'll also take a look at the storage types used in the cloud computing environment including long term, ephemeral, and raw disk. Then I'll describe
potential threats associated with storage types in relation to ISO/IEC 27040. Then, I'll talk about threat mitigation technology and techniques as well
as how to use encryption in a cloud environment.

This will include topics such as key pair management as it applies to cloud hosted data, as well as hashing algorithms such as MD5 and SHA1.
Finally, I will also demonstrate data masking, and then I'll talk about data tokenization technologies.

Data Asset Security and Associated Technologies


[Video description begins] Topic title: Data Asset Security and Associated Technologies. Your host for the session is Dan Lachance. [Video
description ends]

In the cloud, your data can assume multiple states, such as when the data is being created or accessed. And during all of these states, security needs to
be put in place to mitigate threats against those data assets. So with data creation or access, for example, imagine that we're using Microsoft
SharePoint Online. So we've got a cloud-based centralized document repository. We want to limit who can get to it and who can submit content, and
also who can submit specific types of content.

For example, you might only let developers upload script files or executable files to a certain part of a SharePoint site. We then need to consider how
data is processed by IT systems in the cloud. Such as big data being fed into a cluster of virtual machine nodes in the cloud for big data analytic
processing. So we have to think about the data source to make sure that it's protected. Which feeds into data storage to make sure it's encrypted when
it's at rest. Then it can be encrypted while it's in transit, for example, to a cluster for analysis.

Pictured on the screen, we've got our user on the left using HTTPS or a VPN to securely connect over the network to data assets stored in the cloud.
Pictured here is a yellow folder. Down at the bottom, we see a hardware security module or HSM. This isn't required, but it is an option with many of
the big public cloud providers. The purpose of an HSM is to have a secure place using firmware to store cryptographic keys that would be used to
encrypt and decrypt data at rest.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 2/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

Next in our diagram in the middle, we have replication for high availability. By replicating our data to other data centers, or even to other
geographical regions that have data centers. We protect ourselves by making sure our data will be available in the event of a disaster in the primary
location. Now when you think about that, you should also be thinking about the recovery level objective when it comes to restoration from backups,
the RLO.

The RLO is designed so that you can plan the restoration of specific items as opposed to restoring an entire backup set. We should also be considering
the recovery service level, or the RSL. In the event of a disaster or a problem, the RSL is expressed as a percentage value. That represents how much
compute horsepower you're going to need in order to go through a disaster recovery process.

So that's all part of high availability. Then we have to think about auditing data access and that might even be required for legal or regulatory
compliance. So we can audit accessing data, the creation or importation of data, the modification, and of course the deletion of data.

Storage Types
[Video description begins] Topic title: Storage Types. Your host for the session is Dan Lachance. [Video description ends]

When you configure the virtual disks that will be used by virtual machines, whether on-premises or in the cloud, there are a few types to consider.
Such as raw disks, ephemeral or temporary disks, and long term or persistent type of disk storage. The first thing we'll take a look at here on premises
is using VMware Workstation to build a new virtual machine and seeing some of the disk options.

[Video description begins] A VMware Workstation Homepage titled Workstation 10 displays. Underneath the title is a toolbar that contains the
following tools: File, Edit, View, VM, Tabs, and Help. There is a list of following clickable icons in the middle of the screen: Create a New Virtual
Machine, Open a Virtual Machine, Connect to a Remote Server, Virtualize a Physical Machine, and Software Update. [Video description ends]

So here in VMware Workstation, if I were to create a new virtual machine, I'll just go through the wizard here. I'm going to choose Custom, so we can
see the advanced options, and I'll click Next. I'll click Next again until I get to the Guest Operating System screen.

[Video description begins] A New Virtual Machine Wizard dialog box displays. There are radio buttons for Typical, and Custom (advanced)
configurations with Help and Cancel buttons at the bottom along with Back and Next tabs. [Video description ends]

Where I'm just going to choose, I will install the operating system later. I'll click Next, I'll leave it on Microsoft Windows and Windows 7, I'll click
Next on that. And I'll accept the defaults for the virtual machine name and the location on the file system on this host for that VM.

I'll click Next on the processors screen again, and memory, and network, and disk I/O controller type, and disk type. Until finally on the Select a Disk
screen, I have a couple of options. I can create a new virtual disk, use an existing one or use a physical disk.

[Video description begins] The host now points at three radio buttons located on the Select a disk option. As soon as he selects Use a physical disk
radio button, a User Account Control window, with Yes and No buttons at the bottom opens. [Video description ends]

Now, when I use a physical disk, what we're talking about doing is letting the virtual machine directly access a raw disk partition. Where there's no
virtual hard disk file being used in that sense. So that is an option that is available. Now, I'm going to cancel out of that. If I open up an existing virtual
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 3/13
2/22/2021 CCSP 2019: Data Security Technologies Transcript

machine, then we can also reconfigure some of the disk storage for it.

[Video description begins] The host switches back to the Vmware Workstation interface and selects the Open a Virtual Machine option. [Video
description ends]

So here I've opened up an existing virtual machine, although I've not started it. But that's okay, because I can now go to the VM menu and choose
Settings to see all the virtual hardware.

[Video description begins] The Srv2016-1 - VMware Workstation interface opens. The host selects the VM option from the toolbar of the interface and
a new dialog box titled Virtual Machine Settings opens. There is sample information present under the following two tabs: Hardware, and Options
with OK, Cancel, and Help buttons at the bottom. [Video description ends]

What I'm interested in looking at is selecting one of the hard disks currently configured for this virtual machine and clicking the Advanced button on
the right. Now, currently, the virtual machine is not running and we don't have access to configure it. But notice that we do, or we would, have the
option to configure non-persistent storage.

What this means, as it says, is that when changes are made to the disk, they are discarded after the virtual machine is powered off. Now, persistent, of
course, means as a normal virtual machine would run, when you write things to disk they are retained even between reboots.

Now let's switch over to Amazon Web Services, or AWS, for a moment. Here in the AWS Management Console, I'm going to start by clicking on
EC2 under Compute. Where we can not only work with virtual machine instances, but also with the disk volumes that they use.

[Video description begins] As the host selects the EC2 Dashboard, a corresponding interface titled Resources opens on the right. The interface
contains sample information and clickable links with a Launch Instance button in the middle. On the Left, is a Navigation bar with the options such
as: Events, Tags, Reports, Limits, and so on. [Video description ends]

So on the left hand navigator, I'm just going to scroll down and click on Volumes. So any existing volumes will be listed here. I'm going to click
Create Volume. The reason I want to look at this is because one of the options we have is Cold HDD, hard disk drive.

[Video description begins] As the host selects the Volumes option, a corresponding interface, titled Create Volume opens. The interface contains
various drop-down menus for the options Volume Type, Size (GIB), IOP S, and so on. [Video description ends]

In cloud environments, when you're looking at cold storage, you're really looking at longer term storage of data. In this case, it's a disk volume for a
virtual machine, and usually it's for infrequent access. Because you have less performance available than if you were to use a newer option that uses
solid state drives, or SSDs.

So depending on the requirements at the disk I/O level of your virtual machines, you will make a selection appropriate. From choosing a raw disk type
of partition access for a virtual machine, which you would normally do on-premises. You can also use longer term storage, such as cold HDD, or you
could enable non-persistent dynamic storage.

Data Owner vs. Data Custodian


file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 4/13
2/22/2021 CCSP 2019: Data Security Technologies Transcript

[Video description begins] Topic title: Data owner vs. data Custodian. Your host for the session is Dan Lachance. [Video description ends]

With cloud computing, there is normally a shared responsibility between the cloud service provider and the cloud customer depending on the specific
cloud service being used. That also extends to data ownership where there are two roles. They are the data owner and the data custodian. The data
owner has the ultimate responsibility for the data overall. So essentially they are the decision maker as to how that data is managed. You could think
about it as being kind of an administrative type role.

Whereas the data custodian actually manages the data based upon the direction set by the data owner. So they would manage things like permissions,
backup and restore settings, replication settings, whether or not the data is encrypted, and how it's encrypted. So really the data custodian has technical
control to implement data management based on direction from the data owner. And so depending on how you're dealing with data in the cloud, the
responsibility for these roles could be shared between the cloud customer and the cloud service provider.

The other thing to consider is what influences how data will be managed. First of all, laws and regulations. Depending on where data physically
resides will determine which laws, or even in some cases, which regulations might even apply to that data. And so that will have an influence on how
that data is treated. How it's managed. How it's backed up. How it's replicated, if at all. How it's encrypted.

That in turn will feed into organizational security policies. Organizational security policies always have outside influences, or factors, that determine
exactly what security controls are put in place. And if we've got laws, for example, that require encryption of sensitive data well then that's going to
feed into our organizational security policies.

As you might imagine, there needs to be a periodic review of organizational security policies to make sure that they are effective. But at the same time
to also make any required changes as data privacy laws and regulations change constantly.

Storage Type Threat


[Video description begins] Topic title: Storage Type Threat. Your host for the session is Dan Lachance. [Video description ends]

There are always threats to store data, whether you're talking about storage on-premises or data stored in the cloud. The first thing to consider is
privilege escalation. This usually results from a malicious user compromising an account of some kind, maybe by infecting a machine with malware.
And so they have elevated privileges that they otherwise would not have with a regular user account.

Then there's user accounts that legitimately have permissions to resources, to data, but privilege abuse means that those permissions are not being
used properly in accordance with job tasks. So for example, it could be a user accessing sensitive data when they really have no business accessing it,
such as medical information or taxpayer information. It could also be the result of accessing data when they should not be accessing it. Then there's
the threat of data corruption.

Corrupted data can be propagated through replication or backups. So you need to make sure you perform periodic data recovery drills to make sure
that the data you've backed up can be recovered and that it's in good shape. It's consistent and the data integrity is there. The other consideration is
accidental deletion. If you've got backups, then of course, you can restore from backups.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 5/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

But if you're depending solely on replication to other locations in the cloud as your backup mechanism, bear in mind that, depending on the specific
cloud storage solution you’re using, deleting a record in a table or a file in the file system could be propagated to all other replicas. Then there’s the
theft of sensitive data. There could be some liability involved there if it can be shown that we, as a cloud customer, have not performed our due
diligence in terms of securing data properly in accordance with laws and regulations.

These days, ransomware is a big threat. It starts with the user inadvertently clicking a link or opening a file attachment that's infected that they think is
benign. Of course, it's loaded with the malware, in this case ransomware payload. So in step 2, files on the local machine that the user has access to,
write access to, would be encrypted. But the scary part is any remote files that the user also has write access to on that device would also be
encrypted.

And so what happens then in step 4 is that a ransom is demanded. This usually comes in the form of a pop-up screen on the infected station. And in
order to receive the decryption key, that ransom needs to be paid. However, you're dealing with nefarious people in this case. So there's no guarantee
that just because you make the ransom payment, which is usually done through some anonymous means, such as Bitcoin, it doesn't mean you're going
to get those decryption keys.

Now, what can we do about these storage threats? Well, one thing we can do is harden web apps that might use back end databases. An example of
this would be to use input validation techniques on the front end web interface to mitigate database query injection attacks, where queries can actually
be constructed, for example, within a form field, and then sent to the server.

We should encrypt virtual machine disks for protection of data at rest. We can enable immutable archive storage. When you configure data retention
policies for archiving, you can enable immutable as a characteristic, which means that archived data is read-only. It cannot be modified. We should
also be thinking about our use of content delivery networks, or CDNs.

Content delivery networks essentially copy data to different geographical locations to put that data near users that need it. So when you request it, for
example, from a website, it's quick. And so we have to think about encrypting any cached content that might be replicated throughout the content
delivery network if it's being used.

Encryption of Data Assets


[Video description begins] Topic title: Encryption of Data Assets. Your host for the session is Dan Lachance. [Video description ends]

One way to protect the contents of virtual machine disks in the cloud is to encrypt those virtual machine disks. To do that here in Microsoft Azure,
we're going to start by taking a look at an existing virtual machine here in the virtual machines view, named jumpbox. Currently, the virtual machine
is running.

If I click on the virtual machine to open up its properties, one of the things I'll see in the Properties navigation bar is Disks. And when I click that, I'll
see Operating System disks as well as any additional Data disks that might have been added.

[Video description begins] An MS Azure interface titled jumpbox - Disks opens. A corresponding pane with a tabular list for OS disk option opens on
the right. Other options present are Data disks and None. There are Edit, Refresh, and Swap OS Disk buttons within the Command bar and an + Add
data disk button at the bottom. [Video description ends]
file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 6/13
2/22/2021 CCSP 2019: Data Security Technologies Transcript

There are no Data disks for this particular virtual machine. But whether you're talking about Operating System or Data disks for a virtual machine,
there is an Encryption column here. And we can see here for the OS disk, it's currently Not enabled. Now, we can use a key from an Azure Key Vault
to encrypt virtual machine disks.

Let's go look at that. So if I go to the All resources view here in Microsoft Azure in the portal, and if I go to Type, I'm going to filter by type. I'm
going to deselect the Select all item. I'm going to go down to the k's and choose Key Vault.

[Video description begins] The host now selects the All resources option from the Navigation bar on the left and a corresponding interface opens on
the right. It contains the following filter fields: Subscription set to Pay-As-You-Go and Resource group set to all. Underneath it, is a tabular list of
records with the following column headers: NAME, TYPE, SUBSCRIPTION, and TAGS. [Video description ends]

And I'll click outside of it to put that into effect. Now the key vault of interest for us here is KeyVault44490. The reason is because it simply happens
to have a key, if I click on Keys, that we want to use. And the key we're going to use here is called VMDiskKey1. And if I click on that to open it up,
we can see the current version of the key is listed, it's Enabled. If I click on that, we can see some details. It's an RSA 2048-bit type of key.

[Video description begins] The respective interface for the selected VMDiskKey1 displays which contains the corresponding Properties of the key.
There is a Key identifier field, Setting the activation and expiration date checkboxes and an Enabled toggle button which is set to Yes. [Video
description ends]

So I'm going to go ahead and close that out, and we're going to use that key to encrypt the jumpbox virtual machine disk. To do that, I'm going to
switch over to the PowerShell ISE on my computer where I've got a script that will encrypt virtual machine disks. I've already run the Connect-
AzAccount PowerShell cmdlet to authenticate to Azure. So I've downloaded and installed the PowerShell module that allows me to work with
Microsoft Azure items, hence the Az portion of the nomenclature that you'll see here.

[Video description begins] A Windows PowerShell ISE interface titled Encrypt _ VM _ Disk.ps1 containing a sample code displays. The second half of
the interface displays the blank command pane. [Video description ends]

So in line 1, I'm creating a variable, and then a PowerShell variables are prefixed with a dollar sign, so $KeyVault. And I'm using the Get-AzKeyVault
PowerShell cmdlet, and I'm referring to the KeyVault that we just looked at, KeyVault44490, which is in a ResourceGroup called Rg1.

[Video description begins] The host now refers to the code in line 1 of the Powershell ISE interface. [Video description ends]

Next, I'm getting the $diskEncryptionKeyVaultUrl, because now that we've got the key vault variable from line 1, I can call upon the .VaultUri
property. And so that will then be stored in the $diskEncryptionKeyVaultUrl variable. Then I'm creating a variable called $keyVaultResourceId. Once
again, I'm calling upon our $keyVault variable from line 1, but this time a different property, calling upon the .ResourceId property because we're
going to need that. And in line 4, making yet another variable, $KeyEncryptionKeyUrl.

And in parenthesis here, I've got a statement or an expression because I want that to be treated as it's own expression and execute it first. I'm using the
Get-AzKeyVaultKey because we have to refer to an encryption key. And the vault name here is KeyVault44490, the name of the key we were looking
at a moment ago is VMDiskKey1. And then outside of the closing parentheses, I'm going to call upon the .Key.Kid property. So that's going to be
stored in the $KeyEncryptionKeyUrl variable.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 7/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

[Video description begins] The final code in line 4 reads: $keyEncryptionkeyUrl = (Get - AzureVaultkey -VaultName keyVault44490 -Name
VMDiskKey1) .Key .Kid; [Video description ends]

Finally, this is where the action happens in line 6 where we actually do the encryption using the Set-AzVMDiskEncryptionExtension cmdlet. So we
have to specify the ResourceGroup that's in question, the virtual machine or VMName, in this case jumpbox.

And then we've got a number of parameters we're using for that cmdlet such as DiskEncryptionKeyVaultUrl, DiskEncryptionKeyVaultId,
KeyEncryptionKeyUrl, and finally KeyEncryptionKeyVaultId. And we're passing it the respective variables that we established in the first four lines
of this script.

[Video description begins] The following parameters from line 8 to 11: DiskEncryptionKeyVaultUrl, DiskEncryptionKeyVaultId,
KeyEncryptionKeyUrl, and KeyEncryptionKeyVaultId contains the following assigned variables: $diskEncryptionkeyVaultUrl, $keyvaultResourceId,
$keyEncryptionkeyUrl, and $keyVaultResourceId respectively. [Video description ends]

So doing this will encrypt the virtue machine disk. Let's go ahead and click the Run Script button at the top in the toolbar here in the ISE. After a
moment, you'll be presented with this message. It says that This cmdlet prepares the virtual machine and enables encryption which might cause it to
reboot and can take 10 to 15 minutes to finish. Please save your work. Do you want to continue? I am going to click Yes. And so it's currently in the
midst of encrypting the virtual machine disks associated with the jumpbox virtual machine.

Key Management
[Video description begins] Topic title: Key Management. Your host for the session is Dan Lachance. [Video description ends]

A Microsoft Azure Key Vault is a Microsoft Azure Cloud computing resource that serves as a central secured repository to store secrets. Things like
PKI certificates, passphrases, and even keys. And in this example, we're going to create a key within the key vault. Now, these items, these secrets,
can be used by software components or they could be used by other cloud resources.

For example, you might use a key that you generate to encrypt the contents of an Azure storage account. So let's get started with creating this key
vault. So here in the Azure portal, I'm going to click Create a resource in the upper left. I'm going to search for Key Vault and I'll choose Key Vault
from the search results. And then I'll click Create.

[Video description begins] A portal.azure.com webpage titled Create key vault displays. [Video description ends]

The first thing to do is to specify the resource group where this key vault will be created. The resource group is nothing more than a collection of
related cloud resources that you want treated as one manageable unit.

[Video description begins] The Create key vault interface contains the "Subscription", and "Instance details" fields. There following sub-fields:
Resource group, Key vault name, Region, and Pricing tier are also present. The Review + create button and Previous, and Next: Access policy tabs
are also present at the bottom of the pane. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 8/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

So I'm going to choose an existing resource group from the dropdown list. And I'm going to call this KeyVault And I'm going to put in some numeric,
numbers here to make it a unique name in accordance with organizational naming standards. I need to specify the region where this will be deployed.
So in this particular case, because it'll be used in eastern Canada, I'm going to use Canada East. And that's it. I'm going to click review and create to
create the vault.

[Video description begins] An affirmation message about the Validation passed appears at the top of the Working pane now. Also, the Review + create
button changes to a Create button. [Video description ends]

So we're creating the Key Vault, but not the key when we click the Create button. And after a moment the deployment is complete. So I'll click the Go
to resource button, which opens up the key vault's properties. And in the navigation bar, we can work with keys. Now, you'll notice that there's a
Generate as well as an Import button, Generate/Import, same button.

[Video description begins] The host now navigates to the Keys option from the Properties blade of the KeyVault6745690. The corresponding Working
pane on the right displays that there are no keys available. There are Generate/Import, Refresh, and Restore Backup buttons located within the
Command bar of the Working pane. [Video description ends]

So you can either generate keys or import them. I can go to Secrets and generate or import those. And also PKI certificates. Then I've got access
policies that I can work with here to determine which service principles or security items should have access to these.

So whether it's a software component or maybe a virtual machine and so on. So here I'm going to go to Keys. And I'm going to click Generate/Import
up at the top. Now, we can either generate a key, import it, or restore from backup. I'm going to generate a key that I'm going to call Key2.

[Video description begins] The Create a key interface opens. It contains the fields: Options and Name. Underneath it is the Key Type, and RSA Key
Size toggle buttons, which are followed by two checkboxes for setting the activation and expiration date. The toggle button for Enabled? is set to Yes.
There is a Create button at the bottom of the pane. [Video description ends]

It can either be RSA or EC for the types. And it can also have a variety of different key sizes in terms of number of bits. I'm going to go with the
standard RSA 2048-bit key. Notice we can also set an activation date and time of when the key can be used, as well as an expiration date when it no
longer can be used. And the default is that the key is enabled. It's set to a value of Yes.

One interesting thing here is that if you change the key type from RSA to the newer elliptic curve or EC, notice that it looks like the key sizes are
smaller. Well, one of the great things about using elliptic curve cryptography is that you get the same type of security strength of protection but with
smaller keys.

So depending on what your needs are will determine whether you choose RSA or EC. I'm going to go ahead and click Create. And after a moment, we
can see Key2 is enabled and has been created here within our Azure Key Vault.

Hashing
[Video description begins] Topic title: Hashing. Your host for the session is Dan Lachance. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 9/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

Hashing algorithms are used to feed data essentially through a one-way function that results in a unique value. And that unique value is called the
hash. Now, the purpose of hashing is to make sure that you what you started off with as a file, perhaps through backing up or through downloading
over the Internet, hasn't been corrupted in transit.

Or that it hasn't been modified, especially by unauthorized users. And so one way to work with hashing is through PowerShell. So here in Microsoft
PowerShell, I'm going to go ahead and do a dir for the current directory we're in on drive D, which is named SampleFiles. And as we can see, we have
a number of sample files listed here.

[Video description begins] The PowerShell ISE with the following command displays: PS D:\SampleFiles> dir. A corresponding list of sample
directories displays underneath it. [Video description ends]

Now what I want to do is generate a hash for the first file listed here, called CustomerDatabase.accdb. To do that in PowerShell, I'm going to run get-
filehash, space, and then I'm just going to specify the name of that file. So I'll type in enough of that file to make the name unique, and then I'll press
Tab, and it will spell up the rest of it for me.

[Video description begins] The host now enters the new command: get - filehash .\customerDatabase.accdb [Video description ends]

So I'm running get-filehash against the CustomerDatabase.accdb file. Now if I press Enter, what gets returned is the hash value. So this hash value
needs to be stored somewhere, because this represents the state of the file at this point in time.

[Video description begins] A tabular list with the following column headers: Algorithm, Hash, and Path displays. Underneath the headers, are the
corresponding values. [Video description ends]

And if there's a modification, the hash will change. Let's verify that by opening up one of the CSV files, there's one here called
Regional_Spending_2016.csv. So first we'll generate a hash, then we'll open it up and modify it.

[Video description begins] He now enters a new command to perform the new function. The command reads: Get -FileHash .\Regional _ Spending _
2016.csv. A corresponding list of Algorithm with their Hash file and Path displays underneath it. [Video description ends]

So we can see the unique hash value here, it starts with 016B3D94, and so on. So I'm going to use Notepad to open up the regional spending file, and
we're going to make a change to it.

[Video description begins] A notepad titled Regional _ Spending _ 2016.csv displays. Underneath the toolbar of the notepad, a sample text along with
numeral values is present. [Video description ends]

So I'm just going to change one of the numbers, and I'm going to close and save the change. And I'm just going to use my up arrow key here in
PowerShell to bring up the previous command history. And we're going to run the Get-FileHash cmdlet again against the regional spending file.
Notice that when I do, we can tell immediately by the first couple of characters in the hash that we get a different hash generated.

And the reason for this is because the file contents have changed. And so hashing then has successfully detected that something's different, so the hash
is different. Now we know that the file was modified.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 10/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

Data Masking
[Video description begins] Topic title: Data Masking. Your host for the session is Dan Lachance. [Video description ends]

Data masking is a security technique that's used to protect sensitive data, such as when it's being displayed on the screen or even printed on paper. For
instance, if you've ever used a credit card to make a purchase and received a paper copy of the receipt, you might notice on the receipt that it does not
list your entire credit card number, most of the characters are masked out. That's data masking.

So here in the Microsoft Azure Cloud, I've already deployed a SQL database called db173. This is visible when I use the Azure portal and go to the
SQL database's view over here on the left. So I'm going to click on db173 to open up its properties, and I'm going to scroll down under Security and
click on Dynamic Data Masking.

[Video description begins] A portal.azure.com webpage, with db173 (srv173/db173) SQL database displays. The host selects Dynamic Data Masking
from the Properties blade and corresponding information displays in the Working pane, in tabular form, on the right. [Video description ends]

So here, I can see the schema from a table called or a database rather called SalesLT in which there is a table called Customer. And we can see a lot of
the columns are fields for the Customer table. Let's say I want to add masking to the customer email address to protect it.

[Video description begins] He now points at the tabular data within the Working pane. The column headers are SCHEMA, TABLE, COLUMN. An Add
mask button is present next to each row value. [Video description ends]

So I'm going to click Add mask. And what that does is puts it up at the top here. So we can see that Customer_EmailAddress is listed under Masking
rules.

[Video description begins] The selected row data moves to the top within the Working pane under the header of the Masking rule. [Video description
ends]

So what I want to do is click on it. And from here, I can then choose the Masking field format.

[Video description begins] As the host selects the Masking rule, a Properties blade titled Edit Masking Rule opens. There is a Select how to mask
heading and a drop-down menu underneath. [Video description ends]

So you have masking options for numeric values, but if I open the drop-down list, also I've got masking for Credit card values. Notice where only the
four last digits are shown, everything else is masked out. And I have the same type of thing available here for Email, which is perfect for what we
want to use it for.

So I'm going to choose the Email field format. Now, if there's nothing here that suits your needs, you can also use a Custom string prefix. But I'm
going to go ahead and Update this masking rule for the email address.

[Video description begins] The Update, Discard, and Delete buttons are located on the top within the toolbar of the Properties blade, titled Edit
Masking Rule. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 11/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

After which, I'm going to go ahead and close out of it. So we now have a masking rule for this table, specifically this column within the table for the
Customer_EmailAddress. And we can see that the MASK FUNCTION or the masking format has been applied.

[Video description begins] The host switches back to the SQL database interface. [Video description ends]

Now, the next thing to do to save this is to click the Save button, and then it will be put in place.

Data Tokenization
[Video description begins] Topic title: Data Tokenization. Your host for the session is Dan Lachance. [Video description ends]

Data tokenization is an important topic for both developers and cloud administrators. With data tokenization, what we're really doing is taking some
kind of sensitive data and replacing it with some kind of a token that can be used by other services. So in the upper-left here, we've got a credit card
number with an expiry date and the CVC code on the back of the card.

Now, that's obviously sensitive data that we don't want to send around the Internet if we don't have to. And so what can happen is a tokenization
service can be involved that will take that sensitive information and generate a unique token. It maps that sensitive data to a token. Now in order for
this to work properly, the user needs to trust the tokenization service. Really, what that means is a retailer, for example, like Apple Pay or Google Pay
or Master Pass has to trust the tokenization service.

And the user by extension trusts the retailers such as Apple Pay or Google Pay. Now what then happens is we get a token that represents that sensitive
information. And the token is what is being sent around, for example, in this case, to pay for things using that specific credit card information. Now
some things to bear in mind are that once a token is created, it's irreversible.

There is no way that, given a token, we can somehow reverse engineer what has happened to create the token to get back to the original data. Also,
tokens are unique per retailer. So even though you might use the same credit card for multiple different payment gateways, you're going to have a
unique token for each one. So the idea is that the sensitive data is not sent over the network, and at the same time, the sensitive data, in our example
that's credit card information, is also not stored on the user device, such as a smartphone.

So instead, the digital token is what gets stored on the user device. Now, of course, if the user device is compromised, we want to make sure that
digital tokens are invalidated. Because if a malicious user does compromise a user device and has access to all of the digital tokens, then of course,
they could make payments using those digital tokens as per our example.

So you might wonder, well, where is the real benefit? The benefit is you wouldn't have to cancel, in this case, the credit card and get a new one just
because a device was compromised. It's just a matter of removing the tokens.

Data Loss Prevention


[Video description begins] Topic title: Data Loss Prevention. Your host for the session is Dan Lachance. [Video description ends]

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 12/13


2/22/2021 CCSP 2019: Data Security Technologies Transcript

Data loss prevention, otherwise called DLP, according to Wikipedia, is the intentional or unintentional release of secure or private/confidential
information to an untrusted environment. So imagine, for example, that we've got an employee that somehow takes some sensitive information and
makes it available. Perhaps through a chat application, or through SMS texting, or through social media.

So we want to put some controls in place to try to reduce that possibility, whether it's intentional or unintentional. Data loss prevention begins with
user awareness and training. Users need to be aware of social engineering scams, where they might be tricked into divulging sensitive information, for
example, through an SMS text message. New employee on-boarding needs to include training so that there's an awareness about things like new
scams that are out there and how to prevent the leak of sensitive data.

There should be periodic training updates that do the same type of thing, that talk about the latest scams, and also that have updates that result from
past incidents and lessons learned, incidents related to security breaches. Data loss prevention can be configured on-premises or in the cloud. So when
we've got sensitive data that is being used, that's the first consideration. We need a way to identify that the particular data is sensitive and should have
DLP policies applied to it.

That means you need to have a scheme in place to classify or categorize your data. In other words, to flag it to say, this is sensitive credit card
information or this is sensitive medical research information. So the DLP policy configuration listed on the bottom-right of our screen deals with
conditions that are checked for the type of data and how it's being used and actions.

So we can see, in the middle of our screen, the policy configuration can include limited removable media use. So in other words, maybe disallowing
the use of USB thumb drives that are removable. Preventing the printing of sensitive data or the forwarding through email. Or preventing certain types
of files from being attached. Or maybe if they are allowed to be attached, perhaps only if the message is encrypted. And also limiting social media
content.

Course Summary
[Video description begins] Topic title: Course Summary [Video description ends]

So in this course, we've examined issues related to securing data assets, including their management and protection when hosted on or in transit to and
from a cloud platform. We did this by exploring the various life stages of cloud-hosted data assets, and technologies associated with data asset
security and protection.

We also talked about storage types used in a cloud computing environment, potential storage type threats, how to enable and use encryption for cloud-
hosted data. And we also took a look at current and developing data protection techniques, including data tokenization. In our next course, we'll move
on to explore the objectives, tools, and challenges related to data discovery and classification.

file:///C:/Disk D/CISM_CISSP_CIS_CCSP/CCSP/3. Data Security Technologies Transcript.html 13/13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy