Workbook Google Cloud
Workbook Google Cloud
Professional Cloud
Security Engineer
Journey
Course Workbook
Certification Exam Guide Sections
1 Configuring access
4 Managing operations
Courses Documentation
Active Directory user account provisioning |
Identity and access management | Google
Cloud
Security in Google Cloud Managing Security in Google Cloud What is Configuration Manager? - Google
M2 Securing Access to Google Cloud M2 Securing Access to Google Cloud Workspace Admin Help
Manage membership automatically with
dynamic groups - Google Workspace Admin
Help
Creating and updating a dynamic group | Cloud
Identity
Create and manage groups using APIs - Google
Workspace Admin Help
1.2 Diagnostic Question 03
Cymbal Bank leverages Google A. Create a service account with appropriate
Cloud storage services, an permissions. Authenticate the Spark Cluster
on-premises Apache Spark Cluster, and the web application as direct requests
and a web application hosted on a and share the service account key.
third-party cloud. The Spark cluster B. Create a service account with appropriate permissions. Have the Spark
and web application require limited Cluster and the web application authenticate as delegated requests, and
access to Cloud Storage buckets and share the short-lived service account credential as a JWT.
a Cloud SQL instance for only a few
C. Create a service account with appropriate permissions. Authenticate the
hours per day. You have been tasked
Spark Cluster and the web application as a delegated request, and share the
with sharing credentials while
service account key.
minimizing the risk that the
credentials will be compromised. D. Create a service account with appropriate permissions. Have the Spark
Cluster and the web application authenticate as a direct request, and share
the short-lived service account credentials as XML tokens.
What should you do?
1.2 Diagnostic Question 04
Cymbal Bank recently discovered A. Navigate to Organizational policies in the
service account key misuse in one Google Cloud Console. Select your organization. Select
of the teams during a security iam.disableServiceAccountKeyCreation.
audit. As a precaution, going Customize the applied to property, and set Enforcement to ‘On’. Click Save. Repeat the
forward you do not want any team process for iam.disableCrossProjectServiceAccountUsage.
in your organization to generate
B. Run the gcloud resource-manager org-policies enable-enforce command with
new external service account keys.
the constraints iam.disableServiceAccountKeyCreation, and
You also want to restrict every new
iam.disableCrossProjectServiceAccountUsage and the Project IDs you want the
service account’s usage to its
constraints to apply to.
associated Project.
C. Navigate to Organizational policies in the Google Cloud Console. Select your organization.
Select iam.disableServiceAccountKeyCreation. Under Policy Enforcement, select
Merge with parent. Click Save. Repeat the process for
iam.disableCrossProjectServiceAccountLienRemoval.
D. Run the gcloud resource-manager org-policies allow command with the boolean
What should you do? constraints iam.disableServiceAccountKeyCreation and
iam.disableCrossProjectServiceAccountUsage with Organization ID.
Proprietary + Confidential
Courses Documentation
SAML overview | Apigee X | Google Cloud
Set up single sign-on for managed Google
Accounts using third-party Identity providers -
Security in Google Cloud Managing Security in Google Cloud Google Workspace Admin Help
M3 Identity and Access M3 Identity and Access Assign SSO profile to organizational units or
Management (IAM) Management (IAM) groups - Google Workspace Admin Help
Network Mapping results - Google Workspace
Admin Help
Creating and managing custom roles | IAM
Documentation
Understanding IAM custom roles | IAM
Documentation | Google Cloud
Understanding roles | IAM Documentation
1.4 Diagnostic Question 07
Cymbal Bank’s organizational hierarchy divides the A. Assign the Project Editor role in
Organization into departments. The Engineering each individual project to the
Department has a ‘product team’ folder. This folder contains technical product manager. Assign
folders for each of the bank’s products. Each product folder the Project Editor role in each individual project to the web developer.
contains one Google Cloud Project, but more may be added.
B. Assign the Project Owner role in each individual project to the
Each project contains an App Engine deployment.
technical product manager. Assign the App Engine Deployer role in
each individual project to the web developer.
Cymbal Bank has hired a new technical product manager
and a new web developer. The technical product manager C. Assign the Project Editor role at the Engineering Department folder
must be able to interact with and manage all services in level to the technical product manager. Assign the App Engine
projects that roll up to the Engineering Department folder. Deployer role at the specific product’s folder level to the web
The web developer needs read-only access to App Engine developer.
configurations and settings for a specific product. D. Assign the Project Editor role at the Engineering Department folder
level to the technical product manager. Create a Custom Role in the
product folder that the web developer needs access to. Add the
How should you provision the new employees’ roles into appengine.versions.create and appengine.versions.delete permissions
your hierarchy following principles of least privilege? to that role, and assign it to the web developer.
1.4 Diagnostic Question 08
Cymbal Bank’s organizational hierarchy divides the A. Create custom roles for all three user types at
Organization into departments. The Engineering the “analytics” folder level. For the team lead,
Department has a ‘product team’ folder. This folder provide all appengine.* and cloudsql.* permissions.
contains folders for each of the bank’s products. For the developer, provide appengine.applications.*
One folder titled “analytics” contains a Google and appengine.instances.* permissions. For the code reviewer, provide the
Cloud Project that contains an App Engine appengine.instances.* permissions.
deployment and a Cloud SQL instance.
B. Assign the basic ‘App Engine Admin’ and ‘Cloud SQL Admin” roles to the team lead.
Assign the ‘App Engine Admin’ role to the developer. Assign the ‘App Engine Code
A team needs specific access to this project. The
Viewer’ role to the code reviewer. Assign all these permissions at the analytics project
team lead needs full administrative access to App
level.
Engine and Cloud SQL. A developer must be able to
configure and manage all aspects of App Engine C. Create custom roles for all three user types at the project level. For the team lead,
deployments. There is also a code reviewer who provide all appengine.* and cloudsql.* permissions. For the developer, provide
may periodically review the deployed App Engine appengine.applications.* and appengine.instances.* permissions. For the code
source code without making any changes. reviewer, provide the appengine.instances.* permissions.
D. Assign the basic ‘Editor’ role to the team lead. Create a custom role for the developer.
What types of permissions would you Provide all appengine.* permissions to the developer. Provide the predefined ‘App
provide to each of these users? Engine Code Viewer’ role to the code reviewer. Assign all these permissions at the
“analytics” folder level.
Proprietary + Confidential
Security in Google Cloud Access control for projects with IAM | Resource
Manager Documentation | Google Cloud
M3 Identity and Access
Management (IAM) Google Cloud Access control for organizations with IAM |
Resource Manager Documentation | Google
Implement Cloud Cloud
Security Fundamentals
Managing Security in Google Cloud on Google Cloud Access control for folders with IAM | Resource
Manager Documentation | Google Cloud
M3 Identity and Access
Management (IAM) Understanding roles | IAM Documentation
1.5 Diagnostic Question 09
Cymbal Bank is divided into separate A. Create an Organization node. Under the
departments. Each department is divided Organization node, create Department
into teams. Each team works on a distinct folders. Under each Department, create
product that requires Google Cloud Product folders. Under each Product, create Teams folders. In the Teams
resources for development. folder, add Projects.
B. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create Product folders. Add
Projects to the Product folders.
How would you design a Google C. Create an Organization node. Under the Organization node, create
Cloud organization hierarchy to best Department folders. Under each Department, create Teams folders. Add
match Cymbal Bank’s organization Projects to the Teams folders.
structure and needs?
D. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create a Teams folder. Under
each Team, create Product folders. Add Projects to the Product folders.
1.5 Diagnostic Question 10
Cymbal Bank has a team of A. Deny Serial Port Access and Service Account Creation at the
developers and administrators Organization level. Create an ‘admin’ folder and set enforced:
working on different sets of false for constraints/compute.disableSerialPortAccess.
Create a new ‘dev’ folder inside the ‘admin’ folder, and set enforced: false for
Google Cloud resources. The
constraints/iam.disableServiceAccountCreation. Give developers access to the ‘dev’ folder, and
Bank’s administrators should be
administrators access to the ‘admin’ folder.
able to access the serial ports on
B. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
Compute Engine Instances and
set enforced: false for constraints/compute.disableSerialPortAccess. Create a new ‘admin’ folder inside
create service accounts.
the ‘dev’ folder, and set enforced: false for constraints/iam.disableServiceAccountCreation. Give
Developers should only be able to developers access to the ‘dev’ folder, and administrators access to the ‘admin’ folder.
access serial ports.
C. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
set enforced: true for constraints/compute.disableSerialPortAccess and enforced: true for
constraints/iam.disableServiceAccountCreation. Create a new ‘admin’ folder inside the ‘dev’ folder, and
How would you design the set enforced: false for constraints/iam.disableServiceAccountCreation. Give developers access to the
organization hierarchy to provide ‘dev’ folder, and administrators access to the ‘admin’ folder.
the required access?
D. Allow Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and
set enforced: true for constraints/iam.disableServiceAccountCreation. Create another ‘admin’ folder that
inherits from the parent inside the organization node. Give developers access to the ‘dev’ folder, and
administrators access to the ‘admin’ folder.
Proprietary + Confidential
Courses Documentation
Understanding hierarchy evaluation | Resource
Manager Documentation | Google Cloud
Creating and managing organizations |
Security in Google Cloud Managing Security in Google Cloud Resource Manager Documentation | Google
M3 Identity and Access M3 Identity and Access Cloud
Management (IAM) Management (IAM) Best practices for enterprise organizations |
Documentation | Google Cloud
Section 2:
Securing communications
and establishing boundary
protection
Proprietary + Confidential
What should you do? D. Import a self-managed SSL certificate. Attach a global static external IP address to the external
proxy Network Load Balancer. Validate that an existing URL map will route the incoming service to
your managed instance group backend. Load your certificate and create an SSL proxy routing to
your URL map. Create a global forwarding rule that routes incoming requests to the proxy.
2.1 Diagnostic Question 03
Your organization has a website A. Set up Cloud VPN. Set up an unencrypted tunnel to one of the hosts
running on Compute Engine. This in the network. Create outbound or egress firewall rules. Use the
instance only has a private IP private IP address to log in using a gcloud ssh command.
address. You need to provide SSH B. Use SOCKS proxy over SSH. Set up an SSH tunnel to one of the hosts
access to an on-premises in the network. Create the SOCKS proxy on the client side.
developer who will debug the
C. Use the default VPC’s firewall. Open port 22 for TCP protocol using
website from the authorized
the Google Cloud Console.
on-premises location only.
D. Use Identity-Aware Proxy (IAP). Set up IAP TCP forwarding by
creating ingress firewall rules on port 22 for TCP using the gcloud
How do you enable this?
command.
Proprietary + Confidential
D. Create user accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
--network network-name
--deny UDP:27017
--source-service-accounts web-application-user-account
--target-service-accounts database-admin-user-account
2.2 Diagnostic Question 06
Cymbal Bank has designed an A. Use subnet isolation. Create a service account for the fraud detection VM.
application to detect credit card fraud Create one service account for all the teams’ Compute Engine instances that
will access the fraud detection VM. Create a new firewall rule using:
that will analyze sensitive information.
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
The application that’s running on a
--network <network name>
Compute Engine instance is hosted in --allow TCP:80
a new subnet on an existing VPC. --source-service-accounts <one service account for all teams>
Multiple teams who have access to --target-service-accounts <fraud detection engine’s service account>
other VMs in the same VPC must
access the VM. You want to configure B. Use target filtering. Create two tags called ‘app’ and ‘data’. Assign the ‘app’ tag to the Compute Engine instance hosting
the access so that unauthorized VMs the Fraud Detection App (source), and assign the ‘data’ tag to the other Compute Engine instances (target). Create a
firewall rule to allow all ingress communication on this tag.
or users from the internet can’t access
the fraud detection VM. C. Use subnet isolation. Create a service account for the fraud detection engine. Create service accounts for each of the
teams’ Compute Engine instances that will access the engine. Add a firewall rule using:
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
--allow TCP:80
What should you do?
--source-service-accounts <list of service accounts>
--target-service-accounts <fraud detection engine’s service account>
D. Use target filtering. Create a tag called ‘app’, and assign the tag to both the source and the target. Create a firewall rule to
allow all ingress communication on this tag.
Proprietary + Confidential
Configuring boundary
2.2 segmentation
C. Add ingress firewall rules to allow NAT and Health Check ranges for the App Engine standard environment in the
Shared VPC network. Create a client-side connector in the Service Project using the Shared VPC Project ID. Verify
What should you do? that the connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector
using Network Tags or IP ranges.
D. Add ingress firewall rules to allow NAT and Health Check ranges for App Engine standard environment in the Shared
VPC network. Create a server-side connector in the Host Project using the Shared VPC Project ID. Verify that the
connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector using
Network Tags or IP ranges.
2.3 Diagnostic Question 08
Cymbal Bank’s Customer Details API runs on a A. Use a Content Delivery Network (CDN). Establish direct peering with
Compute Engine instance with only an internal one of Google’s nearby edge-enabled PoPs.
IP address. Cymbal Bank’s new branch is B. Use Carrier Peering. Use a service provider to access their enterprise
co-located outside the Google Cloud grade infrastructure to connect to the Google Cloud environment.
points-of-presence (PoPs) and requires a
low-latency way for its on-premises apps to C. Use Partner Interconnect. Use a service provider to access their
enterprise grade infrastructure to connect to the Google Cloud
consume the API without exposing the
environment.
requests to the public internet.
D. Use Dedicated Interconnect. Establish direct peering with one of
Google’s nearby edge-enabled PoPs.
Which solution would you recommend?
2.3 Diagnostic Question 09
An external audit agency needs to A. Use a Cloud VPN tunnel. Use your DNS provider to
perform a one-time review of Cymbal create DNS zones and records for private.googleapis.com.
Bank’s Google Cloud usage. The auditors Connect the DNS provider to your on-premises network.
should be able to access a Default VPC Broadcast the request from the on-premises environment.
containing BigQuery, Cloud Storage, and Use a software-defined firewall to manage incoming and outgoing requests.
Compute Engine instances where all the
B. Use Partner Interconnect. Configure an encrypted tunnel in the auditor's on-premises
usage information is stored. You have
environment. Use Cloud DNS to create DNS zones and A records for
been tasked with enabling the access
private.googleapis.com.
from their on-premises environment,
which already has a configured VPN. C. Use a Cloud VPN tunnel. Use Cloud DNS to create DNS zones and records for
*.googleapis.com. Set up on-premises routing with Cloud Router. Use Cloud Router
custom route advertisements to announce routes for Google Cloud destinations.
What should you do? D. Use Dedicated Interconnect. Configure a VLAN in the auditor's on-premises environment.
Use Cloud DNS to create DNS zones and records for restricted.googleapis.com and
private.googleapis.com. Set up on-premises routing with Cloud Router. Add custom static
routes in the VPC to connect individually to BigQuery, Cloud Storage, and Compute Engine
instances.
2.3 Diagnostic Question 10
An ecommerce portal uses Google A. Cloud DNS, subnet primary IP address range for nodes, and subnet
Kubernetes Engine to deploy its secondary IP address range for pods and services in the cluster
recommendation engine in Docker B. Cloud VPN, subnet secondary IP address range for nodes, and subnet
containers. This cluster instance does not secondary IP address range for pods and services in the cluster
have an external IP address. You need to
provide internet access to the pods in the C. Nginx load balancer, subnet secondary IP address range for nodes, and
subnet secondary IP address range for pods and services in the cluster
Kubernetes cluster. What configuration
would you add? D. Cloud NAT gateway, subnet primary IP address range for nodes, and
subnet secondary IP address range for pods and services in the cluster
Establishing private
2.3 connectivity
Courses Skill Badges
Documentation
Networking in Google Cloud
● M2 Sharing VPC Networks
● M5 Private Connection Options Configuring Serverless VPC Access | Google
● M13 Connectivity Options
Google Cloud Cloud
● M14 Cloud VPN
Security in Google Cloud Overview of VPC Service Controls | Google
● M4 Configuring VPC for Isolation and Security
Build and Secure Cloud
● M5 Securing Compute Engine: Techniques and Networks in
Best Practices Google Cloud Choosing a Network Connectivity product |
Google Cloud
Networking in Google Cloud: Fundamentals
Private Google Access | VPC
● M2 Sharing VPC Networks
Networking in Google Cloud: Hybrid and Multicloud Manage zones | Cloud DNS
● M1 Connectivity Options
● M2 Cloud VPN Private Google Access for on-premises hosts |
Google Cloud
Managing Security in Google Cloud VPC
● M4 Configuring VPC for Isolation and Security
Security Best Practices in Google Cloud
Implement Cloud Simplifying cloud networking for enterprises:
● M1 Securing Compute Engine: Techniques and Security Fundamentals announcing Cloud NAT and more | Google
Best Practices on Google Cloud Cloud Blog
Example GKE setup | Cloud NAT
Cloud NAT overview
Section 3:
Ensuring data protection
3.1 Diagnostic Question 01 Discussion
Cymbal Bank has hired a data analyst A. Use the Cloud Data Loss Prevention (DLP) API to make redact image
team to analyze scanned copies of requests. Provide your project ID, built-in infoTypes, and the
loan applications. Because this is an scanned copies when you make the requests.
external team, Cymbal Bank does not B. Use the Cloud Vision API to perform optical code recognition (OCR)
want to share the name, gender, phone from scanned images. Redact the text using the Cloud Natural
number, or credit card numbers listed Language API with regular expressions.
in the scanned copies. You have been
C. Use the Cloud Vision API to perform optical code recognition (OCR)
tasked with hiding this PII information
from scanned images. Redact the text using the Cloud Data Loss
while minimizing latency.
Prevention (DLP) API with regular expressions.
D. Use the Cloud Vision API to perform text extraction from scanned
What should you do? images. Redact the text using the Cloud Natural Language API with
regular expressions.
3.1 Diagnostic Question 02 Discussion
Cymbal Bank needs to statistically predict A. Generalize all dates to year and month with bucketing. Use the
the days customers delay the payments built-in infoType for customer name. Use a custom infoType for
for loan repayments and credit card customer type with a custom dictionary.
repayments. Cymbal Bank does not want B. Generalize all dates to year and month with bucketing. Use the
to share the exact dates a customer has built-in infoType for customer name. Use a custom infoType for
defaulted or made a payment with data customer type with regular expression.
analysts. Additionally, you need to hide
C. Generalize all dates to year and month with date shifting. Use a
the customer name and the customer
predefined infoType for customer name. Use a custom infoType
type, which could be corporate or retail.
for customer type with a custom dictionary.
How do you provide the appropriate D. Generalize all dates to year and month with date shifting. Use a
information to the data analysts? predefined infoType for customer name. Use a custom infoType
for customer type with regular expression.
3.1 Diagnostic Question 03 Discussion
Cymbal Bank stores customer information in a A. Create separate datasets for each department.
BigQuery table called ‘Information,’ which Create views for each dataset separately.
belongs to the dataset ‘Customers.’ Various Authorize these views to access the source
departments of Cymbal Bank, including loan, dataset. Share the datasets with departments.
credit card, and trading, access the Provide the bigquery.dataViewer role to each department’s required users.
information table. Although the data source
B. Create an authorized dataset in BigQuery’s Explorer panel. Write Customers’ table
remains the same, each department needs to
metadata into a JSON file, and edit the file to add each department’s Project ID and
read and analyze separate customers and
Dataset ID. Provide the bigquery.user role to each department’s required users.
customer-attributes. You want a
cost-effective way to configure departmental C. Secure data with classification. Open the Data Catalog Taxonomies page in the
access to BigQuery to provide optimal Google Cloud Console. Create policy tags for required columns and rows. Provide
performance. the bigquery.user role to each department’s required users. Provide policy tags
access to each department separately.
D. Create separate datasets for each department. Create authorized functions in each
dataset to perform required aggregations. Write transformed data to new tables for
What should you do?
each department separately. Provide the bigquery.dataViewer role to each
department’s required users.
3.1 Diagnostic Question 04 Discussion
Cymbal Bank has a Cloud SQL instance A. Use Secret Manager. Use the duration attribute to set the expiry period to one year. Add
that must be shared with an external the secretmanager.secretAccessor role for the group that contains external developers.
agency. The agency’s developers will be
B. Use Cloud Key Management Service. Use the destination IP address and Port attributes to
assigned roles and permissions through a
provide access for developers at the external agency. Remove the IAM access after one
Google Group in Identity and Access
year and rotate the shared keys. Add cloudkms.cryptoKeyEncryptorDecryptor role for
Management (IAM). The external agency
the group that contains the external developers.
is on an annual contract and will require a
connection string, username, and C. Use Secret Manager. Use the resource attribute to set a key-value pair with key as
password to connect to the database. duration and values as expiry period one year from now. Add secretmanager.viewer role
for the group that contains external developers.
D. Use Secret Manager for the connection string and username, and use Cloud Key
Management Service for the password. Use tags to set the expiry period to the
How would you configure the
timestamp one year from now. Add secretmanager.secretVersionManager and
group’s access?
secretmanager.secretAccessor roles for the group that contains external developers.
Proprietary + Confidential
Documentation
Protecting sensitive data
3.1 and preventing data loss Image inspection and redaction | Data Loss
Prevention Documentation | Google Cloud
Redacting sensitive data from images | Data
Loss Prevention Documentation | Google Cloud
Courses InfoType detector reference | Data Loss
Prevention Documentation | Google Cloud
Pseudonymization | Data Loss Prevention
Security in Google Cloud Security Best Practices in Google Cloud Documentation | Google Cloud
● M5 Securing Compute Engine: ● M1 Securing Compute Engine: Authorized views | BigQuery | Google Cloud
Techniques and Best Practices Techniques and Best Practices
M6 Securing Cloud Data: Authorized datasets | BigQuery | Google Cloud
● ● M2 Securing Cloud Data: Techniques
Techniques and Best Practices and Best Practices Sharing across perimeters with bridges | VPC
● M7 Application Security: ● M3 Application Security: Techniques Service Controls | Google Cloud
Techniques and Best Practices and Best Practices
Creating a perimeter bridge | VPC Service
● M10 Content-Related
Mitigating Security Vulnerabilities in Controls | Google Cloud
Vulnerabilities: Techniques and
Best Practices
Google Cloud Context-aware access with ingress rules | VPC
● M2 Content-Related Vulnerabilities: Service Controls | Google Cloud
Techniques and Best Practices Frequently asked questions | IAM
Documentation
Access control with IAM | Secret Manager
Documentation | Google Cloud
3.2 Diagnostic Question 05 Discussion
Cymbal Bank calculates employee A. Import the spreadsheets to BigQuery, and
incentives on a monthly basis for the create separate tables for Sales and Marketing.
sales department and on a quarterly Set table expiry rules to 365 days for both tables.
basis for the marketing department. Create jobs scheduled to run every quarter for Marketing and every month for Sales.
The incentives are released with the
B. Upload the spreadsheets to Cloud Storage. Select the Nearline storage class for the
next month’s salary. Employee’s
sales department and Coldline storage for the marketing department. Use object
performance documents are stored as
lifecycle management rules to set the storage class to Archival after 365 days. Process
spreadsheets, which are retained for
the data on BigQuery using jobs that run monthly for Sales and quarterly for Marketing.
at least one year for audit. You want to
configure the most cost-effective C. Import the spreadsheets to Cloud SQL, and create separate tables for Sales and
storage for this scenario. Marketing. For Table Expiration, set 365 days for both tables. Use stored procedures to
calculate incentives. Use App Engine cron jobs to run stored procedures monthly for
Sales and quarterly for Marketing.
D. Import the spreadsheets into Cloud Storage and create NoSQL tables. Use App Engine
What should you do? cron jobs to run monthly for Sales and quarterly for Marketing. Use a separate job to
delete the data after 1 year.
3.2 Diagnostic Question 06 Discussion
Cymbal Bank uses Google Kubernetes A. In the Google Cloud console, navigate to
Engine (GKE) to deploy its Docker Google Kubernetes Engine. Select your cluster
containers. You want to encrypt the boot and the boot node inside the cluster. Enable
disk for a cluster running a custom customer-managed encryption. Use Cloud HSM to generate random bytes and
image so that the key rotation is provide an additional layer of security.
controlled by the Bank. GKE clusters will
B. Create a new GKE cluster with customer-managed encryption and HSM enabled.
also generate up to 1024 randomized
Deploy the containers to this cluster. Delete the old GKE cluster. Use Cloud HSM to
characters that will be used with the
generate random bytes and provide an additional layer of security.
keys with Docker containers.
C. Create a new key ring using Cloud Key Management Service. Extract this key to a
certificate. Use the kubectl command to update the Kubernetes configuration.
Validate using MAC digital signatures, and use a startup script to generate random
bytes.
What steps would you take to apply the
encryption settings with a dedicated D. Create a new key ring using Cloud Key Management Service. Extract this key to a
hardware security layer? certificate. Use the Google Cloud Console to update the Kubernetes configuration.
Validate using MAC digital signatures, and use a startup script to generate random
bytes.
3.2 Diagnostic Question 7 Discussion
Cymbal Bank needs to migrate existing
loan processing applications to Google
Cloud. These applications transform
confidential financial information. All the A. Create a Confidential VM instance with Customer-Supplied Encryption Keys. In
data should be encrypted at all stages, Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.
including sharing between sockets and
RAM. An integrity test should also be B. Create a Shielded VM instance with Customer-Supplied Encryption Keys. In
performed every time these instances Cloud Logging, collect all logs for earlyBootReportEvent.
boot. You need to use Cymbal Bank’s C. Create a Confidential VM instance with Customer-Managed Encryption Keys. In
encryption keys to configure the Cloud Logging, collect all logs for earlyBootReportEvent.
Compute Engine instances.
D. Create a Shielded VM instance with Customer-Managed Encryption Keys. In
Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.
Managing Security in Google Cloud Using Cloud KMS with other products
● M4 Configuring Virtual Private Cloud for Rotating keys | Cloud KMS Documentation
Isolation and Security Confidential VM and Compute Engine | Google
Security Best Practices in Google Cloud Cloud
You are building an AI model on Google A. Enable Google Cloud Armor on your deployed model to block malicious requests.
Cloud to analyze customer data and B. Store all model training data in BigQuery with public access for transparency.
predict purchase behavior. This model will
have access to sensitive information like C. Configure IAM roles to grant full access to the model for all Google Cloud users.
purchase history and demographics. D. Deploy the model in a region with the highest data security standards.
E. Monitor the model's performance for anomalies and biases, then manually
intervene if needed.
You're building a machine learning model A. Network traffic inspection and intrusion detection
on Google Cloud. You're choosing between B. Compliance with internal security policies
two options: managing the infrastructure
yourself (IaaS) or using Google's managed C. Data location and residency restrictions
services (PaaS). D. Granular access controls and permissions
E. Physical server hardening and security patches
You are tasked with developing an AI A. Select Google Cloud AI services that leverage a PaaS model. These are the
system on Google Cloud for a only ones that can guarantee a secure-by-design foundation.
telecommunications business. This AI B. Deploy your AI solution using managed instance groups (MIGs). These have
system will conduct sentiment analysis on
baked in security controls specific to running AI workloads.
conversations agents have with
customers, and provide conversational C. Leverage an AI model-specific threat detection scanner. Threats between
recommendations to improve customer AI systems and non-AI systems have very little in common.
satisfaction in the future. D. AI systems are more interconnected than non-AI systems. Prepare for new
attack vectors, as attackers can exploit vulnerabilities in one system to
attack another.
Courses Documentation
Security in Google Cloud How-sensitive-data-protection-can-help-secur
Managing Security in Google Cloud
● M2 Securing Access to Google e-generative-ai-workloads
● M2 Securing Access to Google Cloud
Cloud
Paas-vs-iaas-vs-saas
● M6 Securing Cloud Data:
Security Best Practices in
Techniques and Best Practices
Google Cloud
● M10 Content-Related
Vulnerabilities: Techniques and ● M2 Securing Cloud Data:
Best Practices Techniques and Best Practices
● M11 Monitoring, Logging, Auditing,
and Scanning Mitigating Security Vulnerabilities
on Google Cloud
● M2 Content-Related
Vulnerabilities: Techniques and
Best Practices
● M3 Monitoring, Logging, Auditing
and Scanning
Section 4:
Managing operations
4.1 Diagnostic Question 01 Discussion
Cymbal Bank has received Docker A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, scan, severity
source files from its third-party check, and push—specifying the location of Artifact Registry repository. Specify severity
developers in an Artifact Registry level as CRITICAL. Start the build with the command gcloud builds submit.
repository. These Docker fil2es will
be part of a CI/CD pipeline to update B. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, build, severity
Cymbal Bank’s personal loan check, and push—specifying the location of the Artifact Registry repository. Specify
offering. The bank wants to prevent
severity level as HIGH. Start the build with the command gcloud builds submit.
the possibility of remote users
arbitrarily using the Docker files to
C. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, severity check,
run any code. You have been tasked
build, and—push specifying the location of the Artifact Registry repository. Specify
with using Container Analysis’
On-Demand scanning to scan the
severity level as HIGH. Start the build with the command gcloud builds submit.
images for a one-time update.
D. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, severity check,
scan, and push—specifying the location of the Artifact Registry repository. Specify
What should you do?
severity level as CRITICAL. Start the build with the command gcloud builds submit.
4.1 Diagnostic Question 02 Discussion
Cymbal Bank’s management is A. Set an organization-level policy that requires all Compute Engine VMs to be configured as
concerned about virtual machines Shielded VMs. Use Secure Boot enabled with Unified Extensible Firmware Interface (UEFI).
being compromised by bad actors. Validate integrity events in Cloud Monitoring and place alerts on launch attestation events.
More specifically, they want to
B. Set Cloud Logging measurement policies on the VMs. Use Cloud Logging to place alerts
receive immediate alerts if there
whenever actualMeasurements and policyMeasurements don’t match.
have been changes to the boot
sequence of any of their Compute C. Set an organization-level policy that requires all Compute Engine VMs to be configured as
Engine instances. Shielded VMs. Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate
integrity events in Cloud Monitoring and place alerts on late boot validation events.
D. Set project-level policies that require all Compute Engine VMs to be configured as Shielded VMs.
Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate integrity
What should you do?
events in Cloud Monitoring and place alerts on late boot validation events.
4.1 Diagnostic Question 03 Discussion
Cymbal Bank runs a Node.js A. Prepare a shell script. Add the command gcloud compute
application on a Compute Engine instances stop with the Node.js instance name. Set up
instance. Cymbal Bank needs to certificates for secure boot. Add gcloud compute images
share this base image with a create, and specify the Compute Engine instance’s persistent disk and zone and the certificate
‘development’ Google Group. This files. Add gcloud compute images add-iam-policy-binding and specify the ‘development’ group.
base image should support secure B. Start the Compute Engine instance. Set up certificates for secure boot. Prepare a cloudbuild.yaml
boot for the Compute Engine configuration file. Specify the persistent disk location of the Compute Engine and the
instances deployed from this ‘development’ group. Use the command gcloud builds submit --tag, and specify the configuration
image. How would you automate file path and the certificates.
the image creation? C. Prepare a shell script. Add the command gcloud compute instances start to the script to start the
Node.js Compute Engine instance. Set up Measured Boot for secure boot. Add gcloud compute
images create, and specify the persistent disk and zone of the Compute Engine instance.
D. Stop the Compute Engine instance. Set up Measured Boot for secure boot. Prepare a
How would you automate cloudbuild.yaml configuration file. Specify the persistent disk location of the Compute Engine
the image creation? instance and the ‘development’ group. Use the command gcloud builds submit --tag, and specify
the configuration file path.
4.1 Diagnostic Question 04 Discussion
Cymbal Bank uses Docker A. Build a foundation image. Define a job on Jenkins for the Docker
containers to interact with APIs image. Upload the deployment configuration and container
for its personal banking definition Packer template to a Git repository. In the template,
application. These APIs are under include a pre-processor attribute to tag the image with Git repository
PCI-DSS compliance. The and Container Registry. Use Jenkins to build the container, and deploy it in Google Kubernetes Engine. Use
Kubernetes environment running Container Registry to distribute the image.
the containers will not have B. Build an immutable image. Define a job on Jenkins for the Docker image. Upload the deployment
internet access to download configuration and container definition Packer template to a Git repository. In the template, include a
required packages. post-processor attribute to tag the image with Git repository and Container Registry. Use Jenkins to build
the container, and deploy it in Google Kubernetes Engine. Use Container Registry to distribute the image.
C. Build a foundation image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built
container and deploy it to a Google Kubernetes Engine (GKE) cluster. Add the required users and groups
How would you automate the to the GKE project.
pipeline that is building these
D. Build an immutable image. Store all artifacts and a Packer definition template in a Git repository. Use
containers? Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built
container and deploy it to a Google Kubernetes Engine Cluster (GKE). Add the required users and groups
to the GKE project.
Proprietary + Confidential
Cymbal Bank wants to use A. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at the
Cloud Storage and BigQuery to service level for BigQuery and Cloud Storage.
store safe deposit usage data. B. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at the
Cymbal Bank needs a organization level.
cost-effective approach to
auditing only Cloud Storage and C. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
Cloud Storage. All Data Access Logs are enabled for BigQuery by default.
BigQuery data access activities.
D. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery. All Data Access Logs are enabled for Cloud Storage by default.
How would you use Cloud Audit
Logs to enable this analysis?
4.2 Diagnostic Question 08 Discussion
Cymbal Bank has suffered a remote A. Use Event Threat Detection. Trigger the IAM Anomalous Grant detector to detect all admins
botnet attack on Compute Engine and users with admin or system permissions. Export these logs to the Security Command
instances in an isolated project. The Center. Give the external agency access to the Security Command Center.
affected project now requires
B. Use Cloud Audit Logs. Filter Admin Activity audit logs for only the affected project. Use a
investigation by an external agency.
Pub/Sub topic to stream the logs from Cloud Audit Logs to the external agency’s forensics
An external agency requests that you
tool.
provide all admin and system events
to analyze in their local forensics tool. C. Use the Security Command Center. Select Cloud Logging as the source, and filter by
You want to use the most category: Admin Activity and category: System Activity. View the Source property of the
cost-effective solution to enable the Finding Details section. Use Pub/Sub topics to export the findings to the external agency’s
external analysis. forensics tool.
D. Use Cloud Monitoring and Cloud Logging. Filter Cloud Monitoring to view only system and
admin logs. Expand the system and admin logs in Cloud Logging. Use Pub/Sub to export the
What should you do?
findings from Cloud Logging to the external agency’s forensics tool or storage.
4.2 Diagnostic Question 09 Discussion
The loan application from Cymbal A. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
Bank’s lending department collects Cloud Monitoring. Create table expiry rules to delete logs after three years.
credit reports that contain credit B. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
payment information from customers. the Security Command Center. Create table expiry rules to delete logs after three
According to bank policy, the PDF years.
reports are stored for six months in
C. Set up a logging export bucket in Cloud Storage to collect data from the Security
Cloud Storage, and access logs for the
Command Center. Configure object lifecycle management rules to delete logs after
reports are stored for three years. You
three years.
need to configure a cost-effective
storage solution for the access logs. D. Set up a logging export bucket in Cloud Storage to collect data from Cloud Audit
Logs. Configure object lifecycle management rules to delete logs after three years.
Cymbal Bank uses Compute A. Use Event Threat Detection’s threat detectors. Export findings from ‘Suspicious account activity’
Engine instances for its APIs, and and ‘Anomalous IAM behavior’ detectors and publish them to a Pub/Sub topic. Create a Cloud
recently discovered bitcoin mining Run function to send notifications of suspect activities. Use Pub/Sub notifications to invoke the
activities on some instances. The Cloud Run function.
bank wants to detect all future
B. Enable the VM Manager tools suite in the Security Command Center. Perform a scan of Compute
mining attempts and notify the
Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
security team. The security team
send notifications of suspect activities.
can view the Security Command
Center and Cloud Audit Logs. C. Enable Anomaly Detection in the Security Command Center. Create and configure a Pub/Sub
topic and an email service. Create a Cloud Run function to send email notifications for suspect
activities. Export findings to a Pub/Sub topic, and use them to invoke the Cloud Run function.
D. Enable the Web Security Scanner in the Security Command Center. Perform a scan of Compute
How should you configure the Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
detection and notification? send notifications for suspect activities.
Proprietary + Confidential
Configuring logging,
4.2 monitoring, and detection Documentation
Security controls and forensic analysis for GKE
apps | Cloud Architecture Center
Courses Scenarios for exporting logging data: Security
and access analytics | Cloud Architecture
Center | Google Cloud
Security in Google Cloud Mitigating Security
Vulnerabilities in Google Cloud Security controls and forensic analysis for GKE
M11 Monitoring, Logging, apps | Cloud Architecture Center
Auditing, and Scanning M3 Monitoring, Logging,
Auditing, and Scanning Cloud Audit Logs overview
Cloud Audit Logs with Cloud Storage | Google
Cloud
Configure Data Access audit logs
Scenarios for exporting Cloud Logging:
Compliance requirements | Cloud Architecture
Center | Google Cloud
Security sources for vulnerabilities and threats |
Security Command Center | Google Cloud
Configuring Security Command Center
Enabling real-time email and chat notifications
Section 5:
Supporting compliance
requirements
5.1 Diagnostic Question 01
Cymbal Bank’s lending department stores A. Generate an AES-256 key as a 32-byte bytestring. Decode
sensitive information, such as your it as a base-64 string. Upload the blob to the bucket using
customers’ credit history, address and phone this key.
number, in parquet files. You need to upload B. Generate an RSA key as a 32-byte bytestring. Decode it as
this personally identifiable information (PII) a base-64 string. Upload the blob to the bucket using this
key.
to Cloud Storage so that it’s secure and
compliant with ISO 27018. C. Generate a customer-managed encryption key (CMEK)
using RSA or AES256 encryption. Decode it as a base-64
string. Upload the blob to the bucket using this key.
How should you protect this sensitive
D. Generate a customer-managed encryption key (CMEK)
information using Cymbal Bank’s encryption
using Cloud KMS. Decode it as a base-64 string. Upload
keys and using the least amount of the blob to the bucket using this key.
computational resources?
5.1 Diagnostic Question 02
You are designing a web application for A. Use customer-supplied encryption keys (CSEK) and Cloud
Cymbal Bank so that customers who have Key Management Service (KMS) to detect and encrypt
credit card issues can contact dedicated sensitive information.
support agents. Customers may enter their B. Detect sensitive information with Cloud Natural Language
complete credit card number when chatting API.
with or emailing support agents. You want to C. Use customer-managed encryption keys (CMEK) and
ensure compliance with PCI-DSS and Cloud Key Management Service (KMS) to detect and
prevent support agents from viewing this encrypt sensitive information.
information in the most cost-effective way. D. Implement Sensitive Data Protection using its REST API.
Documentation
Upload an object by using CSEK | Cloud Storage Sensitive Data Protection client libraries | Data Loss
Customer-managed encryption keys (CMEK) | Prevention Documentation
Cloud KMS Documentation Data Loss Prevention Demo
Customer-supplied encryption keys | Cloud Overview of VPC Service Controls | Google Cloud
Storage Getting to know the Google Cloud Healthcare API: Part 1
Data encryption options | Cloud Storage Sharing and collaboration | Cloud Storage
ISO/IEC 27018 Certified Compliant | Google Cloud Google Cloud Platform HIPAA overview guide
Automating the Classification of Data Uploaded to Setting up a HIPAA-aligned project | Cloud Architecture
Cloud Storage | Cloud Architecture Center | Center
Google Cloud
PCI Data Security Standard compliance | Cloud
Sensitive Data Protection overview Architecture Center
When will you take the exam?
Google Cloud Networking in Networking in Build and secure Managing Security Security Best
Fundamentals: Google Cloud: Google Cloud: networks in in Google Cloud Practices in
Core 1. Fundamentals 4. Network Security Google Cloud Google Cloud
Infrastructure 2. Routing and 5. Load Balancing Skill Badge
Addressing 6. Hybrid and
3. Network Multicloud
Architecture
Mitigating Security Implement Cloud Google Review Sample questions Take the
Vulnerabilities on Security Kubernetes Engine documentation certification exam
Google Cloud Fundamentals on Best Practices:
Google Cloud Skill Security Skill
Badge Badge
Weekly study plan
Now, consider what you’ve learned about your knowledge and skills
through the diagnostic questions in this course. You should have a
better understanding of what areas you need to focus on and what
resources are available.
Use the template that follows to plan your study goals for each week.
Consider:
● What exam guide section(s) or topic area(s) will you focus on?
● What courses (or specific modules) will help you learn more?
● What Skill Badges or labs will you work on for hands-on practice?
● What documentation links will you review?
● What additional resources will you use - such as sample
questions?
You may do some or all of these study activities each week.
Area(s) of focus:
Courses/modules
to complete:
Skill Badges/labs
to complete:
Documentation
to review:
Additional study: