Read First
Read First
Salesforce
DATA-CLOUD-CONSULTANT Exam
Salesforce Certified Data Cloud Consultant
https://www.pass4success.com
Questions & Answers PDF P-2
Question: 1
Northern Trail Outfitters (NTD) creates a calculated insight to compute recency, frequency,
monetary {RFM) scores on its unified individuals. NTO then creates a segment based on these scores
that it activates to a Marketing Cloud activation target.
Which two actions are required when configuring the activation?
Choose 2 answers
Answer: BC
Explanation:
To configure an activation to a Marketing Cloud activation target, you need to choose a segment and
select contact points. Choosing a segment allows you to specify which unified individuals you want to
activate. Selecting contact points allows you to map the attributes from the segment to the fields in
the Marketing Cloud data extension. You do not need to add additional attributes or add the
calculated insight in the activation, as these are already part of the segment
definition. Reference: Create a Marketing Cloud Activation Target; Types of Data Targets in Data
Cloud
Question: 2
A customer is concerned that the consolidation rate displayed in the identity resolution is
quite low compared to their initial estimations.
Which configuration change should a consultant consider in order to increase the consolidation rate?
Answer: B
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-3
The consolidation rate is the amount by which source profiles are combined to produce unified
profiles, calculated as 1 - (number of unified individuals / number of source individuals). For
example, if you ingest 100 source records and create 80 unified profiles, your consolidation rate is
20%. To increase the consolidation rate, you need to increase the number of matches between
source profiles, which can be done by adding more match rules. Match rules define the criteria for
matching source profiles based on their attributes. By increasing the number of match rules, you can
increase the chances of finding matches between source profiles and thus increase the consolidation
rate. On the other hand, changing reconciliation rules, including additional attributes, or reducing
the number of match rules can decrease the consolidation rate, as they can either reduce the
number of matches or increase the number of unified profiles. Reference: Identity Resolution
Calculated Insight: Consolidation Rates for Unified Profiles, Identity Resolution Ruleset Processing
Results, Configure Identity Resolution Rulesets
Question: 3
A customer is trying to activate data from Data Cloud to an Amazon S3 Cloud File Storage
Bucket.
Which authentication type should the consultant recommend to connect to the S3 bucket from Data
Cloud?
Answer: D
Explanation:
To use the Amazon S3 Storage Connector in Data Cloud, the consultant needs to provide the S3
bucket name, region, and access key and secret key for authentication. The access key and secret key
are generated by AWS and can be managed in the IAM console. The other options are not supported
by the S3 Storage Connector or by Data Cloud. Reference: Amazon S3 Storage Connector -
Salesforce, How to Use the Amazon S3 Storage Connector in Data Cloud | Salesforce Developers Blog
Learn more
1help.salesforce.com2developer.salesforce.com
Question: 4
A consultant has an activation that is set to publish every 12 hours, but has discovered that
updates to the data prior to activation are delayed by up to 24 hours.
Which two areas should a consultant review to troubleshoot this issue?
Choose 2 answers
https://www.pass4success.com
Questions & Answers PDF P-4
D. Review calculated insights to make sure they're run after the segments are refreshed.
Answer: B C
Explanation:
The correct answer is B and C because calculated insights and segments are both dependent on the
data ingestion process. Calculated insights are derived from the data model objects and segments
are subsets of data model objects that meet certain criteria. Therefore, both of them need to be
updated after the data is ingested to reflect the latest changes. Data transformations are optional
steps that can be applied to the data streams before they are mapped to the data model objects, so
they are not relevant to the issue. Reviewing calculated insights to make sure they’re run after the
segments are refreshed (option D) is also incorrect because calculated insights are independent of
segments and do not need to be refreshed after them. Reference: Salesforce Data Cloud Consultant
Exam Guide, Data Ingestion and Modeling, Calculated Insights, Segments
Question: 5
Northern Trail Outfitters wants to use some of its Marketing Cloud data in Data Cloud.
Which engagement channel data will require custom integration?
A. SMS
B. Email
C. CloudPage
D. Mobile push
Answer: C
Explanation:
CloudPage is a web page that can be personalized and hosted by Marketing Cloud. It is not one of the
standard engagement channels that Data Cloud supports out of the box. To use CloudPage data in
Data Cloud, a custom integration is required. The other engagement channels (SMS, email, and
mobile push) are supported by Data Cloud and can be integrated using the Marketing Cloud
Connector or the Marketing Cloud API. Reference: Data Cloud Overview, Marketing Cloud
Connector, Marketing Cloud API
Question: 6
Which permission setting should a consultant check if the custom Salesforce CRM object is
not available in New Data Stream configuration?
A. Confirm the Create object permission is enabled in the Data Cloud org.
B. Confirm the View All object permission is enabled in the source Salesforce CRM org.
C. Confirm the Ingest Object permission is enabled in the Salesforce CRM org.
D. Confirm that the Modify Object permission is enabled in the Data Cloud org.
Answer: B
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-5
To create a new data stream from a custom Salesforce CRM object, the consultant needs to confirm
that the View All object permission is enabled in the source Salesforce CRM org. This permission
allows the user to view all records associated with the object, regardless of sharing
settings1. Without this permission, the custom object will not be available in the New Data Stream
configuration2. Reference:
Manage Access with Data Cloud Permission Sets
Object Permissions
Question: 7
Which two common use cases can be addressed with Data Cloud?
Choose 2 answers
A. Understand and act upon customer data to drive more relevant experiences.
B. Govern enterprise data lifecycle through a centralized set of policies and processes.
C. Harmonize data from multiple sources with a standardized and extendable data model.
D. Safeguard critical business data by serving as a centralized system for backup and disaster
recovery.
Answer: A, C
Explanation:
Data Cloud is a data platform that can help customers connect, prepare, harmonize, unify, query,
analyze, and act on their data across various Salesforce and external sources. Some of the common
use cases that can be addressed with Data Cloud are:
Understand and act upon customer data to drive more relevant experiences. Data Cloud can help
customers gain a 360-degree view of their customers by unifying data from different sources and
resolving identities across channels. Data Cloud can also help customers segment their audiences,
create personalized experiences, and activate data in any channel using insights and AI.
Harmonize data from multiple sources with a standardized and extendable data model. Data Cloud
can help customers transform and cleanse their data before using it, and map it to a common data
model that can be extended and customized. Data Cloud can also help customers create calculated
insights and related attributes to enrich their data and optimize identity resolution.
The other two options are not common use cases for Data Cloud. Data Cloud does not provide data
governance or backup and disaster recovery features, as these are typically handled by other
Salesforce or external solutions.
Reference:
Learn How Data Cloud Works
About Salesforce Data Cloud
Discover Use Cases for the Platform
Understand Common Data Analysis Use Cases
Question: 8
https://www.pass4success.com
Questions & Answers PDF P-6
Where is value suggestion for attributes in segmentation enabled when creating the DMO?
A. Data Mapping
B. Data Transformation
C. Segment Setup
D. Data Stream Setup
Answer: C
Explanation:
Value suggestion for attributes in segmentation is a feature that allows you to see and select the
possible values for a text field when creating segment filters. You can enable or disable this feature
for each data model object (DMO) field in the DMO record home. Value suggestion can be enabled
for up to 500 attributes for your entire org. It can take up to 24 hours for suggested values to appear.
To use value suggestion when creating segment filters, you need to drag the attribute onto the
canvas and start typing in the Value field for an attribute. You can also select multiple values for some
operators. Value suggestion is not available for attributes with more than 255 characters or for
relationships that are one-to-many (1:N). Reference: Use Value Suggestions in
Segmentation, Considerations for Selecting Related Attributes
Question: 9
A Data Cloud customer wants to adjust their identity resolution rules to increase their
accuracy of matches. Rather than matching on email address, they want to review a rule that joins
their CRM Contacts with their Marketing Contacts, where both use the CRM ID as their primary key.
Which two steps should the consultant take to address this new use case?
Choose 2 answers
A. Map the primary key from the two systems to Party Identification, using CRM ID as the
identification name for both.
B. Map the primary key from the two systems to party identification, using CRM ID as the
identification name for individuals
coming from the CRM, and Marketing ID as the identification name for individuals coming from the
marketing platform.
C. Create a custom matching rule for an exact match on the Individual ID attribute.
D. Create a matching rule based on party identification that matches on CRM ID as the party
identification name.
Answer: A, D
Explanation:
To address this new use case, the consultant should map the primary key from the two systems to
Party Identification, using CRM ID as the identification name for both, and create a matching rule
based on party identification that matches on CRM ID as the party identification name. This way, the
consultant can ensure that the CRM Contacts and Marketing Contacts are matched based on their
CRM ID, which is a unique identifier for each individual. By using Party Identification, the consultant
can also leverage the benefits of this attribute, such as being able to match across different entities
https://www.pass4success.com
Questions & Answers PDF P-7
and sources, and being able to handle multiple values for the same individual. The other options are
incorrect because they either do not use the CRM ID as the primary key, or they do not use Party
Identification as the attribute type. Reference: Configure Identity Resolution Rulesets, Identity
Resolution Match Rules, Data Cloud Identity Resolution Ruleset, Data Cloud Identity Resolution
Config Input
Question: 10
Which consideration related to the way Data Cloud ingests CRM data is true?
A. CRM data cannot be manually refreshed and must wait for the next scheduled synchronization,
B. The CRM Connector's synchronization times can be customized to up to 15-minute intervals.
C. Formula fields are refreshed at regular sync intervals and are updated at the next full refresh.
D. The CRM Connector allows standard fields to stream into Data Cloud in real time.
Answer: D
Explanation:
The correct answer is D. The CRM Connector allows standard fields to stream into Data Cloud in real
time. This means that any changes to the standard fields in the CRM data source are reflected in Data
Cloud almost instantly, without waiting for the next scheduled synchronization. This feature enables
Data Cloud to have the most up-to-date and accurate CRM data for segmentation and activation1.
The other options are incorrect for the following reasons:
A) CRM data can be manually refreshed at any time by clicking the Refresh button on the data stream
detail page2. This option is false.
B) The CRM Connector’s synchronization times can be customized to up to 60-minute intervals, not
15-minute intervals3. This option is false.
C) Formula fields are not refreshed at regular sync intervals, but only at the next full refresh4. A full
refresh is a complete data ingestion process that occurs once every 24 hours or when manually
triggered. This option is false.
Reference:
1: Connect and Ingest Data in Data Cloud article on Salesforce Help
2: Data Sources in Data Cloud unit on Trailhead
3: Data Cloud for Admins module on Trailhead
4: [Formula Fields in Data Cloud] unit on Trailhead
: [Data Streams in Data Cloud] unit on Trailhead
Question: 11
A. Includes data from sources where the data is most frequently occurring
B. Identifies which individual records should be merged into a unified profile by setting a priority for
specific data sources
C. Identifies which data sources should be used in the process of reconcillation by prioritizing the
most recently updated data source
D. Sets the priority of specific data sources when building attributes in a unified profile, such as a
https://www.pass4success.com
Questions & Answers PDF P-8
Answer: D
Explanation:
: The Source Sequence reconciliation rule sets the priority of specific data sources when building
attributes in a unified profile, such as a first or last name. This rule allows you to define which data
source should be used as the primary source of truth for each attribute, and which data sources
should be used as fallbacks in case the primary source is missing or invalid. For example, you can set
the Source Sequence rule to use data from Salesforce CRM as the first priority, data from Marketing
Cloud as the second priority, and data from Google Analytics as the third priority for the first name
attribute. This way, the unified profile will use the first name value from Salesforce CRM if it exists,
otherwise it will use the value from Marketing Cloud, and so on. This rule helps you to ensure the
accuracy and consistency of the unified profile attributes across different data
sources. Reference: Salesforce Data Cloud Consultant Exam Guide, Identity Resolution, Reconciliation
Rules
Question: 12
Answer: B C
Explanation:
To delete a data stream in Data Cloud, the underlying data lake object (DLO) must not have any
dependencies or references to other objects or processes. The following two dependencies prevent a
data stream from being deleted1:
Data transform: This is a process that transforms the ingested data into a standardized format and
structure for the data model. A data transform can use one or more DLOs as input or output. If a DLO
is used in a data transform, it cannot be deleted until the data transform is removed or modified2.
Data model object: This is an object that represents a type of entity or relationship in the data model.
A data model object can be mapped to one or more DLOs to define its attributes and values. If a DLO
is mapped to a data model object, it cannot be deleted until the mapping is removed or changed3.
Reference:
1: Delete a Data Stream article on Salesforce Help
2: [Data Transforms in Data Cloud] unit on Trailhead
3: [Data Model in Data Cloud] unit on Trailhead
Question: 13
What should a user do to pause a segment activation with the intent of using that segment
https://www.pass4success.com
Questions & Answers PDF P-9
again?
Answer: A
Explanation:
The correct answer is A. Deactivate the segment. If a segment is no longer needed, it can be
deactivated through Data Cloud and applies to all chosen targets. A deactivated segment no longer
publishes, but it can be reactivated at any time1. This option allows the user to pause a segment
activation with the intent of using that segment again.
The other options are incorrect for the following reasons:
B) Delete the segment. This option permanently removes the segment from Data Cloud and cannot
be undone2. This option does not allow the user to use the segment again.
C) Skip the activation. This option skips the current activation cycle for the segment, but does not
affect the future activation cycles3. This option does not pause the segment activation indefinitely.
D) Stop the publish schedule. This option stops the segment from publishing to the chosen targets,
but does not deactivate the segment4. This option does not pause the segment activation
completely.
Reference:
1: Deactivated Segment article on Salesforce Help
2: Delete a Segment article on Salesforce Help
3: Skip an Activation article on Salesforce Help
4: Stop a Publish Schedule article on Salesforce Help
Question: 14
When creating a segment on an individual, what is the result of using two separate
containers linked by an AND as shown below?
GoodsProduct | Count | At Least | 1
Color | Is Equal To | red
AND
GoodsProduct | Count | At Least | 1
PrimaryProductCategory | Is Equal To | shoes
A. Individuals who purchased at least one of any red’ product and also purchased at least one pair
of ‘shoes’
B. Individuals who purchased at least one 'red shoes' as a single line item in a purchase
C. Individuals who made a purchase of at least one 'red shoes’ and nothing else
D. Individuals who purchased at least one of any 'red' product or purchased at least one pair of
'shoes'
Answer: A
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-10
: When creating a segment on an individual, using two separate containers linked by an AND means
that the individual must satisfy both the conditions in the containers. In this case, the individual must
have purchased at least one product with the color attribute equal to ‘red’ and at least one product
with the primary product category attribute equal to ‘shoes’. The products do not have to be the
same or purchased in the same transaction. Therefore, the correct answer is A.
The other options are incorrect because they imply different logical operators or conditions. Option B
implies that the individual must have purchased a single product that has both the color attribute
equal to ‘red’ and the primary product category attribute equal to ‘shoes’. Option C implies that the
individual must have purchased only one product that has both the color attribute equal to ‘red’ and
the primary product category attribute equal to ‘shoes’ and no other products. Option D implies that
the individual must have purchased either one product with the color attribute equal to ‘red’ or one
product with the primary product category attribute equal to ‘shoes’ or both, which is equivalent to
using an OR operator instead of an AND operator.
Reference:
Create a Container for Segmentation
Create a Segment in Data Cloud
Navigate Data Cloud Segmentation
Question: 15
What should an organization use to stream inventory levels from an inventory management
system into Data Cloud in a fast and scalable, near-real-time way?
Answer: C
Explanation:
The Ingestion API is a RESTful API that allows you to stream data from any source into Data Cloud in
a fast and scalable way. You can use the Ingestion API to send data from your inventory management
system into Data Cloud as JSON objects, and then use Data Cloud to create data models, segments,
and insights based on your inventory data. The Ingestion API supports both batch and streaming
modes, and can handle up to 100,000 records per second. The Ingestion API also provides features
such as data validation, encryption, compression, and retry mechanisms to ensure data quality and
security. Reference: Ingestion API Developer Guide, Ingest Data into Data Cloud
Question: 16
Northern Trail Outfitters (NTO), an outdoor lifestyle clothing brand, recently started a new
line of business. The new business specializes in gourmet camping food. For business reasons as well
as security reasons, it's important to NTO to keep all Data Cloud data separated by brand.
Which capability best supports NTO's desire to separate its data by brand?
https://www.pass4success.com
Questions & Answers PDF P-11
Answer: C
Explanation:
Data spaces are logical containers that allow you to separate and organize your data by different
criteria, such as brand, region, product, or business unit1. Data spaces can help you manage data
access, security, and governance, as well as enable cross-cloud data integration and activation2. For
NTO, data spaces can support their desire to separate their data by brand, so that they can have
different data models, rules, and insights for their outdoor lifestyle clothing and gourmet camping
food businesses. Data spaces can also help NTO comply with any data privacy and security
regulations that may apply to their different brands3. The other options are incorrect because they
do not provide the same level of data separation and organization as data spaces. Data streams are
used to ingest data from different sources into Data Cloud, but they do not separate the data by
brand4. Data model objects are used to define the structure and attributes of the data, but they do
not isolate the data by brand5. Data sources are used to identify the origin and type of the data, but
they do not partition the data by brand. Reference: Data Spaces Overview, Create Data Spaces, Data
Privacy and Security in Data Cloud, Data Streams Overview, Data Model Objects Overview, [Data
Sources Overview]
Question: 17
Cumulus Financial created a segment called High Investment Balance Customers. This is a
foundational segment that includes several segmentation criteria the marketing team should
consistently use.
Which feature should the consultant suggest the marketing team use to ensure this consistency
when creating future, more refined segments?
Answer: A
Explanation:
Nested segments are segments that include or exclude one or more existing segments. They allow
the marketing team to reuse filters and maintain consistency in their data by using an existing
segment to build a new one. For example, the marketing team can create a nested segment that
includes High Investment Balance Customers and excludes customers who have opted out of email
marketing. This way, they can leverage the foundational segment and apply additional criteria
without duplicating the rules. The other options are not the best features to ensure consistency
because:
B) A calculated insight is a data object that performs calculations on data lake objects or CRM data
https://www.pass4success.com
Questions & Answers PDF P-12
and returns a result. It is not a segment and cannot be used for activation or personalization.
C) A data kit is a bundle of packageable metadata that can be exported and imported across Data
Cloud orgs. It is not a feature for creating segments, but rather for sharing components.
D) Cloning a segment creates a copy of the segment with the same rules and filters. It does not allow
the marketing team to add or remove criteria from the original segment, and it may create confusion
and redundancy. Reference: Create a Nested Segment - Salesforce, Save Time with Nested Segments
(Generally Available) - Salesforce, Calculated Insights - Salesforce, Create and Publish a Data Kit Unit
| Salesforce Trailhead, Create a Segment in Data Cloud - Salesforce
Question: 18
Cumulus Financial uses Service Cloud as its CRM and stores mobile phone, home phone,
and work phone as three separate fields for its customers on the Contact record. The company plans
to use Data Cloud and ingest the Contact object via the CRM Connector.
What is the most efficient approach that a consultant should take when ingesting this data to ensure
all the different phone numbers are properly mapped and available for use in activation?
A. Ingest the Contact object and map the Work Phone, Mobile Phone, and Home Phone to the
Contact Point Phone data map object from the Contact data stream.
B. Ingest the Contact object and use streaming transforms to normalize the phone numbers from
the Contact data stream into a separate Phone data lake object (DLO) that contains three rows,
and then map this new DLO to the Contact Point Phone data map object.
C. Ingest the Contact object and then create a calculated insight to normalize the phone numbers,
and then map to the Contact Point Phone data map object.
D. Ingest the Contact object and create formula fields in the Contact data stream on the phone
numbers, and then map to the Contact Point Phone data map object.
Answer: B
Explanation:
The most efficient approach that a consultant should take when ingesting this data to ensure all the
different phone numbers are properly mapped and available for use in activation is B. Ingest the
Contact object and use streaming transforms to normalize the phone numbers from the Contact data
stream into a separate Phone data lake object (DLO) that contains three rows, and then map this new
DLO to the Contact Point Phone data map object. This approach allows the consultant to use the
streaming transforms feature of Data Cloud, which enables data manipulation and transformation at
the time of ingestion, without requiring any additional processing or storage. Streaming transforms
can be used to normalize the phone numbers from the Contact data stream, such as removing
spaces, dashes, or parentheses, and adding country codes if needed. The normalized phone numbers
can then be stored in a separate Phone DLO, which can have one row for each phone number type
(work, home, mobile). The Phone DLO can then be mapped to the Contact Point Phone data map
object, which is a standard object that represents a phone number associated with a contact point.
This way, the consultant can ensure that all the phone numbers are available for activation, such as
sending SMS messages or making calls to the customers.
The other options are not as efficient as option B. Option A is incorrect because it does not normalize
the phone numbers, which may cause issues with activation or identity resolution. Option C is
incorrect because it requires creating a calculated insight, which is an additional step that consumes
more resources and time than streaming transforms. Option D is incorrect because it requires
https://www.pass4success.com
Questions & Answers PDF P-13
creating formula fields in the Contact data stream, which may not be supported by the CRM
Connector or may cause conflicts with the existing fields in the Contact object. Reference: Salesforce
Data Cloud Consultant Exam Guide, Data Ingestion and Modeling, Streaming Transforms, Contact
Point Phone
Question: 19
A customer has a Master Customer table from their CRM to ingest into Data Cloud. The
table contains a name and primary email address, along with other personally Identifiable
information (Pll).
How should the fields be mapped to support identity resolution?
A. Create a new custom object with fields that directly match the incoming table.
B. Map all fields to the Customer object.
C. Map name to the Individual object and email address to the Contact Phone Email object.
D. Map all fields to the Individual object, adding a custom field for the email address.
Answer: C
Explanation:
To support identity resolution in Data Cloud, the fields from the Master Customer table should be
mapped to the standard data model objects that are designed for this purpose. The Individual object
is used to store the name and other personally identifiable information (PII) of a customer, while the
Contact Phone Email object is used to store the primary email address and other contact information
of a customer. These objects are linked by a relationship field that indicates the contact information
belongs to the individual. By mapping the fields to these objects, Data Cloud can use the identity
resolution rules to match and reconcile the profiles from different sources based on the name and
email address fields. The other options are not recommended because they either create a new
custom object that is not part of the standard data model, or map all fields to the Customer object
that is not intended for identity resolution, or map all fields to the Individual object that does not
have a standard email address field. Reference: Data Modeling Requirements for Identity
Resolution, Create Unified Individual Profiles
Question: 20
A. Delete the data from the incoming data stream and perform a full refresh.
B. Add the Individual ID to a headerless file and use the delete from file functionality.
C. Use Data Explorer to locate and manually remove the Individual.
D. Use the Consent API to suppress processing and delete the Individual and related records from
source data streams.
Answer: B, D
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-14
: To honor a Request to be Forgotten by a customer, a consultant should use Data Cloud in two ways:
Add the Individual ID to a headerless file and use the delete from file functionality. This option allows
the consultant to delete multiple Individuals from Data Cloud by uploading a CSV file with their
IDs1. The deletion process is asynchronous and can take up to 24 hours to complete1.
Use the Consent API to suppress processing and delete the Individual and related records from
source data streams. This option allows the consultant to submit a Data Deletion request for an
Individual profile in Data Cloud using the Consent API2. A Data Deletion request deletes the specified
Individual entity and any entities where a relationship has been defined between that entity’s
identifying attribute and the Individual ID attribute2. The deletion process is reprocessed at 30, 60,
and 90 days to ensure a full deletion2. The other options are not correct because:
Deleting the data from the incoming data stream and performing a full refresh will not delete the
existing data in Data Cloud, only the new data from the source system3.
Using Data Explorer to locate and manually remove the Individual will not delete the related records
from the source data streams, only the Individual entity in Data Cloud. Reference:
Delete Individuals from Data Cloud
Requesting Data Deletion or Right to Be Forgotten
Data Refresh for Data Cloud
[Data Explorer]
Question: 21
Cumulus Financial uses Data Cloud to segment banking customers and activate them for
direct mail via a Cloud File Storage activation. The company also wants to analyze individuals who
have been in the segment within the last 2 years.
Which Data Cloud component allows for this?
A. Segment exclusion
B. Nested segments
C. Segment membership data model object
D. Calculated insights
Answer: C
Explanation:
Data Cloud allows customers to analyze the segment membership history of individuals using the
Segment Membership data model object. This object stores information about when an individual
joined or left a segment, and can be used to create reports and dashboards to track segment
performance over time. Cumulus Financial can use this object to filter individuals who have been in
the segment within the last 2 years and compare them with other metrics.
The other options are not Data Cloud components that allow for this analysis. Segment exclusion is a
feature that allows customers to remove individuals from a segment based on another segment.
Nested segments are segments that are created from other segments using logical operators.
Calculated insights are derived attributes that are created from existing data using formulas.
Reference:
Segment Membership Data Model Object
Data Cloud Reports and Dashboards
Create a Segment in Data Cloud
https://www.pass4success.com
Questions & Answers PDF P-15
Question: 22
Answer: A
Explanation:
Data Cloud is a platform that enables you to activate all your customer data across Salesforce
applications and other systems. Data Cloud allows you to create a unified profile of each customer by
ingesting, transforming, and linking data from various sources, such as CRM, marketing, commerce,
service, and external data providers. Data Cloud also provides insights and analytics on customer
behavior, preferences, and needs, as well as tools to segment, target, and personalize customer
interactions. Data Cloud’s primary value to customers is to provide a unified view of a customer and
their related data, which can help you deliver better customer experiences, increase loyalty, and
drive growth. Reference: Salesforce Data Cloud, When Data Creates Competitive Advantage
Question: 23
During an implementation project, a consultant completed ingestion of all data streams for
their customer.
Prior to segmenting and acting on that data, which additional configuration is required?
A. Data Activation
B. Calculated Insights
C. Data Mapping
D. Identity Resolution
Answer: D
Explanation:
After ingesting data from different sources into Data Cloud, the additional configuration that is
required before segmenting and acting on that data is Identity Resolution. Identity Resolution is the
process of matching and reconciling source profiles from different data sources and creating unified
profiles that represent a single individual or entity1. Identity Resolution enables you to create a 360-
degree view of your customers and prospects, and to segment and activate them based on their
attributes and behaviors2. To configure Identity Resolution, you need to create and deploy a ruleset
that defines the match rules and reconciliation rules for your data3. The other options are incorrect
because they are not required before segmenting and acting on the data. Data Activation is the
process of sending data from Data Cloud to other Salesforce clouds or external destinations for
marketing, sales, or service purposes4. Calculated Insights are derived attributes that are computed
based on the source or unified data, such as lifetime value, churn risk, or product affinity5. Data
https://www.pass4success.com
Questions & Answers PDF P-16
Mapping is the process of mapping source attributes to unified attributes in the data model. These
configurations can be done after segmenting and acting on the data, or in parallel with Identity
Resolution, but they are not prerequisites for it. Reference: Identity Resolution Overview, Segment
and Activate Data in Data Cloud, Configure Identity Resolution Rulesets, Data Activation
Overview, Calculated Insights Overview, [Data Mapping Overview]
Question: 24
Northern Trail Outfitters (NTO) wants to connect their B2C Commerce data with Data Cloud
and bring two years of transactional history into Data Cloud.
What should NTO use to achieve this?
Answer: D
Explanation:
The B2C Commerce Starter Bundles are predefined data streams that ingest order and product data
from B2C Commerce into Data Cloud. However, the starter bundles only bring in the last 90 days of
data by default. To bring in two years of transactional history, NTO needs to use a custom extract
from B2C Commerce that includes the historical data and configure the data stream to use the
custom extract as the source. The other options are not sufficient to achieve this because:
A) B2C Commerce Starter Bundles only ingest the last 90 days of data by default.
B) Direct Sales Order entity ingestion is not a supported method for connecting B2C Commerce data
with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce data,
only data ingestion.
C) Direct Sales Product entity ingestion is not a supported method for connecting B2C Commerce
data with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce
data, only data ingestion. Reference: Create a B2C Commerce Data Bundle - Salesforce, B2C
Commerce Connector - Salesforce, Salesforce B2C Commerce Pricing Plans & Costs
Question: 25
A. Flow
B. Report
C. Activation alert
D. Dashboard
Answer: C
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-17
The feature that the consultant should use to solution for this use case is C. Activation alert.
Activation alerts are notifications that are sent to users when an activation fails or succeeds for a
segment. Activation alerts can be configured in the Activation Settings page, where the consultant
can specify the recipients, the frequency, and the conditions for sending the alerts. Activation alerts
can help the customer to monitor the status of their activations and troubleshoot any issues that may
arise. Reference: Salesforce Data Cloud Consultant Exam Guide, Activation Alerts
Question: 26
Which two steps should a consultant take if a successfully configured Amazon S3 data
stream fails to refresh with a "NO FILE FOUND" error message?
Choose 2 answers
A. Check if correct permissions are configured for the Data Cloud user.
B. Check if the Amazon S3 data source is enabled in Data Cloud Setup.
C. Check If the file exists in the specified bucket location.
D. Check if correct permissions are configured for the S3 user.
Answer: A C
Explanation:
: A “NO FILE FOUND” error message indicates that Data Cloud cannot access or locate the file from
the Amazon S3 source. There are two possible reasons for this error and two corresponding steps
that a consultant should take to troubleshoot it:
The Data Cloud user does not have the correct permissions to read the file from the Amazon S3
bucket. This could happen if the user’s permission set or profile does not include the Data Cloud Data
Stream Read permission, or if the user’s Amazon S3 credentials are invalid or expired. To fix this
issue, the consultant should check and update the user’s permissions and credentials in Data Cloud
and Amazon S3, respectively.
The file does not exist in the specified bucket location. This could happen if the file name or path has
changed, or if the file has been deleted or moved from the Amazon S3 bucket. To fix this issue, the
consultant should check and verify the file name and path in the Amazon S3 bucket, and update the
data stream configuration in Data Cloud accordingly. Reference: Create Amazon S3 Data Stream in
Data Cloud, How to Use the Amazon S3 Storage Connector in Data Cloud, Amazon S3 Connection
Question: 27
A consultant is discussing the benefits of Data Cloud with a customer that has multiple
disjointed data sources.
Which two functional areas should the consultant highlight in relation to managing customer data?
Choose 2 answers
A. Data Harmonization
B. Unified Profiles
C. Master Data Management
D. Data Marketplace
https://www.pass4success.com
Questions & Answers PDF P-18
Answer: A, B
Explanation:
Data Cloud is an open and extensible data platform that enables smarter, more efficient AI with
secure access to first-party and industry data1. Two functional areas that the consultant should
highlight in relation to managing customer data are:
Data Harmonization: Data Cloud harmonizes data from multiple sources and formats into a common
schema, enabling a single source of truth for customer data1. Data Cloud also applies data quality
rules and transformations to ensure data accuracy and consistency.
Unified Profiles: Data Cloud creates unified profiles of customers and prospects by linking data across
different identifiers, such as email, phone, cookie, and device ID1. Unified profiles provide a holistic
view of customer behavior, preferences, and interactions across channels and touchpoints. The other
options are not correct because:
Master Data Management: Master Data Management (MDM) is a process of creating and
maintaining a single, consistent, and trusted source of master data, such as product, customer,
supplier, or location data. Data Cloud does not provide MDM functionality, but it can integrate with
MDM solutions to enrich customer data.
Data Marketplace: Data Marketplace is a feature of Data Cloud that allows users to discover, access,
and activate data from third-party providers, such as demographic, behavioral, and intent data. Data
Marketplace is not a functional area related to managing customer data, but rather a source of
external data that can enhance customer data. Reference:
Salesforce Data Cloud
[Data Harmonization for Data Cloud]
[Unified Profiles for Data Cloud]
[What is Master Data Management?]
[Integrate Data Cloud with Master Data Management]
[Data Marketplace for Data Cloud]
Question: 28
A retailer wants to unify profiles using Loyalty ID which is different than the unique ID of
their customers.
Which object should the consultant use in identity resolution to perform exact match rules on the
Loyalty ID?
Answer: A
Explanation:
The Party Identification object is the correct object to use in identity resolution to perform exact
match rules on the Loyalty ID. The Party Identification object is a child object of the Individual object
that stores different types of identifiers for an individual, such as email, phone, loyalty ID, social
https://www.pass4success.com
Questions & Answers PDF P-19
media handle, etc. Each identifier has a type, a value, and a source. The consultant can use the Party
Identification object to create a match rule that compares the Loyalty ID type and value across
different sources and links the corresponding individuals.
The other options are not correct objects to use in identity resolution to perform exact match rules
on the Loyalty ID. The Loyalty Identification object does not exist in Data Cloud. The Individual object
is the parent object that represents a unified profile of an individual, but it does not store the Loyalty
ID directly. The Contact Identification object is a child object of the Contact object that stores
identifiers for a contact, such as email, phone, etc., but it does not store the Loyalty ID.
Reference:
Data Modeling Requirements for Identity Resolution
Identity Resolution in a Data Space
Configure Identity Resolution Rulesets
Map Required Objects
Data and Identity in Data Cloud
Question: 29
Which data model subject area defines the revenue or quantity for an opportunity by
product family?
A. Engagement
B. Product
C. Party
D. Sales Order
Answer: D
Explanation:
The Sales Order subject area defines the details of an order placed by a customer for one or more
products or services. It includes information such as the order date, status, amount, quantity,
currency, payment method, and delivery method. The Sales Order subject area also allows you to
track the revenue or quantity for an opportunity by product family, which is a grouping of products
that share common characteristics or features. For example, you can use the Sales Order Line Item
DMO to associate each product in an order with its product family, and then use the Sales Order
Revenue DMO to calculate the total revenue or quantity for each product family in an
opportunity. Reference: Sales Order Subject Area, Sales Order Revenue DMO Reference
Question: 30
Which configuration supports separate Amazon S3 buckets for data ingestion and activation?
Answer: A
https://www.pass4success.com
Questions & Answers PDF P-20
Explanation:
To support separate Amazon S3 buckets for data ingestion and activation, you need to configure
dedicated S3 data sources in Data Cloud setup. Data sources are used to identify the origin and type
of the data that you ingest into Data Cloud1. You can create different data sources for each S3 bucket
that you want to use for ingestion or activation, and specify the bucket name, region, and access
credentials2. This way, you can separate and organize your data by different criteria, such as brand,
region, product, or business unit3. The other options are incorrect because they do not support
separate S3 buckets for data ingestion and activation. Multiple S3 connectors are not a valid
configuration in Data Cloud setup, as there is only one S3 connector available4. Dedicated S3 data
sources in activation setup are not a valid configuration either, as activation setup does not require
data sources, but activation targets5. Separate user credentials for data stream and activation target
are not sufficient to support separate S3 buckets, as you also need to specify the bucket name and
region for each data source2. Reference: Data Sources Overview, Amazon S3 Storage Connector, Data
Spaces Overview, Data Streams Overview, Data Activation Overview
Question: 31
A customer wants to use the transactional data from their data warehouse in Data Cloud.
They are only able to export the data via an SFTP site.
How should the file be brought into Data Cloud?
Answer: A
Explanation:
The SFTP Connector is a data source connector that allows Data Cloud to ingest data from an SFTP
server. The customer can use the SFTP Connector to create a data stream from their exported file and
bring it into Data Cloud as a data lake object. The other options are not the best ways to bring the file
into Data Cloud because:
B) The Cloud Storage Connector is a data source connector that allows Data Cloud to ingest data from
cloud storage services such as Amazon S3, Azure Storage, or Google Cloud Storage. The customer
does not have their data in any of these services, but only on an SFTP site.
C) The Data Import Wizard is a tool that allows users to import data for many standard Salesforce
objects, such as accounts, contacts, leads, solutions, and campaign members. It is not designed to
import data from an SFTP site or for custom objects in Data Cloud.
D) The Dataloader is an application that allows users to insert, update, delete, or export Salesforce
records. It is not designed to ingest data from an SFTP site or into Data Cloud. Reference: SFTP
Connector - Salesforce, Create Data Streams with the SFTP Connector in Data Cloud - Salesforce, Data
Import Wizard - Salesforce, Salesforce Data Loader
Question: 32
https://www.pass4success.com
Questions & Answers PDF P-21
When performing segmentation or activation, which time zone is used to publish and refresh data?
Answer: D
Explanation:
The time zone that is used to publish and refresh data when performing segmentation or activation
is D. Time zone set by the Salesforce Data Cloud org. This time zone is the one that is configured in
the org settings when Data Cloud is provisioned, and it applies to all users and activities in Data
Cloud. This time zone determines when the segments are scheduled to refresh and when the
activations are scheduled to publish. Therefore, it is important to consider the time zone difference
between the Data Cloud org and the destination systems or channels when planning the
segmentation and activation strategies. Reference: Salesforce Data Cloud Consultant Exam
Guide, Segmentation, Activation
Question: 33
Cumulus Financial is currently using Data Cloud and ingesting transactional data from its
backend system via an S3 Connector in upsert mode. During the initial setup six months ago, the
company created a formula field in Data Cloud to create a custom classification. It now needs to
update this formula to account for more classifications.
What should the consultant keep in mind with regard to formula field updates when using the S3
Connector?
A. Data Cloud will initiate a full refresh of data from $3 and will update the formula on all records.
B. Data Cloud will only update the formula on a go-forward basis for new records.
C. Data Cloud does not support formula field updates for data streams of type upsert.
D. Data Cloud will update the formula for all records at the next incremental upsert refresh.
Answer: D
Explanation:
A formula field is a field that calculates a value based on other fields or constants. When using the S3
Connector to ingest data from an Amazon S3 bucket, Data Cloud supports creating and updating
formula fields on the data lake objects (DLOs) that store the data from the S3 source. However, the
formula field updates are not applied immediately, but rather at the next incremental upsert refresh
of the data stream. An incremental upsert refresh is a process that adds new records and updates
existing records from the S3 source to the DLO based on the primary key field. Therefore, the
consultant should keep in mind that the formula field updates will affect both new and existing
records, but only after the next incremental upsert refresh of the data stream. The other options are
incorrect because Data Cloud does not initiate a full refresh of data from S3, does not update the
formula only for new records, and does support formula field updates for data streams of type
upsert. Reference: Create a Formula Field, Amazon S3 Connection, Data Lake Object
https://www.pass4success.com
Questions & Answers PDF P-22
Question: 34
Luxury Retailers created a segment targeting high value customers that it activates through
Marketing Cloud for email communication. The company notices that the activated count is smaller
than the segment count.
What is a reason for this?
A. Marketing Cloud activations apply a frequency cap and limit the number of records that can be
sent in an activation.
B. Data Cloud enforces the presence of Contact Point for Marketing Cloud activations. If the
individual does not have a related Contact Point, it will not be activated.
C. Marketing Cloud activations automatically suppress individuals who are unengaged and have not
opened or clicked on an email in the last six months.
D. Marketing Cloud activations only activate those individuals that already exist in Marketing Cloud.
They do not allow activation of new records.
Answer: B
Explanation:
Data Cloud requires a Contact Point for Marketing Cloud activations, which is a record that links an
individual to an email address. This ensures that the individual has given consent to receive email
communications and that the email address is valid. If the individual does not have a related Contact
Point, they will not be activated in Marketing Cloud. This may result in a lower activated count than
the segment count. Reference: Data Cloud Activation, Contact Point for Marketing Cloud
Question: 35
Northern Trail Outfitters wants to implement Data Cloud and has several use cases in mind.
Which two use cases are considered a good fit for Data Cloud?
Choose 2 answers
A. To ingest and unify data from various sources to reconcile customer identity
B. To create and orchestrate cross-channel marketing messages
C. To use harmonized data to more accurately understand the customer and business impact
D. To eliminate the need for separate business intelligence and IT data management tools
Answer: A, C
Explanation:
Data Cloud is a data platform that can help customers connect, prepare, harmonize, unify, query,
analyze, and act on their data across various Salesforce and external sources. Some of the use cases
that are considered a good fit for Data Cloud are:
To ingest and unify data from various sources to reconcile customer identity. Data Cloud can help
customers bring all their data, whether streaming or batch, into Salesforce and map it to a common
data model. Data Cloud can also help customers resolve identities across different channels and
sources and create unified profiles of their customers.
https://www.pass4success.com
Questions & Answers PDF P-23
To use harmonized data to more accurately understand the customer and business impact. Data
Cloud can help customers transform and cleanse their data before using it, and enrich it with
calculated insights and related attributes. Data Cloud can also help customers create segments and
audiences based on their data and activate them in any channel. Data Cloud can also help customers
use AI to predict customer behavior and outcomes.
The other two options are not use cases that are considered a good fit for Data Cloud. Data Cloud
does not provide features to create and orchestrate cross-channel marketing messages, as this is
typically handled by other Salesforce solutions such as Marketing Cloud. Data Cloud also does not
eliminate the need for separate business intelligence and IT data management tools, as it is designed
to work with them and complement their capabilities.
Reference:
Learn How Data Cloud Works
About Salesforce Data Cloud
Discover Use Cases for the Platform
Understand Common Data Analysis Use Cases
Question: 36
A. To provide transparency and security for data gathered from individuals who provide consent
for its use and receive value in exchange
B. To provide trusted, first-party data in the Data Cloud Marketplace that follows all compliance
regulations
C. To ensure opt-in consents are collected for all email marketing as required by law
D. To obtain competitive data from reliable sources through interviews, surveys, and polls
Answer: A
Explanation:
: Building a trust-based, first-party data asset means collecting, managing, and activating data from
your own customers and prospects in a way that respects their privacy and preferences. It also means
providing them with clear and honest information about how you use their data, what benefits they
can expect from sharing their data, and how they can control their data. By doing so, you can create a
mutually beneficial relationship with your customers, where they trust you to use their data
responsibly and ethically, and you can deliver more relevant and personalized experiences to them. A
trust-based, first-party data asset can help you improve customer loyalty, retention, and growth, as
well as comply with data protection regulations and standards. Reference: Use first-party data for a
powerful digital experience, Why first-party data is the key to data privacy, Build a first-party data
strategy
Question: 37
What is the result of a segmentation criteria filtering on City | Is Equal To | 'San José'?
A. Cities containing 'San José’, 'San Jose’, 'san jose’, or 'san jose’
B. Cities only containing 'San Jose' or 'san jose’
https://www.pass4success.com
Questions & Answers PDF P-24
Answer: D
Explanation:
The result of a segmentation criteria filtering on City | Is Equal To | ‘San José’ is cities only containing
'San José’ or ‘san josé’. This is because the segmentation criteria is case-sensitive and accent-
sensitive, meaning that it will only match the exact value that is entered in the filter1. Therefore,
cities containing 'San Jose’, 'san jose’, or ‘San Jose’ will not be included in the result, as they do not
match the filter value exactly. To include cities with different variations of the name ‘San José’, you
would need to use the OR operator and add multiple filter values, such as ‘San José’ OR ‘San Jose’ OR
‘san jose’ OR 'san josé’2. Reference: Segmentation Criteria, Segmentation Operators
Question: 38
During a privacy law discussion with a customer, the customer indicates they need to honor
requests for the right to be forgotten. The consultant determines that Consent API will solve this
business need.
Which two considerations should the consultant inform the customer about?
Choose 2 answers
Answer: CD
Explanation:
When advising a customer about using the Consent API in Salesforce to comply with requests for the
right to be forgotten, the consultant should focus on two primary considerations:
Data deletion requests are submitted for Individual profiles (Answer C): The Consent API in
Salesforce is designed to handle data deletion requests specifically for individual profiles. This means
that when a request is made to delete data, it is targeted at the personal data associated with an
individual's profile in the Salesforce system. The consultant should inform the customer that the
requests must be specific to individual profiles to ensure accurate processing and compliance with
privacy laws.
Data deletion requests submitted to Data Cloud are passed to all connected Salesforce clouds
(Answer D): When a data deletion request is made through the Consent API in Salesforce Data Cloud,
the request is not limited to the Data Cloud alone. Instead, it propagates through all connected
Salesforce clouds, such as Sales Cloud, Service Cloud, Marketing Cloud, etc. This ensures
comprehensive compliance with the right to be forgotten across the entire Salesforce ecosystem. The
customer should be aware that the deletion request will affect all instances of the individual’s data
across the connected Salesforce environments.
Question: 39
https://www.pass4success.com
Questions & Answers PDF P-25
To import campaign members into a campaign in Salesforce CRM, a user wants to export
the segment to Amazon S3. The resulting file needs to include the Salesforce CRM Campaign ID in
the name.
What are two ways to achieve this outcome?
Choose 2 answers
Answer: A, C
Explanation:
: The two ways to achieve this outcome are A and C. Include campaign identifier in the activation
name and include campaign identifier in the filename specification. These two options allow the user
to specify the Salesforce CRM Campaign ID in the name of the file that is exported to Amazon S3. The
activation name and the filename specification are both configurable settings in the activation
wizard, where the user can enter the campaign identifier as a text or a variable. The activation name
is used as the prefix of the filename, and the filename specification is used as the suffix of the
filename. For example, if the activation name is “Campaign_123” and the filename specification is
“{segmentName}_{date}”, the resulting file name will be “Campaign_123_SegmentA_2023-12-
18.csv”. This way, the user can easily identify the file that corresponds to the campaign and import it
into Salesforce CRM.
The other options are not correct. Option B is incorrect because hard coding the campaign identifier
as a new attribute in the campaign activation is not possible. The campaign activation does not have
any attributes, only settings. Option D is incorrect because including the campaign identifier in the
segment name is not sufficient. The segment name is not used in the filename of the exported file,
unless it is specified in the filename specification. Therefore, the user will not be able to see the
campaign identifier in the file name.
Question: 40
How can a consultant modify attribute names to match a naming convention in Cloud File
Storage targets?
Answer: C
Explanation:
: A Cloud File Storage target is a type of data action target in Data Cloud that allows sending data to a
cloud storage service such as Amazon S3 or Google Cloud Storage. When configuring an activation to
https://www.pass4success.com
Questions & Answers PDF P-26
a Cloud File Storage target, a consultant can modify the attribute names to match a naming
convention by setting preferred attribute names in Data Cloud. Preferred attribute names are aliases
that can be used to control the field names in the target file. They can be set for each attribute in the
activation configuration, and they will override the default field names from the data model object.
The other options are incorrect because they do not affect the field names in the target file. Using a
formula field to update the field name in an activation will not change the field name, but only the
field value. Updating attribute names in the data stream configuration will not affect the existing data
lake objects or data model objects. Updating field names in the data model object will change the
field names for all data sources and activations that use the object, which may not be desirable or
consistent. Reference: Preferred Attribute Name, Create a Data Cloud Activation Target, Cloud File
Storage Target
Question: 41
Northern Trail Qutfitters wants to be able to calculate each customer's lifetime value {LTV)
but also create breakdowns of the revenue sourced by website, mobile app, and retail channels.
What should a consultant use to address this use case in Data Cloud?
A. Flow Orchestration
B. Nested segments
C. Metrics on metrics
D. Streaming data transform
Answer: C
Explanation:
Metrics on metrics is a feature that allows creating new metrics based on existing metrics and
applying mathematical operations on them. This can be useful for calculating complex business
metrics such as LTV, ROI, or conversion rates. In this case, the consultant can use metrics on metrics
to calculate the LTV of each customer by summing up the revenue generated by them across
different channels. The consultant can also create breakdowns of the revenue by channel by using
the channel attribute as a dimension in the metric definition. Reference: Metrics on Metrics, Create
Metrics on Metrics
Question: 42
A consultant wants to ensure that every segment managed by multiple brand teams
adheres to the same set of exclusion criteria, that are updated on a monthly basis.
What is the most efficient option to allow for this capability?
Answer: B
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-27
The most efficient option to allow for this capability is to create a reusable container block with
common criteria. A container block is a segment component that can be reused across multiple
segments. A container block can contain any combination of filters, nested segments, and exclusion
criteria. A consultant can create a container block with the exclusion criteria that apply to all the
segments managed by multiple brand teams, and then add the container block to each segment.
This way, the consultant can update the exclusion criteria in one place and have them reflected in all
the segments that use the container block.
The other options are not the most efficient options to allow for this capability. Creating, publishing,
and deploying a data kit is a way to share data and segments across different data spaces, but it does
not allow for updating the exclusion criteria on a monthly basis. Creating a nested segment is a way
to combine segments using logical operators, but it does not allow for excluding individuals based on
specific criteria. Creating a segment and copying it for each brand is a way to create multiple
segments with the same exclusion criteria, but it does not allow for updating the exclusion criteria in
one place.
Reference:
Create a Container Block
Create a Segment in Data Cloud
Create and Publish a Data Kit
Create a Nested Segment
Question: 43
A. Streaming transforms
B. Data model triggers
C. Sales and Service bundle
D. Data actions and Lightning web components
Answer: A
Explanation:
The correct answer is A. Streaming transforms. Streaming transforms are a feature of Data Cloud that
allows real-time data integration with Salesforce CRM. Streaming transforms use the Data Cloud
Streaming API to synchronize micro-batches of updates between the CRM data source and Data
Cloud in near-real time1. Streaming transforms enable Data Cloud to have the most current and
accurate CRM data for segmentation and activation2.
The other options are incorrect for the following reasons:
B) Data model triggers. Data model triggers are a feature of Data Cloud that allows custom logic to be
executed when data model objects are created, updated, or deleted3. Data model triggers do not
integrate data with Salesforce CRM, but rather manipulate data within Data Cloud.
C) Sales and Service bundle. Sales and Service bundle is a feature of Data Cloud that allows pre-built
data streams, data model objects, segments, and activations for Sales Cloud and Service Cloud data
sources4. Sales and Service bundle does not integrate data in real time with Salesforce CRM, but
rather ingests data at scheduled intervals.
D) Data actions and Lightning web components. Data actions and Lightning web components are
https://www.pass4success.com
Questions & Answers PDF P-28
features of Data Cloud that allow custom user interfaces and workflows to be built and embedded in
Salesforce applications5. Data actions and Lightning web components do not integrate data with
Salesforce CRM, but rather display and interact with data within Salesforce applications.
Reference:
1: Load Data into Data Cloud
2: [Data Streams in Data Cloud]
3: [Data Model Triggers in Data Cloud] unit on Trailhead
4: [Sales and Service Bundle in Data Cloud] unit on Trailhead
5: [Data Actions and Lightning Web Components in Data Cloud] unit on Trailhead
: [Data Model in Data Cloud] unit on Trailhead
: [Create a Data Model Object] article on Salesforce Help
: [Data Sources in Data Cloud] unit on Trailhead
: [Connect and Ingest Data in Data Cloud] article on Salesforce Help
: [Data Spaces in Data Cloud] unit on Trailhead
: [Create a Data Space] article on Salesforce Help
: [Segments in Data Cloud] unit on Trailhead
: [Create a Segment] article on Salesforce Help
: [Activations in Data Cloud] unit on Trailhead
: [Create an Activation] article on Salesforce Help
Question: 44
Answer: A
Explanation:
To create a multi-dimensional metric to identify unified individual lifetime value (LTV), the sequence
of data model object (DMO) joins that is necessary within the calculated Insight is Unified Individual
> Unified Link Individual > Sales Order. This is because the Unified Individual DMO represents the
unified profile of an individual or entity that is created by identity resolution1. The Unified Link
Individual DMO represents the link between a unified individual and an individual from a source
system2. The Sales Order DMO represents the sales order information from a source system3. By
joining these three DMOs, you can calculate the LTV of a unified individual based on the sales order
data from different source systems. The other options are incorrect because they do not join the
correct DMOs to enable the LTV calculation. Option B is incorrect because the Individual DMO
represents the source profile of an individual or entity from a source system, not the unified profile4.
Option C is incorrect because the join order is reversed, and you need to start with the Unified
Individual DMO to identify the unified profile. Option D is incorrect because it is missing the Unified
Link Individual DMO, which is needed to link the unified profile with the source
https://www.pass4success.com
Questions & Answers PDF P-29
profile. Reference: Unified Individual Data Model Object, Unified Link Individual Data Model
Object, Sales Order Data Model Object, Individual Data Model Object
Question: 45
Cumulus Financial created a segment called Multiple Investments that contains individuals
who have invested in two or more mutual funds.
The company plans to send an email to this segment regarding a new mutual fund offering, and
wants to personalize the email content with information about each customer's current mutual fund
investments.
How should the Data Cloud consultant configure this activation?
A. Include Fund Type equal to "Mutual Fund" as a related attribute. Configure an activation based on
the new segment with no additional attributes.
B. Choose the Multiple Investments segment, choose the Email contact point, add related attribute
Fund Name, and add related attribute filter for Fund Type equal to "Mutual Fund".
C. Choose the Multiple Investments segment, choose the Email contact point, and add related
attribute Fund Type.
D. Include Fund Name and Fund Type by default for post processing in the target system.
Answer: B
Explanation:
To personalize the email content with information about each customer’s current mutual fund
investments, the Data Cloud consultant needs to add related attributes to the activation. Related
attributes are additional data fields that can be sent along with the segment to the target system for
personalization or analysis purposes. In this case, the consultant needs to add the Fund Name
attribute, which contains the name of the mutual fund that the customer has invested in, and apply a
filter for Fund Type equal to “Mutual Fund” to ensure that only relevant data is sent. The other
options are not correct because:
A) Including Fund Type equal to “Mutual Fund” as a related attribute is not enough to personalize the
email content. The consultant also needs to include the Fund Name attribute, which contains the
specific name of the mutual fund that the customer has invested in.
C) Adding related attribute Fund Type is not enough to personalize the email content. The consultant
also needs to add the Fund Name attribute, which contains the specific name of the mutual fund that
the customer has invested in, and apply a filter for Fund Type equal to “Mutual Fund” to ensure that
only relevant data is sent.
D) Including Fund Name and Fund Type by default for post processing in the target system is not a
valid option. The consultant needs to add the related attributes and filters during the activation
configuration in Data Cloud, not after the data is sent to the target system. Reference: Add Related
Attributes to an Activation - Salesforce, Related Attributes in Activation - Salesforce, Prepare for Your
Salesforce Data Cloud Consultant Credential
Question: 46
A consultant is integrating an Amazon 53 activated campaign with the customer's destination system.
In order for the destination system to find the metadata about the segment, which file on the 53 will
https://www.pass4success.com
Questions & Answers PDF P-30
Answer: B
Explanation:
The file on the Amazon S3 that will contain the metadata about the segment for processing is B. The
json file. The json file is a metadata file that is generated along with the csv file when a segment is
activated to Amazon S3. The json file contains information such as the segment name, the segment
ID, the segment size, the segment attributes, the segment filters, and the segment schedule. The
destination system can use this file to identify the segment and its properties, and to match the
segment data with the corresponding fields in the destination system. Reference: Salesforce Data
Cloud Consultant Exam Guide, Amazon S3 Activation
Question: 47
A customer notices that their consolidation rate has recently increased. They contact the
consultant to ask why.
What are two likely explanations for the increase?
Choose 2 answers
A. New data sources have been added to Data Cloud that largely overlap with the existing profiles.
B. Duplicates have been removed from source system data streams.
C. Identity resolution rules have been removed to reduce the number of matched profiles.
D. Identity resolution rules have been added to the ruleset to increase the number of matched
profiles.
Answer: A, D
Explanation:
The consolidation rate is a metric that measures the amount by which source profiles are combined
to produce unified profiles in Data Cloud, calculated as 1 - (number of unified profiles / number of
source profiles). A higher consolidation rate means that more source profiles are matched and
merged into fewer unified profiles, while a lower consolidation rate means that fewer source profiles
are matched and more unified profiles are created. There are two likely explanations for why the
consolidation rate has recently increased for a customer:
New data sources have been added to Data Cloud that largely overlap with the existing profiles. This
means that the new data sources contain many profiles that are similar or identical to the profiles
from the existing data sources. For example, if a customer adds a new CRM system that has the same
customer records as their old CRM system, the new data source will overlap with the existing one.
When Data Cloud ingests the new data source, it will use the identity resolution ruleset to match and
merge the overlapping profiles into unified profiles, resulting in a higher consolidation rate.
Identity resolution rules have been added to the ruleset to increase the number of matched profiles.
https://www.pass4success.com
Questions & Answers PDF P-31
This means that the customer has modified their identity resolution ruleset to include more match
rules or more match criteria that can identify more profiles as belonging to the same individual. For
example, if a customer adds a match rule that matches profiles based on email address and phone
number, instead of just email address, the ruleset will be able to match more profiles that have the
same email address and phone number, resulting in a higher consolidation rate.
Reference: Identity Resolution Calculated Insight: Consolidation Rates for Unified Profiles, Configure
Identity Resolution Rulesets
Question: 48
A client wants to bring in loyalty data from a custom object in Salesforce CRM that contains
a point balance for accrued hotel points and airline points within the same record. The client wants
to split these point systems into two separate records for better tracking and processing.
What should a consultant recommend in this scenario?
Answer: B
Explanation:
Batch transforms are a feature that allows creating new data lake objects based on existing data lake
objects and applying transformations on them. This can be useful for splitting, merging, or reshaping
data to fit the data model or business requirements. In this case, the consultant can use batch
transforms to create a second data lake object that contains only the airline points from the original
loyalty data object. The original object can be modified to contain only the hotel points. This way, the
client can have two separate records for each point system and track and process them
accordingly. Reference: Batch Transforms, Create a Batch Transform
Question: 49
A segment fails to refresh with the error "Segment references too many data lake objects
(DLOS)".
Which two troubleshooting tips should help remedy this issue?
Choose 2 answers
Answer: A, B
Explanation:
The error “Segment references too many data lake objects (DLOs)” occurs when a segment query
https://www.pass4success.com
Questions & Answers PDF P-32
exceeds the limit of 50 DLOs that can be referenced in a single query. This can happen when the
segment has too many filters, nested segments, or exclusion criteria that involve different DLOs. To
remedy this issue, the consultant can try the following troubleshooting tips:
Split the segment into smaller segments. The consultant can divide the segment into multiple
segments that have fewer filters, nested segments, or exclusion criteria. This can reduce the number
of DLOs that are referenced in each segment query and avoid the error. The consultant can then use
the smaller segments as nested segments in a larger segment, or activate them separately.
Use calculated insights in order to reduce the complexity of the segmentation query. The consultant
can create calculated insights that are derived from existing data using formulas. Calculated insights
can simplify the segmentation query by replacing multiple filters or nested segments with a single
attribute. For example, instead of using multiple filters to segment individuals based on their
purchase history, the consultant can create a calculated insight that calculates the lifetime value of
each individual and use that as a filter.
The other options are not troubleshooting tips that can help remedy this issue. Refining
segmentation criteria to limit up to five custom data model objects (DMOs) is not a valid option, as
the limit of 50 DLOs applies to both standard and custom DMOs. Spacing out the segment schedules
to reduce DLO load is not a valid option, as the error is not related to the DLO load, but to the
segment query complexity.
Reference:
Troubleshoot Segment Errors
Create a Calculated Insight
Create a Segment in Data Cloud
Question: 50
An organization wants to enable users with the ability to identify and select text attributes
from a picklist of options.
Which Data Cloud feature should help with this use case?
A. Value suggestion
B. Data harmonization
C. Transformation formulas
D. Global picklists
Answer: A
Explanation:
: Value suggestion is a Data Cloud feature that allows users to see and select the possible values for a
text field when creating segment filters. Value suggestion can be enabled or disabled for each data
model object (DMO) field in the DMO record home. Value suggestion can help users to identify and
select text attributes from a picklist of options, without having to type or remember the exact values.
Value suggestion can also reduce errors and improve data quality by ensuring consistent and valid
values for the segment filters. Reference: Use Value Suggestions in Segmentation, Considerations for
Selecting Related Attributes
Question: 51
https://www.pass4success.com
Questions & Answers PDF P-33
A consultant is working in a customer's Data Cloud org and is asked to delete the existing
identity resolution ruleset.
Which two impacts should the consultant communicate as a result of this action?
Choose 2 answers
Answer: B, C
Explanation:
Deleting an identity resolution ruleset has two major impacts that the consultant should
communicate to the customer. First, it will permanently remove all unified customer data that was
created by the ruleset, meaning that the unified profiles and their attributes will no longer be
available in Data Cloud1. Second, it will eliminate dependencies on data model objects that were
used by the ruleset, meaning that the data model objects can be modified or deleted without
affecting the ruleset1. These impacts can have significant consequences for the customer’s data
quality, segmentation, activation, and analytics, so the consultant should advise the customer to
carefully consider the implications of deleting a ruleset before proceeding. The other options are
incorrect because they are not impacts of deleting a ruleset. Option A is incorrect because deleting a
ruleset will not remove all individual data, but only the unified customer data. The individual data
from the source systems will still be available in Data Cloud1. Option D is incorrect because deleting a
ruleset will not remove all source profile data, but only the unified customer data. The source profile
data from the data streams will still be available in Data Cloud1. Reference: Delete an Identity
Resolution Ruleset
Question: 52
Northern Trail Outfitters uploads new customer data to an Amazon S3 Bucket on a daily
basis to be ingested in Data Cloud.
In what order should each process be run to ensure that freshly imported data is ready and available
to use for any segment?
Answer: D
Explanation:
To ensure that freshly imported data from an Amazon S3 Bucket is ready and available to use for any
segment, the following processes should be run in this order:
Refresh Data Stream: This process updates the data lake objects in Data Cloud with the latest data
from the source system. It can be configured to run automatically or manually, depending on the
https://www.pass4success.com
Questions & Answers PDF P-34
data stream settings1. Refreshing the data stream ensures that Data Cloud has the most recent and
accurate data from the Amazon S3 Bucket.
Identity Resolution: This process creates unified individual profiles by matching and consolidating
source profiles from different data streams based on the identity resolution ruleset. It runs daily by
default, but can be triggered manually as well2. Identity resolution ensures that Data Cloud has a
single view of each customer across different data sources.
Calculated Insight: This process performs calculations on data lake objects or CRM data and returns a
result as a new data object. It can be used to create metrics or measures for segmentation or analysis
purposes3. Calculated insights ensure that Data Cloud has the derived data that can be used for
personalization or activation.
Reference:
1: Configure Data Stream Refresh and Frequency - Salesforce
2: Identity Resolution Ruleset Processing Results - Salesforce
3: Calculated Insights - Salesforce
Question: 53
Data Cloud receives a nightly file of all ecommerce transactions from the previous day.
Several segments and activations depend upon calculated insights from the updated data in order to
maintain accuracy in the customer's scheduled campaign messages.
What should the consultant do to ensure the ecommerce data is ready for use for each of the
scheduled activations?
A. Use Flow to trigger a change data event on the ecommerce data to refresh calculated insights
and segments before the activations are scheduled to run.
B. Set a refresh schedule for the calculated insights to occur every hour.
C. Ensure the activations are set to Incremental Activation and automatically publish every hour.
D. Ensure the segments are set to Rapid Publish and set to refresh every hour.
Answer: A
Explanation:
The best option that the consultant should do to ensure the ecommerce data is ready for use for
each of the scheduled activations is A. Use Flow to trigger a change data event on the ecommerce
data to refresh calculated insights and segments before the activations are scheduled to run. This
option allows the consultant to use the Flow feature of Data Cloud, which enables automation and
orchestration of data processing tasks based on events or schedules. Flow can be used to trigger a
change data event on the ecommerce data, which is a type of event that indicates that the data has
been updated or changed. This event can then trigger the refresh of the calculated insights and
segments that depend on the ecommerce data, ensuring that they reflect the latest data. The refresh
of the calculated insights and segments can be completed before the activations are scheduled to
run, ensuring that the customer’s scheduled campaign messages are accurate and relevant.
The other options are not as good as option A. Option B is incorrect because setting a refresh
schedule for the calculated insights to occur every hour may not be sufficient or efficient. The refresh
schedule may not align with the activation schedule, resulting in outdated or inconsistent data. The
refresh schedule may also consume more resources and time than necessary, as the ecommerce data
may not change every hour. Option C is incorrect because ensuring the activations are set to
Incremental Activation and automatically publish every hour may not solve the problem. Incremental
https://www.pass4success.com
Questions & Answers PDF P-35
Activation is a feature that allows only the new or changed records in a segment to be activated,
reducing the activation time and size. However, this feature does not ensure that the segment data is
updated or refreshed based on the ecommerce data. The activation schedule may also not match the
ecommerce data update schedule, resulting in inaccurate or irrelevant campaign messages. Option D
is incorrect because ensuring the segments are set to Rapid Publish and set to refresh every hour
may not be optimal or effective. Rapid Publish is a feature that allows segments to be published
faster by skipping some validation steps, such as checking for duplicate records or invalid values.
However, this feature may compromise the quality or accuracy of the segment data, and may not be
suitable for all use cases. The refresh schedule may also have the same issues as option B, as it may
not sync with the ecommerce data update schedule or the activation schedule, resulting in outdated
or inconsistent data. Reference: Salesforce Data Cloud Consultant Exam Guide, Flow, Change Data
Events, Calculated Insights, Segments, [Activation]
Question: 54
Which two requirements must be met for a calculated insight to appear in the
segmentation canvas?
Choose 2 answers
A. The metrics of the calculated insights must only contain numeric values.
B. The primary key of the segmented table must be a metric in the calculated insight.
C. The calculated insight must contain a dimension including the Individual or Unified Individual Id.
D. The primary key of the segmented table must be a dimension in the calculated insight.
Answer: C, D
Explanation:
A calculated insight is a custom metric or measure that is derived from one or more data model
objects or data lake objects in Data Cloud. A calculated insight can be used in segmentation to filter
or group the data based on the calculated value. However, not all calculated insights can appear in
the segmentation canvas. There are two requirements that must be met for a calculated insight to
appear in the segmentation canvas:
The calculated insight must contain a dimension including the Individual or Unified Individual Id. A
dimension is a field that can be used to categorize or group the data, such as name, gender, or
location. The Individual or Unified Individual Id is a unique identifier for each individual profile in
Data Cloud. The calculated insight must include this dimension to link the calculated value to the
individual profile and to enable segmentation based on the individual profile attributes.
The primary key of the segmented table must be a dimension in the calculated insight. The primary
key is a field that uniquely identifies each record in a table. The segmented table is the table that
contains the data that is being segmented, such as the Customer or the Order table. The calculated
insight must include the primary key of the segmented table as a dimension to ensure that the
calculated value is associated with the correct record in the segmented table and to avoid duplication
or inconsistency in the segmentation results.
Reference: Create a Calculated Insight, Use Insights in Data Cloud, Segmentation
Question: 55
https://www.pass4success.com
Questions & Answers PDF P-36
Answer: D
Explanation:
The Data Rights Subject Request tool is a feature that allows Data Cloud users to manage customer
requests for data access, deletion, or portability. The tool provides a user interface and an API to
create, track, and fulfill data rights requests. The tool also generates a report that contains the
customer’s personal data and the actions taken to comply with the request. The consultant should
use this tool to accommodate the customer’s request for data deletion in Data
Cloud. Reference: Data Rights Subject Request Tool, Create a Data Rights Subject Request
Question: 56
Answer: B
Explanation:
The Ignore Empty Value option in identity resolution allows customers to ignore empty fields when
running reconciliation rules. Reconciliation rules are used to determine the final value of an attribute
for a unified individual profile, based on the values from different sources. The Ignore Empty Value
option can be set to true or false for each attribute in a reconciliation rule. If set to true, the
reconciliation rule will skip any source that has an empty value for that attribute and move on to the
next source in the priority order. If set to false, the reconciliation rule will consider any source that
has an empty value for that attribute as a valid source and use it to populate the attribute value for
the unified individual profile.
The other options are not correct descriptions of what the Ignore Empty Value option does in identity
resolution. The Ignore Empty Value option does not affect the custom match rules or the standard
match rules, which are used to identify and link individuals across different sources based on their
attributes. The Ignore Empty Value option also does not ignore individual object records with empty
fields when running identity resolution rules, as identity resolution rules operate on the attribute
level, not the record level.
Reference:
Data Cloud Identity Resolution Reconciliation Rule Input
Configure Identity Resolution Rulesets
https://www.pass4success.com
Questions & Answers PDF P-37
Question: 57
Northern Trail Outfitters (NTO) is configuring an identity resolution ruleset based on Fuzzy
Name and Normalized Email.
What should NTO do to ensure the best email address is activated?
Answer: B
Explanation:
: NTO is using Fuzzy Name and Normalized Email as match rules to link together data from different
sources into a unified individual profile. However, there might be cases where the same email
address is available from more than one source, and NTO needs to decide which one to use for
activation. For example, if Rachel has the same email address in Service Cloud and Marketing Cloud,
but prefers to receive communications from NTO via Marketing Cloud, NTO needs to ensure that the
email address from Marketing Cloud is activated. To do this, NTO can use the source priority order in
activations, which allows them to rank the data sources in order of preference for activation. By
placing Marketing Cloud higher than Service Cloud in the source priority order, NTO can make sure
that the email address from Marketing Cloud is delivered to the activation target, such as an email
campaign or a journey. This way, NTO can respect Rachel’s preference and deliver a better customer
experience. Reference: Configure Activations, Use Source Priority Order in Activations
Question: 58
A customer wants to create segments of users based on their Customer Lifetime Value.
However, the source data that will be brought into Data Cloud does not include that key performance
indicator (KPI).
Which sequence of steps should the consultant follow to achieve this requirement?
A. Ingest Data > Map Data to Data Model > Create Calculated Insight > Use in Segmentation
B. Create Calculated Insight > Map Data to Data Model> Ingest Data > Use in Segmentation
C. Create Calculated Insight > Ingest Data > Map Data to Data Model> Use in Segmentation
D. Ingest Data > Create Calculated Insight > Map Data to Data Model > Use in Segmentation
Answer: A
Explanation:
To create segments of users based on their Customer Lifetime Value (CLV), the sequence of steps that
the consultant should follow is Ingest Data > Map Data to Data Model > Create Calculated Insight >
https://www.pass4success.com
Questions & Answers PDF P-38
Use in Segmentation. This is because the first step is to ingest the source data into Data Cloud using
data streams1. The second step is to map the source data to the data model, which defines the
structure and attributes of the data2. The third step is to create a calculated insight, which is a
derived attribute that is computed based on the source or unified data3. In this case, the calculated
insight would be the CLV, which can be calculated using a formula or a query based on the sales
order data4. The fourth step is to use the calculated insight in segmentation, which is the process of
creating groups of individuals or entities based on their attributes and behaviors. By using the CLV
calculated insight, the consultant can segment the users by their predicted revenue from the lifespan
of their relationship with the brand. The other options are incorrect because they do not follow the
correct sequence of steps to achieve the requirement. Option B is incorrect because it is not possible
to create a calculated insight before ingesting and mapping the data, as the calculated insight
depends on the data model objects3. Option C is incorrect because it is not possible to create a
calculated insight before mapping the data, as the calculated insight depends on the data model
objects3. Option D is incorrect because it is not recommended to create a calculated insight before
mapping the data, as the calculated insight may not reflect the correct data model structure and
attributes3. Reference: Data Streams Overview, Data Model Objects Overview, Calculated Insights
Overview, Calculating Customer Lifetime Value (CLV) With Salesforce, [Segmentation Overview]
Question: 59
During discovery, which feature should a consultant highlight for a customer who has
multiple data sources and needs to match and reconcile data about individuals into a single unified
profile?
A. Data Cleansing
B. Harmonization
C. Data Consolidation
D. Identity Resolution
Answer: D
Explanation:
Identity resolution is the feature that allows Data Cloud to match and reconcile data about
individuals from multiple data sources into a single unified profile. Identity resolution uses rulesets
to define how source profiles are matched and consolidated based on common attributes, such as
name, email, phone, or party identifier. Identity resolution enables Data Cloud to create a 360-
degree view of each customer across different data sources and systems12. The other options are not
the best features to highlight for this customer need because:
A) Data cleansing is the process of detecting and correcting errors or inconsistencies in data, such as
duplicates, missing values, or invalid formats. Data cleansing can improve the quality and accuracy of
data, but it does not match or reconcile data across different data sources3.
B) Harmonization is the process of standardizing and transforming data from different sources into a
common format and structure. Harmonization can enable data integration and interoperability, but it
does not match or reconcile data across different data sources4.
C) Data consolidation is the process of combining data from different sources into a single data set or
system. Data consolidation can reduce data redundancy and complexity, but it does not match or
reconcile data across different data sources5. Reference: 1: Data and Identity in Data Cloud |
Salesforce Trailhead, 2: Data Cloud Identiy Resolution | Salesforce AI Research, 3: [Data Cleansing -
https://www.pass4success.com
Questions & Answers PDF P-39
Question: 60
Northern Trail Outfitters (NTO) wants to send a promotional campaign for customers that
have purchased within the past 6 months. The consultant created a segment to meet this
requirement.
Now, NTO brings an additional requirement to suppress customers who have made purchases within
the last week.
What should the consultant use to remove the recent customers?
A. Batch transforms
B. Segmentation exclude rules
C. Related attributes
D. Streaming insight
Answer: B
Explanation:
The consultant should use B. Segmentation exclude rules to remove the recent customers.
Segmentation exclude rules are filters that can be applied to a segment to exclude records that meet
certain criteria. The consultant can use segmentation exclude rules to exclude customers who have
made purchases within the last week from the segment that contains customers who have purchased
within the past 6 months. This way, the segment will only include customers who are eligible for the
promotional campaign.
The other options are not correct. Option A is incorrect because batch transforms are data processing
tasks that can be applied to data streams or data lake objects to modify or enrich the data. Batch
transforms are not used for segmentation or activation. Option C is incorrect because related
attributes are attributes that are derived from the relationships between data model objects. Related
attributes are not used for excluding records from a segment. Option D is incorrect because
streaming insights are derived attributes that are calculated at the time of data ingestion. Streaming
insights are not used for excluding records from a segment. Reference: Salesforce Data Cloud
Consultant Exam Guide, Segmentation, Segmentation Exclude Rules
Question: 61
A new user of Data Cloud only needs to be able to review individual rows of ingested data and
validate that it has been modeled
successfully to its linked data model object. The user will also need to make changes if required.
What is the minimum permission set needed to accommodate this use case?
Answer: C
https://www.pass4success.com
Questions & Answers PDF P-40
Explanation:
The Data Cloud User permission set is the minimum permission set needed to accommodate this use
case. The Data Cloud User permission set grants access to the Data Explorer feature, which allows
the user to review individual rows of ingested data and validate that it has been modeled
successfully to its linked data model object. The user can also make changes to the data model object
fields, such as adding or removing fields, changing field types, or creating formula fields. The Data
Cloud User permission set does not grant access to other Data Cloud features or tasks, such as
creating data streams, creating segments, creating activations, or managing users. The other
permission sets are either too restrictive or too permissive for this use case. The Data Cloud for
Marketing Specialist permission set only grants access to the segmentation and activation features,
but not to the Data Explorer feature. The Data Cloud Admin permission set grants access to all Data
Cloud features and tasks, including the Data Explorer feature, but it is more than what the user
needs. The Data Cloud for Marketing Data Aware Specialist permission set grants access to the Data
Explorer feature, but also to the segmentation and activation features, which are not required for
this use case. Reference: Data Cloud Standard Permission Sets, Data Explorer, Set Up Data Cloud Unit
Question: 62
Which data stream category should be assigned to use the data for time-based operations in
segmentation and calculated insights?
A. Individual
B. Transaction
C. Sales Order
D. Engagement
Answer: B
Explanation:
Data streams are the sources of data that are ingested into Data Cloud and mapped to the data
model. Data streams have different categories that determine how the data is processed and used in
Data Cloud. Transaction data streams are used for time-based operations in segmentation and
calculated insights, such as filtering by date range, aggregating by time period, or calculating time-to-
event metrics. Transaction data streams are typically used for event data, such as purchases, clicks,
or visits, that have a timestamp and a value associated with them. Reference: Data Streams, Data
Stream Categories
Question: 63
Which data model subject area should be used for any Organization, Individual, or Member in the
Customer 360 data model?
A. Engagement
B. Membership
C. Party
D. Global Account
https://www.pass4success.com
Questions & Answers PDF P-41
Answer: C
Explanation:
: The data model subject area that should be used for any Organization, Individual, or Member in the
Customer 360 data model is the Party subject area. The Party subject area defines the entities that
are involved in any business transaction or relationship, such as customers, prospects, partners,
suppliers, etc. The Party subject area contains the following data model objects (DMOs):
Organization: A DMO that represents a legal entity or a business unit, such as a company, a
department, a branch, etc.
Individual: A DMO that represents a person, such as a customer, a contact, a user, etc.
Member: A DMO that represents the relationship between an individual and an organization, such as
an employee, a customer, a partner, etc.
The other options are not data model subject areas that should be used for any Organization,
Individual, or Member in the Customer 360 data model. The Engagement subject area defines the
actions that people take, such as clicks, views, purchases, etc. The Membership subject area defines
the associations that people have with groups, such as loyalty programs, clubs, communities, etc.
The Global Account subject area defines the hierarchical relationships between organizations, such
as parent-child, subsidiary, etc.
Reference:
Data Model Subject Areas
Party Subject Area
Customer 360 Data Model
Question: 64
Which method should a consultant use when performing aggregations in windows of 15 minutes on
data collected via the Interaction SDK or Mobile SDK?
A. Batch transform
B. Calculated insight
C. Streaming insight
D. Formula fields
Answer: C
Explanation:
Streaming insight is a method that allows you to perform aggregations in windows of 15 minutes on
data collected via the Interaction SDK or Mobile SDK. Streaming insight is a feature that enables you
to create real-time metrics and insights based on streaming data from various sources, such as web,
mobile, or IoT devices. Streaming insight allows you to define aggregation rules, such as count, sum,
average, min, max, or percentile, and apply them to streaming data in time windows of 15 minutes.
For example, you can use streaming insight to calculate the number of visitors, the average session
duration, or the conversion rate for your website or app in 15-minute intervals. Streaming insight
also allows you to visualize and explore the aggregated data in dashboards, charts, or
tables. Reference: Streaming Insight, Create Streaming Insights
https://www.pass4success.com
Questions & Answers PDF P-42
Question: 65
Northern Trail Outfitters is using the Marketing Cloud Starter Data Bundles to bring Marketing Cloud
data into Data Cloud.
What are two of the available datasets in Marketing Cloud Starter Data Bundles?
Choose 2 answers
A. Personalization
B. MobileConnect
C. Loyalty Management
D. MobilePush
Answer: B, D
Explanation:
The Marketing Cloud Starter Data Bundles are predefined data bundles that allow you to easily ingest
data from Marketing Cloud into Data Cloud1. The available datasets in Marketing Cloud Starter Data
Bundles are Email, MobileConnect, and MobilePush2. These datasets contain engagement events
and metrics from different Marketing Cloud channels, such as email, SMS, and push
notifications2. By using these datasets, you can enrich your Data Cloud data model with Marketing
Cloud data and create segments and activations based on your marketing campaigns and journeys1.
The other options are incorrect because they are not available datasets in Marketing Cloud Starter
Data Bundles. Option A is incorrect because Personalization is not a dataset, but a feature of
Marketing Cloud that allows you to tailor your content and messages to your audience3. Option C is
incorrect because Loyalty Management is not a dataset, but a product of Marketing Cloud that allows
you to create and manage loyalty programs for your customers4. Reference: Marketing Cloud Starter
Data Bundles in Data Cloud, Connect Your Data Sources, Personalization in Marketing Cloud, Loyalty
Management in Marketing Cloud
Question: 66
A customer has a custom Customer Email c object related to the standard Contact object in
Salesforce CRM. This custom object
stores the email address a Contact that they want to use for activation.
To which data entity is mapped?
A. Contact
B. Contact Point_Email
C. Custom customer Email__c object
D. Individual
Answer: B
Explanation:
The Contact Point_Email object is the data entity that represents an email address associated with
an individual in Data Cloud. It is part of the Customer 360 Data Model, which is a standardized data
model that defines common entities and relationships for customer data. The Contact Point_Email
https://www.pass4success.com
Questions & Answers PDF P-43
object can be mapped to any custom or standard object that stores email addresses in Salesforce
CRM, such as the custom Customer Email__c object. The other options are not the correct data
entities to map to because:
A) The Contact object is the data entity that represents a person who is associated with an account
that is a customer, partner, or competitor in Salesforce CRM. It is not the data entity that represents
an email address in Data Cloud.
C) The custom Customer Email__c object is not a data entity in Data Cloud, but a custom object in
Salesforce CRM. It can be mapped to a data entity in Data Cloud, such as the Contact Point_Email
object, but it is not a data entity itself.
D) The Individual object is the data entity that represents a unique person in Data Cloud. It is the core
entity for managing consent and privacy preferences, and it can be related to one or more contact
points, such as email addresses, phone numbers, or social media handles. It is not the data entity
that represents an email address in Data Cloud. Reference: Customer 360 Data Model: Individual and
Contact Points - Salesforce, Contact Point_Email | Object Reference for the Salesforce Platform |
Salesforce Developers, [Contact | Object Reference for the Salesforce Platform | Salesforce
Developers], [Individual | Object Reference for the Salesforce Platform | Salesforce Developers]
Question: 67
During discovery, which feature should a consultant highlight for a customer who has multiple data
sources and needs to match and reconcile data about individuals into a single unified profile?
A. Harmonization
B. Data Cleansing
C. Data Consolidation
D. Identity Resolution
Answer: D
Explanation:
The feature that the consultant should highlight for a customer who has multiple data sources and
needs to match and reconcile data about individuals into a single unified profile is D. Identity
Resolution. Identity Resolution is the process of identifying, matching, and reconciling data about
individuals across different data sources and creating a unified profile that represents a single view of
the customer. Identity Resolution uses various methods and rules to determine the best match and
reconciliation of data, such as deterministic matching, probabilistic matching, reconciliation rules,
and identity graphs. Identity Resolution enables the customer to have a complete and accurate
understanding of their customers and their interactions across different channels and
touchpoints. Reference: Salesforce Data Cloud Consultant Exam Guide, Identity Resolution
Question: 68
Cumulus Financial uses Data Cloud to segment banking customers and activate them for direct mail
via a Cloud File Storage
activation. The company also wants to analyze individuals who have been in the segment within the
last 2 years.
Which Data Cloud component allows for this?
https://www.pass4success.com
Questions & Answers PDF P-44
A. Nested segments
B. Segment exclusion
C. Calculated insights
D. Segment membership data model object
Answer: D
Explanation:
The segment membership data model object is a Data Cloud component that allows for analyzing
individuals who have been in a segment within a certain time period. The segment membership data
model object is a table that stores the information about which individuals belong to which segments
and when they were added or removed from the segments. This object can be used to create
calculated insights, such as segment size, segment duration, segment overlap, or segment retention,
that can help measure the effectiveness of segmentation and activation strategies. The segment
membership data model object can also be used to create nested segments or segment exclusions
based on the segment membership criteria, such as segment name, segment type, or segment date
range. The other options are not correct because they are not Data Cloud components that allow for
analyzing individuals who have been in a segment within the last 2 years. Nested segments and
segment exclusions are features that allow for creating more complex segments based on existing
segments, but they do not provide the historical data about segment membership. Calculated
insights are custom metrics or measures that are derived from data model objects or data lake
objects, but they do not store the segment membership information by
themselves. Reference: Segment Membership Data Model Object, Create a Calculated Insight, Create
a Nested Segment
Question: 69
Every day, Northern Trail Outfitters uploads a summary of the last 24 hours of store transactions to a
new file in an Amazon S3
bucket, and files older than seven days are automatically deleted. Each file contains a timestamp in a
standardized naming convention.
Which two options should a consultant configure when ingesting this data stream?
Choose 2 answers
Answer: B, C
Explanation:
: When ingesting data from an Amazon S3 bucket, the consultant should configure the following
options:
The refresh mode should be set to “Upsert”, which means that new and updated records will be
added or updated in Data Cloud, while existing records will be preserved. This ensures that the data
https://www.pass4success.com
Questions & Answers PDF P-45
Question: 70
Which solution provides an easy way to ingest Marketing Cloud subscriber profile attributes into Data
Cloud on a daily basis?
Answer: C
Explanation:
The solution that provides an easy way to ingest Marketing Cloud subscriber profile attributes into
Data Cloud on a daily basis is the Marketing Cloud Data extension Data Stream. The Marketing Cloud
Data extension Data Stream is a feature that allows customers to stream data from Marketing Cloud
data extensions to Data Cloud data spaces. Customers can select which data extensions they want to
stream, and Data Cloud will automatically create and update the corresponding data model objects
(DMOs) in the data space. Customers can also map the data extension fields to the DMO attributes
using a user interface or an API. The Marketing Cloud Data extension Data Stream can help
customers ingest subscriber profile attributes and other data from Marketing Cloud into Data Cloud
without writing any code or setting up any complex integrations.
The other options are not solutions that provide an easy way to ingest Marketing Cloud subscriber
profile attributes into Data Cloud on a daily basis. Automation Studio and Profile file API are tools
that can be used to export data from Marketing Cloud to external systems, but they require
customers to write scripts, configure file transfers, and schedule automations. Marketing Cloud
Connect API is an API that can be used to access data from Marketing Cloud in other Salesforce
solutions, such as Sales Cloud or Service Cloud, but it does not support streaming data to Data Cloud.
Email Studio Starter Data Bundle is a data kit that contains sample data and segments for Email
Studio, but it does not contain subscriber profile attributes or stream data to Data Cloud.
Reference:
Marketing Cloud Data Extension Data Stream
Data Cloud Data Ingestion
[Marketing Cloud Data Extension Data Stream API]
https://www.pass4success.com
Questions & Answers PDF P-46
Question: 71
A customer has a requirement to be able to view the last time each segment was published within
their Data Cloud org.
Which two features should the consultant recommend to best address this requirement?
Choose 2 answers
A. Profile Explorer
B. Calculated insight
C. Dashboard
D. Report
Answer: C D
Explanation:
: A customer who wants to view the last time each segment was published within their Data Cloud
org can use the dashboard and report features to achieve this requirement. A dashboard is a visual
representation of data that can show key metrics, trends, and comparisons. A report is a tabular or
matrix view of data that can show details, summaries, and calculations. Both dashboard and report
features allow the user to create, customize, and share data views based on their needs and
preferences. To view the last time each segment was published, the user can create a dashboard or a
report that shows the segment name, the publish date, and the publish status fields from the
segment object. The user can also filter, sort, group, or chart the data by these fields to get more
insights and analysis. The user can also schedule, refresh, or export the dashboard or report data as
needed. Reference: Dashboards, Reports
Question: 72
A. An audit log showing the user who activated the segment and when it was activated
B. The activated data payload
C. The metadata regarding the segment definition
D. The manifest of origin sources within Data Cloud
Answer: B
Explanation:
When activating to Amazon S3, the information that is provided in a .csv file is the activated data
payload. The activated data payload is the data that is sent from Data Cloud to the activation target,
which in this case is an Amazon S3 bucket1. The activated data payload contains the attributes and
values of the individuals or entities that are included in the segment that is being activated2. The
activated data payload can be used for various purposes, such as marketing, sales, service, or
analytics3. The other options are incorrect because they are not provided in a .csv file when
https://www.pass4success.com
Questions & Answers PDF P-47
activating to Amazon S3. Option A is incorrect because an audit log is not provided in a .csv file, but it
can be viewed in the Data Cloud UI under the Activation History tab4. Option C is incorrect because
the metadata regarding the segment definition is not provided in a .csv file, but it can be viewed in
the Data Cloud UI under the Segmentation tab5. Option D is incorrect because the manifest of origin
sources within Data Cloud is not provided in a .csv file, but it can be viewed in the Data Cloud UI
under the Data Sources tab. Reference: Data Activation Overview, Create and Activate Segments in
Data Cloud, Data Activation Use Cases, View Activation History, Segmentation Overview, [Data
Sources Overview]
Question: 73
Which operator should a consultant use to create a segment for a birthday campaign that is
evaluated daily?
A. Is Today
B. Is Birthday
C. Is Between
D. Is Anniversary Of
Answer: D
Explanation:
To create a segment for a birthday campaign that is evaluated daily, the consultant should use the Is
Anniversary Of operator. This operator compares a date field with the current date and returns true if
the month and day are the same, regardless of the year. For example, if the date field is 1990-01-01
and the current date is 2023-01-01, the operator returns true. This way, the consultant can create a
segment that includes all the customers who have their birthday on the same day as the current
date, and the segment will be updated daily with the new birthdays. The other options are not the
best operators to use for this purpose because:
A) The Is Today operator compares a date field with the current date and returns true if the date is
the same, including the year. For example, if the date field is 1990-01-01 and the current date is
2023-01-01, the operator returns false. This operator is not suitable for a birthday campaign, as it will
only include the customers who were born on the same day and year as the current date, which is
very unlikely.
B) The Is Birthday operator is not a valid operator in Data Cloud. There is no such operator available
in the segment canvas or the calculated insight editor.
C) The Is Between operator compares a date field with a range of dates and returns true if the date is
within the range, including the endpoints. For example, if the date field is 1990-01-01 and the range
is 2022-12-25 to 2023-01-05, the operator returns true. This operator is not suitable for a birthday
campaign, as it will only include the customers who have their birthday within a fixed range of dates,
and the segment will not be updated daily with the new birthdays.
Question: 74
Luxury Retailers created a segment targeting high value customers that it activates through
Marketing Cloud for email communication. The company notices that the activated count is smaller
than the segment count.
https://www.pass4success.com
Questions & Answers PDF P-48
A. Data Cloud enforces the presence of Contact Point for Marketing Cloud activations. If the
individual does not have a related Contact Point, it will not be activated.
B. Marketing Cloud activations automatically suppress individuals who are unengaged and have not
opened or clicked on an email in the last six months.
C. Marketing Cloud activations only activate those individuals that already exist in Marketing Cloud.
They do not allow activation of new records.
D. Marketing Cloud activations apply a frequency cap and limit the number of records that can be
sent in an activation.
Answer: A
Explanation:
The reason for the activated count being smaller than the segment count is A. Data Cloud enforces
the presence of Contact Point for Marketing Cloud activations. If the individual does not have a
related Contact Point, it will not be activated. A Contact Point is a data model object that represents
a channel or method of communication with an individual, such as email, phone, or social media. For
Marketing Cloud activations, Data Cloud requires that the individual has a related Contact Point of
type Email, which contains a valid email address. If the individual does not have such a Contact Point,
or if the Contact Point is missing or invalid, the individual will not be activated and will not receive
the email communication. Therefore, the activated count may be lower than the segment count,
depending on how many individuals in the segment have a valid email Contact
Point. Reference: Salesforce Data Cloud Consultant Exam Guide, Contact Point, Marketing Cloud
Activation
Question: 75
A Data Cloud consultant recently added a new data source and mapped some of the data to a new
custom data model object (DMO)
that they want to use for creating segments. However, they cannot view the newly created DMO
when trying to create a new segment.
What is the cause of this issue?
Answer: B
Explanation:
The cause of this issue is that the new custom data model object (DMO) is not of category Profile. A
category is a property of a DMO that defines its purpose and functionality in Data Cloud. There are
three categories of DMOs: Profile, Event, and Other. Profile DMOs are used to store attributes of
individuals or entities, such as name, email, address, etc. Event DMOs are used to store actions or
interactions of individuals or entities, such as purchases, clicks, visits, etc. Other DMOs are used to
https://www.pass4success.com
Questions & Answers PDF P-49
store any other type of data that does not fit into the Profile or Event categories, such as products,
locations, categories, etc. Only Profile DMOs can be used for creating segments in Data Cloud, as
segments are based on the attributes of individuals or entities. Therefore, if the new custom DMO is
not of category Profile, it will not appear in the segmentation canvas. The other options are not
correct because they are not the cause of this issue. Data ingestion is not a prerequisite for creating
segments, as segments can be created based on the data model schema without actual data. The
new DMO does not need to have a relationship to the individual DMO, as segments can be created
based on any Profile DMO, regardless of its relationship to other DMOs. Segmentation is not only
supported for the Individual and Unified Individual DMOs, as segments can be created based on any
Profile DMO, including custom ones. Reference: Create a Custom Data Model Object from an Existing
Data Model Object, Create a Segment in Data Cloud, Data Model Object Category
Question: 76
Cumulus Financial wants to segregate Salesforce CRM Account data based on Country for its Data
Cloud users.
What should the consultant do to accomplish this?
A. Use streaming transforms to filter out Account data based on Country and map to separate data
model objects accordingly.
B. Use the data spaces feature and applying filtering on the Account data lake object based on
Country.
C. Use Salesforce sharing rules on the Account object to filter and segregate records based on
Country.
D. Use formula fields based on the account Country field to filter incoming records.
Answer: B
Explanation:
Data spaces are a feature that allows Data Cloud users to create subsets of data based on filters and
permissions. Data spaces can be used to segregate data based on different criteria, such as
geography, business unit, or product line. In this case, the consultant can use the data spaces feature
and apply filtering on the Account data lake object based on Country. This way, the Data Cloud users
can access only the Account data that belongs to their respective countries. Reference: Data
Spaces, Create a Data Space
Question: 77
A. Deletes the records from all data source objects, and any downstream data model objects are
updated at the next scheduled ingestion
B. Deletes the specified Individual record and its Unified Individual Link record.
C. Deletes the specified Individual and records from any data source object mapped to the Individual
data model object.
D. Deletes the specified Individual and records from any data model object/data lake object related
to the Individual.
https://www.pass4success.com
Questions & Answers PDF P-50
Answer: D
Explanation:
Data Cloud handles an individual’s Right to be Forgotten by deleting the specified Individual and
records from any data model object/data lake object related to the Individual. This means that Data
Cloud removes all the data associated with the individual from the data space, including the data
from the source objects, the unified individual profile, and any related objects. Data Cloud also
deletes the Unified Individual Link record that links the individual to the source records. Data Cloud
uses the Consent API to process the Right to be Forgotten requests, which are reprocessed at 30, 60,
and 90 days to ensure a full deletion.
The other options are not correct descriptions of how Data Cloud handles an individual’s Right to be
Forgotten. Data Cloud does not delete the records from all data source objects, as this would affect
the data integrity and availability of the source systems. Data Cloud also does not delete only the
specified Individual record and its Unified Individual Link record, as this would leave the source
records and the related records intact. Data Cloud also does not delete only the specified Individual
and records from any data source object mapped to the Individual data model object, as this would
leave the related records intact.
Reference:
Requesting Data Deletion or Right to Be Forgotten
Data Deletion for Data Cloud
Use the Consent API with Data Cloud
Data and Identity in Data Cloud
Question: 78
A healthcare client wants to make use of identity resolution, but does not want to risk unifying
profiles that may share certain
personally identifying information (PII).
Which matching rule criteria should a consultant recommend for the most accurate matching
results?
Answer: A
Explanation:
Identity resolution is the process of linking data from different sources into a unified profile of a
customer or an individual. Identity resolution uses matching rules to compare the attributes of
different records and determine if they belong to the same person. Matching rules can be based on
exact or fuzzy matching of various attributes, such as name, email, phone, address, or custom
identifiers. A healthcare client who wants to use identity resolution, but does not want to risk
unifying profiles that may share certain personally identifying information (PII), such as name or
email, should use a matching rule criteria that is based on a unique and reliable identifier that is
https://www.pass4success.com
Questions & Answers PDF P-51
specific to the healthcare domain. One such identifier is the patient ID, which is a unique number
assigned to each patient by a healthcare provider or system. By using the party identification on
patient ID as a matching rule criteria, the healthcare client can ensure that only records that have the
same patient ID are matched and unified, and avoid false positives or false negatives that may occur
due to common or similar names or emails. The party identification on patient ID is also a secure and
compliant way of handling sensitive healthcare data, as it does not expose or share any PII that may
be subject to data protection regulations or standards. Reference: Configure Identity Resolution
Rulesets, A framework of identity resolution: evaluating identity attributes and methods
Question: 79
A user is not seeing suggested values from newly-modeled data when building a segment.
What is causing this issue?
Answer: A
Explanation:
: Value suggestion is a feature that allows users to see suggested values for data model object (DMO)
fields when creating segment filters. However, this feature can take up to 24 hours to process and
display the values for newly-modeled data. Therefore, if a user is not seeing suggested values from
newly-modeled data, it is likely that the value suggestion is still processing and will be available soon.
The other options are incorrect because value suggestion does not require any specific permissions,
can work on both direct and related attributes, and can return more than 50 values for a specific
attribute, depending on the data type and frequency of the values. Reference: Use Value Suggestions
in Segmentation, Data Cloud Limits and Guidelines
Question: 80
A consultant is building a segment to announce a new product launch for customers that have
previously purchased black pants.
How should the consultant place attributes for product color and product type from the Order
Product object to meet this criteria?
A. Place the attribute for product color in one container and the attribute for product type in another
container.
B. Place an attribute for the "black" calculated insight to dynamically apply
C. Place the attributes for product and product type as direct attributes.
D. Place the attributes for product color and product type in a single container.
Answer: D
Explanation:
https://www.pass4success.com
Questions & Answers PDF P-52
: To create a segment based on the product color and product type from the Order Product object,
the consultant should place the attributes for product color and product type in a single container.
This way, the segment will include only the customers who have purchased black pants, and not
those who have purchased black shirts or blue pants. A container is a grouping of attributes that
defines a segment of individuals based on a logical AND operation. Placing the attributes in separate
containers would result in a segment that includes customers who have purchased any black product
or any pants product, which is not the desired criteria. Placing an attribute for the “black” calculated
insight would not work, because calculated insights are based on aggregated data and not individual-
level data. Placing the attributes as direct attributes would not work, because direct attributes are
used to filter individuals based on their profile data, not their order data. Reference:
Create a Segment in Data Cloud
Learn About Segmentation Tools
Salesforce Launches: Data Cloud Consultant Certification
Question: 81
Cumulus Financial wants to be able to track the daily transaction volume of each of its customers in
real time and send out a notification as soon as it detects volume outside a customer's normal range.
What should a consultant do to accommodate this request?
Answer: C
Explanation:
: A streaming insight is a type of insight that analyzes streaming data in real time and triggers actions
based on predefined conditions. A data action is a type of action that executes a flow, a data action
target, or a data action script when an insight is triggered. By using a streaming insight paired with a
data action, a consultant can accommodate Cumulus Financial’s request to track the daily transaction
volume of each customer and send out a notification when the volume is outside the normal range.
A calculated insight is a type of insight that performs calculations on data in a data space and stores
the results in a data extension. A streaming data transform is a type of data transform that applies
transformations to streaming data in real time and stores the results in a data extension. A flow is a
type of automation that executes a series of actions when triggered by an event, a schedule, or
another flow. None of these options can achieve the same functionality as a streaming insight paired
with a data action. Reference: Use Insights in Data Cloud Unit, Streaming Insights and Data Actions
Use Cases, Streaming Insights and Data Actions Limits and Behaviors
Question: 82
Cumulus Financial uses calculated insights to compute the total banking value per branch for its high
net worth customers. In the
calculated insight, "banking value" is a metric, "branch" is a dimension, and "high net worth" is a
filter.
https://www.pass4success.com
Questions & Answers PDF P-53
Answer: D
Explanation:
According to the Salesforce Data Cloud documentation, an attribute is a dimension or a measure that
can be used in activation. A dimension is a categorical variable that can be used to group or filter
data, such as branch, region, or product. A measure is a numerical variable that can be used to
calculate metrics, such as revenue, profit, or count. A filter is a condition that can be applied to limit
the data that is used in a calculated insight, such as high net worth, age range, or gender. In this
question, the calculated insight uses “banking value” as a metric, which is a measure, and “branch”
as a dimension. Therefore, only “branch” can be included as an attribute in activation, since it is a
dimension. The other options are either measures or filters, which are not
attributes. Reference: Data Cloud Permission Sets, Salesforce Data Cloud Exam Questions
Question: 83
Cloud Kicks wants to be able to build a segment of customers who have visited its website within the
previous 7 days.
Which filter operator on the Engagement Date field fits this use case?
A. Is Between
B. Greater than Last Number of
C. Next Number of Days
D. Last Number of Days
Answer: D
Explanation:
: The filter operator Last Number of Days allows you to filter on date fields using a relative date range
that specifies the number of days before today. For example, you can use this operator to filter on
customers who have visited your website in the last 7 days, or the last 30 days, or any number of
days you want. This operator is useful for creating dynamic segments that update automatically
based on the current date12. Reference:
Relative Date Filter Reference
Create Filtered Segments
Question: 84
The Salesforce CRM Connector is configured and the Case object data stream is set up. Subsequently,
a new custom field named Business Priority is created on the Case object in Salesforce CRM.
However, the new field is not available when trying to add it to the data stream.
https://www.pass4success.com
Questions & Answers PDF P-54
A. The Salesforce Integration User Is missing Rad permissions on the newly created field.
B. The Salesforce Data Loader application should be used to perform a bulk upload from a desktop.
C. Custom fields on the Case object are not supported for ingesting into Data Cloud.
D. After 24 hours when the data stream refreshes it will automatically include any new fields that
were added to the Salesforce CRM.
Answer: A
Explanation:
The Salesforce CRM Connector uses the Salesforce Integration User to access the data from the
Salesforce CRM org. The Integration User must have the Read permission on the fields that are
included in the data stream. If the Integration User does not have the Read permission on the newly
created field, the field will not be available for selection in the data stream configuration. To resolve
this issue, the administrator should assign the Read permission on the new field to the Integration
User profile or permission set. Reference: Create a Salesforce CRM Data Stream, Edit a Data
Stream, Salesforce Data Cloud Full Refresh for CRM, SFMC, or Ingestion API Data Streams
Question: 85
A. Identity Resolution
B. Query APL
C. Data Explorer
D. Profile Explorer
E. Data Actions
Answer: A, C, D
Explanation:
To validate the data on a unified profile, the consultant can use the following features:
Identity Resolution: This feature allows the consultant to view and edit the identity resolution
rulesets that determine how individuals are unified from different data sources1.
Data Explorer: This feature allows the consultant to browse and filter the unified profiles and view
their attributes, segments, and activities2.
Profile Explorer: This feature allows the consultant to drill down into a specific unified profile and
view its details, such as source records, identity graph, calculated insights, and data
actions3. Reference:
1: Identity Resolution in Data Cloud
2: Data Explorer in Data Cloud
3: Profile Explorer in Data Cloud
Question: 86
https://www.pass4success.com
Questions & Answers PDF P-55
A Data Cloud consultant recently discovered that their identity resolution process is matching
individuals that share email addresses or phone numbers, but are not actually the same individual.
What should the consultant do to address this issue?
A. Modify the existing ruleset with stricter matching criteria, run the ruleset and review the updated
results, then adjust as needed until the individuals are matching correctly.
B. Create and run a new rules fewer matching rules, compare the two rulesets to review and verify
the results, and then migrate to the new ruleset once approved.
C. Create and run a new ruleset with stricter matching criteria, compare the two rulesets to review
and verify the results, and then migrate to the new ruleset once approved.
D. Modify the existing ruleset with stricter matching criteria, compare the two rulesets to review and
verify the results, and then migrate to the new ruleset once approved.
Answer: C
Explanation:
: Identity resolution is the process of linking source profiles from different data sources into unified
individual profiles based on match and reconciliation rules. If the identity resolution process is
matching individuals that share email addresses or phone numbers, but are not actually the same
individual, it means that the match rules are too loose and need to be refined. The best way to
address this issue is to create and run a new ruleset with stricter matching criteria, such as adding
more attributes or increasing the match score threshold. Then, the consultant can compare the two
rulesets to review and verify the results, and see if the new ruleset reduces the false positives and
improves the accuracy of the identity resolution. Once the new ruleset is approved, the consultant
can migrate to the new ruleset and delete the old one. The other options are incorrect because
modifying the existing ruleset can affect the existing unified profiles and cause data loss or
inconsistency. Creating and running a new ruleset with fewer matching rules can increase the false
negatives and reduce the coverage of the identity resolution. Reference: Create Unified Individual
Profiles, AI-based Identity Resolution: Linking Diverse Customer Data, Data Cloud Identiy Resolution.
Question: 87
A. Subscriber
B. Unified Individual
C. Unified Contact
D. Individual
Answer: B
Explanation:
: The correct answer is B, Unified Individual. A Unified Individual is a record that represents a
https://www.pass4success.com
Questions & Answers PDF P-56
customer across different data sources, created by applying identity resolution rulesets. Identity
resolution rulesets are sets of match and reconciliation rules that define how to link and merge data
from different sources based on common attributes. Data Cloud uses identity resolution rulesets to
resolve data across multiple data sources and helps you create one record for each customer,
regardless of where the data came from1. A retail customer who wants to bring customer data from
different sources and use identity resolution for segmentation should segment on the Unified
Individual entity, which contains the resolved and consolidated customer data. The other options are
incorrect because they do not represent the resolved customer data across different sources. A
Subscriber is a record that represents a customer who has opted in to receive marketing
communications. A Unified Contact is a record that represents a customer who has a relationship
with a specific business unit. An Individual is a record that represents a customer’s profile data from
a single data source. Reference:
Identity Resolution Ruleset Processing Results
Consider Data Implications for Segmentation
Prepare for your Salesforce Data Cloud Consultant Credential
AI-based Identity Resolution: Linking Diverse Customer Data
Question: 88
A consultant is reviewing a recent activation using engagement-based related attributes but is not
seeing any related attributes in their payload for the majority of their segment members.
Which two areas should the consultant review to help troubleshoot this issue?
Choose 2 answers
Answer:A C
Engagement-based related attributes are attributes that describe the interactions of a person with
an email message, such as opens, clicks, unsubscribes, etc. These attributes are stored in the
Engagement data model object (DMO) and can be added to an activation to send more personalized
communications. However, there are some considerations and limitations when using engagement-
based related attributes, such as:
For engagement data, activation supports a 90-day lookback window. This means that only the
attributes from the engagement events that occurred within the last 90 days are considered for
activation. Any records outside of this window are not included in the activation payload. Therefore,
the consultant should review the event time of the related engagement events and make sure they
are within the lookback window.
The correct path to the related attributes must be selected for the activation. A path is a sequence of
DMOs that are connected by relationships in the data model. For example, the path from Individual
to Engagement is Individual -> Email -> Engagement. The path determines which related attributes
are available for activation and how they are filtered. Therefore, the consultant should review the
path selection and make sure it matches the desired related attributes and filters.
The other two options are not relevant for this issue. The activations can reference segments that
https://www.pass4success.com
Questions & Answers PDF P-57
segment on profile data rather than engagement data, as long as the activation target supports
related attributes. The activated profiles do not need to have a Unified Contact Point, which is a
unique identifier for a person across different data sources, to activate engagement-based related
attributes. Reference: Add Related Attributes to an Activation, Related Attributes in Data Cloud
activation have no values, Explore the Engagement Data Model Object
Question: 89
How does identity resolution select attributes for unified individuals when there Is conflicting
information in the data model?
Answer: B
Explanation:
Identity resolution is the process of creating unified profiles of individuals by matching and merging
data from different sources. When there is conflicting information in the data model, such as
different names, addresses, or phone numbers for the same person, identity resolution leverages
reconciliation rules to select the most accurate and complete attributes for the unified profile.
Reconciliation rules are configurable rules that define how to resolve conflicts based on criteria such
as recency, frequency, source priority, or completeness. For example, a reconciliation rule can specify
that the most recent name or the most frequent phone number should be selected for the unified
profile. Reconciliation rules can be applied at the attribute level or the contact point
level. Reference: Identity Resolution, Reconciliation Rules, Salesforce Data Cloud Exam Questions
Question: 90
A. Text
B. Number
C. Decimal
D. Serial
Answer: A
Explanation:
The field type Text should be chosen to ensure that leading zeros in the purchase order number are
preserved. This is because text fields store alphanumeric characters as strings, and do not remove
any leading or trailing characters. On the other hand, number, decimal, and serial fields store
numeric values as numbers, and automatically remove any leading zeros when displaying or
https://www.pass4success.com
Questions & Answers PDF P-58
exporting the data123. Therefore, text fields are more suitable for storing data that needs to retain its
original format, such as purchase order numbers, zip codes, phone numbers, etc. Reference:
Zeros at the start of a field appear to be omitted in Data Exports
Keep First ‘0’ When Importing a CSV File
Import and export address fields that begin with a zero or contain a plus symbol
Question: 91
Which statement about Data Cloud's Web and Mobile Application Connector is true?
A. A standard schema containing event, profile, and transaction data is created at the time the
connector is configured.
B. The Tenant Specific Endpoint is auto-generated in Data Cloud when setting the connector.
C. Any data streams associated with the connector will be automatically deleted upon deleting the
app from Data Cloud Setup.
D. The connector schema can be updated to delete an existing field.
Answer: B
Explanation:
The Web and Mobile Application Connector allows you to ingest data from your websites and mobile
apps into Data Cloud. To use this connector, you need to set up a Tenant Specific Endpoint (TSE) in
Data Cloud, which is a unique URL that identifies your Data Cloud org. The TSE is auto-generated
when you create a connector app in Data Cloud Setup. You can then use the TSE to configure the
SDKs for your websites and mobile apps, which will send data to Data Cloud through the
TSE. Reference: Web and Mobile Application Connector, Connect Your Websites and Mobile
Apps, Create a Web or Mobile App Data Stream
Question: 92
Answer: A, D
Explanation:
To package Data Cloud components from one organization to another, the consultant should include
the following components in a data kit:
Data model objects: These are the custom objects that define the data model for Data Cloud, such as
https://www.pass4success.com
Questions & Answers PDF P-59
Individual, Segment, Activity, etc. They store the data ingested from various sources and enable the
creation of unified profiles and segments1.
Identity resolution rulesets: These are the rules that determine how data from different sources are
matched and merged to create unified profiles. They specify the criteria, logic, and priority for
identity resolution2. Reference:
1: Data Model Objects in Data Cloud
2: Identity Resolution Rulesets in Data Cloud
Question: 93
Answer: B
Explanation:
: A calculated insight is a multidimensional metric that is defined and calculated from data using SQL
expressions. A calculated insight can include dimensions and measures. Dimensions are the fields
that are used to group or filter the data, such as customer ID, product category, or region. Measures
are the fields that are used to perform calculations or aggregations, such as revenue, quantity, or
average order value. A calculated insight can be modified by editing the SQL expression or changing
the data space. However, the consultant needs to be aware of the following limitations and
considerations when modifying a calculated insight12:
Existing dimensions cannot be removed. If a dimension is removed from the SQL expression, the
calculated insight will fail to run and display an error message. This is because the dimension is used
to create the primary key for the calculated insight object, and removing it will cause a conflict with
the existing data. Therefore, the correct answer is B.
New dimensions can be added. If a dimension is added to the SQL expression, the calculated insight
will run and create a new field for the dimension in the calculated insight object. However, the
consultant should be careful not to add too many dimensions, as this can affect the performance and
usability of the calculated insight.
Existing measures can be removed. If a measure is removed from the SQL expression, the calculated
insight will run and delete the field for the measure from the calculated insight object. However, the
consultant should be aware that removing a measure can affect the existing segments or activations
that use the calculated insight.
New measures can be added. If a measure is added to the SQL expression, the calculated insight will
run and create a new field for the measure in the calculated insight object. However, the consultant
should be careful not to add too many measures, as this can affect the performance and usability of
the calculated insight. Reference: Calculated Insights, Calculated Insights in a Data Space.
Question: 94
https://www.pass4success.com
Questions & Answers PDF P-60
A user has built a segment in Data Cloud and is in the process of creating an activation. When
selecting related attributes, they cannot find a specific set of attributes they know to be related to
the
individual.
Which statement explains why these attributes are not available?
Answer: C
Explanation:
The correct answer is C, the desired attributes reside on different related paths. When creating an
activation in Data Cloud, you can select related attributes from data model objects that are linked to
the segment entity. However, not all related attributes are available for every activation. The
availability of related attributes depends on the container path, which is the sequence of data model
objects that connects the segment entity to the related entity. For example, if you segment on the
Unified Individual entity, you can select related attributes from the Order Product entity, but only if
the container path is Unified Individual > Order > Order Product. If the container path is Unified
Individual > Order Line Item > Order Product, then the related attributes from Order Product are not
available for activation. This is because Data Cloud only supports one-to-many relationships for
related attributes, and Order Line Item is a many-to-many junction object between Order and Order
Product. Therefore, you need to ensure that the desired attributes reside on the same related path as
the segment entity, and that the path does not include any many-to-many junction objects. The
other options are incorrect because they do not explain why the related attributes are not available.
The segment entity can be any data model object, not just profile data. The attributes are not
restricted by being used in another activation. Activations can include one-to-many attributes, not
just one-to-one attributes. Reference:
Related Attributes in Activation
Considerations for Selecting Related Attributes
Salesforce Launches: Data Cloud Consultant Certification
Create a Segment in Data Cloud
Question: 95
A. Direct attributes
B. Data stream attributes
C. Calculated Insights
D. Related attributes
E. Streaming insights
https://www.pass4success.com
Questions & Answers PDF P-61
Answer: A, C, D
Explanation:
A segment is a subset of individuals who meet certain criteria based on their attributes and
behaviors. A consultant can use different types of criteria when building a segment in Data Cloud,
such as:
Direct attributes: These are attributes that describe the characteristics of an individual, such as
name, email, gender, age, etc. These attributes are stored in the Profile data model object (DMO)
and can be used to filter individuals based on their profile data.
Calculated Insights: These are insights that perform calculations on data in a data space and store the
results in a data extension. These insights can be used to segment individuals based on metrics or
scores derived from their data, such as customer lifetime value, churn risk, loyalty tier, etc.
Related attributes: These are attributes that describe the relationships of an individual with other
DMOs, such as Email, Engagement, Order, Product, etc. These attributes can be used to segment
individuals based on their interactions or transactions with different entities, such as email opens,
clicks, purchases, etc.
The other two options are not valid criteria for building a segment in Data Cloud. Data stream
attributes are attributes that describe the streaming data that is ingested into Data Cloud from
various sources, such as Marketing Cloud, Commerce Cloud, Service Cloud, etc. These attributes are
not directly available for segmentation, but they can be transformed and stored in data extensions
using streaming data transforms. Streaming insights are insights that analyze streaming data in real
time and trigger actions based on predefined conditions. These insights are not used for
segmentation, but for activation and personalization. Reference: Create a Segment in Data
Cloud, Use Insights in Data Cloud, Data Cloud Data Model
Question: 96
A consultant is planning the ingestion of a data stream that has profile information including a
mobile phone number.
To ensure that the phone number can be used for future SMS campaigns, they need to confirm the
phone number field is in the
proper E164 Phone Number format. However, the phone numbers in the file appear to be in varying
formats.
What is the most efficient way to guarantee that the various phone number formats are
standardized?
Answer: C
Explanation:
The most efficient way to guarantee that the various phone number formats are standardized is to
assign the PhoneNumber field type when creating the data stream. The PhoneNumber field type is a
special field type that automatically converts phone numbers into the E164 format, which is the
https://www.pass4success.com
Questions & Answers PDF P-62
international standard for phone numbers. The E164 format consists of a plus sign (+), the country
code, and the national number. For example, +1-202-555-1234 is the E164 format for a US phone
number. By using the PhoneNumber field type, the consultant can ensure that the phone numbers
are consistent and can be used for future SMS campaigns. The other options are either more time-
consuming, require manual intervention, or do not address the formatting issue. Reference: Data
Stream Field Types, E164 Phone Number Format, Salesforce Data Cloud Exam Questions
Question: 97
A user Is not seeing suggested values from newly-modeled data when building a segment.
What is causing this issue?
A. Value suggestion will only return results for the first 50 values of a specific attribute,
B. Value suggestion can only work on direct attributes and not related attributes.
C. Value suggestion requires Data Aware Specialist permissions at a minimum.
D. Value suggestion is still processing and takes up to 24 hours to be available.
Answer: D
Explanation:
The most likely cause of this issue is that value suggestion is still processing and takes up to 24 hours
to be available. Value suggestion is a feature that enables you to see suggested values for data model
object (DMO) fields when creating segment filters. However, this feature needs to be enabled for
each DMO field, and it can take up to 24 hours for the suggested values to appear after enabling the
feature1. Therefore, if a user is not seeing suggested values from newly-modeled data, it could be
that the data has not been processed yet by the value suggestion feature. Reference:
Use Value Suggestions in Segmentation
Question: 98
A customer has outlined requirements to trigger a journey for an abandoned browse behavior. Based
on the requirements, the consultant determines they will use streaming insights to trigger a data
action to Journey Builder every hour.
How should the consultant configure the solution to ensure the data action is triggered at the
cadence required?
Answer: D
Explanation:
Streaming insights are computed from real-time engagement events and can be used to trigger data
actions based on pre-set rules. Data actions are workflows that send data from Data Cloud to other
systems, such as Journey Builder. To ensure that the data action is triggered every hour, the
https://www.pass4success.com
Questions & Answers PDF P-63
consultant should set the insights aggregation time window to 1 hour. This means that the streaming
insight will evaluate the events that occurred within the last hour and execute the data action if the
conditions are met. The other options are not relevant for streaming insights and data
actions. Reference: Streaming Insights and Data Actions Limits and Behaviors, Streaming
Insights, Streaming Insights and Data Actions Use Cases, Use Insights in Data Cloud, 6 Ways the
Latest Marketing Cloud Release Can Boost Your Campaigns
Question: 99
A consultant is helping a beauty company ingest its profile data into Data Cloud. The company’s
source data includes several fields, such as eye color, skin type, and hair color, that are not fields in
the standard Individual data model object (DMO).
What should the consultant recommend to map this data to be used for both segmentation and
identity resolution?
A. Create a custom DMO from scratch that has all fields that are needed.
B. Create a custom DMO with only the additional fields and map it to the standard Individual DMO.
C. Create custom fields on the standard Individual DMO.
D. Duplicate the standard Individual DMO and add the additional fields.
Answer: C
Explanation:
The best option to map the data to be used for both segmentation and identity resolution is to create
custom fields on the standard Individual DMO. This way, the consultant can leverage the existing
fields and functionality of the Individual DMO, such as identity resolution rulesets, calculated
insights, and data actions, while adding the additional fields that are specific to the beauty
company’s data1. Creating a custom DMO from scratch or duplicating the standard Individual DMO
would require more effort and maintenance, and might not be compatible with the existing features
of Data Cloud. Creating a custom DMO with only the additional fields and mapping it to the standard
Individual DMO would create unnecessary complexity and redundancy, and might not allow the use
of the custom fields for identity resolution. Reference:
1: Data Model Objects in Data Cloud
Question: 100
The recruiting team at Cumulus Financial wants to identify which candidates have browsed the jobs
page on its website at least twice within the last 24 hours. They want the information about these
candidates to be available for segmentation in Data Cloud and the candidates added to their
recruiting system.
Which feature should a consultant recommend to achieve this goal?
https://www.pass4success.com
Questions & Answers PDF P-64
Answer: B
Explanation:
A streaming insight is a feature that allows users to create and monitor real-time metrics from
streaming data sources, such as web and mobile events. A streaming insight can also trigger data
actions, such as sending notifications, creating records, or updating fields, based on the metric values
and conditions. Therefore, a streaming insight is the best feature to achieve the goal of identifying
candidates who have browsed the jobs page on the website at least twice within the last 24 hours,
and adding them to the recruiting system. The other options are incorrect because:
A streaming data transform is a feature that allows users to transform and enrich streaming data
using SQL expressions, such as filtering, joining, aggregating, or calculating values. However, a
streaming data transform does not provide the ability to monitor metrics or trigger data actions
based on conditions.
A calculated insight is a feature that allows users to define and calculate multidimensional metrics
from data using SQL expressions, such as LTV, CSAT, or average order value. However, a calculated
insight is not suitable for real-time data analysis, as it runs on a scheduled basis and does not support
data actions.
A batch data transform is a feature that allows users to create and schedule complex data
transformations using a visual editor, such as joining, aggregating, filtering, or appending data.
However, a batch data transform is not suitable for real-time data analysis, as it runs on a scheduled
basis and does not support data actions. Reference: Streaming Insights, Create a Streaming
Insight, Use Insights in Data Cloud, Learn About Data Cloud Insights, Data Cloud Insights Using
SQL, Streaming Data Transforms, Get Started with Batch Data Transforms in Data
Cloud, Transformations for Batch Data Transforms, Batch Data Transforms in Data Cloud: Quick
Look, Salesforce Data Cloud: AI CDP.
Question: 101
A customer has multiple team members who create segment audiences that work in different time
zones. One team member works at the home office in the Pacific time zone, that matches the org
Time Zone setting. Another team member works remotely in the Eastern time zone.
Which user will see their home time zone in the segment and activation schedule areas?
Answer: D
Explanation:
The correct answer is D, both team members; Data Cloud adjusts the segment and activation
schedules to the time zone of the logged-in user. Data Cloud uses the time zone settings of the
logged-in user to display the segment and activation schedules. This means that each user will see
the schedules in their own home time zone, regardless of the org time zone setting or the location of
other team members. This feature helps users to avoid confusion and errors when scheduling
https://www.pass4success.com
Questions & Answers PDF P-65
segments and activations across different time zones. The other options are incorrect because they
do not reflect how Data Cloud handles time zones. The team member in the Pacific time zone will not
see the same time zone as the org time zone setting, unless their personal time zone setting matches
the org time zone setting. The team member in the Eastern time zone will not see the schedules in
the org time zone setting, unless their personal time zone setting matches the org time zone setting.
Data Cloud does not show all schedules in GMT, but rather in the user’s local time zone. Reference:
Data Cloud Time Zones
Change default time zones for Users and the organization
Change your time zone settings in Salesforce, Google & Outlook
DateTime field and Time Zone Settings in Salesforce
Question: 102
Cumulus Financial wants its service agents to view a display of all cases associated with a Unified
Individual on a contact record.
Which two features should a consultant consider for this use case?
Choose 2 answers
A. Data Action
B. Profile API
C. Lightning Web Components
D. Query APL
Answer: B, C
Explanation:
A Unified Individual is a profile that combines data from multiple sources using identity resolution
rules in Data Cloud. A Unified Individual can have multiple contact points, such as email, phone, or
address, that link to different systems and records. A consultant can use the following features to
display all cases associated with a Unified Individual on a contact record:
Profile API: This is a REST API that allows you to retrieve and update Unified Individual profiles and
related attributes in Data Cloud. You can use the Profile API to query the cases that are related to a
Unified Individual by using the contact point ID or the unified ID as a filter. You can also use the
Profile API to update the Unified Individual profile with new or modified case information from other
systems.
Lightning Web Components: These are custom HTML elements that you can use to create reusable UI
components for your Salesforce apps. You can use Lightning Web Components to create a custom
component that displays the cases related to a Unified Individual on a contact record. You can use
the Profile API to fetch the data from Data Cloud and display it in a table, list, or chart format. You can
also use Lightning Web Components to enable actions, such as creating, editing, or deleting cases,
from the contact record.
The other two options are not relevant for this use case. A Data Action is a type of action that
executes a flow, a data action target, or a data action script when an insight is triggered. A Data
Action is used for activation and personalization, not for displaying data on a contact record. A Query
APL is a query language that allows you to access and manipulate data in Data Cloud. A Query APL is
used for data exploration and analysis, not for displaying data on a contact record. Reference: Profile
API Developer Guide, Lightning Web Components Developer Guide, Create Unified Individual
Profiles Unit
https://www.pass4success.com
Questions & Answers PDF P-66
Question: 103
A Data Cloud Consultant Is in the process of setting up data streams for a new service-based data
source.
When ingesting Case data, which field is recommended to be associated with the Event Time field?
Answer: A
Explanation:
: The Event Time field is a special field type that captures the timestamp of an event in a data stream.
It is used to track the chronological order of events and to enable time-based segmentation and
activation. When ingesting Case data, the recommended field to be associated with the Event Time
field is the Last Modified Date field. This field reflects the most recent update to the case and can be
used to measure the case duration, resolution time, and customer satisfaction. The other fields, such
as Resolution Date, Escalation Date, or Creation Date, are not as suitable for the Event Time field, as
they may not capture the latest status of the case or may not be applicable for all
cases. Reference: Data Stream Field Types, Salesforce Data Cloud Exam Questions
Question: 104
Northern Trail Outfitters uses B2C Commerce and is exploring implementing Data Cloud to get a
unified view of its customers and all their order transactions.
What should the consultant keep in mind with regard to historical data ingesting order data using the
B2C Commerce Order Bundle?
Answer: C
Explanation:
The B2C Commerce Order Bundle is a data bundle that creates a data stream to flow order data from
a B2C Commerce instance to Data Cloud. However, this data bundle does not ingest any historical
data and only ingests new orders from the time the data stream is created. Therefore, if a consultant
wants to ingest historical order data, they need to use a different method, such as exporting the data
from B2C Commerce and importing it to Data Cloud using a CSV file12. Reference:
Create a B2C Commerce Data Bundle
Data Access and Export for B2C Commerce and Commerce Marketplace
https://www.pass4success.com
Questions & Answers PDF P-67
https://www.pass4success.com
Questions & Answers PDF P-68
https://www.pass4success.com/DATA-CLOUD-
CONSULTANT.html
https://www.pass4success.com