Data Cloud Certification 1 of 3
Data Cloud Certification 1 of 3
Modify the existing ruleset to use fewer matching rules, run the
ruleset and review the updated results, then adjust as needed
until the individuals are matching correctly.
Correct answer
Create and run a new ruleset with stricter matching criteria,
compare the two rulesets to review and verify the results, and
then migrate to the new ruleset once approved.
Create and run a new ruleset with fewer matching rules,
compare the two rulesets to review and verify the results, and
then migrate to the new ruleset once approved.
Modify the existing ruleset with stricter matching criteria, run
the ruleset and review the updated results, then adjust as
needed until the individuals are matching correctly.
Overall explanation
1. Creating and running a new ruleset with stricter matching
criteria allows the consultant to test a more refined approach
without immediately affecting the current system or data. This
approach enables the consultant to ensure that the new rules will
effectively reduce false positives, such as individuals sharing email
addresses or phone numbers but not actually being the same
person.
2. After running the new ruleset, the consultant can compare the
results of the new and old rulesets to verify that the stricter
criteria produce the desired outcomes (i.e., fewer false matches).
3. Once the new ruleset is verified and approved, it can be migrated
to the system, ensuring that the identity resolution process is
more accurate moving forward.
Question 7Skipped
Data Cloud receives a nightly file of all ecommerce transactions
from the previous day. Several segments and activations
depend upon calculated insights from the updated data in order
to maintain accuracy in the customer's scheduled campaign
messages.
What should the consultant do to ensure the ecommerce data is
ready for use for each of the scheduled activations?
Ensure the activations are set to Incremental Activation and
automatically publish every hour.
Correct answer
Use Flow to trigger a change data event on the ecommerce data
to refresh calculated insights and segments before the
activations are scheduled to run.
Set a refresh schedule for the calculated insights to occur every
hour.
Ensure the segments are set to Rapid Publish and set to refresh
every hour.
Overall explanation
The best option that the consultant should do to ensure the ecommerce
data is ready for use for each of the scheduled activations is B. Use Flow
to trigger a change data event on the ecommerce data to refresh
calculated insights and segments before the activations are scheduled to
run. This option allows the consultant to use the Flow feature of Data
Cloud, which enables automation and orchestration of data processing
tasks based on events or schedules. Flow can be used to trigger a
change data event on the ecommerce data, which is a type of event that
indicates that the data has been updated or changed. This event can
then trigger the refresh of the calculated insights and segments that
depend on the ecommerce data, ensuring that they reflect the latest
data. The refresh of the calculated insights and segments can be
completed before the activations are scheduled to run, ensuring that the
customer's scheduled campaign messages are accurate and relevant.
Question 8Skipped
A client wants to bring in loyalty data from a custom object in
Salesforce CRM that contains a point balance for accrued hotel
points and airline points within the same record. The client
wants to split these point systems into two separate records for
better tracking and processing.
What should a consultant recommend in this scenario?
Correct answer
Use batch transforms to create a second data lake object.
Create a junction object in Salesforce CRM and modify the
ingestion strategy.
Clone the data source object.
Create a data kit from the data lake object and deploy it to the
same Data Cloud org.
Overall explanation
1. Batch transforms allow you to process and manipulate data as it
is ingested into Data Cloud. In this case, the client has a custom
object that contains both hotel and airline points in the same
record. To separate these point systems into two distinct records
for better tracking and processing, a batch transform can be
used to split the data into two separate records.
2. This approach involves creating a second data lake object,
where the original record is split into two separate entries: one for
hotel points and one for airline points. These transformations can
be applied on the data at the time of ingestion to meet the client's
requirements.
Question 9Skipped
Which operator should a consultant use to create a segment for
a birthday campaign that is evaluated daily?
Is Today
Is Birthday
Is Between
Correct answer
Is Anniversary Of
Overall explanation
To create a segment for a birthday campaign that is evaluated daily, the
consultant should use the Is Anniversary Of operator. This operator
compares a date field with the current date and returns true if the
month and day are the same, regardless of the year. For example, if the
date field is 1990-01-01 and the current date is 2023-01-01, the operator
returns true. This way, the consultant can create a segment that
includes all the customers who have their birthday on the same day as
the current date, and the segment will be updated daily with the new
birthdays. The other options are not the best operators to use for this
purpose because:
* A. The Is Today operator compares a date field with the current date
and returns true if the date is the same, including the year. For example,
if the date field is 1990-01-01 and the current date is
2023-01-01, the operator returns false. This operator is not suitable for a
birthday campaign, as it will only include the customers who were born
on the same day and year as the current date, which is very unlikely.
Question 10Skipped
A new user of Data Cloud only needs to be able to review
individual rows of ingested data and validate that it has been
modeled successfully to its linked data model object. The user
will also need to make changes if required.
What is the minimum permission set needed to accommodate
this use case?
Data Cloud for Marketing Specialist
Data Cloud Admin
Correct answer
Data Cloud for Marketing Data Aware Specialist
Data Cloud User
Overall explanation
This permission set is specifically designed for users who need to
interact with data in a more detailed manner than a general user. It
includes the capabilities to review, validate, and modify data, matching
the described needs without the broader administrative permissions.
Resources
Salesforce doc
Question 11Skipped
A customer is trying to activate data from Data Cloud to an
Amazon S3 Cloud File Storage Bucket.
Which authentication type should the consultant recommend to
connect to the S3 bucket from Data Cloud?
Use a JWT Token generated on S3.
Use an S3 Private Key Certificate.
Use an S3 Encrypted Username and Password.
Correct answer
Use an S3 Access Key and Secret Key.
Overall explanation
To use the Amazon S3 Storage Connector in Data Cloud, the consultant
needs to provide the S3 bucket name, region, and access key and secret
key for authentication. The access key and secret key are generated by
AWS and can be managed in the IAM console. The other options are not
supported by the S3 Storage Connector or by Data Cloud. References:
Amazon S3 Storage Connector - Salesforce, How to Use the Amazon S3
Storage Connector in Data Cloud
Question 12Skipped
A consultant is discussing the benefits of Data Cloud with a
customer that has multiple disjointed data sources.
Correct selection
Unified Profiles
Correct selection
Data Harmonization
Master Data Management
Data Marketplace
Overall explanation
1. A. Unified Profiles
Unified Profiles refer to the process of consolidating all
customer data (from multiple disjointed sources) into a
single, comprehensive view. Data Cloud's ability to
create unified profiles ensures that data from different
sources is merged to form a complete, accurate customer
profile. This helps in managing customer data effectively and
is especially important when dealing with multiple data
sources.
2. B. Data Harmonization
Data Harmonization involves transforming, standardizing,
and integrating data from different sources so that it can be
used seamlessly together. In cases where data comes from
disjointed sources, data harmonization ensures that the
data is aligned in format and meaning, making it easier to
analyze and act upon. This is critical for creating a reliable,
consistent dataset from disparate sources.
Question 13Skipped
What does it mean to build a trust-based, first-party data asset?
Ensure opt-in consents are collected for all email marketing as
required by law
Correct answer
To provide transparency and security for data gathered from
individuals who provide consent for its use and receive value in
exchange
To obtain competitive data from reliable sources through
interviews, surveys, and polls
To provide trusted, first-party data in the Data Cloud
Marketplace that follows all compliance regulations
Overall explanation
Building a trust-based, first-party data asset means collecting,
managing, and activating data from your own customers and prospects
in a way that respects their privacy and preferences. It also means
providing them with clear and honest information about how you use
their data, what benefits they can expect from sharing their data, and
how they can control their data. By doing so, you can create a mutually
beneficial relationship with your customers, where they trust you to use
their data responsibly and ethically, and you can deliver more relevant
and personalized experiences to them. A trust-based, first-party data
asset can help you improve customer loyalty, retention, and growth, as
well as comply with data protection regulations and standards.
References: Use first-party data for a powerful digital experience, Why
first-party data is the key to data privacy, Build a first-party data
strategy
Question 14Skipped
Northern Trail Outfitters is using the Marketing Cloud Starter
Data Bundles to bring Marketing Cloud data into Data Cloud.
Choose 2 answers
Personalization
Correct selection
MobileConnect
Loyalty Management
Correct selection
MobilePush
Overall explanation
The Marketing Cloud Starter Data Bundles are predefined data bundles
that allow you to easily ingest data from Marketing Cloud into Data
Cloud1. The available datasets in Marketing Cloud Starter Data Bundles
are Email, MobileConnect, and MobilePush2. These datasets contain
engagement events and metrics from different Marketing Cloud
channels, such as email, SMS, and push notifications2. By using these
datasets, you can enrich your Data Cloud data model with Marketing
Cloud data and create segments and activations based on your
marketing campaigns and journeys1. The other options are incorrect
because they are not available datasets in Marketing Cloud Starter Data
Bundles. Option A is incorrect because Personalization is not a dataset,
but a feature of Marketing Cloud that allows you to tailor your content
and messages to your audience3. Option C is incorrect because Loyalty
Management is not a dataset, but a product of Marketing Cloud that
allows you to create and manage loyaltyprograms for your customers4.
References: Marketing Cloud Starter Data Bundles in Data Cloud,
Connect Your Data Sources, Personalization in Marketing Cloud, Loyalty
Management in Marketing Cloud
Question 15Skipped
Northern Trail Outfitters unifies individuals in its Data Cloud
instance.
Choose 3 answers
Identity Resolution
Correct selection
Query API
Correct selection
Data Explorer
Correct selection
Profile Explorer
Data Actions
Overall explanation
These three features can be used to validate the data in the unified
profile object. Data Explorer allows you to view the ingested data from
different sources and how it is mapped to the unified profile object.
Query API allows you to query the unified profile object using SOQL or
SQL queries. Profile Explorer allows you to view the unified profile
records and their attributes. Reference:
https://help.salesforce.com/s/articleView?
id=sf.c360_a_data_explorer.htm&type=5
https://help.salesforce.com/s/articleView?
id=sf.c360_a_query_api.htm&type=5
https://help.salesforce.com/s/articleView?
id=sf.c360_a_profile_explorer.htm&type=5
Question 16Skipped
A consultant is integrating an Amazon S3 activated campaign
with the customer's destination system.
Correct answer
The .json file
The .txt file
The .zip file
The .csv file
Overall explanation
The file on the Amazon S3 that will contain the metadata about the
segment for processing is B. The json file.
The json file is a metadata file that is generated along with the csv file
when a segment is activated to Amazon S3.
The json file contains information such as the segment name, the
segment ID, the segment size, the segment attributes, the segment
filters, and the segment schedule.
The destination system can use this file to identify the segment and its
properties, and to match the segment data with the corresponding fields
in the destination system.
Question 17Skipped
Cumulus Financial uses Data Cloud to segment banking
customers and activate them for direct mail via a Cloud File
Storage activation. The company also wants to analyze
individuals who have been in the segment within the last 2
years.
Nested segments
Segment exclusion
Calculated insightst
Correct answer
Segment membership data model object
Overall explanation
The segment membership data model object is a Data Cloud component
that allows for analyzing individuals who have been in a segment within
a certain time period. The segment membership data model object is a
table that stores the information about which individuals belong to which
segments and when they were added or removed from the segments.
This object can be used to create calculated insights, such as segment
size, segment duration, segment overlap, or segment retention, that can
help measure the effectiveness of segmentation and activation
strategies. The segment membership data model object can also be
used to create nested segments or segment exclusions based on the
segment membership criteria, such as segment name, segment type, or
segment date range. The other options are not correct because they are
not Data Cloud components that allow for analyzing individuals who
have been in a segment within the last 2 years. Nested segments and
segment exclusions are features that allow for creating more complex
segments based on existing segments, but they do not provide the
historical data about segment membership. Calculated insights are
custom metrics or measures that are derived from data model objects or
data lake objects, but they do not store the segment membership
information by themselves. References: Segment Membership Data
Model Object, Create a Calculated Insight, Create a Nested Segment
Question 18Skipped
What should an organization use to stream inventory levels from
an inventory management system into Data Cloud in a fast and
scalable, near-real-time way?
Cloud Storage Connector
Commerce Cloud Connector
Correct answer
Ingestion API
Marketing Cloud Personalization Connector
Overall explanation
The Ingestion API is a RESTful API that allows you to stream data from
any source into Data Cloud in a fast and scalable way. You can use the
Ingestion API to send data from your inventory management system into
Data Cloud as JSON objects, and then use Data Cloud to create data
models, segments, and insights based on your inventory data. The
Ingestion API supports both batch and streaming modes, and can handle
up to 100,000 records per second. The Ingestion API also provides
features such as data validation, encryption, compression, and retry
mechanisms to ensure data quality and security. References: Ingestion
API Developer Guide, Ingest Data into Data Cloud
Question 19Skipped
When creating a segment on an individual, what is the result of
using two separate containers linked by an AND as shown
below?
Question 20Skipped
Northern Trail Outfitters wants to implement Data Cloud and has
several use cases in mind.
Which two use cases are considered a good fit for Data Cloud?
Choose 2 answers
Correct selection
To ingest and unify data from various sources to reconcile
customer identity
To create and orchestrate cross-channel marketing message
Correct selection
To use harmonized data to more accurately understand the
customer and business impact
To eliminate the need for separate business intelligence and IT
data management tools
Overall explanation
Data Cloud is a data platform that can help customers connect, prepare,
harmonize, unify, query, analyze, and act on their data across various
Salesforce and external sources. Some of the use cases that are
considered a good fit for Data Cloud are:
The other two options are not use cases that are considered a good fit
for Data Cloud. Data Cloud does not provide features to create and
orchestrate cross-channel marketing messages, as this is typically
handled by other Salesforce solutions such as Marketing Cloud. Data
Cloud also does not eliminate the need for separate business
intelligence and IT data management tools, as it is designed to work with
them and complement their capabilities.
Question 21Skipped
A Data Cloud consultant recently discovered that their identity
resolution process is matching individuals that share email
addresses or phone numbers, but are not actually the same
individual.
What should the consultant do to address this issue?
Data Encryption:
Consent Management:
Practical Application:
Question 24Skipped
A customer has a Master Customer table from their CRM to
ingest into Data Cloud. The table contains a name and primary
email address, along with other personally Identifiable
information (Pll).
Create a new custom object with fields that directly match the
incoming table.
Map all fields to the Customer object.
Correct answer
Map name to the Individual object and email address to the
Contact Phone Email object.
Map all fields to the Individual object, adding a custom field for
the email address.
Overall explanation
To support identity resolution in Data Cloud, the fields from the Master
Customer table should be mapped to the standard data model objects
that are designed for this purpose. The Individual object is used to store
the name and other personally identifiable information (PII) of a
customer, while the Contact Phone Email object is used to store the
primary email address and other contact information of a customer.
These objects are linked by a relationship field that indicates the contact
information belongs to the individual. By mapping the fields to these
objects, Data Cloud can use the identity resolution rules to match and
reconcile the profiles from different sources based on the name and
email address fields. The other options are not recommended because
they either create a new custom object that is not part of the standard
data model, or map all fields to the Customer object that is not intended
for identity resolution, or map all fields to the Individual object that does
not have a standard email address field. References: Data Modeling
Requirements for Identity Resolution, Create Unified Individual Profiles
Question 25Skipped
Customer needs to integrate in real time with Salesforce CRM.
Correct answer
Streaming transforms.
Data model triggers.
Sales and Service bundle.
Data actions and Lightning web components
Overall explanation
Streaming transforms. Streaming transforms are a feature of Data Cloud
that allows real-time data integration with Salesforce CRM. Streaming
transforms use the Data Cloud Streaming API to synchronize micro-
batches of updates between the CRM data source and Data Cloud in
near-real time1. Streaming transforms enable Data Cloud to have the
most current and accurate CRM data for segmentation and activation2.
Question 26Skipped
Northern Trail Outfitters (NTO) wants to send a promotional
campaign for customers that have purchased within the past 6
months. The consultant created a segment to meet this
requirement.
Batch transforms
Correct answer
Segmentation exclude rules
Related attributes
Streaming insight
Overall explanation
The consultant should use B. Segmentation exclude rules to remove the
recent customers. Segmentation exclude rules are filters that can be
applied to a segment to exclude records that meet certain criteria. The
consultant can use segmentation exclude rules to exclude customers
who have made purchases within the last week from the segment that
contains customers who have purchased within the past 6 months. This
way, the segment will only include customers who are eligible for the
promotional campaign.
The other options are not correct. Option A is incorrect because batch
transforms are data processing tasks that can be applied to data
streams or data lake objects to modify or enrich the data. Batch
transforms are not used for segmentation or activation. Option C is
incorrect because related attributes are attributes that are derived from
the relationships between data model objects. Related attributes are not
used for excluding records from a segment. Option D is incorrect
because streaming insights are derived attributes that are calculated at
the time of data ingestion. Streaming insights are not used for excluding
records from a segment. References: Salesforce Data Cloud Consultant
Exam Guide, Segmentation, Segmentation Exclude Rules
Question 27Skipped
A customer is concerned that the consolidation rate displayed in
the identity resolution is quite low compared to their initial
estimations.
By increasing the number of match rules, you can increase the chances
of finding matches between source profiles and thus increase the
consolidation rate. On the other hand, changing reconciliation rules,
including additional attributes, or reducing the number of match rules
can decrease the consolidation rate, as they can either reduce the
number of matches or increase the number of unified profiles.
References: Identity Resolution Calculated Insight: Consolidation Rates
for Unified Profiles, Identity Resolution Ruleset Processing Results,
Configure Identity Resolution Rulesets
Question 28Skipped
A customer has multiple team members who create segment
audiences that work in different time zones. One team member
works at the home office in the Pacific time zone, that matches
the org Time Zone setting.
Which user will see their home time zone in the segment and
activation schedule areas?
Choose 2 answers
Correct selection
Data deletion requests are submitted for Individual profiles.
Data deletion requests submitted to Data Cloud are passed to
all connected Salesforce clouds.
Correct selection
Data deletion requests are reprocessed at 30, 60, and 90 days.
Data deletion requests are processed within 1 hour.
Overall explanation
A. Data deletion requests are submitted for Individual profiles.
This is correct because the Consent API in Salesforce Data Cloud
allows for handling data deletion requests specifically for Individual
profiles, ensuring compliance with privacy regulations like the right
to be forgotten.
Which feature should the consultant use to solution for this use
case?
Dashboard
Flow
Correct answer
Activation alert
Report
Overall explanation
The feature that the consultant should use to solution for this use case is
C. Activation alert. Activation alerts are notifications that are sent to
users when an activation fails or succeeds for a segment. Activation
alerts can be configured in the Activation Settings page, where the
consultant can specify the recipients, the frequency, and the conditions
for sending the alerts. Activation alerts can help the customer to monitor
the status of their activations and troubleshoot any issues that may
arise. References: Salesforce Data Cloud Consultant Exam Guide,
Activation Alerts
Question 33Skipped
Cumulus Financial created a segment called High Investment
Balance Customers. This is a foundational segment that includes
several segmentation criteria the marketing team should
consistently use.
Question 34Skipped
An administrator wants to be able to create a multi-dimensional
metric to identify unified individual lifetime value (LTV). Which
sequence of DMO joins are necessary within the Calculated
Insight to enable this calculation?
Unified Individual > Individual > Sales Order
Correct answer
Unified Individual > Unified Link Individual > Sales Order
Sales Order > Unified Individual
Sales Order > Individual > Unified Individual
Overall explanation
To create a multi-dimensional metric to identify unified individual
lifetime value (LTV), the administrator needs to join the following data
model objects (DMOs) in the Calculated Insight: Unified Individual: This
DMO represents the unified profile of an individual, which contains
attributes from multiple sources. Unified Link Individual: This DMO
represents the link between an Individual DMO and a Unified Individual
DMO. Sales Order: This DMO represents a transaction or purchase made
by an individual. The sequence of joins should start from the Unified
Individual DMO, then join the Unified Link Individual DMO using the
UnifiedIndividualId field, and then join the Sales Order DMO using the
IndividualId field. This way, the administrator can access the sales order
data for each unified individual and calculate their lifetime value.
Question 35Skipped
Which permission setting should a consultant check if the
custom Salesforce CRM object is not available in New Data
Stream configuration?
Confirm the Create object permission is enabled in the Data
Cloud org.
Correct answer
Confirm the View All object permission is enabled in the source
Salesforce CRM org.
Confirm the Ingest Object permission is enabled in the
Salesforce CRM org.
Confirm that the Modify Object permission is enabled in the
Data Cloud org.
Overall explanation
To create a new data stream from a custom Salesforce CRM object, the
consultant needs to confirm that the View All object permission is
enabled in the source Salesforce CRM org. This permission allows the
user to view all records associated with the object, regardless of sharing
settings1. Without this permission, the custom object will not be
available in the New Data Stream configuration2
Question 36Skipped
Which configuration supports separate Amazon S3 buckets for
data ingestion and activation?
Multiple S3 connectors in Data Cloud setup
Dedicated S3 data sources in Data Cloud setup
Dedicated S3 data sources in activation setup
Correct answer
Separate user credentials for data stream and activation target
Overall explanation
To support separate Amazon S3 buckets for data
ingestion and activation in Salesforce Data Cloud,
configuring separate user credentials for the data stream
(ingestion) and the activation target (output) is essential. By using
distinct credentials, you can assign access permissions for different
buckets, ensuring that:
Data Ingestion uses credentials with access to the source S3
bucket.
Data Activation uses different credentials to access the target S3
bucket.
Question 37Skipped
Luxury Retailers created a segment targeting high value
customers that it activates through Marketing Cloud for email
communication. The company notices that the activated count is
smaller than the segment count.
Correct answer
Data Cloud enforces the presence of Contact Point for Marketing
Cloud activations. If the individual does not have a related
Contact Point, it will not be activated.
Marketing Cloud activations automatically suppress individuals
who are unengaged and have not opened or clicked on an email
in the last six months.
Marketing Cloud activations only activate those individuals that
already exist in Marketing Cloud.
Correct answer
Data Cloud will initiate a full refresh of data from S3 and will
update the formula on all records.
Data Cloud will only update the formula on a go-forward basis
for new records.
Data Cloud does not support formula field updates for data
streams of type upsert.
Data Cloud will update the formula for all records at the next
incremental upsert refresh.
Overall explanation
In the context of S3 Connector and upsert mode, if a formula field is
updated in Data Cloud, it will indeed initiate a full refresh of
data from the S3 bucket and apply the updated formula to all records,
not just new ones.
This is because changing a formula can affect how existing records are
classified or processed, and in order to ensure that the updated formula
is consistently applied across all records, Data Cloud needs to perform a
full refresh and reprocess all the data.
Question 39Skipped
Cumulus Financial created a segment called Multiple
Investments that contains individuals who have invested in two
or more mutual funds.
* D. Including Fund Name and Fund Type by default for post processing
in the target system is not a valid option. The consultant needs to add
the related attributes and filters during the activation configuration in
Data Cloud, not after the data is sent to the target system. References:
Add Related Attributes to an Activation - Salesforce, Related Attributes in
Activation - Salesforce, Prepare for Your Salesforce Data Cloud
Consultant Credential
Question 40Skipped
Which information is provided in a .csv file when activating to
Amazon S3?
The manifest of origin sources within Data Cloud
The metadata regarding the segment definition
Correct answer
The activated data payload
An audit log showing the user who activated the segment and
when it was activated
Overall explanation
When activating data to Amazon S3, a .csv file typically contains the
actual data payload (e.g., customer IDs, segments, or other attributes)
that is being sent to the destination for further processing or storage.
Question 41Skipped
Which two common use cases can be addressed with Data
Cloud?
Choose 2 answers
Govern enterprise data lifecycle through a centralized set of
policies and processes.
Correct selection
Understand and act upon customer data to drive more relevant
experiences.
Safeguard critical business data by serving as a centralized
system for backup and disaster recovery.
Correct selection
Harmonize data from multiple sources with a standardized and
extendable data model.
Overall explanation
Data Cloud is a data platform that can help customers connect, prepare,
harmonize, unify, query, analyze, and act on their data across various
Salesforce and external sources. Some of the common use cases that
can be addressed with Data Cloud are:
Question 42Skipped
Northern Trail Outfitters (NTO) creates a calculated insight to
compute recency, frequency, monetary (RFM) scores on its
unified individuals. NTO then creates a segment based on these
scores that it activates to a Marketing Cloud activation target.
Correct selection
Select contact points.
Add additional attributes.
Correct selection
Choose a segment.
Add the calculated insight in the activation.
Overall explanation
A. Select contact points.
When configuring an activation in Marketing Cloud, selecting the
appropriate contact points (such as email, phone number, or
other identifiers) is necessary to ensure the segment
members can be targeted correctly.
This step ensures that the segment activation matches contacts
with the correct communication channel.
C. Choose a segment.
Data Mapping
Correct answer
Identity Resolution
Data Activation
Calculated Insights
Overall explanation
After ingesting data streams, Identity Resolution is required to unify
and link records across different data sources. This process ensures that
all data related to the same individual is consolidated into a single
unified profile. Only after identity resolution can accurate segmentation
and actions be performed on the data.
A. Data Mapping: This happens earlier during data ingestion to
ensure data aligns with the system's schema.
C. Data Activation: This occurs later, after segmentation, when
the data is sent to activation targets.
D. Calculated Insights: These are optional and used to derive
additional metrics or scores from the unified data.
Question 44Skipped
Which data model subject area should be used for any
Organization, Individual, or Member in the Customer 360 data
model?
Individual
Global Account
Correct answer
Party
Membership
Overall explanation
The party subject area should be used for any organization, individual, or
member in the Customer 360 data model. It includes information such as
name, address, email, phone, and loyalty membership.
References:https://help.salesforce.com/s/articleView?
id=sf.c360_a_data_cloud_party.htm&type=5
Question 45Skipped
The Salesforce CRM Connector is configured and the Case object
data stream is set up. Subsequently, a new custom field named
Business Priority is created on the Case object in Salesforce
CRM. However, the new field is not available when trying to add
it to the data stream.
Correct answer
The Salesforce Integration User Is missing Read permissions on
the newly created field.
The Salesforce Data Loader application should beused to
perform a bulk upload from a desktop.
Customfields on the Case object are not supportedfor ingesting
into Data Cloud.
After 24 hourswhen the data stream refreshesit will
automatically include any new fields that were added to the
Salesforce CRM.
Overall explanation
The Salesforce CRM Connector uses the Salesforce Integration User to
access the data from the Salesforce CRM org. The Integration User must
have the Read permission on the fields that are included in the data
stream. If the Integration User does not have the Read permission on the
newly created field, the field will not be available for selection in the
data stream configuration. To resolve this issue, the administrator
should assign the Read permission on the new field to the Integration
User profile or permission set. References: Create a Salesforce CRM Data
Stream, Edit a Data Stream, Salesforce Data Cloud Full Refresh for CRM,
SFMC, or Ingestion API Data Streams
Question 46Skipped
Cumulus Financial uses Service Cloud as its CRM and stores
mobile phone, home phone, and work phone as three separate
fields for its customers on the Contact record. The company
plans to use Data Cloud and ingest the Contact object via the
CRM Connector.
What is the most efficient approach that a consultant should
take when ingesting this data to ensure all the different phone
numbers are properly mapped and available for use in
activation?
Ingest the Contact object and map the Work Phone, Mobile
Phone, and Home Phone to the Contact Point Phone data map
object from the Contact data stream.
Correct answer
Ingest the Contact object and use streaming transforms to
normalize the phone numbers from the Contact data stream into
a separate Phone data lake object (DLO) that contains three
rows, and then map this new DLO to the Contact Point Phone
data map object.
Ingest the Contact object and then create a calculated insight to
normalize the phone numbers, and then map to the Contact
Point Phone data map object.
Ingest the Contact object and create formula fields in the
Contact data stream on the phone numbers, and then map to
the Contact Point Phone data map object.
Overall explanation
The most efficient approach that a consultant should take when
ingesting this data to ensure all the different phone numbers are
properly mapped and available for use in activation is B. Ingest the
Contact object and use streaming transforms to normalize the phone
numbers from the Contact data stream into a separate Phone data lake
object (DLO) that contains three rows, and then map this new DLO to the
Contact Point Phone data map object. This approach allows the
consultant to use the streaming transforms feature of Data Cloud, which
enables data manipulation and transformation at the time of ingestion,
without requiring any additional processing or storage. Streaming
transforms can be used to normalize the phone numbers from the
Contact data stream, such as removing spaces, dashes, or parentheses,
and adding country codes if needed. The normalized phone numbers can
then be stored in a separate Phone DLO, which can have one row for
each phone number type (work, home, mobile). The Phone DLO can then
be mapped to the Contact Point Phone data map object, which is a
standard object that represents a phone number associated with a
contact point.
This way, the consultant can ensure that all the phone numbers are
available for activation, such as sending SMS messages or making calls
to the customers.
Question 47Skipped
What does the Ignore Empty Value option do in Identity
Resolution?
Ignores Individual object records with empty fields when
running Identity Resolution rules
Ignores empty fields when running any custom match rules
Correct answer
Ignores empty fields when running reconciliation rules
Ignores empty fields when running the standard match rules
Overall explanation
The Ignore Empty Value option is a setting for reconciliation rules, which
determine the logic for data selection in a unified profile7. If this option
is enabled, empty fields are ignored when applying the reconciliation
rule. For example, if the rule is to select the most frequent value, and
one of the values is empty, it will not be counted as a frequency.
Question 48Skipped
What does the Source Sequence reconciliation rule do in identity
resolution?
Identifies which data sources should be used in the process of
reconciliation by prioritizing the most recently updated data
source
Includes data from sources where the data is most frequently
occurring
Correct answer
Sets the priority of specific data sources when building
attributes in a unified profile, such as a first or last name
Identifies which individual records should be merged into a
unified profile by setting a priority for specific data sources
Overall explanation
The Source Sequence reconciliation rule in Identity Resolution
determines the priority order of data sources when building the unified
profile. It specifies which source should take precedence for attributes
(like first name, last name, or email) when multiple sources provide
conflicting or overlapping data. This ensures consistency and accuracy in
the unified profile based on predefined priorities.
Question 49Skipped
Cumulus Financial wants to be able to track the daily
transaction volume of each of its customers in real time and
send out anotification as soon as it detects volume outside a
customer's normal range.
For example, you can use the Sales Order Line Item DMO to associate
each product in an order with its product family, and then use the Sales
Order Revenue DMO to calculate the total revenue or quantity for each
product family in an opportunity. References: Sales Order Subject Area,
Sales Order Revenue DMO Reference
Question 51Skipped
Which method should a consultant use when performing
aggregations in windows of 15 minutes on data collected via the
Interaction SDK or Mobile SDK?
Batch transform
Calculated insight
Correct answer
Streaming insight
Formula fields
Overall explanation
Streaming insight is a method that allows you to perform aggregations
in windows of 15 minutes on data collected via the Interaction SDK or
Mobile SDK. Streaming insight is a feature that enables you to create
real-time metrics and insights based on streaming data from various
sources, such as web, mobile, or IoT devices. Streaming insight allows
you to define aggregation rules, such as count, sum, average, min, max,
or percentile, and apply them to streaming data in time windows of 15
minutes. For example, you can use streaming insight to calculate the
number of visitors, the average session duration, or the conversion rate
for your website or app in 15-minute intervals. Streaming insight also
allows you to visualize and explore the aggregated data in dashboards,
charts, or tables. References: Streaming Insight, Create Streaming
Insights
Question 52Skipped
Northern Trail Outfitters wants to use some of its Marketing
Cloud data in Data Cloud.
Which engagement channel data will require custom
integration?
SMS
Correct answer
CloudPage
Mobile push
Email
Overall explanation
CloudPage is a web page that can be personalized and hosted by
Marketing Cloud. It is not one of the standard engagement channels that
Data Cloud supports out of the box. To use CloudPage data in Data
Cloud, a custom integration is required. The other engagement channels
(SMS, email, and mobile push) are supported by Data Cloud and can be
integrated using the Marketing Cloud Connector or the Marketing Cloud
API. References: Data Cloud Overview, Marketing Cloud Connector,
Marketing Cloud API
Question 53Skipped
Which configuration supports separate Amazon S3 buckets for
data ingestion and activation?
Multiple S3 connectors in Data Cloud setup
Dedicated S3 data sources in Data Cloud setup
Dedicated S3 data sources in activation setup
Correct answer
Separate user credentials for data stream and activation target
Overall explanation
To support separate Amazon S3 buckets for data
ingestion and activation in Salesforce Data Cloud,
configuring separate user credentials for the data stream
(ingestion) and the activation target (output) is essential. By using
distinct credentials, you can assign access permissions for different
buckets, ensuring that:
Data Ingestion uses credentials with access to the source S3
bucket.
Data Activation uses different credentials to access the target S3
bucket.
Question 54Skipped
How does Data Cloud handle an individual's Right to be
Forgotten?
Deletes the specified Individual and records from any data
source object mapped to the Individual data model object.
Correct answer
Deletes the specified Individual and records from any data
model object/data lake object related to the Individual.
Deletes the records from all data source objects, and any
downstream data model objects are updated at the next
scheduled ingestion.
Deletes the specified Individual record and its Unified Individual
Link record.
Overall explanation
Data Cloud handles an individual's Right to be Forgotten by deleting the
specified Individual and records from any data model object/data lake
object related to the Individual. This means that Data Cloud removes all
the data associated with the individual from the data space, including
the data from the source objects, the unified individual profile, and any
related objects. Data Cloud also deletes the Unified Individual Link
record that links the individual to the source records. Data Cloud uses
the Consent API to process the Right to be Forgotten requests, which are
reprocessed at 30, 60, and 90 days to ensure a full deletion.
The other options are not correct descriptions of how Data Cloud handles
an individual's Right to be Forgotten. Data Cloud does not delete the
records from all data source objects, as this would affect the data
integrity and availability of the source systems. Data Cloud also does not
delete only the specified Individual record and its Unified Individual Link
record, as this would leave the source records and the related records
intact. Data Cloud also does not delete only the specified Individual and
records from any data source object mapped to the Individual data
model object, as this would leave the related records intact.
Question 55Skipped
Which two actions can a consultant take during the project planning
phase to ensure client stakeholder goals are met? (Choose two.)
Create scheduled dashboard to be sent weekly to all
stakeholders.
Correct selection
Establish a stakeholder committee and meeting schedule.
Ensure the project key performance indicators are profitable.
Correct selection
Acquire the client stakeholder's key performance indicators.
Overall explanation
1. B. Establish a stakeholder committee and meeting
schedule: Regular communication with stakeholders ensures
alignment, timely updates, and the ability to address concerns or
changes in goals during the project. A well-defined meeting
schedule keeps everyone informed and engaged.
2. D. Acquire the client stakeholder's key performance
indicators (KPIs): Understanding the KPIs is critical to aligning
the project outcomes with the client’s goals. These metrics provide
a clear measure of success and help prioritize project activities to
meet stakeholder expectations.
Question 56Skipped
Cumulus Financial uses calculated insights to compute the total
banking value per branch for its high net worth customers. In
the calculated insight, "banking value" is a metric, "branch" is a
dimension, and "high net worth" is a filter.
Question 57Skipped
A customer requests that their personal data be deleted.
Question 58Skipped
Which data stream category should be assigned to use the data
for time-based operations in segmentation and calculated
insights?
Sales Order
Individual
Transaction
Correct answer
Engagement
Overall explanation
The Engagement data stream category is specifically designed for
time-based operations such as segmentation and calculated insights. It
includes data related to interactions, events, or activities that occur over
time (e.g., clicks, views, or other user engagement metrics). This allows
for time-series analysis, trend tracking, and using temporal data in
segmentation logic or insights.
Question 59Skipped
Northern Trail Outfitters asks its consultant to extract the
runner profiles and activity logs from its Track My Run mobile
app and load them into Data Cloud. The marketing department
also indicates that they need the last 90 days of historical data
and want all new and updated data as it becomes available on a
go-forward basis.
As best practice, which sequence of actions should the
consultant use to implement this request?
Use bulk ingestion to first load the last 90 days of data, and also
subsequently use bulk ingestion to synchronize the future data
as It becomes available.
Use streaming ingestion to first load the last 90 days of data,
and also subsequently use streaming ingestion synchronize
future data as It becomes available.
Use streaming ingestion to first load the last 90 days of data,
and then use bulk Ingestion to synchronize future data as It
becomes available.
Correct answer
Use bulk ingestion to first load the last 90 days of data, and
then use streaming ingestion to synchronize future data as It
becomes available.
Overall explanation
1. Bulk ingestion is best suited for loading large amounts of
historical data efficiently. Since Northern Trail Outfitters needs 90
days of historical data, this should be ingested using a bulk
ingestion process.
2. Streaming ingestion is ideal for capturing real-time
updates and ensuring new and updated data is ingested as it
becomes available. Once the historical data is loaded, streaming
ingestion should be implemented to synchronize and
continuously ingest future records.
Question 60Skipped
A consultant is building a segment to announce a new product
launch for customers that have previously purchased black
pants.
How should the consultant place the attributes for product color
and product type from the Order Product object to meet this
criteria?
Place an attribute for the "black pants" calculated insight to
dynamically apply.
Place the attribute for product color in one container and the
attribute for product type in another container.
Place the attributes for product color and product type as direct
attributes.
Correct answer
Place the attributes for product color and product type in a
single container.
Overall explanation
To meet the criteria of targeting customers who have previously
purchased black pants, placing both the product color and product
type attributes in a single container is the most effective approach.
This ensures that the segment logic can check both attributes together
to identify customers who meet the combined criteria (e.g., product
color = black and product type = pants).
Question 61Skipped
A consultant wants to ensure that every segment managed by
multiple brand teams adheres to the same set of exclusion
criteria, that are updated on a monthly basis.
Question 62Skipped
What should a user do to pause a segment activation with the
intent of using that segment again?
Skip the activation
Correct answer
Stop the publish schedule.
Deactivate the segment.
Delete the segment.
Overall explanation
If a user wants to pause a segment activation but intends to use the
segment again, the best approach is to stop the publish schedule. This
halts the segment's activation process without deleting or deactivating
the segment, allowing the user to resume the activation later or use the
segment as needed.
Question 63Skipped
A consultant is ingesting a list of employees from their human
resources database that they want to segment on.
Which data stream category should the consultant choose when
ingesting this data?
Contact Data
Other Data
Engagement Data
Correct answer
Profile Data
Overall explanation
When ingesting a list of employees from the human resources database
for segmentation, the appropriate data stream category is Profile Data.
This category is used to store and manage information about individuals,
such as employees in this case. Profile data typically includes personal
attributes (name, job title, department, etc.) that can be used for
segmentation.
A. Contact Data: This is generally used for storing contact-
related information (like email addresses or phone numbers), not
detailed personal or employee attributes.
B. Other Data: This category is typically used for non-standard
data that doesn't fit neatly into other categories.
C. Engagement Data: This refers to data related to user
interactions or engagements, like website visits or email opens,
which wouldn't be relevant for employee information.
Thus, Profile Data is the most appropriate category for ingesting and
segmenting employee information.
Question 64Skipped
A consultant wants to build a new audience in Data Cloud.
Correct selection
Related attributes
Correct selection
Direct attributes
Streaming insights
Correct selection
Calculated insights
Data stream attributes
Overall explanation
A. Related attributes These are attributes from related objects or
entities, such as data linked through relationships (e.g., customer
demographic data or transaction details related to an individual).
B. Direct attributes These are attributes directly associated with the
primary object being segmented (e.g., fields on the unified individual
record like age, gender, or location).
Question 65Skipped
Which two dependencies prevent a data stream from being
deleted? (Choose two.)
Correct selection
The underlying data lake object is mapped to a data model
object.
The underlying data lake object is used in segmentation.
Correct selection
The underlying data lake object is used in a data transform.
The underlying data lake object is used in activation.
Overall explanation
To delete a data stream in Data Cloud, the underlying data lake object
(DLO) must not have any dependencies or references to other objects or
processes. The following two dependencies prevent a data stream from
being deleted1:
Data transform: This is a process that transforms the ingested
data into a standardized format and structure for the data model.
A data transform can use one or more DLOs as input or output. If a
DLO is used in a data transform, it cannot be deleted until the data
transform is removed or modified2.