Salesforce Data-Cloud-Consultant Exam (page: 4)
Salesforce Certified Data Cloud Consultant
Updated on: 25-Dec-2025

Northern Trail Outfitters (NTO), an outdoor lifestyle clothing brand, recently started a new line of business. The new business specializes in gourmet camping food. For business reasons as well as security reasons, it's important to NTO to keep all Data Cloud data separated by brand.
Which capability best supports NTO's desire to separate its data by brand?

  1. Data streams for each brand
  2. Data model objects for each brand
  3. Data spaces for each brand
  4. Data sources for each brand

Answer(s): C

Explanation:

Data spaces are logical containers that allow you to separate and organize your data by different criteria, such as brand, region, product, or business unit. Data spaces can help you manage data access, security, and governance, as well as enable cross-cloud data integration and activation. For NTO, data spaces can support their desire to separate their data by brand, so that they can have different data models, rules, and insights for their outdoor lifestyle clothing and gourmet camping food businesses. Data spaces can also help NTO comply with any data privacy and security regulations that may apply to their different brands. The other options are incorrect because they do not provide the same level of data separation and organization as data spaces. Data streams are used to ingest data from different sources into Data Cloud, but they do not separate the data by brand. Data model objects are used to define the structure and attributes of the data, but they do not isolate the data by brand. Data sources are used to identify the origin and type of the data, but they do not partition the data by brand.


Reference:

Data Spaces Overview, Create Data Spaces, Data Privacy and Security in Data Cloud, Data Streams Overview, Data Model Objects Overview, [Data

Sources Overview]



Cumulus Financial created a segment called High Investment Balance Customers. This is a foundational segment that includes several segmentation criteria the marketing team should consistently use.
Which feature should the consultant suggest the marketing team use to ensure this consistency when creating future, more refined segments?

  1. Create new segments using nested segments.
  2. Create a High Investment Balance calculated insight.
  3. Package High Investment Balance Customers in a data kit.
  4. Create new segments by cloning High Investment Balance Customers.

Answer(s): A

Explanation:

Nested segments are segments that include or exclude one or more existing segments. They allow the marketing team to reuse filters and maintain consistency in their data by using an existing segment to build a new one. For example, the marketing team can create a nested segment that includes High Investment Balance Customers and excludes customers who have opted out of email marketing. This way, they can leverage the foundational segment and apply additional criteria without duplicating the rules. The other options are not the best features to ensure consistency because:
B . A calculated insight is a data object that performs calculations on data lake objects or CRM data and returns a result. It is not a segment and cannot be used for activation or personalization. C . A data kit is a bundle of packageable metadata that can be exported and imported across Data Cloud orgs. It is not a feature for creating segments, but rather for sharing components. D . Cloning a segment creates a copy of the segment with the same rules and filters. It does not allow the marketing team to add or remove criteria from the original segment, and it may create confusion and redundancy.


Reference:

Create a Nested Segment - Salesforce, Save Time with Nested Segments (Generally Available) - Salesforce, Calculated Insights - Salesforce, Create and Publish a Data Kit Unit | Salesforce Trailhead, Create a Segment in Data Cloud - Salesforce



Cumulus Financial uses Service Cloud as its CRM and stores mobile phone, home phone, and work phone as three separate fields for its customers on the Contact record. The company plans to use Data Cloud and ingest the Contact object via the CRM Connector.
What is the most efficient approach that a consultant should take when ingesting this data to ensure all the different phone numbers are properly mapped and available for use in activation?

  1. Ingest the Contact object and map the Work Phone, Mobile Phone, and Home Phone to the Contact Point Phone data map object from the Contact data stream.
  2. Ingest the Contact object and use streaming transforms to normalize the phone numbers from the Contact data stream into a separate Phone data lake object (DLO) that contains three rows, and then map this new DLO to the Contact Point Phone data map object.
  3. Ingest the Contact object and then create a calculated insight to normalize the phone numbers, and then map to the Contact Point Phone data map object.
  4. Ingest the Contact object and create formula fields in the Contact data stream on the phone numbers, and then map to the Contact Point Phone data map object.

Answer(s): B

Explanation:

The most efficient approach that a consultant should take when ingesting this data to ensure all the different phone numbers are properly mapped and available for use in activation is B. Ingest the

Contact object and use streaming transforms to normalize the phone numbers from the Contact data stream into a separate Phone data lake object (DLO) that contains three rows, and then map this new DLO to the Contact Point Phone data map object. This approach allows the consultant to use the streaming transforms feature of Data Cloud, which enables data manipulation and transformation at the time of ingestion, without requiring any additional processing or storage. Streaming transforms can be used to normalize the phone numbers from the Contact data stream, such as removing spaces, dashes, or parentheses, and adding country codes if needed. The normalized phone numbers can then be stored in a separate Phone DLO, which can have one row for each phone number type (work, home, mobile). The Phone DLO can then be mapped to the Contact Point Phone data map object, which is a standard object that represents a phone number associated with a contact point. This way, the consultant can ensure that all the phone numbers are available for activation, such as sending SMS messages or making calls to the customers. The other options are not as efficient as option B. Option A is incorrect because it does not normalize the phone numbers, which may cause issues with activation or identity resolution. Option C is incorrect because it requires creating a calculated insight, which is an additional step that consumes more resources and time than streaming transforms. Option D is incorrect because it requires creating formula fields in the Contact data stream, which may not be supported by the CRM Connector or may cause conflicts with the existing fields in the Contact object.


Reference:

Salesforce Data Cloud Consultant Exam Guide, Data Ingestion and Modeling, Streaming Transforms, Contact Point Phone



A customer has a Master Customer table from their CRM to ingest into Data Cloud. The table contains a name and primary email address, along with other personally Identifiable information (Pll).
How should the fields be mapped to support identity resolution?

  1. Create a new custom object with fields that directly match the incoming table.
  2. Map all fields to the Customer object.
  3. Map name to the Individual object and email address to the Contact Phone Email object.
  4. Map all fields to the Individual object, adding a custom field for the email address.

Answer(s): C

Explanation:

To support identity resolution in Data Cloud, the fields from the Master Customer table should be mapped to the standard data model objects that are designed for this purpose. The Individual object is used to store the name and other personally identifiable information (PII) of a customer, while the Contact Phone Email object is used to store the primary email address and other contact information of a customer. These objects are linked by a relationship field that indicates the contact information belongs to the individual. By mapping the fields to these objects, Data Cloud can use the identity resolution rules to match and reconcile the profiles from different sources based on the name and email address fields. The other options are not recommended because they either create a new custom object that is not part of the standard data model, or map all fields to the Customer object that is not intended for identity resolution, or map all fields to the Individual object that does not have a standard email address field.


Reference:

Data Modeling Requirements for Identity Resolution, Create Unified Individual Profiles



Cloud Kicks received a Request to be Forgotten by a customer. In which two ways should a consultant use Data Cloud to honor this request? Choose 2 answers

  1. Delete the data from the incoming data stream and perform a full refresh.
  2. Add the Individual ID to a headerless file and use the delete from file functionality.
  3. Use Data Explorer to locate and manually remove the Individual.
  4. Use the Consent API to suppress processing and delete the Individual and related records from source data streams.

Answer(s): B,D

Explanation:

To honor a Request to be Forgotten by a customer, a consultant should use Data Cloud in two ways:
Add the Individual ID to a headerless file and use the delete from file functionality. This option allows the consultant to delete multiple Individuals from Data Cloud by uploading a CSV file with their IDs. The deletion process is asynchronous and can take up to 24 hours to complete. Use the Consent API to suppress processing and delete the Individual and related records from source data streams. This option allows the consultant to submit a Data Deletion request for an Individual profile in Data Cloud using the Consent API2. A Data Deletion request deletes the specified Individual entity and any entities where a relationship has been defined between that entity's identifying attribute and the Individual ID attribute. The deletion process is reprocessed at 30, 60, and 90 days to ensure a full deletion. The other options are not correct because:
Deleting the data from the incoming data stream and performing a full refresh will not delete the existing data in Data Cloud, only the new data from the source system. Using Data Explorer to locate and manually remove the Individual will not delete the related records from the source data streams, only the Individual entity in Data Cloud.


Reference:

Delete Individuals from Data Cloud
Requesting Data Deletion or Right to Be Forgotten
Data Refresh for Data Cloud
[Data Explorer]



Viewing Page 4 of 35



Share your comments for Salesforce Data-Cloud-Consultant exam with other users:

LOL what a joke 9/10/2023 9:09:00 AM

some of the answers are incorrect, i would be wary of using this until an admin goes back and reviews all the answers
UNITED STATES


Muhammad Rawish Siddiqui 12/9/2023 7:40:00 AM

question # 267: federated operating model is also correct.
SAUDI ARABIA


Mayar 9/22/2023 4:58:00 AM

its helpful alot.
Anonymous


Sandeep 7/25/2022 11:58:00 PM

the questiosn from this braindumps are same as in the real exam. my passing mark was 84%.
INDIA


Eman Sawalha 6/10/2023 6:09:00 AM

it is an exam that measures your understanding of cloud computing resources provided by aws. these resources are aligned under 6 categories: storage, compute, database, infrastructure, pricing and network. with all of the services and typees of services under each category
GREECE


Mars 11/16/2023 1:53:00 AM

good and very useful
TAIWAN PROVINCE OF CHINA


ronaldo7 10/24/2023 5:34:00 AM

i cleared the az-104 exam by scoring 930/1000 on the exam. it was all possible due to this platform as it provides premium quality service. thank you!
UNITED STATES


Palash Ghosh 9/11/2023 8:30:00 AM

easy questions
Anonymous


Noor 10/2/2023 7:48:00 AM

could you please upload ad0-127 dumps
INDIA


Kotesh 7/27/2023 2:30:00 AM

good content
Anonymous


Biswa 11/20/2023 9:07:00 AM

understanding about joins
Anonymous


Jimmy Lopez 8/25/2023 10:19:00 AM

please upload oracle cloud infrastructure 2023 foundations associate exam braindumps. thank you.
Anonymous


Lily 4/24/2023 10:50:00 PM

questions made studying easy and enjoyable, passed on the first try!
UNITED STATES


John 8/7/2023 12:12:00 AM

has anyone recently attended safe 6.0 exam? did you see any questions from here?
Anonymous


Big Dog 6/24/2023 4:47:00 PM

question 13 should be dhcp option 43, right?
UNITED STATES


B.Khan 4/19/2022 9:43:00 PM

the buy 1 get 1 is a great deal. so far i have only gone over exam. it looks promissing. i report back once i write my exam.
INDIA


Ganesh 12/24/2023 11:56:00 PM

is this dump good
Anonymous


Albin 10/13/2023 12:37:00 AM

good ................
EUROPEAN UNION


Passed 1/16/2022 9:40:00 AM

passed
GERMANY


Harsh 6/12/2023 1:43:00 PM

yes going good
Anonymous


Salesforce consultant 1/2/2024 1:32:00 PM

good questions for practice
FRANCE


Ridima 9/12/2023 4:18:00 AM

need dump and sap notes for c_s4cpr_2308 - sap certified application associate - sap s/4hana cloud, public edition - sourcing and procurement
Anonymous


Tanvi Rajput 10/6/2023 6:50:00 AM

question 11: d i personally feel some answers are wrong.
UNITED KINGDOM


Anil 7/18/2023 9:38:00 AM

nice questions
Anonymous


Chris 8/26/2023 1:10:00 AM

looking for c1000-158: ibm cloud technical advocate v4 questions
Anonymous


sachin 6/27/2023 1:22:00 PM

can you share the pdf
Anonymous


Blessious Phiri 8/13/2023 10:26:00 AM

admin ii is real technical stuff
Anonymous


Luis Manuel 7/13/2023 9:30:00 PM

could you post the link
UNITED STATES


vijendra 8/18/2023 7:54:00 AM

hello send me dumps
Anonymous


Simeneh 7/9/2023 8:46:00 AM

it is very nice
Anonymous


john 11/16/2023 5:13:00 PM

i gave the amazon dva-c02 tests today and passed. very helpful.
Anonymous


Tao 11/20/2023 8:53:00 AM

there is an incorrect word in the problem statement. for example, in question 1, there is the word "speci c". this is "specific. in the other question, there is the word "noti cation". this is "notification. these mistakes make this site difficult for me to use.
Anonymous


patricks 10/24/2023 6:02:00 AM

passed my az-120 certification exam today with 90% marks. studied using the dumps highly recommended to all.
Anonymous


Ananya 9/14/2023 5:17:00 AM

i need it, plz make it available
UNITED STATES