Free Salesforce-Data-Cloud Exam Braindumps (page: 5)

Page 5 of 24

Cloud Kicks has received a Request to be Forgotten by a customer. In which two ways can Data Cloud honor this request?

  1. Use Data Explorer to locate and manually remove the Individual
  2. Use the Consent API to suppress processing and delete the individual and related records from source data streams
  3. Delete the data from the incoming data stream and perform a full refresh
  4. Add the Individual Id to a headerless file and use the delete from file functionality

Answer(s): B,D

Explanation:

These two ways can help Data Cloud honor a request to be forgotten by a customer. The Consent API

allows you to set a consent flag for an individual that prevents further processing of their data and deletes their records from source data streams. The delete from file functionality allows you to upload a file with individual IDs that will be deleted from Data Cloud.


Reference:

https://help.salesforce.com/s/articleView?id=sf.c360_a_data_cloud_consent_api.htm&type=5 https://help.salesforce.com/s/articleView?id=sf.c360_a_data_cloud_delete_from_file.htm&type=5



A customer wants to use the transactional data from their data warehouse in Data Cloud. They are only able to export the data via a SFTP site.
What are two recommended ways to bring this data into Data Cloud?

  1. Manually import the file using the Data Import Wizard
  2. Utilize Salesforce's Dataloader application to perform a bulk upload from a desktop
  3. Import the file into Google Cloud Storage and ingest with the Cloud Storage Connector
  4. Import the file into Amazon S3 and ingest with the Cloud Storage Connector

Answer(s): C,D

Explanation:

These two options are recommended ways to bring transactional data from a data warehouse into Data Cloud via a SFTP site. You can use the Cloud Storage Connector to ingest files from Google Cloud Storage or Amazon S3 buckets into Data Cloud.


Reference:

https://help.salesforce.com/s/articleView?id=sf.c360_a_data_cloud_google_cloud_storage.htm&typ e=5 https://help.salesforce.com/s/articleView?id=sf.c360_a_data_cloud_amazon_s3.htm&type=5



A segment fails to refresh with the error "Segment references too many Data Lake Objects (DLOs)".
What are two remedies for this issue?

  1. Space out the segment schedules to reduce Data Lake Object load
  2. Refine segmentation criteria to limit up to 5 custom DMOs
  3. Split the segment into smaller segments
  4. Use Calculated Insights in order to reduce the complexity of the segmentation query

Answer(s): A,C

Explanation:

These two remedies can help resolve the error "Segment references too many Data Lake Objects (DLOs)". Spacing out the segment schedules can reduce the concurrent load on the Data Lake Objects and improve performance. Splitting the segment into smaller segments can reduce the number of Data Lake Objects that are referenced by each segment.


Reference:

https://help.salesforce.com/s/articleView?



Which operator can be used to create a segment for a birthday campaign that is evaluated daily?

  1. Is This Year
  2. Is Anniversary Of
  3. Is Between
  4. Is Birthday

Answer(s): B

Explanation:

This operator can be used to create a segment for a birthday campaign that is evaluated daily. It compares a date attribute to the current date and returns true if they have the same month and day,

regardless of the year.


Reference:

https://help.salesforce.com/s/articleView?id=sf.c360_a_data_cloud_segmentation_operators.htm&t ype=5



Page 5 of 24



Post your Comments and Discuss Salesforce Salesforce-Data-Cloud exam with other Community members:

Sanjna P commented on September 11, 2024
Q29 : the correct answer is C, D and E. Here is the article : https://help.salesforce.com/s/articleView?id=sf.c360_a_edit_segment.htm&type=5
UNITED STATES
upvote