A Data Cloud customer wants to adjust their identity resolution rules to increase theiraccuracy of matches. Rather than matching on email address, they want to review a rule that joins their CRM Contacts with their Marketing Contacts, where both use the CRM ID as their primary key.Which two steps should the consultant take to address this new use case?Choose 2 answers
Answer(s): A,D
To address this new use case, the consultant should map the primary key from the two systems to Party Identification, using CRM ID as the identification name for both, and create a matching rule based on party identification that matches on CRM ID as the party identification name. This way, the consultant can ensure that the CRM Contacts and Marketing Contacts are matched based on their CRM ID, which is a unique identifier for each individual. By using Party Identification, the consultant can also leverage the benefits of this attribute, such as being able to match across different entities and sources, and being able to handle multiple values for the same individual. The other options are incorrect because they either do not use the CRM ID as the primary key, or they do not use Party Identification as the attribute type.
Configure Identity Resolution Rulesets, Identity Resolution Match Rules, Data Cloud Identity Resolution Ruleset, Data Cloud Identity Resolution Config Input
Which consideration related to the way Data Cloud ingests CRM data is true?
Answer(s): D
The correct answer is D. The CRM Connector allows standard fields to stream into Data Cloud in real time. This means that any changes to the standard fields in the CRM data source are reflected in Data Cloud almost instantly, without waiting for the next scheduled synchronization. This feature enables Data Cloud to have the most up-to-date and accurate CRM data for segmentation and activation1.The other options are incorrect for the following reasons:A . CRM data can be manually refreshed at any time by clicking the Refresh button on the data stream detail page2. This option is false.B . The CRM Connector's synchronization times can be customized to up to 60-minute intervals, not 15-minute intervals3. This option is false.C . Formula fields are not refreshed at regular sync intervals, but only at the next full refresh4. A full refresh is a complete data ingestion process that occurs once every 24 hours or when manually triggered. This option is false.1: Connect and Ingest Data in Data Cloud article on Salesforce Help2: Data Sources in Data Cloud unit on Trailhead3: Data Cloud for Admins module on Trailhead4: [Formula Fields in Data Cloud] unit on Trailhead5: [Data Streams in Data Cloud] unit on Trailhead
What does the Source Sequence reconciliation rule do in identity resolution?
The Source Sequence reconciliation rule sets the priority of specific data sources when building attributes in a unified profile, such as a first or last name. This rule allows you to define which data source should be used as the primary source of truth for each attribute, and which data sources should be used as fallbacks in case the primary source is missing or invalid. For example, you can set the Source Sequence rule to use data from Salesforce CRM as the first priority, data from Marketing Cloud as the second priority, and data from Google Analytics as the third priority for the first name attribute. This way, the unified profile will use the first name value from Salesforce CRM if it exists, otherwise it will use the value from Marketing Cloud, and so on. This rule helps you to ensure the accuracy and consistency of the unified profile attributes across different data sources.
Salesforce Data Cloud Consultant Exam Guide, Identity Resolution, Reconciliation Rules
Which two dependencies prevent a data stream from being deleted?Choose 2 answers
Answer(s): B,C
To delete a data stream in Data Cloud, the underlying data lake object (DLO) must not have any dependencies or references to other objects or processes. The following two dependencies prevent a data stream from being deleted1:Data transform: This is a process that transforms the ingested data into a standardized format and structure for the data model. A data transform can use one or more DLOs as input or output. If a DLO is used in a data transform, it cannot be deleted until the data transform is removed or modified2.Data model object: This is an object that represents a type of entity or relationship in the data model. A data model object can be mapped to one or more DLOs to define its attributes and values. If a DLO is mapped to a data model object, it cannot be deleted until the mapping is removed or changed3.1: Delete a Data Stream article on Salesforce Help2: [Data Transforms in Data Cloud] unit on Trailhead3: [Data Model in Data Cloud] unit on Trailhead
What should a user do to pause a segment activation with the intent of using that segmentagain?
Answer(s): A
The correct answer is A. Deactivate the segment. If a segment is no longer needed, it can be deactivated through Data Cloud and applies to all chosen targets. A deactivated segment no longer publishes, but it can be reactivated at any time1. This option allows the user to pause a segment activation with the intent of using that segment again.The other options are incorrect for the following reasons:B . Delete the segment. This option permanently removes the segment from Data Cloud and cannot be undone2. This option does not allow the user to use the segment again.C . Skip the activation. This option skips the current activation cycle for the segment, but does not affect the future activation cycles3. This option does not pause the segment activation indefinitely.D . Stop the publish schedule. This option stops the segment from publishing to the chosen targets, but does not deactivate the segment4. This option does not pause the segment activation completely.1: Deactivated Segment article on Salesforce Help2: Delete a Segment article on Salesforce Help3: Skip an Activation article on Salesforce Help4: Stop a Publish Schedule article on Salesforce Help
When creating a segment on an individual, what is the result of using two separatecontainers linked by an AND as shown below?GoodsProduct | Count | At Least | 1Color | Is Equal To | redANDGoodsProduct | Count | At Least | 1PrimaryProductCategory | Is Equal To | shoes
When creating a segment on an individual, using two separate containers linked by an AND means that the individual must satisfy both the conditions in the containers. In this case, the individual must have purchased at least one product with the color attribute equal to `red' and at least one product with the primary product category attribute equal to `shoes'. The products do not have to be the same or purchased in the same transaction. Therefore, the correct answer is A .The other options are incorrect because they imply different logical operators or conditions. Option B implies that the individual must have purchased a single product that has both the color attribute equal to `red' and the primary product category attribute equal to `shoes'. Option C implies that the individual must have purchased only one product that has both the color attribute equal to `red' and the primary product category attribute equal to `shoes' and no other products. Option D implies that the individual must have purchased either one product with the color attribute equal to `red' or one product with the primary product category attribute equal to `shoes' or both, which is equivalent to using an OR operator instead of an AND operator.Create a Container for SegmentationCreate a Segment in Data CloudNavigate Data Cloud Segmentation
What should an organization use to stream inventory levels from an inventory managementsystem into Data Cloud in a fast and scalable, near-real-time way?
Answer(s): C
The Ingestion API is a RESTful API that allows you to stream data from any source into Data Cloud in a fast and scalable way. You can use the Ingestion API to send data from your inventory management system into Data Cloud as JSON objects, and then use Data Cloud to create data models, segments, and insights based on your inventory data. The Ingestion API supports both batch and streaming modes, and can handle up to 100, 000 records per second. The Ingestion API also provides features such as data validation, encryption, compression, and retry mechanisms to ensure data quality and security.
Ingestion API Developer Guide, Ingest Data into Data Cloud
Northern Trail Outfitters (NTO), an outdoor lifestyle clothing brand, recently started a newline of business. The new business specializes in gourmet camping food. For business reasons as well as security reasons, it's important to NTO to keep all Data Cloud data separated by brand.Which capability best supports NTO's desire to separate its data by brand?
Data spaces are logical containers that allow you to separate and organize your data by different criteria, such as brand, region, product, or business unit1. Data spaces can help you manage data access, security, and governance, as well as enable cross-cloud data integration and activation2. For NTO, data spaces can support their desire to separate their data by brand, so that they can have different data models, rules, and insights for their outdoor lifestyle clothing and gourmet camping food businesses. Data spaces can also help NTO comply with any data privacy and security regulations that may apply to their different brands3. The other options are incorrect because they do not provide the same level of data separation and organization as data spaces. Data streams are used to ingest data from different sources into Data Cloud, but they do not separate the data by brand4. Data model objects are used to define the structure and attributes of the data, but they do not isolate the data by brand5. Data sources are used to identify the origin and type of the data, but they do not partition the data by brand.
Data Spaces Overview, Create Data Spaces, Data Privacy and Security in Data Cloud, Data Streams Overview, Data Model Objects Overview, [Data Sources Overview]
Post your Comments and Discuss Salesforce Data-Con-101 exam dumps with other Community members:
No discussions yet for this exam. Be the first to share your experience and help others prepare!
💬 Did you find this helpful?
Thank you for sharing! Your feedback helps the community.