Cumulus Financial created a segment called High Investment Balance Customers. This is afoundational segment that includes several segmentation criteria the marketing team shouldconsistently use.Which feature should the consultant suggest the marketing team use to ensure this consistencywhen creating future, more refined segments?
Answer(s): A
Nested segments are segments that include or exclude one or more existing segments. They allow the marketing team to reuse filters and maintain consistency in their data by using an existing segment to build a new one. For example, the marketing team can create a nested segment that includes High Investment Balance Customers and excludes customers who have opted out of email marketing. This way, they can leverage the foundational segment and apply additional criteria without duplicating the rules. The other options are not the best features to ensure consistency because:B . A calculated insight is a data object that performs calculations on data lake objects or CRM data and returns a result. It is not a segment and cannot be used for activation or personalization.C . A data kit is a bundle of packageable metadata that can be exported and imported across Data Cloud orgs. It is not a feature for creating segments, but rather for sharing components.D . Cloning a segment creates a copy of the segment with the same rules and filters. It does not allow the marketing team to add or remove criteria from the original segment, and it may create confusion and redundancy.
Create a Nested Segment - Salesforce, Save Time with Nested Segments (Generally Available) - Salesforce, Calculated Insights - Salesforce, Create and Publish a Data Kit Unit | Salesforce Trailhead, Create a Segment in Data Cloud - Salesforce
Cumulus Financial uses Service Cloud as its CRM and stores mobile phone, home phone, and work phone as three separate fields for its customers on the Contact record. The company plans to use Data Cloud and ingest the Contact object via the CRM Connector.What is the most efficient approach that a consultant should take when ingesting this data to ensure all the different phone numbers are properly mapped and available for use in activation?
Answer(s): B
The most efficient approach that a consultant should take when ingesting this data to ensure all the different phone numbers are properly mapped and available for use in activation is B. Ingest the Contact object and use streaming transforms to normalize the phone numbers from the Contact data stream into a separate Phone data lake object (DLO) that contains three rows, and then map this new DLO to the Contact Point Phone data map object. This approach allows the consultant to use the streaming transforms feature of Data Cloud, which enables data manipulation and transformation at the time of ingestion, without requiring any additional processing or storage. Streaming transforms can be used to normalize the phone numbers from the Contact data stream, such as removing spaces, dashes, or parentheses, and adding country codes if needed. The normalized phone numbers can then be stored in a separate Phone DLO, which can have one row for each phone number type (work, home, mobile). The Phone DLO can then be mapped to the Contact Point Phone data map object, which is a standard object that represents a phone number associated with a contact point. This way, the consultant can ensure that all the phone numbers are available for activation, such as sending SMS messages or making calls to the customers.The other options are not as efficient as option B. Option A is incorrect because it does not normalize the phone numbers, which may cause issues with activation or identity resolution. Option C is incorrect because it requires creating a calculated insight, which is an additional step that consumes more resources and time than streaming transforms. Option D is incorrect because it requires creating formula fields in the Contact data stream, which may not be supported by the CRM Connector or may cause conflicts with the existing fields in the Contact object.
Salesforce Data Cloud Consultant Exam Guide, Data Ingestion and Modeling, Streaming Transforms, Contact Point Phone
A customer has a Master Customer table from their CRM to ingest into Data Cloud. Thetable contains a name and primary email address, along with other personally Identifiableinformation (Pll).How should the fields be mapped to support identity resolution?
Answer(s): C
To support identity resolution in Data Cloud, the fields from the Master Customer table should be mapped to the standard data model objects that are designed for this purpose. The Individual object is used to store the name and other personally identifiable information (PII) of a customer, while the Contact Phone Email object is used to store the primary email address and other contact information of a customer. These objects are linked by a relationship field that indicates the contact information belongs to the individual. By mapping the fields to these objects, Data Cloud can use the identity resolution rules to match and reconcile the profiles from different sources based on the name and email address fields. The other options are not recommended because they either create a new custom object that is not part of the standard data model, or map all fields to the Customer object that is not intended for identity resolution, or map all fields to the Individual object that does not have a standard email address field.
Data Modeling Requirements for Identity Resolution, Create Unified Individual Profiles
Cloud Kicks received a Request to be Forgotten by a customer.In which two ways should a consultant use Data Cloud to honor this request?Choose 2 answers
Answer(s): B,D
To honor a Request to be Forgotten by a customer, a consultant should use Data Cloud in two ways:Add the Individual ID to a headerless file and use the delete from file functionality. This option allows the consultant to delete multiple Individuals from Data Cloud by uploading a CSV file with their IDs1. The deletion process is asynchronous and can take up to 24 hours to complete1.Use the Consent API to suppress processing and delete the Individual and related records from source data streams. This option allows the consultant to submit a Data Deletion request for an Individual profile in Data Cloud using the Consent API2. A Data Deletion request deletes the specified Individual entity and any entities where a relationship has been defined between that entity's identifying attribute and the Individual ID attribute2. The deletion process is reprocessed at 30, 60, and 90 days to ensure a full deletion2. The other options are not correct because:Deleting the data from the incoming data stream and performing a full refresh will not delete the existing data in Data Cloud, only the new data from the source system3.Using Data Explorer to locate and manually remove the Individual will not delete the related records from the source data streams, only the Individual entity in Data Cloud.
Delete Individuals from Data CloudRequesting Data Deletion or Right to Be ForgottenData Refresh for Data Cloud[Data Explorer]
What is Data Cloud's primary value to customers?
Data Cloud is a platform that enables you to activate all your customer data across Salesforce applications and other systems. Data Cloud allows you to create a unified profile of each customer by ingesting, transforming, and linking data from various sources, such as CRM, marketing, commerce, service, and external data providers. Data Cloud also provides insights and analytics on customer behavior, preferences, and needs, as well as tools to segment, target, and personalize customer interactions. Data Cloud's primary value to customers is to provide a unified view of a customer and their related data, which can help you deliver better customer experiences, increase loyalty, and drive growth.
Salesforce Data Cloud, When Data Creates Competitive Advantage
During an implementation project, a consultant completed ingestion of all data streams fortheir customer.Prior to segmenting and acting on that data, which additional configuration is required?
Answer(s): D
After ingesting data from different sources into Data Cloud, the additional configuration that is required before segmenting and acting on that data is Identity Resolution. Identity Resolution is the process of matching and reconciling source profiles from different data sources and creating unified profiles that represent a single individual or entity1. Identity Resolution enables you to create a 360- degree view of your customers and prospects, and to segment and activate them based on their attributes and behaviors2. To configure Identity Resolution, you need to create and deploy a ruleset that defines the match rules and reconciliation rules for your data3. The other options are incorrect because they are not required before segmenting and acting on the data. Data Activation is the process of sending data from Data Cloud to other Salesforce clouds or external destinations for marketing, sales, or service purposes4. Calculated Insights are derived attributes that are computed based on the source or unified data, such as lifetime value, churn risk, or product affinity5. Data Mapping is the process of mapping source attributes to unified attributes in the data model. These configurations can be done after segmenting and acting on the data, or in parallel with Identity Resolution, but they are not prerequisites for it.
Identity Resolution Overview, Segment and Activate Data in Data Cloud, Configure Identity Resolution Rulesets, Data Activation Overview, Calculated Insights Overview, [Data Mapping Overview]
Northern Trail Outfitters (NTO) wants to connect their B2C Commerce data with Data Cloudand bring two years of transactional history into Data Cloud.What should NTO use to achieve this?
The B2C Commerce Starter Bundles are predefined data streams that ingest order and product data from B2C Commerce into Data Cloud. However, the starter bundles only bring in the last 90 days of data by default. To bring in two years of transactional history, NTO needs to use a custom extract from B2C Commerce that includes the historical data and configure the data stream to use the custom extract as the source. The other options are not sufficient to achieve this because:A . B2C Commerce Starter Bundles only ingest the last 90 days of data by default.B . Direct Sales Order entity ingestion is not a supported method for connecting B2C Commerce data with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce data, only data ingestion.C . Direct Sales Product entity ingestion is not a supported method for connecting B2C Commerce data with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce data, only data ingestion.
Create a B2C Commerce Data Bundle - Salesforce, B2C Commerce Connector - Salesforce, Salesforce B2C Commerce Pricing Plans & Costs
A customer has a requirement to receive a notification whenever an activation fails for aparticular segment.Which feature should the consultant use to solution for this use case?
The feature that the consultant should use to solution for this use case is C. Activation alert. Activation alerts are notifications that are sent to users when an activation fails or succeeds for a segment. Activation alerts can be configured in the Activation Settings page, where the consultant can specify the recipients, the frequency, and the conditions for sending the alerts. Activation alerts can help the customer to monitor the status of their activations and troubleshoot any issues that may arise.
Salesforce Data Cloud Consultant Exam Guide, Activation Alerts
Post your Comments and Discuss Salesforce Data-Con-101 exam dumps with other Community members:
/sbin/init
/etc/inittab
/etc/rc.d
/etc/init.d
/lib/init.so
/etc/rc.d/rcinit
/proc/sys/kernel/init
/boot/init
/bin/init
Amazon S3 Intelligent-Tiering
S3 Lifecycle
S3 Glacier Flexible Retrieval
Amazon Athena
Amazon EFS
EC2 instance store
ElastiCache for Redis
S3 Glacier Deep Archive
AWS Lake Formation
Amazon EMR Spark jobs
Amazon Kinesis Data Streams
Amazon DynamoDB
Defender for Endpoint
Defender for Identity
Defender for Cloud Apps
Defender for Office 365
S3 Object Lock
S3
SFTP
AWS Transfer Family
Amazon SQS
API Gateway
Lambda
usage plan
AWS WAF
Amazon ECS
Application Load Balancer
AWS Global Accelerator
Network Load Balancer
EC2
Auto Scaling group
CloudFront
ALB
AWS PrivateLink
CRR
SSE-S3
Athena
SSE-KMS
RDS Custom for Oracle
s3:GetObject
Amazon OpenSearch Service
CloudWatch Logs
Kinesis Data Firehose
Kinesis
S3 bucket
SQS
AWS Lambda
AWS Secrets Manager
AWS Systems Manager OpsCenter
secretsmanager:GetSecretValue
seq
for h in {1..254}
for h in $(seq 1 254); do
Kinesis Data Streams
Amazon Redshift
secrets:GetSecretValue
aws:PrincipalOrgID
"aws:PrincipalOrgID": "o-1234567890"
Azure Bot Service
Our website is free, but we have to fight against AI bots and content theft. We're sorry for the inconvenience caused by these security measures. You can access the rest of the Data-Con-101 content, but please register or login to continue.
💬 Did you find this helpful?
Thank you for sharing! Your feedback helps the community.