Free Certified Data Architect Exam Braindumps (page: 21)

Page 20 of 56

As part of addressing general data protection regulation (GDPR) requirements, UC plans to implement a dataclassification policy for all its internal systems that stores customer information including salesforce.

What should a data architect recommend so that UC can easily classify consumer information maintained in salesforce under both standard and custom objects?

  1. Use App Exchange products to classify fields based on policy.
  2. Use data classification metadata fields available in field definition.
  3. Create a custom picklist field to capture classification of information on customer.
  4. Build reports for customer information and validate.

Answer(s): B



The architect is planning a large data migration for Universal Containers from their legacy CRM system to Salesforce.
What three things should the architect consider to optimize performance of the data migration? Choose 3 answers

  1. Review the time zones of the User loading the data.
  2. Remove custom indexes on the data being loaded.
  3. Determine if the legacy system is still in use.
  4. Defer sharing calculations of the Salesforce Org.
  5. Deactivate approval processes and workflow rules.

Answer(s): B,D,E



Company S was recently acquired by Company T. As part of the acquisition, all of the data for the Company S's Salesforce instance (source) must be migrated into the Company T's Salesforce instance (target). Company S has 6 million Case records.

An Architect has been tasked with optimizing the data load time.

What should the Architect consider to achieve this goal?

  1. Pre-process the data, then use Data Loader with SOAP API to upsert with zip compression enabled.
  2. Directly leverage Salesforce-to-Salesforce functionality to load Case data.
  3. Load the data in multiple sets using Bulk API parallel processes.
  4. Utilize the Salesforce Org Migration Tool from the Setup Data Management menu.

Answer(s): A



Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr.

Which 2 data archiving strategies should a data architect recommend? Choose 2 options:

  1. Use custom objects for cases older than 2 years and use nightly batchto move them.
  2. Sync cases older than 2 years to an external database, and provide access to Service agents to the database
  3. Use Big objects for cases older than 2 years, and use nightly batch to move them.
  4. Use Heroku and external objects to displaycases older than 2 years and bulk API to hard delete from Salesforce.

Answer(s): C,D






Post your Comments and Discuss Salesforce Certified Data Architect exam with other Community members:

Certified Data Architect Discussions & Posts