Free Certified Data Architect Exam Braindumps (page: 24)

Page 23 of 56

NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records.

In salesforce, the customer is representedby the contact object.

NTO utilizes an MDM solution with these attributes:

1. The MDM solution keeps track of customer master with a master key.
2. The master key is a map to the record ID's from each external system that customer data is stored within.
3. The MDM solution provides de-duplication features, so it acts as the single source of truth.

How should a data architect implement the storage of master key within salesforce?

  1. Store the master key in Heroku postgres and use Heroku connect for synchronization.
  2. Create a custom object to store the master key with a lookup field to contact.
  3. Create an external object to store the master key with a lookup field to contact.
  4. Store the master key on the contact object as an external ID (Field for referential imports)

Answer(s): D



NTO uses salesforce to manage relationships and tracksales opportunities. It has 10 million customers and 100 million opportunities. The CEO has been complaining 10 minutes to run and sometimes failed to load, throwing a time out error.

Which 3 options should help improve the dashboard performance?

Choose 3answers:

  1. Use selective queries to reduce the amount of data being returned.
  2. De-normalize the data by reducing the number of joins.
  3. Remove widgets from the dashboard to reduce the number of graphics loaded.
  4. Run the dashboard for CEO and send itvia email.
  5. Reduce the amount of data queried by archiving unused opportunity records.

Answer(s): A,B,E



UC is planning a massive SF implementation with large volumes of data. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner.

What should a data architect do to minimize data load times due to system calculations?

  1. Enable defer sharing calculations, and suspend sharing rule calculations
  2. Load the data through data loader, and turn on parallel processing.
  3. Leverage the Bulk API andconcurrent processing with multiple batches
  4. Enable granular locking to avoid "UNABLE _TO_LOCK_ROW" error.

Answer(s): A



Universal Containers (UC) has implemented Salesforce, UC is running out of storage and needs to have an archiving solution, UC would like to maintaintwo years of data in Saleforce and archive older data out of Salesforce.

Which solution should a data architect recommend as an archiving solution?

  1. Use a third-party backup solution to backup all data off platform.
  2. Build a batch join move all recordsoff platform, and delete all records from Salesforce.
  3. Build a batch join to move two-year-old records off platform, and delete records from Salesforce.
  4. Build a batch job to move all restore off platform, and delete old records from Salesforce.

Answer(s): A






Post your Comments and Discuss Salesforce Certified Data Architect exam with other Community members:

Certified Data Architect Discussions & Posts