Free Certified Data Architect Exam Braindumps (page: 15)

Page 14 of 56

Universal Containers (UC) is implementing a Salesforce project with large volumes of data and daily transactions. The solution includes both real-time web service integrations and Visualforce mash -ups with back -end systems. The Salesforce Full sandbox used by the project integrates with full-scale back -end testing systems.

What two types of performance testing are appropriate for this project?

Choose 2 answers

  1. Pre -go -live automated page -load testing against the Salesforce Full sandbox.
  2. Post go -live automated page -load testing against the Salesforce Production org.
  3. Pre -go -live unit testing in the Salesforce Full sandbox.
  4. Stress testing against the web services hosted by the integration middleware.

Answer(s): A,D



Universal Containers (UC) is launching an RFP to acquire a new accounting product available on AppExchange. UC is expecting to issue 5 million invoices per year, with each invoice containing an average of 10 line items.

What should UC's Data Architect recommend to ensure scalability?

  1. Ensure invoice line items simply reference existing Opportunity line items.
  2. Ensure the account product vendor includes Wave Analytics in their offering.
  3. Ensure the account product vendor provides a sound data archiving strategy.
  4. Ensurethe accounting product runs 100% natively on the Salesforce platform.

Answer(s): C



An architect is planning on having different batches to load one million Opportunities into Salesforce using the Bulk API in parallel mode.
What should be considered when loading the Opportunity records?

  1. Create indexes on Opportunity object text fields.
  2. Group batches by the AccountId field.
  3. Sort batches by Name field values.
  4. Order batches by Auto -number field.

Answer(s): D



Every year, Ursa Major Solar has morethan 1 million orders. Each order contains an average of 10 line items. The Chief Executive Officer (CEO) needs the Sales Reps to see how much money each customer generates year-over-year.
However, data storage is running low in Salesforce.

Which approach for data archiving is appropriate for this scenario?

  1. 1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later.
  2. 1. Annually aggregate order amount data to store in a custom object. 2. Delete thoseorders and order line items.
  3. 1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later.
  4. 1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.

Answer(s): B






Post your Comments and Discuss Salesforce Certified Data Architect exam with other Community members:

Certified Data Architect Discussions & Posts