Free HEROKU-ARCHITECT Exam Braindumps

Universal Containers (UC) uses Apache Kafka on Heroku to stream shipment inventory data in real time throughout the world. A Kafka topic is used to send messages with updates on the shipping container GPS coordinates as they are in transit. UC is using a Heroku Kafka basic-0 plan. The topic was provisioned with 8 partitions, 1 week of retention, and no compaction. The keys for the events are being assigned by Heroku Kafka, which means that they will be randomly distributed between the partitions.
UC has a single-dyno consumer application that persists the data to their Enterprise Data Warehouse (EDW). Recently, they've been noticing data loss in the EDW.

What should an Architect with Kafka experience recommend?

  1. Enable compaction on the topic to drop older messages, which will drop older messages with the same key.
  2. Upgrade to a larger Apache Kafka for Heroku plan, which has greater data capacity.
  3. Use Heroku Redis to store message receipt information to account for "at-least" once delivery, which will guarantee that messages are never processed more than once. Scale up the consumer dynos to match the number of partitions so that there is one process for each partition.

Answer(s): C



A client's Heroku application syncs data between a Heroku Postgres database and a Salesforce org using the Salesforce Bulk API. The client has determined the application currently uses 90% of the client's daily Salesforce Bulk API limit.

To overcome this issue, what feature, to replace the Bulk API implementation in this scenario, should an Architect recommend?

  1. Custom Apex callouts
  2. Heroku Connect
  3. Salesforce SOAP API
  4. Salesforce Connect

Answer(s): B


Reference:



A client wants to use Heroku Connect to sync data from a Heroku Postgres table to a Salesforce org.

The client only needs to sync a specific subset of the rows in the table.

How should this be performed?

  1. Add a mapping filter to the table when setting up the sync, and select appropriate criteria from the list.
  2. Filter the data in the database, and provide an alternative table or view for use in the sync.
  3. Use the Heroku Connect Mapping Query Editor, and add filters to the query.
  4. Place Sharing Rules on the records, and restrict visibility to only those rows that are needed.

Answer(s): D

Explanation:

https://devcenterheroku com/artJcles/heroku-connect-faq#can-j-use-sharJng-rules-to-restrict- record-Visibility



A client requires that their web application's logs are accessible only from within the same isolated network as the application itself.

Which solution should an Architect recommend in this scenario?

  1. Deploy the application to a Private Space. Provide the Private Space's stable outbound IPs to Heroku's Logplex router to block all logs originating from the Private Space.
  2. Deploy the application to a Shield Private Space with Private Space Logging enabled. Forward logs to a destination within the Shield Private Space.
  3. Deploy the application to a Private Space. Connect the Private Space to an on-premise logging system using VPN and specify it as a log drain.
  4. Deploy the application to a Private Space. Enable Internal Routing to prevent the application's logs from being forwarded outside of the Private Space.

Answer(s): C






Post your Comments and Discuss Salesforce HEROKU-ARCHITECT exam with other Community members:

HEROKU-ARCHITECT Discussions & Posts