Free SAP-C01 Exam Braindumps (page: 60)

Page 60 of 134

A company is developing a gene reporting device that will collect genomic information to assist researchers with collecting large samples of data from a diverse population. The device will push 8 KB of genomic data every second to a data platform that will need to process and analyze the data and provide information back to researchers. The data platform must meet the following requirements:

•Provide near-real-time analytics of the inbound genomic data
•Ensure the data is flexible, parallel, and durable
•Deliver results of processing to a data warehouse

Which strategy should a solutions architect use to meet these requirements?

  1. Use Amazon Kinesis Data Firehose to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon RDS instance.
  2. Use Amazon Kinesis Data Streams to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon Redshift cluster using Amazon EMR.
  3. Use Amazon S3 to collect the inbound device data, analyze the data from Amazon SQS with Kinesis, and save the results to an Amazon Redshift cluster.
  4. Use an Amazon API Gateway to put requests into an Amazon SQS queue, analyze the data with an AWS Lambda function, and save the results to an Amazon Redshift cluster using Amazon EMR.

Answer(s): B

Explanation:

B) Using Amazon Kinesis Data Streams to collect inbound genomic data provides a flexible, scalable, and durable stream for real-time analytics. Kinesis clients can be used to analyze the data in parallel, ensuring near-real-time processing. The results can be stored in an Amazon Redshift cluster using Amazon EMR for further analysis, which meets the requirement of saving processed data in a data warehouse. This strategy offers low-latency analytics and scalable data processing.



A solutions architect needs to define a reference architecture for a solution for three-tier applications with web. application, and NoSQL data layers. The reference architecture must meet the following requirements:

•High availability within an AWS Region
•Able to fail over in 1 minute to another AWS Region for disaster recovery
•Provide the most efficient solution while minimizing the impact on the user experience

Which combination of steps will meet these requirements? (Choose three.)

  1. Use an Amazon Route 53 weighted routing policy set to 100/0 across the two selected Regions. Set Time to Live (TTL) to 1 hour.
  2. Use an Amazon Route 53 failover routing policy for failover from the primary Region to the disaster recovery Region. Set Time to Live (TTL) to 30 seconds.
  3. Use a global table within Amazon DynamoDB so data can be accessed in the two selected Regions.
  4. Back up data from an Amazon DynamoDB table in the primary Region every 60 minutes and then write the data to Amazon S3. Use S3 cross-Region replication to copy the data from the primary Region to the disaster recovery Region. Have a script import the data into DynamoDB in a disaster recovery scenario.
  5. Implement a hot standby model using Auto Scaling groups for the web and application layers across multiple Availability Zones in the Regions. Use zonal Reserved Instances for the minimum number of servers and On-Demand Instances for any additional resources.
  6. Use Auto Scaling groups for the web and application layers across multiple Availability Zones in the Regions. Use Spot Instances for the required resources.

Answer(s): B,C,E

Explanation:

B) Using an Amazon Route 53 failover routing policy with a TTL of 30 seconds allows for quick failover from the primary Region to the disaster recovery Region, meeting the requirement for 1-minute failover.
C) Implementing a DynamoDB global table ensures that data is available in both Regions simultaneously, providing high availability and disaster recovery without the need for manual data replication.
E) A hot standby model with Auto Scaling groups ensures high availability across multiple Availability Zones and Regions, while using zonal Reserved Instances for base capacity and On-Demand Instances for scaling helps balance cost and performance.
This combination meets the requirements for high availability, disaster recovery, and minimizing the impact on the user experience.



A company manufactures smart vehicles. The company uses a custom application to collect vehicle data. The vehicles use the MQTT protocol to connect to the application. The company processes the data in 5-minute intervals. The company then copies vehicle telematics data to on-premises storage. Custom applications analyze this data to detect anomalies.

The number of vehicles that send data grows constantly. Newer vehicles generate high volumes of data. The on-premises storage solution is not able to scale for peak traffic, which results in data loss. The company must modernize the solution and migrate the solution to AWS to resolve the scaling challenges.

Which solution will meet these requirements with the LEAST operational overhead?

  1. Use AWS IoT Greengrass to send the vehicle data to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create an Apache Kafka application to store the data in Amazon S3. Use a pretrained model in Amazon SageMaker to detect anomalies.
  2. Use AWS IoT Core to receive the vehicle data. Configure rules to route data to an Amazon Kinesis Data Firehose delivery stream that stores the data in Amazon S3. Create an Amazon Kinesis Data Analytics application that reads from the delivery stream to detect anomalies.
  3. Use AWS IoT FleetWise to collect the vehicle data. Send the data to an Amazon Kinesis data stream. Use an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use the built-in machine learning transforms in AWS Glue to detect anomalies.
  4. Use Amazon MQ for RabbitMQ to collect the vehicle data. Send the data to an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use Amazon Lookout for Metrics to detect anomalies.

Answer(s): B

Explanation:

B) Using AWS IoT Core to receive vehicle data and routing it to Amazon Kinesis Data Firehose for storage in Amazon S3 provides a scalable and serverless solution with minimal operational overhead. Kinesis Data Analytics can process data in near real-time to detect anomalies, ensuring efficient processing and analysis of large volumes of data. This solution meets the scaling challenges and operational needs without requiring complex infrastructure management.



During an audit, a security team discovered that a development team was putting IAM user secret access keys in their code and then committing it to an AWS CodeCommit repository. The security team wants to automatically find and remediate instances of this security vulnerability.

Which solution will ensure that the credentials are appropriately secured automatically?

  1. Run a script nightly using AWS Systems Manager Run Command to search for credentials on the development instances. If found, use AWS Secrets Manager to rotate the credentials
  2. Use a scheduled AWS Lambda function to download and scan the application code from CodeCommit. If credentials are found, generate new credentials and store them in AWS KMS.
  3. Configure Amazon Macie to scan for credentials in CodeCommit repositories. If credentials are found, trigger an AWS Lambda function to disable the credentials and notify the user.
  4. Configure a CodeCommit trigger to invoke an AWS Lambda function to scan new code submissions for credentials. If credentials are found, disable them in AWS IAM and notify the user.

Answer(s): D

Explanation:

D) Configuring a CodeCommit trigger to invoke an AWS Lambda function ensures that new code submissions are automatically scanned for credentials. If credentials are found, the Lambda function can immediately disable them in AWS IAM and notify the user, providing a proactive and automated approach to securing the credentials. This solution effectively addresses the security vulnerability while minimizing operational overhead.



Page 60 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote