Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 60)

Page 60 of 134

A company is developing a gene reporting device that will collect genomic information to assist researchers with collecting large samples of data from a diverse population. The device will push 8 KB of genomic data every second to a data platform that will need to process and analyze the data and provide information back to researchers. The data platform must meet the following requirements:

•Provide near-real-time analytics of the inbound genomic data
•Ensure the data is flexible, parallel, and durable
•Deliver results of processing to a data warehouse

Which strategy should a solutions architect use to meet these requirements?

  1. Use Amazon Kinesis Data Firehose to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon RDS instance.
  2. Use Amazon Kinesis Data Streams to collect the inbound sensor data, analyze the data with Kinesis clients, and save the results to an Amazon Redshift cluster using Amazon EMR.
  3. Use Amazon S3 to collect the inbound device data, analyze the data from Amazon SQS with Kinesis, and save the results to an Amazon Redshift cluster.
  4. Use an Amazon API Gateway to put requests into an Amazon SQS queue, analyze the data with an AWS Lambda function, and save the results to an Amazon Redshift cluster using Amazon EMR.

Answer(s): B

Explanation:

B) Using Amazon Kinesis Data Streams to collect inbound genomic data provides a flexible, scalable, and durable stream for real-time analytics. Kinesis clients can be used to analyze the data in parallel, ensuring near-real-time processing. The results can be stored in an Amazon Redshift cluster using Amazon EMR for further analysis, which meets the requirement of saving processed data in a data warehouse. This strategy offers low-latency analytics and scalable data processing.



A solutions architect needs to define a reference architecture for a solution for three-tier applications with web. application, and NoSQL data layers. The reference architecture must meet the following requirements:

•High availability within an AWS Region
•Able to fail over in 1 minute to another AWS Region for disaster recovery
•Provide the most efficient solution while minimizing the impact on the user experience

Which combination of steps will meet these requirements? (Choose three.)

  1. Use an Amazon Route 53 weighted routing policy set to 100/0 across the two selected Regions. Set Time to Live (TTL) to 1 hour.
  2. Use an Amazon Route 53 failover routing policy for failover from the primary Region to the disaster recovery Region. Set Time to Live (TTL) to 30 seconds.
  3. Use a global table within Amazon DynamoDB so data can be accessed in the two selected Regions.
  4. Back up data from an Amazon DynamoDB table in the primary Region every 60 minutes and then write the data to Amazon S3. Use S3 cross-Region replication to copy the data from the primary Region to the disaster recovery Region. Have a script import the data into DynamoDB in a disaster recovery scenario.
  5. Implement a hot standby model using Auto Scaling groups for the web and application layers across multiple Availability Zones in the Regions. Use zonal Reserved Instances for the minimum number of servers and On-Demand Instances for any additional resources.
  6. Use Auto Scaling groups for the web and application layers across multiple Availability Zones in the Regions. Use Spot Instances for the required resources.

Answer(s): B,C,E

Explanation:

B) Using an Amazon Route 53 failover routing policy with a TTL of 30 seconds allows for quick failover from the primary Region to the disaster recovery Region, meeting the requirement for 1-minute failover.
C) Implementing a DynamoDB global table ensures that data is available in both Regions simultaneously, providing high availability and disaster recovery without the need for manual data replication.
E) A hot standby model with Auto Scaling groups ensures high availability across multiple Availability Zones and Regions, while using zonal Reserved Instances for base capacity and On-Demand Instances for scaling helps balance cost and performance.
This combination meets the requirements for high availability, disaster recovery, and minimizing the impact on the user experience.



A company manufactures smart vehicles. The company uses a custom application to collect vehicle data. The vehicles use the MQTT protocol to connect to the application. The company processes the data in 5-minute intervals. The company then copies vehicle telematics data to on-premises storage. Custom applications analyze this data to detect anomalies.

The number of vehicles that send data grows constantly. Newer vehicles generate high volumes of data. The on-premises storage solution is not able to scale for peak traffic, which results in data loss. The company must modernize the solution and migrate the solution to AWS to resolve the scaling challenges.

Which solution will meet these requirements with the LEAST operational overhead?

  1. Use AWS IoT Greengrass to send the vehicle data to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create an Apache Kafka application to store the data in Amazon S3. Use a pretrained model in Amazon SageMaker to detect anomalies.
  2. Use AWS IoT Core to receive the vehicle data. Configure rules to route data to an Amazon Kinesis Data Firehose delivery stream that stores the data in Amazon S3. Create an Amazon Kinesis Data Analytics application that reads from the delivery stream to detect anomalies.
  3. Use AWS IoT FleetWise to collect the vehicle data. Send the data to an Amazon Kinesis data stream. Use an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use the built-in machine learning transforms in AWS Glue to detect anomalies.
  4. Use Amazon MQ for RabbitMQ to collect the vehicle data. Send the data to an Amazon Kinesis Data Firehose delivery stream to store the data in Amazon S3. Use Amazon Lookout for Metrics to detect anomalies.

Answer(s): B

Explanation:

B) Using AWS IoT Core to receive vehicle data and routing it to Amazon Kinesis Data Firehose for storage in Amazon S3 provides a scalable and serverless solution with minimal operational overhead. Kinesis Data Analytics can process data in near real-time to detect anomalies, ensuring efficient processing and analysis of large volumes of data. This solution meets the scaling challenges and operational needs without requiring complex infrastructure management.



During an audit, a security team discovered that a development team was putting IAM user secret access keys in their code and then committing it to an AWS CodeCommit repository. The security team wants to automatically find and remediate instances of this security vulnerability.

Which solution will ensure that the credentials are appropriately secured automatically?

  1. Run a script nightly using AWS Systems Manager Run Command to search for credentials on the development instances. If found, use AWS Secrets Manager to rotate the credentials
  2. Use a scheduled AWS Lambda function to download and scan the application code from CodeCommit. If credentials are found, generate new credentials and store them in AWS KMS.
  3. Configure Amazon Macie to scan for credentials in CodeCommit repositories. If credentials are found, trigger an AWS Lambda function to disable the credentials and notify the user.
  4. Configure a CodeCommit trigger to invoke an AWS Lambda function to scan new code submissions for credentials. If credentials are found, disable them in AWS IAM and notify the user.

Answer(s): D

Explanation:

D) Configuring a CodeCommit trigger to invoke an AWS Lambda function ensures that new code submissions are automatically scanned for credentials. If credentials are found, the Lambda function can immediately disable them in AWS IAM and notify the user, providing a proactive and automated approach to securing the credentials. This solution effectively addresses the security vulnerability while minimizing operational overhead.



Page 60 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote