Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 25)

Page 25 of 134

A company has introduced a new policy that allows employees to work remotely from their homes if they connect by using a VPN. The company is hosting internal applications with VPCs in multiple AWS accounts. Currently, the applications are accessible from the company's on-premises office network through an AWS Site-to-Site VPN connection. The VPC in the company's main AWS account has peering connections established with VPCs in other AWS accounts.

A solutions architect must design a scalable AWS Client VPN solution for employees to use while they work from home.

What is the MOST cost-effective solution that meets these requirements?

  1. Create a Client VPN endpoint in each AWS account. Configure required routing that allows access to internal applications.
  2. Create a Client VPN endpoint in the main AWS account. Configure required routing that allows access to internal applications.
  3. Create a Client VPN endpoint in the main AWS account. Provision a transit gateway that is connected to each AWS account. Configure required routing that allows access to internal applications.
  4. Create a Client VPN endpoint in the main AWS account. Establish connectivity between the Client VPN endpoint and the AWS Site-to-Site VPN.

Answer(s): B

Explanation:

B) Create a Client VPN endpoint in the main AWS account. Configure required routing that allows access to internal applications.

Creating a Client VPN endpoint in the main AWS account is the most cost-effective solution because it centralizes the VPN management while allowing remote employees to access internal applications hosted across multiple VPCs. By configuring the necessary routing, employees can connect securely to the applications without needing multiple VPN endpoints in each account, reducing both complexity and cost. This setup leverages the existing infrastructure efficiently while ensuring scalability as more users connect remotely.



A company is running an application in the AWS Cloud. Recent application metrics show inconsistent response times and a significant increase in error rates. Calls to third-party services are causing the delays. Currently, the application calls third-party services synchronously by directly invoking an AWS Lambda function.

A solutions architect needs to decouple the third-party service calls and ensure that all the calls are eventually completed.

Which solution will meet these requirements?

  1. Use an Amazon Simple Queue Service (Amazon SQS) queue to store events and invoke the Lambda function.
  2. Use an AWS Step Functions state machine to pass events to the Lambda function.
  3. Use an Amazon EventBridge rule to pass events to the Lambda function.
  4. Use an Amazon Simple Notification Service (Amazon SNS) topic to store events and Invoke the Lambda function.

Answer(s): A

Explanation:

A) Use an Amazon Simple Queue Service (Amazon SQS) queue to store events and invoke the Lambda function.

Using Amazon SQS allows the application to decouple the calls to third-party services, which helps in managing inconsistent response times and reducing error rates. By placing the service calls in an SQS queue, the application can continue processing other tasks while the Lambda function retrieves and processes the messages asynchronously. This design ensures that all calls to third-party services are eventually completed, even if there are temporary issues, thus enhancing the overall reliability and performance of the application.



A company is running applications on AWS in a multi-account environment. The company's sales team and marketing team use separate AWS accounts in AWS Organizations.

The sales team stores petabytes of data in an Amazon S3 bucket. The marketing team uses Amazon QuickSight for data visualizations. The marketing team needs access to data that the sates team stores in the S3 bucket. The company has encrypted the S3 bucket with an AWS Key Management Service (AWS KMS) key. The marketing team has already created the IAM service role for QuickSight to provide QuickSight access in the marketing AWS account. The company needs a solution that will provide secure access to the data in the S3 bucket across AWS accounts.

Which solution will meet these requirements with the LEAST operational overhead?

  1. Create a new S3 bucket in the marketing account. Create an S3 replication rule in the sales account to copy the objects to the new S3 bucket in the marketing account. Update the QuickSight permissions in the marketing account to grant access to the new S3 bucket.
  2. Create an SCP to grant access to the S3 bucket to the marketing account. Use AWS Resource Access Manager (AWS RAM) to share the KMS key from the sates account with the marketing account. Update the QuickSight permissions in the marketing account to grant access to the S3 bucket.
  3. Update the S3 bucket policy in the marketing account to grant access to the QuickSight role. Create a KMS grant for the encryption key that is used in the S3 bucket. Grant decrypt access to the QuickSight role. Update the QuickSight permissions in the marketing account to grant access to the S3 bucket.
  4. Create an IAM role in the sales account and grant access to the S3 bucket. From the marketing account, assume the IAM role in the sales account to access the S3 bucket. Update the QuickSight rote, to create a trust relationship with the new IAM role in the sales account.

Answer(s): D

Explanation:

D) Create an IAM role in the sales account and grant access to the S3 bucket. From the marketing account, assume the IAM role in the sales account to access the S3 bucket. Update the QuickSight role to create a trust relationship with the new IAM role in the sales account.

This solution allows the marketing team to access the S3 bucket in the sales account securely without duplicating data or creating additional complexities. By creating an IAM role in the sales account that has permissions to access the S3 bucket and allowing the marketing account to assume that role, the company maintains a single source of truth for the data and leverages IAM roles for secure cross-account access. Additionally, this method requires minimal operational overhead since it does not involve data replication or the need for complex configurations.



A company is planning to migrate its business-critical applications from an on-premises data center to AWS. The company has an on-premises installation of a Microsoft SQL Server Always On cluster. The company wants to migrate to an AWS managed database service. A solutions architect must design a heterogeneous database migration on AWS.

Which solution will meet these requirements?

  1. Migrate the SQL Server databases to Amazon RDS for MySQL by using backup and restore utilities.
  2. Use an AWS Snowball Edge Storage Optimized device to transfer data to Amazon S3. Set up Amazon RDS for MySQL. Use S3 integration with SQL Server features, such as BULK INSERT.
  3. Use the AWS Schema Conversion Tool to translate the database schema to Amazon RDS for MySQL. Then use AWS Database Migration Service (AWS DMS) to migrate the data from on-premises databases to Amazon RDS.
  4. Use AWS DataSync to migrate data over the network between on-premises storage and Amazon S3. Set up Amazon RDS for MySQL. Use S3 integration with SQL Server features, such as BULK INSERT.

Answer(s): C

Explanation:

C) Use the AWS Schema Conversion Tool to translate the database schema to Amazon RDS for MySQL. Then use AWS Database Migration Service (AWS DMS) to migrate the data from on-premises databases to Amazon RDS.

This solution effectively addresses the requirement for a heterogeneous database migration from Microsoft SQL Server to Amazon RDS for MySQL. The AWS Schema Conversion Tool helps to convert the database schema from SQL Server to the MySQL format, ensuring compatibility. AWS DMS then facilitates the data migration while minimizing downtime, as it can replicate ongoing changes from the source database to the target. This approach is suitable for business-critical applications, providing a seamless transition with the necessary tools to manage schema changes and data migration efficiently.



Page 25 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote