Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 33)

Page 33 of 134

A company uses AWS Organizations for a multi-account setup in the AWS Cloud. The company uses AWS Control Tower for governance and uses AWS Transit Gateway for VPC connectivity across accounts.

In an AWS application account, the company’s application team has deployed a web application that uses AWS Lambda and Amazon RDS. The company's database administrators have a separate DBA account and use the account to centrally manage all the databases across the organization. The database administrators use an Amazon EC2 instance that is deployed in the DBA account to access an RDS database that is deployed m the application account.

The application team has stored the database credentials as secrets in AWS Secrets Manager in the application account. The application team is manually sharing the secrets with the database administrators. The secrets are encrypted by the default AWS managed key for Secrets Manager in the application account. A solutions architect needs to implement a solution that gives the database administrators access to the database and eliminates the need to manually share the secrets.

Which solution will meet these requirements?

  1. Use AWS Resource Access Manager (AWS RAM) to share the secrets from the application account with the DBA account. In the DBA account, create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the shared secrets. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  2. In the application account, create an IAM role that is named DBA-Secret. Grant the role the required permissions to access the secrets. In the DBA account, create an IAM role that is named DBA-Admin. Grant the DBA-Admin role the required permissions to assume the DBA-Secret role in the application account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  3. In the DBA account create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the secrets and the default AWS managed key in the application account. In the application account, attach resource-based policies to the key to allow access from the DBA account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  4. In the DBA account, create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the secrets in the application account. Attach an SCP to the application account to allow access to the secrets from the DBA account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.

Answer(s): B

Explanation:

To provide the database administrators in the DBA account with access to the database credentials stored in AWS Secrets Manager in the application account while eliminating the need for manual sharing, the best solution is:
B) In the application account, create an IAM role that is named DBA-Secret. Grant the role the required permissions to access the secrets. In the DBA account, create an IAM role that is named DBA-Admin. Grant the DBA-Admin role the required permissions to assume the DBA-Secret role in the application account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
-Why this is the best option:
oCross-Account Role Assumption: This solution leverages IAM roles and allows the DBA-Admin role in the DBA account to assume the DBA-Secret role in the application account, granting it the necessary permissions to access the secrets without manual intervention.
oLeast Privilege Access: The use of specific roles ensures that access is controlled and only granted to the necessary resources, adhering to the principle of least privilege.
oAutomation and Security: By automating the access process through role assumption, it reduces operational overhead and increases security by avoiding hardcoded credentials or manual sharing.
This approach effectively meets the requirement of seamless access to the secrets while maintaining security and governance standards across AWS accounts.



A company manages multiple AWS accounts by using AWS Organizations. Under the root OU, the company has two OUs: Research and DataOps.

Because of regulatory requirements, all resources that the company deploys in the organization must reside in the ap-northeast-1 Region. Additionally, EC2 instances that the company deploys in the DataOps OU must use a predefined list of instance types.

A solutions architect must implement a solution that applies these restrictions. The solution must maximize operational efficiency and must minimize ongoing maintenance.

Which combination of steps will meet these requirements? (Choose two.)

  1. Create an IAM role in one account under the DataOps OU. Use the ec2:InstanceType condition key in an inline policy on the role to restrict access to specific instance type.
  2. Create an IAM user in all accounts under the root OU. Use the aws:RequestedRegion condition key in an inline policy on each user to restrict access to all AWS Regions except ap-northeast-1.
  3. Create an SCP. Use the aws:RequestedRegion condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU.
  4. Create an SCP. Use the ec2:Region condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU, the DataOps OU, and the Research OU.
  5. Create an SCP. Use the ec2:InstanceType condition key to restrict access to specific instance types. Apply the SCP to the DataOps OU.

Answer(s): C,E

Explanation:

To ensure that all resources deployed in the organization reside in the ap-northeast-1 Region and that EC2 instances in the DataOps OU only use a predefined list of instance types, the best approach is:
C) Create an SCP. Use the aws
condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU.
E) Create an SCP. Use the ec2
condition key to restrict access to specific instance types. Apply the SCP to the DataOps OU.
-Why these options are effective:
oOption C establishes a clear and broad restriction on resource deployment across all accounts within the organization by limiting the available regions to only ap-northeast-1. This ensures regulatory compliance across the entire AWS Organization.
oOption E applies specific restrictions on the DataOps OU, ensuring that only approved EC2 instance types can be utilized, thereby aligning with company policies and compliance requirements.
Using Service Control Policies (SCPs) is a highly effective way to manage permissions at the organizational level, as it centralizes governance and minimizes ongoing maintenance. This approach eliminates the need for IAM roles or users with restrictive inline policies across individual accounts, promoting operational efficiency.



A company runs a serverless application in a single AWS Region. The application accesses external URLs and extracts metadata from those sites. The company uses an Amazon Simple Notification Service (Amazon SNS) topic to publish URLs to an Amazon Simple Queue Service (Amazon SQS) queue. An AWS Lambda function uses the queue as an event source and processes the URLs from the queue. Results are saved to an Amazon S3 bucket.

The company wants to process each URL in other Regions to compare possible differences in site localization. URLs must be published from the existing Region. Results must be written to the existing S3 bucket in the current Region.

Which combination of changes will produce multi-Region deployment that meets these requirements? (Choose two.)

  1. Deploy the SQS queue with the Lambda function to other Regions.
  2. Subscribe the SNS topic in each Region to the SQS queue.
  3. Subscribe the SQS queue in each Region to the SNS topic.
  4. Configure the SQS queue to publish URLs to SNS topics in each Region.
  5. Deploy the SNS topic and the Lambda function to other Regions.

Answer(s): A,C

Explanation:

A) Deploy the SQS queue with the Lambda function to other Regions.By deploying the SQS queue and the associated Lambda function in other Regions, the application can process URLs independently in those Regions. This allows for parallel processing and comparison of site localization without affecting the existing infrastructure.
C) Subscribe the SQS queue in each Region to the SNS topic.This setup enables the existing SNS topic in the original Region to notify the SQS queues in the other Regions. When a URL is published to the SNS topic, it will trigger the SQS queues in all subscribed Regions. The Lambda functions in those Regions can then process these URLs, ensuring that results are collected for comparison while keeping the source of URLs centralized.
By implementing these two changes, the company can effectively create a multi-Region deployment for processing URLs while maintaining the original Region for URL publishing and result storage.



A company runs a proprietary stateless ETL application on an Amazon EC2 Linux instances. The application is a Linux binary, and the source code cannot be modified. The application is single-threaded, uses 2 GB of RAM, and is highly CPU intensive. The application is scheduled to run every 4 hours and runs for up to 20 minutes. A solutions architect wants to revise the architecture for the solution.

Which strategy should the solutions architect use?

  1. Use AWS Lambda to run the application. Use Amazon CloudWatch Logs to invoke the Lambda function every 4 hours.
  2. Use AWS Batch to run the application. Use an AWS Step Functions state machine to invoke the AWS Batch job every 4 hours.
  3. Use AWS Fargate to run the application. Use Amazon EventBridge (Amazon CloudWatch Events) to invoke the Fargate task every 4 hours.
  4. Use Amazon EC2 Spot Instances to run the application. Use AWS CodeDeploy to deploy and run the application every 4 hours.

Answer(s): C

Explanation:

C) Use AWS Fargate to run the application. Use Amazon EventBridge (Amazon CloudWatch Events) to invoke the Fargate task every 4 hours.This approach is ideal for a stateless application that can run in containers. AWS Fargate allows you to run containers without managing the underlying EC2 instances, simplifying operations and scaling. Given that the application is highly CPU intensive and single-threaded, Fargate can efficiently handle the resource allocation based on the defined requirements (2 GB of RAM and CPU needs). Using EventBridge to schedule the task every 4 hours aligns with the requirement to run the ETL job periodically.
This solution reduces operational overhead, leverages the serverless architecture of Fargate, and ensures that the application can be executed with the necessary resources without direct management of infrastructure.



Page 33 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote