Free SAP-C01 Exam Braindumps (page: 33)

Page 33 of 134

A company uses AWS Organizations for a multi-account setup in the AWS Cloud. The company uses AWS Control Tower for governance and uses AWS Transit Gateway for VPC connectivity across accounts.

In an AWS application account, the company’s application team has deployed a web application that uses AWS Lambda and Amazon RDS. The company's database administrators have a separate DBA account and use the account to centrally manage all the databases across the organization. The database administrators use an Amazon EC2 instance that is deployed in the DBA account to access an RDS database that is deployed m the application account.

The application team has stored the database credentials as secrets in AWS Secrets Manager in the application account. The application team is manually sharing the secrets with the database administrators. The secrets are encrypted by the default AWS managed key for Secrets Manager in the application account. A solutions architect needs to implement a solution that gives the database administrators access to the database and eliminates the need to manually share the secrets.

Which solution will meet these requirements?

  1. Use AWS Resource Access Manager (AWS RAM) to share the secrets from the application account with the DBA account. In the DBA account, create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the shared secrets. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  2. In the application account, create an IAM role that is named DBA-Secret. Grant the role the required permissions to access the secrets. In the DBA account, create an IAM role that is named DBA-Admin. Grant the DBA-Admin role the required permissions to assume the DBA-Secret role in the application account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  3. In the DBA account create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the secrets and the default AWS managed key in the application account. In the application account, attach resource-based policies to the key to allow access from the DBA account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
  4. In the DBA account, create an IAM role that is named DBA-Admin. Grant the role the required permissions to access the secrets in the application account. Attach an SCP to the application account to allow access to the secrets from the DBA account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.

Answer(s): B

Explanation:

To provide the database administrators in the DBA account with access to the database credentials stored in AWS Secrets Manager in the application account while eliminating the need for manual sharing, the best solution is:
B) In the application account, create an IAM role that is named DBA-Secret. Grant the role the required permissions to access the secrets. In the DBA account, create an IAM role that is named DBA-Admin. Grant the DBA-Admin role the required permissions to assume the DBA-Secret role in the application account. Attach the DBA-Admin role to the EC2 instance for access to the cross-account secrets.
-Why this is the best option:
oCross-Account Role Assumption: This solution leverages IAM roles and allows the DBA-Admin role in the DBA account to assume the DBA-Secret role in the application account, granting it the necessary permissions to access the secrets without manual intervention.
oLeast Privilege Access: The use of specific roles ensures that access is controlled and only granted to the necessary resources, adhering to the principle of least privilege.
oAutomation and Security: By automating the access process through role assumption, it reduces operational overhead and increases security by avoiding hardcoded credentials or manual sharing.
This approach effectively meets the requirement of seamless access to the secrets while maintaining security and governance standards across AWS accounts.



A company manages multiple AWS accounts by using AWS Organizations. Under the root OU, the company has two OUs: Research and DataOps.

Because of regulatory requirements, all resources that the company deploys in the organization must reside in the ap-northeast-1 Region. Additionally, EC2 instances that the company deploys in the DataOps OU must use a predefined list of instance types.

A solutions architect must implement a solution that applies these restrictions. The solution must maximize operational efficiency and must minimize ongoing maintenance.

Which combination of steps will meet these requirements? (Choose two.)

  1. Create an IAM role in one account under the DataOps OU. Use the ec2:InstanceType condition key in an inline policy on the role to restrict access to specific instance type.
  2. Create an IAM user in all accounts under the root OU. Use the aws:RequestedRegion condition key in an inline policy on each user to restrict access to all AWS Regions except ap-northeast-1.
  3. Create an SCP. Use the aws:RequestedRegion condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU.
  4. Create an SCP. Use the ec2:Region condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU, the DataOps OU, and the Research OU.
  5. Create an SCP. Use the ec2:InstanceType condition key to restrict access to specific instance types. Apply the SCP to the DataOps OU.

Answer(s): C,E

Explanation:

To ensure that all resources deployed in the organization reside in the ap-northeast-1 Region and that EC2 instances in the DataOps OU only use a predefined list of instance types, the best approach is:
C) Create an SCP. Use the aws
condition key to restrict access to all AWS Regions except ap-northeast-1. Apply the SCP to the root OU.
E) Create an SCP. Use the ec2
condition key to restrict access to specific instance types. Apply the SCP to the DataOps OU.
-Why these options are effective:
oOption C establishes a clear and broad restriction on resource deployment across all accounts within the organization by limiting the available regions to only ap-northeast-1. This ensures regulatory compliance across the entire AWS Organization.
oOption E applies specific restrictions on the DataOps OU, ensuring that only approved EC2 instance types can be utilized, thereby aligning with company policies and compliance requirements.
Using Service Control Policies (SCPs) is a highly effective way to manage permissions at the organizational level, as it centralizes governance and minimizes ongoing maintenance. This approach eliminates the need for IAM roles or users with restrictive inline policies across individual accounts, promoting operational efficiency.



A company runs a serverless application in a single AWS Region. The application accesses external URLs and extracts metadata from those sites. The company uses an Amazon Simple Notification Service (Amazon SNS) topic to publish URLs to an Amazon Simple Queue Service (Amazon SQS) queue. An AWS Lambda function uses the queue as an event source and processes the URLs from the queue. Results are saved to an Amazon S3 bucket.

The company wants to process each URL in other Regions to compare possible differences in site localization. URLs must be published from the existing Region. Results must be written to the existing S3 bucket in the current Region.

Which combination of changes will produce multi-Region deployment that meets these requirements? (Choose two.)

  1. Deploy the SQS queue with the Lambda function to other Regions.
  2. Subscribe the SNS topic in each Region to the SQS queue.
  3. Subscribe the SQS queue in each Region to the SNS topic.
  4. Configure the SQS queue to publish URLs to SNS topics in each Region.
  5. Deploy the SNS topic and the Lambda function to other Regions.

Answer(s): A,C

Explanation:

A) Deploy the SQS queue with the Lambda function to other Regions.By deploying the SQS queue and the associated Lambda function in other Regions, the application can process URLs independently in those Regions. This allows for parallel processing and comparison of site localization without affecting the existing infrastructure.
C) Subscribe the SQS queue in each Region to the SNS topic.This setup enables the existing SNS topic in the original Region to notify the SQS queues in the other Regions. When a URL is published to the SNS topic, it will trigger the SQS queues in all subscribed Regions. The Lambda functions in those Regions can then process these URLs, ensuring that results are collected for comparison while keeping the source of URLs centralized.
By implementing these two changes, the company can effectively create a multi-Region deployment for processing URLs while maintaining the original Region for URL publishing and result storage.



A company runs a proprietary stateless ETL application on an Amazon EC2 Linux instances. The application is a Linux binary, and the source code cannot be modified. The application is single-threaded, uses 2 GB of RAM, and is highly CPU intensive. The application is scheduled to run every 4 hours and runs for up to 20 minutes. A solutions architect wants to revise the architecture for the solution.

Which strategy should the solutions architect use?

  1. Use AWS Lambda to run the application. Use Amazon CloudWatch Logs to invoke the Lambda function every 4 hours.
  2. Use AWS Batch to run the application. Use an AWS Step Functions state machine to invoke the AWS Batch job every 4 hours.
  3. Use AWS Fargate to run the application. Use Amazon EventBridge (Amazon CloudWatch Events) to invoke the Fargate task every 4 hours.
  4. Use Amazon EC2 Spot Instances to run the application. Use AWS CodeDeploy to deploy and run the application every 4 hours.

Answer(s): C

Explanation:

C) Use AWS Fargate to run the application. Use Amazon EventBridge (Amazon CloudWatch Events) to invoke the Fargate task every 4 hours.This approach is ideal for a stateless application that can run in containers. AWS Fargate allows you to run containers without managing the underlying EC2 instances, simplifying operations and scaling. Given that the application is highly CPU intensive and single-threaded, Fargate can efficiently handle the resource allocation based on the defined requirements (2 GB of RAM and CPU needs). Using EventBridge to schedule the task every 4 hours aligns with the requirement to run the ETL job periodically.
This solution reduces operational overhead, leverages the serverless architecture of Fargate, and ensures that the application can be executed with the necessary resources without direct management of infrastructure.



Page 33 of 134



Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

Mike commented on October 08, 2024
Not bad at all
CANADA
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Gilbert commented on September 14, 2024
Cant wait to pass mine
Anonymous
upvote

Paresh commented on April 19, 2023
There were only 3 new questions that I did not see in this exam dumps. There rest of the questions were all word by word from this dump.
UNITED STATES
upvote

Matthew commented on October 18, 2022
An extremely helpful study package. I highly recommend.
UNITED STATES
upvote

Peter commented on June 23, 2022
I thought these were practice exam questions but they turned out to be real questoins from the actual exam.
NETHERLANDS
upvote

Henry commented on September 29, 2021
I do not have the words to thank you guys. Passing this exam was creting many scary thoughts. I am gold I used your braindumps and passed. I can get a beer and relax now.
AUSTRALIA
upvote

Nik commented on April 12, 2021
I would not be able to pass my exam without your help. You guys rock!
SINGAPOR
upvote

Rohit commented on January 09, 2021
Thank you for the 50% sale. I really appreicate this price cut during this extra ordinary time where everyone is having financial problem.
INDIA
upvote

Roger-That commented on December 23, 2020
The 20% holiday discount is a sweet deal. Thank you for the discount code.
UNITED STATES
upvote

Duke commented on October 23, 2020
It is helpful. Questions are real. Purcahse is easy but the only problem, there is no option to pay in Euro. Only USD.
GERMANY
upvote

Tan Jin commented on September 09, 2020
The questions from this exam dumps is valid. I got 88% in my exam today.
SINGAPORE
upvote

Dave commented on November 05, 2019
Useful practice questions to get a feel of the actual exam. Some of the answers are not correct so please exercise caution.
EUROPEAN UNION
upvote

Je commented on October 02, 2018
Great
UNITED STATES
upvote

Invisible Angel commented on January 11, 2018
Have yet to try. But most recommend it
NEW ZEALAND
upvote

Mic commented on December 26, 2017
Nice dumps, site is secure and checkout process is a breeze.
UNITED STATES
upvote