Free SAP-C01 Exam Braindumps (page: 15)

Page 14 of 134

A company runs a new application as a static website in Amazon S3. The company has deployed the application to a production AWS account and uses Amazon CloudFront to deliver the website. The website calls an Amazon API Gateway REST API. An AWS Lambda function backs each API method.

The company wants to create a CSV report every 2 weeks to show each API Lambda function’s recommended configured memory, recommended cost, and the price difference between current configurations and the recommendations. The company will store the reports in an S3 bucket.

Which solution will meet these requirements with the LEAST development time?

  1. Create a Lambda function that extracts metrics data for each API Lambda function from Amazon CloudWatch Logs for the 2-week period. Collate the data into tabular format. Store the data as a .csv file in an S3 bucket. Create an Amazon EventBridge rule to schedule the Lambda function to run every 2 weeks.
  2. Opt in to AWS Compute Optimizer. Create a Lambda function that calls the ExportLambdaFunctionRecommendations operation. Export the .csv file to an S3 bucket. Create an Amazon EventBridge rule to schedule the Lambda function to run every 2 weeks.
  3. Opt in to AWS Compute Optimizer. Set up enhanced infrastructure metrics. Within the Compute Optimizer console, schedule a job to export the Lambda recommendations to a .csv file. Store the file in an S3 bucket every 2 weeks.
  4. Purchase the AWS Business Support plan for the production account. Opt in to AWS Compute Optimizer for AWS Trusted Advisor checks. In the Trusted Advisor console, schedule a job to export the cost optimization checks to a .csv file. Store the file in an S3 bucket every 2 weeks.

Answer(s): B

Explanation:

B) Opt in to AWS Compute Optimizer. Create a Lambda function that calls the ExportLambdaFunctionRecommendations operation. Export the .csv file to an S3 bucket. Create an Amazon EventBridge rule to schedule the Lambda function to run every 2 weeks is the correct solution.

AWS Compute Optimizer provides recommendations for optimizing the memory and performance of Lambda functions, including cost estimates. By using the ExportLambdaFunctionRecommendations operation, you can easily extract recommendations into a .csv file and store it in an S3 bucket. The solution involves minimal development effort because Compute Optimizer automatically provides the required data, and scheduling the Lambda function with Amazon EventBridge ensures the task runs every two weeks without manual intervention.

This solution meets the requirements for creating a report with recommended configurations and cost differences while minimizing development time.



A company’s factory and automation applications are running in a single VPC. More than 20 applications run on a combination of Amazon EC2, Amazon Elastic Container Service (Amazon ECS), and Amazon RDS.

The company has software engineers spread across three teams. One of the three teams owns each application, and each time is responsible for the cost and performance of all of its applications. Team resources have tags that represent their application and team. The teams use IAM access for daily activities.

The company needs to determine which costs on the monthly AWS bill are attributable to each application or team. The company also must be able to create reports to compare costs from the last 12 months and to help forecast costs for the next 12 months. A solutions architect must recommend an AWS Billing and Cost Management solution that provides these cost reports.

Which combination of actions will meet these requirements? (Choose three.)

  1. Activate the user-define cost allocation tags that represent the application and the team.
  2. Activate the AWS generated cost allocation tags that represent the application and the team.
  3. Create a cost category for each application in Billing and Cost Management.
  4. Activate IAM access to Billing and Cost Management.
  5. Create a cost budget.
  6. Enable Cost Explorer.

Answer(s): A,C,F

Explanation:

The correct answers are:

A) Activate the user-defined cost allocation tags that represent the application and the team: Activating user-defined cost allocation tags allows you to track costs based on specific tags associated with your resources, such as the application and team tags. This helps in attributing costs to the appropriate teams and applications.

C) Create a cost category for each application in Billing and Cost Management: Creating cost categories allows you to group your AWS costs by specific criteria (such as applications or teams) for better analysis and reporting. This is crucial for organizing and tracking costs across the teams.

F) Enable Cost Explorer: Cost Explorer helps visualize and analyze costs, providing historical data and forecasts for the next 12 months. It can generate detailed reports based on activated tags and cost categories, allowing the company to track costs and make informed budgeting decisions.

These actions allow the company to track costs by team and application, analyze trends over time, and create forecasts based on historical data.



An AWS customer has a web application that runs on premises. The web application fetches data from a third-party API that is behind a firewall. The third party accepts only one public CIDR block in each client’s allow list.

The customer wants to migrate their web application to the AWS Cloud. The application will be hosted on a set of Amazon EC2 instances behind an Application Load Balancer (ALB) in a VPC. The ALB is located in public subnets. The EC2 instances are located in private subnets. NAT gateways provide internet access to the private subnets.

How should a solutions architect ensure that the web application can continue to call the third-party API after the migration?

  1. Associate a block of customer-owned public IP addresses to the VPC. Enable public IP addressing for public subnets in the VPC.
  2. Register a block of customer-owned public IP addresses in the AWS account. Create Elastic IP addresses from the address block and assign them to the NAT gateways in the VPC.
  3. Create Elastic IP addresses from the block of customer-owned IP addresses. Assign the static Elastic IP addresses to the ALB.
  4. Register a block of customer-owned public IP addresses in the AWS account. Set up AWS Global Accelerator to use Elastic IP addresses from the address block. Set the ALB as the accelerator endpoint.

Answer(s): B

Explanation:

B) Register a block of customer-owned public IP addresses in the AWS account. Create Elastic IP addresses from the address block and assign them to the NAT gateways in the VPC is the correct answer.

By registering customer-owned public IP addresses with AWS, you ensure that the third-party API sees traffic originating from a trusted IP range that the third party has already whitelisted. Assigning Elastic IP addresses from this block to the NAT gateways ensures that all traffic from the EC2 instances in the private subnets (which go through the NAT gateway for external access) will use the customer-owned IP addresses. This setup allows the web application to continue calling the third-party API as it did when running on-premises.

This solution ensures that the migration to AWS does not break the integration with the third-party API while maintaining security and compliance with the API's firewall rules.



A company with several AWS accounts is using AWS Organizations and service control policies (SCPs). An administrator created the following SCP and has attached it to an organizational unit (OU) that contains AWS account 1111-1111-1111:


Developers working in account 1111-1111-1111 complain that they cannot create Amazon S3 buckets. How should the administrator address this problem?

  1. Add s3:CreateBucket with “Allow” effect to the SCP.
  2. Remove the account from the OU, and attach the SCP directly to account 1111-1111-1111.
  3. Instruct the developers to add Amazon S3 permissions to their IAM entities.
  4. Remove the SCP from account 1111-1111-1111.

Answer(s): C

Explanation:

C) Instruct the developers to add Amazon S3 permissions to their IAM entities is the correct answer.

An SCP (Service Control Policy) defines what services and actions can be used within an AWS account, but it does not grant permissions on its own. It acts as a boundary. In this case, the SCP does not explicitly deny the ability to create Amazon S3 buckets, so the issue is that the developers' IAM roles or users are not assigned the appropriate permissions to create S3 buckets.

To resolve this, the developers need to have the necessary S3 permissions (e.g., s3:CreateBucket) in their IAM roles or policies. Once the appropriate permissions are added, they will be able to create S3 buckets, as the SCP does not restrict that action.






Post your Comments and Discuss Amazon SAP-C01 exam with other Community members:

SAP-C01 Discussions & Posts