Free AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL Exam Braindumps (page: 70)

Page 70 of 134

A company uses AWS CloudFormation to deploy applications within multiple VPCs that are all attached to a transit gateway. Each VPC that sends traffic to the public internet must send the traffic through a shared services VPC. Each subnet within a VPC uses the default VPC route table, and the traffic is routed to the transit gateway. The transit gateway uses its default route table for any VPC attachment.

A security audit reveals that an Amazon EC2 instance that is deployed within a VPC can communicate with an EC2 instance that is deployed in any of the company's other VPCs. A solutions architect needs to limit the traffic between the VPCs. Each VPC must be able to communicate only with a predefined, limited set of authorized VPCs.

What should the solutions architect do to meet these requirements?

  1. Update the network ACL of each subnet within a VPC to allow outbound traffic only to the authorized VPCs. Remove all deny rules except the default deny rule.
  2. Update all the security groups that are used within a VPC to deny outbound traffic to security groups that are used within the unauthorized VPCs.
  3. Create a dedicated transit gateway route table for each VPC attachment. Route traffic only to the authorized VPCs.
  4. Update the main route table of each VPC to route traffic only to the authorized VPCs through the transit gateway.

Answer(s): C

Explanation:

C) Creating a dedicated transit gateway route table for each VPC attachment and routing traffic only to authorized VPCs ensures that VPCs can communicate with each other only as per the predefined rules. By isolating the traffic flow at the transit gateway level, the solution restricts inter-VPC communication, allowing only authorized VPCs to exchange traffic, which meets the security requirements of limiting traffic between VPCs.



A company has a Windows-based desktop application that is packaged and deployed to the users' Windows machines. The company recently acquired another company that has employees who primarily use machines with a Linux operating system. The acquiring company has decided to migrate and rehost the Windows-based desktop application to AWS.

All employees must be authenticated before they use the application. The acquiring company uses Active Directory on premises but wants a simplified way to manage access to the application on AWS for all the employees.

Which solution will rehost the application on AWS with the LEAST development effort?

  1. Set up and provision an Amazon Workspaces virtual desktop for every employee. Implement authentication by using Amazon Cognito identity pools. Instruct employees to run the application from their provisioned Workspaces virtual desktops.
  2. Create an Auto Scaling group of Windows-based Amazon EC2 instances. Join each EC2 instance to the company’s Active Directory domain. Implement authentication by using the Active Directory that is running on premises. Instruct employees to run the application by using a Windows remote desktop.
  3. Use an Amazon AppStream 2.0 image builder to create an image that includes the application and the required configurations. Provision an AppStream 2.0 On-Demand fleet with dynamic Fleet Auto Scaling policies for running the image. Implement authentication by using AppStream 2.0 user pools. Instruct the employees to access the application by starting browser-based AppStream 2.0 streaming sessions.
  4. Refactor and containerize the application to run as a web-based application. Run the application in Amazon Elastic Container Service (Amazon ECS) on AWS Fargate with step scaling policies. Implement authentication by using Amazon Cognito user pools. Instruct the employees to run the application from their browsers.

Answer(s): C

Explanation:

C) Using Amazon AppStream 2.0 provides a quick way to rehost the Windows desktop application without significant redevelopment efforts. AppStream 2.0 allows the application to run on AWS and stream to users via a browser. By leveraging AppStream 2.0 On-Demand fleet with auto-scaling, authentication via AppStream 2.0 user pools, and a browser-based streaming session, the company can easily provide access to both Windows and Linux users without requiring major changes to the existing application. This approach also simplifies authentication and access management, meeting the company’s requirements.



A company is collecting a large amount of data from a fleet of IoT devices. Data is stored as Optimized Row Columnar (ORC) files in the Hadoop Distributed File System (HDFS) on a persistent Amazon EMR cluster. The company's data analytics team queries the data by using SQL in Apache Presto deployed on the same EMR cluster. Queries scan large amounts of data, always run for less than 15 minutes, and run only between 5 PM and 10 PM.

The company is concerned about the high cost associated with the current solution. A solutions architect must propose the most cost-effective solution that will allow SQL data queries.

Which solution will meet these requirements?

  1. Store data in Amazon S3. Use Amazon Redshift Spectrum to query data.
  2. Store data in Amazon S3. Use the AWS Glue Data Catalog and Amazon Athena to query data.
  3. Store data in EMR File System (EMRFS). Use Presto in Amazon EMR to query data.
  4. Store data in Amazon Redshift. Use Amazon Redshift to query data.

Answer(s): B

Explanation:

B) Storing data in Amazon S3 and using the AWS Glue Data Catalog along with Amazon Athena is the most cost-effective solution. Amazon Athena allows for serverless querying of data stored in S3 using SQL, which is more cost-efficient compared to maintaining a persistent Amazon EMR cluster. Athena supports querying large datasets, such as the ORC files from the IoT devices, and since queries only run during a specific time window, Athena’s on-demand nature is ideal. This setup eliminates the need for a persistent EMR cluster, thus reducing costs.



A large company recently experienced an unexpected increase in Amazon RDS and Amazon DynamoDB costs. The company needs to increase visibility into details of AWS Billing and Cost Management. There are various accounts associated with AWS Organizations, including many development and production accounts. There is no consistent tagging strategy across the organization, but there are guidelines in place that require all infrastructure to be deployed using AWS CloudFormation with consistent tagging. Management requires cost center numbers and project ID numbers for all existing and future DynamoDB tables and RDS instances.

Which strategy should the solutions architect provide to meet these requirements?

  1. Use Tag Editor to tag existing resources. Create cost allocation tags to define the cost center and project ID and allow 24 hours for tags to propagate to existing resources.
  2. Use an AWS Config rule to alert the finance team of untagged resources. Create a centralized AWS Lambda based solution to tag untagged RDS databases and DynamoDB resources every hour using a cross-account role.
  3. Use Tag Editor to tag existing resources. Create cost allocation tags to define the cost center and project ID. Use SCPs to restrict resource creation that do not have the cost center and project ID on the resource.
  4. Create cost allocation tags to define the cost center and project ID and allow 24 hours for tags to propagate to existing resources. Update existing federated roles to restrict privileges to provision resources that do not include the cost center and project ID on the resource.

Answer(s): C

Explanation:

C) Using Tag Editor to tag existing resources and creating cost allocation tags for cost center and project ID ensures that all current resources are properly tagged. Implementing Service Control Policies (SCPs) to restrict the creation of new resources that do not include the required tags ensures compliance with the company’s guidelines moving forward. This strategy provides a way to apply consistent tagging across the organization, which aligns with the requirements for tagging all infrastructure deployed using AWS CloudFormation.



Page 70 of 134



Post your Comments and Discuss Amazon AWS-SOLUTIONS-ARCHITECT-PROFESSIONAL exam with other Community members:

Zak commented on June 28, 2024
@AppleKid, I manged to pass this exam after failing once. Do not set for your exam without memorizing these questions. These are what you will see in the real exam.
Anonymous
upvote

Apple Kid commented on June 26, 2024
Did anyone gave exam recently and tell if these are good?
Anonymous
upvote

Captain commented on June 26, 2024
This is so helpful
Anonymous
upvote

udaya commented on April 25, 2024
stulll learning and seem to be questions are helpful
Anonymous
upvote

Jerry commented on February 18, 2024
very good for exam !!!!
HONG KONG
upvote

AWS-Guy commented on February 16, 2024
Precise and to the point. I aced this exam and now going for the next exam. Very great full to this site and it's wonderful content.
CANADA
upvote

Jerry commented on February 12, 2024
very good exam stuff
HONG KONG
upvote

travis head commented on November 16, 2023
I gave the Amazon SAP-C02 tests and prepared from this site as it has latest mock tests available which helped me evaluate my performance and score 919/1000
Anonymous
upvote

Weed Flipper commented on October 07, 2020
This is good stuff man.
CANADA
upvote

IT-Guy commented on September 29, 2020
Xengine software is good and free. Too bad it is only in English and no support for French.
FRANCE
upvote

pema commented on August 30, 2019
Can I have the latest version of this exam?
GERMANY
upvote

MrSimha commented on February 23, 2019
Thank you
Anonymous
upvote

Phil C. commented on November 12, 2018
To soon to tell, but I will be back to post a review after my exam.
Anonymous
upvote

MD EJAZ ALI TANWIR commented on August 20, 2017
This is valid dump in US. Thank you guys for providing this.
UNITED STATES
upvote

flypig commented on June 02, 2017
The Braindumps will short my ready time for this exam!
CHINA
upvote