Free DAS-C01 Exam Braindumps (page: 6)

Page 6 of 42

Three teams of data analysts use Apache Hive on an Amazon EMR cluster with the EMR File System (EMRFS) to query data stored within each teams Amazon
S3 bucket. The EMR cluster has Kerberos enabled and is con gured to authenticate users from the corporate Active Directory. The data is highly sensitive, so access must be limited to the members of each team.
Which steps will satisfy the security requirements?

  1. For the EMR cluster Amazon EC2 instances, create a service role that grants no access to Amazon S3. Create three additional IAM roles, each granting access to each team's speci c bucket. Add the additional IAM roles to the cluster's EMR role for the EC2 trust policy. Create a security con guration mapping for the additional IAM roles to Active Directory user groups for each team.
  2. For the EMR cluster Amazon EC2 instances, create a service role that grants no access to Amazon S3. Create three additional IAM roles, each granting access to each team's speci c bucket. Add the service role for the EMR cluster EC2 instances to the trust policies for the additional IAM roles. Create a security con guration mapping for the additional IAM roles to Active Directory user groups for each team.
  3. For the EMR cluster Amazon EC2 instances, create a service role that grants full access to Amazon S3. Create three additional IAM roles, each granting access to each team's speci c bucket. Add the service role for the EMR cluster EC2 instances to the trust polices for the additional IAM roles. Create a security con guration mapping for the additional IAM roles to Active Directory user groups for each team.
  4. For the EMR cluster Amazon EC2 instances, create a service role that grants full access to Amazon S3. Create three additional IAM roles, each granting access to each team's speci c bucket. Add the service role for the EMR cluster EC2 instances to the trust polices for the base IAM roles. Create a security con guration mapping for the additional IAM roles to Active Directory user groups for each team.

Answer(s): B



A company is planning to create a data lake in Amazon S3. The company wants to create tiered storage based on access patterns and cost objectives. The solution must include support for JDBC connections from legacy clients, metadata management that allows federation for access control, and batch-based ETL using PySpark and Scala. Operational management should be limited. Which combination of components can meet these requirements? (Choose three.)

  1. AWS Glue Data Catalog for metadata management
  2. Amazon EMR with Apache Spark for ETL
  3. AWS Glue for Scala-based ETL
  4. Amazon EMR with Apache Hive for JDBC clients
  5. Amazon Athena for querying data in Amazon S3 using JDBC drivers
  6. Amazon EMR with Apache Hive, using an Amazon RDS with MySQL-compatible backed metastore

Answer(s): A,C,E


Reference:

https://d1.awsstatic.com/whitepapers/Storage/data-lake-on-aws.pdf



A company wants to optimize the cost of its data and analytics platform. The company is ingesting a number of .csv and JSON les in Amazon S3 from various data sources. Incoming data is expected to be 50 GB each day. The company is using Amazon Athena to query the raw data in Amazon S3 directly. Most queries aggregate data from the past 12 months, and data that is older than 5 years is infrequently queried. The typical query scans about 500 MB of data and is expected to return results in less than 1 minute. The raw data must be retained inde nitely for compliance requirements.
Which solution meets the company's requirements?

  1. Use an AWS Glue ETL job to compress, partition, and convert the data into a columnar data format. Use Athena to query the processed dataset. Con gure a lifecycle policy to move the processed data into the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class 5 years after object creation. Con gure a second lifecycle policy to move the raw data into Amazon S3 Glacier for long-term archival 7 days after object creation.
  2. Use an AWS Glue ETL job to partition and convert the data into a row-based data format. Use Athena to query the processed dataset. Con gure a lifecycle policy to move the data into the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class 5 years after object creation. Con gure a second lifecycle policy to move the raw data into Amazon S3 Glacier for long-term archival 7 days after object creation.
  3. Use an AWS Glue ETL job to compress, partition, and convert the data into a columnar data format. Use Athena to query the processed dataset. Con gure a lifecycle policy to move the processed data into the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class 5 years after the object was last accessed. Con gure a second lifecycle policy to move the raw data into Amazon S3 Glacier for long- term archival 7 days after the last date the object was accessed.
  4. Use an AWS Glue ETL job to partition and convert the data into a row-based data format. Use Athena to query the processed dataset. Con gure a lifecycle policy to move the data into the Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage class 5 years after the object was last accessed. Con gure a second lifecycle policy to move the raw data into Amazon S3 Glacier for long-term archival 7 days after the last date the object was accessed.

Answer(s): A



An energy company collects voltage data in real time from sensors that are attached to buildings. The company wants to receive noti cations when a sequence of two voltage drops is detected within 10 minutes of a sudden voltage increase at the same building. All noti cations must be delivered as quickly as possible. The system must be highly available. The company needs a solution that will automatically scale when this monitoring feature is implemented in other cities. The noti cation system is subscribed to an Amazon Simple Noti cation Service (Amazon SNS) topic for remediation.
Which solution will meet these requirements?

  1. Create an Amazon Managed Streaming for Apache Kafka cluster to ingest the data. Use an Apache Spark Streaming with Apache Kafka consumer API in an automatically scaled Amazon EMR cluster to process the incoming data. Use the Spark Streaming application to detect the known event sequence and send the SNS message.
  2. Create a REST-based web service by using Amazon API Gateway in front of an AWS Lambda function. Create an Amazon RDS for PostgreSQL database with su cient Provisioned IOPS to meet current demand. Con gure the Lambda function to store incoming events in the RDS for PostgreSQL database, query the latest data to detect the known event sequence, and send the SNS message.
  3. Create an Amazon Kinesis Data Firehose delivery stream to capture the incoming sensor data. Use an AWS Lambda transformation function to detect the known event sequence and send the SNS message.
  4. Create an Amazon Kinesis data stream to capture the incoming sensor data. Create another stream for noti cations. Set up AWS Application Auto Scaling on both streams. Create an Amazon Kinesis Data Analytics for Java application to detect the known event sequence, and add a message to the message stream Con gure an AWS Lambda function to poll the message stream and publish to the SNS topic.

Answer(s): C


Reference:

https://aws.amazon.com/kinesis/data-streams/faqs/



Page 6 of 42



Post your Comments and Discuss Amazon DAS-C01 exam with other Community members:

Jack commented on October 03, 2024
are these still legit?
Anonymous
upvote

Ashok Kumar commented on October 03, 2024
Very good content to prep
UNITED STATES
upvote

User commented on October 03, 2024
By far one of the best free sources of exam dumps. I searched google for free braindumps and boom I got this right away.
UNITED STATES
upvote

Vignesh commented on October 03, 2024
I'm writing next week, are the questions still valid?
CZECH REPUBLIC
upvote

Rama commented on October 03, 2024
All looks good.
Anonymous
upvote

Yaron M commented on October 03, 2024
please stop the pain i cant take this anyomre my wife left me and she took the kids its been 54 years and i still cant pass AZ104 please make the suffering stop
Anonymous
upvote

Varon commented on October 03, 2024
The 2 hardest topics of this exams are: 1) Designing Resilient Architectures and 2) Cost-Optimized Architectures By mastering these areas, you’ll be better prepared for tricky exam questions related to resilient and cost-effective architectures.
INDIA
upvote

Haji Momen commented on October 03, 2024
The questions in the exam dumps are pretty same as the real exam the only problem is that it is not complete or has less questions compared to full version. I am from South Africa and this is expensive for me. So I will be using the free version.
South Africa
upvote

Saurabh commented on October 03, 2024
Super Course to go ahead
INDIA
upvote

solla maaten commented on October 03, 2024
just reviewing
Anonymous
upvote

DJ commented on October 03, 2024
This dump is still valid?
MALAYSIA
upvote

senan commented on October 03, 2024
salam bu ne suallardi bele
AZERBAIJAN
upvote

Rk commented on October 03, 2024
Good content
Anonymous
upvote

George commented on October 02, 2024
Focus on mastering designing scalable, resilient architectures and cost-optimization strategies using core AWS services for this SAA-C03 exam.
UNITED STATES
upvote

Esmaiel commented on October 02, 2024
This is a very good practice paper to get ready for exam. Helpful to me.
UNITED STATES
upvote

Fawad commented on October 02, 2024
The exam turned out to be very hard as stated by some users here. So there is no way to pass it unless you know the questions. And note that some of the answers in this exam dump PDF is not correct but the questions are legit.
EUROPEAN UNION
upvote

Moataz commented on October 02, 2024
I approve this exam dump. It is valid in UAE. I passed the test.
UNITED ARAB EMIRATES
upvote

JB commented on October 02, 2024
Thanks for the study material.
Anonymous
upvote

Nisino commented on October 02, 2024
After weeks of cramming and feeling overwhelmed, I ended up using this exam dumps as I badly needed to pass and it worked.
Netherlands
upvote

Hades commented on October 02, 2024
i hope this will help me pass
VIET NAM
upvote

Saboor commented on October 01, 2024
The answer to comment questions here: 1- Yes, The exam and it is very hard. 2- Yes, I passed this exam. But I did not just rely on this exam dumps but I had studied. Though I got most of these questions in my test. Good luck guys.
UNITED STATES
upvote

cota commented on October 01, 2024
não entendi
BRAZIL
upvote

Fakhro commented on October 01, 2024
Single try and passed. So good and usable document.
GERMANY
upvote

Chandra commented on October 01, 2024
The full version of this document is in PDF and well formatted. I purchased it because it has more questions compare to this free version.
INDIA
upvote

hassan commented on October 01, 2024
Hoping the Dumps will help
CANADA
upvote

Fred commented on October 01, 2024
Thank you for putting together these questions. The PDF was great but the test engine needs a lot of enhancement.
UNITED KINGDOM
upvote

Solomon commented on October 01, 2024
I passed the SAAC03 on Saturday. These guys are doing a great job on this platform and they deserve the credit. Their questions are valid and thoroughly reviewed. I recommend subscribing to Freebrain dumps
Anonymous
upvote

Jeff commented on October 01, 2024
Question 11 is Form Choice (Answer D) - explanation is examining the answer
CANADA
upvote

Cleo commented on October 01, 2024
great resource, for the exams Ireland
Anonymous
upvote

shilpa commented on October 01, 2024
hi neee help in preparation of my exam
Anonymous
upvote

Petro UA commented on October 01, 2024
hate DNS questions. So need to practice more
UNITED STATES
upvote

Trying Out commented on September 30, 2024
useful to learn and prep for integ architect
Anonymous
upvote

Nope commented on September 30, 2024
Prince2 v6, about 10% of the answers are wrong
UNITED KINGDOM
upvote

Viney commented on September 30, 2024
Brilliant!!! Spot on questions. Passed with on the first go. Can't say thank you enough.
Italy
upvote