Free DAS-C01 Exam Braindumps (page: 18)

Page 18 of 42

A company uses the Amazon Kinesis SDK to write data to Kinesis Data Streams. Compliance requirements state that the data must be encrypted at rest using a key that can be rotated. The company wants to meet this encryption requirement with minimal coding effort.
How can these requirements be met?

  1. Create a customer master key (CMK) in AWS KMS. Assign the CMK an alias. Use the AWS Encryption SDK, providing it with the key alias to encrypt and decrypt the data.
  2. Create a customer master key (CMK) in AWS KMS. Assign the CMK an alias. Enable server-side encryption on the Kinesis data stream using the CMK alias as the KMS master key.
  3. Create a customer master key (CMK) in AWS KMS. Create an AWS Lambda function to encrypt and decrypt the data. Set the KMS key ID in the function's environment variables.
  4. Enable server-side encryption on the Kinesis data stream using the default KMS key for Kinesis Data Streams.

Answer(s): B


Reference:

https://aws.amazon.com/kinesis/data-streams/faqs/



A company wants to enrich application logs in near-real-time and use the enriched dataset for further analysis. The application is running on Amazon EC2 instances across multiple Availability Zones and storing its logs using Amazon CloudWatch Logs. The enrichment source is stored in an Amazon DynamoDB table.
Which solution meets the requirements for the event collection and enrichment?

  1. Use a CloudWatch Logs subscription to send the data to Amazon Kinesis Data Firehose. Use AWS Lambda to transform the data in the Kinesis Data Firehose delivery stream and enrich it with the data in the DynamoDB table. Con gure Amazon S3 as the Kinesis Data Firehose delivery destination.
  2. Export the raw logs to Amazon S3 on an hourly basis using the AWS CLI. Use AWS Glue crawlers to catalog the logs. Set up an AWS Glue connection for the DynamoDB table and set up an AWS Glue ETL job to enrich the data. Store the enriched data in Amazon S3.
  3. Con gure the application to write the logs locally and use Amazon Kinesis Agent to send the data to Amazon Kinesis Data Streams. Con gure a Kinesis Data Analytics SQL application with the Kinesis data stream as the source. Join the SQL application input stream with DynamoDB records, and then store the enriched output stream in Amazon S3 using Amazon Kinesis Data Firehose.
  4. Export the raw logs to Amazon S3 on an hourly basis using the AWS CLI. Use Apache Spark SQL on Amazon EMR to read the logs from Amazon S3 and enrich the records with the data from DynamoDB. Store the enriched data in Amazon S3.

Answer(s): A



A company uses Amazon Redshift as its data warehouse. A new table has columns that contain sensitive data. The data in the table will eventually be referenced by several existing queries that run many times a day. A data analyst needs to load 100 billion rows of data into the new table. Before doing so, the data analyst must ensure that only members of the auditing group can read the columns containing sensitive data.
How can the data analyst meet these requirements with the lowest maintenance overhead?

  1. Load all the data into the new table and grant the auditing group permission to read from the table. Load all the data except for the columns containing sensitive data into a second table. Grant the appropriate users read-only permissions to the second table.
  2. Load all the data into the new table and grant the auditing group permission to read from the table. Use the GRANT SQL command to allow read-only access to a subset of columns to the appropriate users.
  3. Load all the data into the new table and grant all users read-only permissions to non-sensitive columns. Attach an IAM policy to the auditing group with explicit ALLOW access to the sensitive data columns.
  4. Load all the data into the new table and grant the auditing group permission to read from the table. Create a view of the new table that contains all the columns, except for those considered sensitive, and grant the appropriate users read-only permissions to the table.

Answer(s): B



A banking company wants to collect large volumes of transactional data using Amazon Kinesis Data Streams for real-time analytics. The company uses
PutRecord to send data to Amazon Kinesis, and has observed network outages during certain times of the day. The company wants to obtain exactly once semantics for the entire processing pipeline.
What should the company do to obtain these characteristics?

  1. Design the application so it can remove duplicates during processing be embedding a unique ID in each record.
  2. Rely on the processing semantics of Amazon Kinesis Data Analytics to avoid duplicate processing of events.
  3. Design the data producer so events are not ingested into Kinesis Data Streams multiple times.
  4. Rely on the exactly one processing semantics of Apache Flink and Apache Spark Streaming included in Amazon EMR.

Answer(s): A


Reference:

https://docs.aws.amazon.com/streams/latest/dev/kinesis-record-processor-duplicates.html



Page 18 of 42



Post your Comments and Discuss Amazon DAS-C01 exam with other Community members:

raba commented on September 26, 2024
@khorshal can i use this alone to pass the exams
Anonymous
upvote

raba commented on September 26, 2024
some of the questions are straight forward
Anonymous
upvote

Judwa commented on September 26, 2024
This exam is super hard. I was overwhelmed. After using this exam dump, I went into the exam feeling a bit better. I passed my test. :-)
INDIA
upvote

Jubran commented on September 26, 2024
Clear explanations and well-structured content made it so much easier to prepare and pass.
UNITED STATES
upvote

KXK commented on September 26, 2024
The study guide was concise yet comprehensive. It helped me focus on the key topics and feel more prepared than ever!
INDIA
upvote

Chandra commented on September 26, 2024
I passed my exam with ease, thanks to the targeted material in this guide. It made a huge difference in how I prepared.
CANADA
upvote

raba commented on September 26, 2024
I was thinking question 16 should be a legacy systems
Anonymous
upvote

Bubba commented on September 26, 2024
Good work guys. The layout is user-friendly, and the content is spot on.
Hong Kong
upvote

rabihu commented on September 26, 2024
These are really challenging questions.i love it
Anonymous
upvote

Murad commented on September 26, 2024
This guide gave me the exact focus I needed to pass my exam on the first try. Highly effective and reliable.
Turkey
upvote

raba commented on September 26, 2024
these are really good questions
Anonymous
upvote

Kg commented on September 26, 2024
hi @phil , thank you for the response , basically i must just check wether the answers are correct
Anonymous
upvote

Alhassan commented on September 26, 2024
these are really good questions
Anonymous
upvote

Jose commented on September 26, 2024
these are really good questions
Anonymous
upvote

David commented on September 26, 2024
good Questions
Anonymous
upvote

Mohammed commented on September 26, 2024
Absolutely grateful for this exam dumps. Passed on the first set down.
France
upvote

Phil commented on September 26, 2024
Hi @kg I feel you. Based on my experience, the questions are valid but some of the answers were not accurate. So I managed to study and kinda figure these answers. For me the accuracy of the questions were more important and I saw most of them in the exam.
Anonymous
upvote

Madhan commented on September 26, 2024
Useful questions
INDIA
upvote

Owol Sentmi commented on September 26, 2024
great Questions
Anonymous
upvote

Noha commented on September 26, 2024
Feeling very confident now. Went over the free questions here then decided to buy the full PDF and test engine with the sale price and now ready to write my test. Will share my experience next week after I go for my exam. Wish me luck guys.
UNITED STATES
upvote

Baylis commented on September 26, 2024
I am certified now. Thank you team.
UNITED STATES
upvote

Harper commented on September 26, 2024
If you have access to full version of this exam dumps then you are good to go and pass your exam.
EUROPEAN UNION
upvote

Suil commented on September 26, 2024
Very good Practice questions
CHINA
upvote

lala commented on September 26, 2024
really helping
Anonymous
upvote

Champ commented on September 26, 2024
Good to see that something is still free. I truly appreciate this service.
Mexico
upvote

kg commented on September 26, 2024
anyone who sees this comment please respond to my question, can the answers on freedumps be trusted , because im using different materials also from exam topics and the answers dont look the same
Anonymous
upvote

Shams commented on September 25, 2024
This exam is valid in UAE. I passed.
UNITED ARAB EMIRATES
upvote

rb commented on September 25, 2024
these are really good questions
Anonymous
upvote

Muhammad Saleem commented on September 25, 2024
In which Service Studio layer can Entities be found? I think Answer should be Data but It's Interface
UNITED ARAB EMIRATES
upvote

Khoshal commented on September 25, 2024
@Emily I have taken this exam and yes it is hard. But I managed to pass this exam with some study and using the questions from this exam dumps. I would say about 80% more or less of these questions are in the exam.
INDIA
upvote

Emily commented on September 25, 2024
I understand that most users reported that this exam is very hard. But how much of these questions were present in the exam if anyone has taken the exam? Please share.
Hong Kong
upvote

john commented on September 25, 2024
these are really good ques
Anonymous
upvote

Catho commented on September 25, 2024
I blindly trusted this site and purchased the full version. Well I am happy I did. Now I acquired my certificate and pass my exam.
EUROPEAN UNION
upvote

Gorbender commented on September 25, 2024
There are some new questions in this exam which are not present in this exam dumps. But about 75 to 80% of the questions are there. It was enough for me to pass.
INDIA
upvote