Free DAS-C01 Exam Braindumps (page: 5)

Page 5 of 42

A team of data scientists plans to analyze market trend data for their company's new investment strategy. The trend data comes from ve different data sources in large volumes. The team wants to utilize Amazon Kinesis to support their use case. The team uses SQL-like queries to analyze trends and wants to send noti cations based on certain signi cant patterns in the trends. Additionally, the data scientists want to save the data to Amazon S3 for archival and historical re- processing, and use AWS managed services wherever possible. The team wants to implement the lowest-cost solution.
Which solution meets these requirements?

  1. Publish data to one Kinesis data stream. Deploy a custom application using the Kinesis Client Library (KCL) for analyzing trends, and send noti cations using Amazon SNS. Con gure Kinesis Data Firehose on the Kinesis data stream to persist data to an S3 bucket.
  2. Publish data to one Kinesis data stream. Deploy Kinesis Data Analytic to the stream for analyzing trends, and con gure an AWS Lambda function as an output to send noti cations using Amazon SNS. Con gure Kinesis Data Firehose on the Kinesis data stream to persist data to an S3 bucket.
  3. Publish data to two Kinesis data streams. Deploy Kinesis Data Analytics to the rst stream for analyzing trends, and con gure an AWS Lambda function as an output to send noti cations using Amazon SNS. Con gure Kinesis Data Firehose on the second Kinesis data stream to persist data to an S3 bucket.
  4. Publish data to two Kinesis data streams. Deploy a custom application using the Kinesis Client Library (KCL) to the rst stream for analyzing trends, and send noti cations using Amazon SNS. Con gure Kinesis Data Firehose on the second Kinesis data stream to persist data to an S3 bucket.

Answer(s): A



A company currently uses Amazon Athena to query its global datasets. The regional data is stored in Amazon S3 in the us-east-1 and us-west-2 Regions. The data is not encrypted. To simplify the query process and manage it centrally, the company wants to use Athena in us-west-2 to query data from Amazon S3 in both
Regions. The solution should be as low-cost as possible.
What should the company do to achieve this goal?

  1. Use AWS DMS to migrate the AWS Glue Data Catalog from us-east-1 to us-west-2. Run Athena queries in us-west-2.
  2. Run the AWS Glue crawler in us-west-2 to catalog datasets in all Regions. Once the data is crawled, run Athena queries in us-west-2.
  3. Enable cross-Region replication for the S3 buckets in us-east-1 to replicate data in us-west-2. Once the data is replicated in us-west-2, run the AWS Glue crawler there to update the AWS Glue Data Catalog in us-west-2 and run Athena queries.
  4. Update AWS Glue resource policies to provide us-east-1 AWS Glue Data Catalog access to us-west-2. Once the catalog in us-west-2 has access to the catalog in us-east-1, run Athena queries in us-west-2.

Answer(s): B



A large company receives les from external parties in Amazon EC2 throughout the day. At the end of the day, the les are combined into a single le, compressed into a gzip le, and uploaded to Amazon S3. The total size of all the les is close to 100 GB daily. Once the les are uploaded to Amazon S3, an
AWS Batch program executes a COPY command to load the les into an Amazon Redshift cluster.
Which program modi cation will accelerate the COPY process?

  1. Upload the individual les to Amazon S3 and run the COPY command as soon as the les become available.
  2. Split the number of les so they are equal to a multiple of the number of slices in the Amazon Redshift cluster. Gzip and upload the les to Amazon S3. Run the COPY command on the les.
  3. Split the number of les so they are equal to a multiple of the number of compute nodes in the Amazon Redshift cluster. Gzip and upload the les to Amazon S3. Run the COPY command on the les.
  4. Apply sharding by breaking up the les so the distkey columns with the same values go to the same le. Gzip and upload the sharded les to Amazon S3. Run the COPY command on the les.

Answer(s): B


Reference:

https://docs.aws.amazon.com/redshift/latest/dg/t_splitting-data- les.html



A large ride-sharing company has thousands of drivers globally serving millions of unique customers every day. The company has decided to migrate an existing data mart to Amazon Redshift. The existing schema includes the following tables.
A trips fact table for information on completed rides.
A drivers dimension table for driver pro les.
A customers fact table holding customer pro le information.
The company analyzes trip details by date and destination to examine pro tability by region. The drivers data rarely changes. The customers data frequently changes.
What table design provides optimal query performance?

  1. Use DISTSTYLE KEY (destination) for the trips table and sort by date. Use DISTSTYLE ALL for the drivers and customers tables.
  2. Use DISTSTYLE EVEN for the trips table and sort by date. Use DISTSTYLE ALL for the drivers table. Use DISTSTYLE EVEN for the customers table.
  3. Use DISTSTYLE KEY (destination) for the trips table and sort by date. Use DISTSTYLE ALL for the drivers table. Use DISTSTYLE EVEN for the customers table.
  4. Use DISTSTYLE EVEN for the drivers table and sort by date. Use DISTSTYLE ALL for both fact tables.

Answer(s): C



Page 5 of 42



Post your Comments and Discuss Amazon DAS-C01 exam with other Community members:

Marc commented on October 21, 2024
hello would need help
UNITED STATES
upvote

Honest Consumer commented on October 21, 2024
Not a bad question bank. Very close to real exam topics and questions.
UNITED STATES
upvote

Shawna commented on October 21, 2024
I found this document a big help towards my preparation. Well worth the money.
UNITED STATES
upvote

Asma commented on October 21, 2024
Good questions
FRANCE
upvote

Jen commented on October 21, 2024
Do not overthink this guys. Just use these questions and you are good to pass.
EUROPEAN UNION
upvote

siva commented on October 21, 2024
it's goooood
INDIA
upvote

Lee commented on October 21, 2024
Finally a exam dump I can rely on. I went for the full PDF version and it turned out to be as advertised. I just passed first exam last Friday. Preping for the second one. Hopefully I can write and pass this one too because these exams are very difficult.
Hong Kong
upvote

Subash commented on October 21, 2024
I am planning to take this exam. Are these 257 questions enough to clear it? Also, does each section have a passing percentage, or is it based on the overall ?
INDIA
upvote

amrith commented on October 20, 2024
more questions on databricks as well please
Anonymous
upvote

jeff commented on October 20, 2024
This took the pressure out of preparation as I read everywhere that this exam is really hard. Wonderful resource.
UNITED STATES
upvote

CoolMo commented on October 20, 2024
A friend gave me the address to this site he said he passed his Azure exam using their exam dumps. I hope it can help me with my exam as well.
EUROPEAN UNION
upvote

Tyler commented on October 20, 2024
This is BIG help. I don't want to discount the fact that these questions are very similar to those in real exam. Way to go guys.
Canada
upvote

amrith commented on October 20, 2024
Documentation
Anonymous
upvote

Raj commented on October 20, 2024
Great article! I especially appreciated the way you broke down the questions
UNITED STATES
upvote

Jim commented on October 20, 2024
Some of the questions are tought. Need to practice more..
UNITED STATES
upvote

Jim commented on October 20, 2024
Good site for Salesforce certification
UNITED STATES
upvote

Tom commented on October 20, 2024
This is a very good resource
UNITED STATES
upvote

Marcellus Werifah commented on October 20, 2024
Verified answers
UNITED STATES
upvote

samir commented on October 20, 2024
good practice
AUSTRIA
upvote

Patric commented on October 20, 2024
The main thing about this exam dump is that the PDF is not free. And that is what I needed. So I had to pay for that but they offer 50% discount if you buy 2 or more exams.
Spain
upvote

Nathan commented on October 20, 2024
Using dumps are my last resort. And that is what I ended up using with this exam to pass. The exam is extremely difficult.
France
upvote

Marcellus Werifah commented on October 20, 2024
Who decides what is the correct in case of conflicts
UNITED STATES
upvote

Marcellus Werifah commented on October 20, 2024
Novice. Would need detailed explanation of any questions
UNITED STATES
upvote

Maya commented on October 20, 2024
It would be great if all answers are supported by reference link.
UNITED KINGDOM
upvote

Maya commented on October 20, 2024
good material
UNITED KINGDOM
upvote

Jay commented on October 20, 2024
Interesting selection of questions
GREECE
upvote

Vinod Kumar Ramaswamy commented on October 20, 2024
ITS VERY USEFUL
UNITED KINGDOM
upvote

Srikanth commented on October 20, 2024
Useful to prepare for the exam
Anonymous
upvote

Alex commented on October 20, 2024
This exam is super duper hard. So prepare for it guys. I only passed it because of these questions. God bless the owner of this site.
UNITED STATES
upvote

Petrious commented on October 20, 2024
The site provides good content and very reliable support team. Very quick to reply to questions.
GERMANY
upvote

Rashmi commented on October 20, 2024
Good Content
Anonymous
upvote

Manish commented on October 19, 2024
Hi it's a good initiative
Anonymous
upvote

Veronica commented on October 19, 2024
Brain dump questions are new to me...I'm not sure how to respond to this since I only answered a handful of questions
UNITED STATES
upvote

Kamran commented on October 19, 2024
Useful resource
UNITED STATES
upvote