Free CCA175 Exam Braindumps (page: 13)

Page 13 of 25

Problem Scenario 46 : You have been given belwo list in scala (name, sex, cost) for each work done.
List( ("Deeapak" , "male", 4000), ("Deepak" , "male", 2000), ("Deepika" , "female", 2000), ("Deepak" , "female", 2000), ("Deepak" , "male", 1000) , ("Neeta" , "female", 2000))
Now write a Spark program to load this list as an RDD and do the sum of cost for combination of name and sex (as key)

  1. See the explanation for Step by Step Solution and configuration.

Answer(s): A

Explanation:

Solution :
Step 1: Create an RDD out of this list
val rdd = sc.parallelize(List( ("Deeapak" , "male", 4000}, ("Deepak" , "male", 2000), ("Deepika" , "female", 2000), ("Deepak" , "female", 2000), ("Deepak" , "male", 1000} , ("Neeta" , "female", 2000}}}
Step 2: Convert this RDD in pair RDD
val byKey = rdd.map({case (name, sex, cost) => (name, sex)->cost})
Step 3: Now group by Key
val byKeyGrouped = byKey.groupByKey
Step 4: Nowsum the cost for each group
val result = byKeyGrouped.map{case ((id1, id2), values) => (id1, id2, values.sum)}
Step 5: Save the results result.repartition(1).saveAsTextFile("spark12/result.txt")



Problem Scenario 6 : You have been given following mysql database details as well as other info.
user=retail_dba
password=cloudera
database=retail_db
jdbc URL = jdbc:mysql://quickstart:3306/retail_db
Compression Codec : org.apache.hadoop.io.compress.SnappyCodec
Please accomplish following.

1. Import entire database such that it can be used as a hive tables, it must be created in
default schema.
2. Also make sure each tables file is partitioned in 3 files e.g. part-00000, part-00002, part-
00003
3. Store all the Java files in a directory called java_output to evalute the further

  1. See the explanation for Step by Step Solution and configuration.

Answer(s): A

Explanation:

Solution :
Step 1: Drop all the tables, which we have created in previous problems. Before implementing the solution.
Login to hive and execute following command.
show tables;
drop table categories;
drop table customers;
drop table departments;
drop table employee;
drop table ordeMtems;
drop table orders;
drop table products;
show tables;
Check warehouse directory. hdfs dfs -Is /user/hive/warehouse
Step 2: Now we have cleaned database. Import entire retail db with all the required parameters as problem statement is asking.
sqoop import-all-tables \
-m3\
-connect jdbc:mysql://quickstart:3306/retail_db \
--username=retail_dba \
-password=cloudera \
-hive-import \
--hive-overwrite \
-create-hive-table \
--compress \
--compression-codec org.apache.hadoop.io.compress.SnappyCodec \
--outdir java_output
Step 3: Verify the work is accomplished or not.
a). Go to hive and check all the tables hive
show tables;
select count(1) from customers;
b). Check the-warehouse directory and number of partitions, hdfs dfs -Is /user/hive/warehouse
hdfs dfs -Is /user/hive/warehouse/categories
c). Check the output Java directory.
Is -Itr java_output/



Problem Scenario 38 : You have been given an RDD as below,
val rdd: RDD[Array[Byte]]
Now you have to save this RDD as a SequenceFile. And below is the code snippet.
import org.apache.hadoop.io.compress.GzipCodec
rdd.map(bytesArray => (A.get(), new
B(bytesArray))).saveAsSequenceFile('7output/path", classOt[GzipCodec])
What would be the correct replacement for A and B in above snippet.

  1. See the explanation for Step by Step Solution and configuration.

Answer(s): A

Explanation:

Solution :

A). NullWritable
B). BytesWritable



Problem Scenario 57 : You have been given below code snippet.
val a = sc.parallelize(1 to 9, 3) operationl
Write a correct code snippet for operationl which will produce desired output, shown below.
Array[(String, Seq[lnt])] = Array((even, ArrayBuffer(2, 4, G, 8)), (odd, ArrayBuffer(1, 3, 5, 7,
9)))

  1. See the explanation for Step by Step Solution and configuration.

Answer(s): A

Explanation:

Solution :

a.groupBy(x => {if (x % 2 == 0) "even" else "odd" }).collect



Page 13 of 25



Post your Comments and Discuss Cloudera CCA175 exam with other Community members:

Abhishek commented on December 30, 2024
It's a useful deck
Anonymous
upvote

Ramba commented on December 30, 2024
is this valid?
EUROPEAN UNION
upvote

Sith commented on December 30, 2024
Anyone taken the exam recently?
CANADA
upvote

sith commented on December 30, 2024
Are these question still valid ? can someone please confirm?
CANADA
upvote

Frank Smith commented on December 30, 2024
Excellent material for exam preparation
COSTA RICA
upvote

Anonymous commented on December 30, 2024
thank you for sharing
Anonymous
upvote

NB commented on December 30, 2024
good support
Anonymous
upvote

Karanpeet Sachdeva commented on December 30, 2024
Preparing for exam
INDIA
upvote

Harshini commented on December 30, 2024
Good Practice
Anonymous
upvote

Uzman commented on December 30, 2024
great collection
Anonymous
upvote

Md Habibur Rahman commented on December 30, 2024
Very helpful
BANGLADESH
upvote

Kollur commented on December 29, 2024
Best questions for preparation
JAPAN
upvote

Kollur commented on December 29, 2024
Usefull data
JAPAN
upvote

dinesh commented on December 29, 2024
Useful data
AUSTRALIA
upvote

Max commented on December 29, 2024
You’re the best
Anonymous
upvote

Deepika Deshmukh commented on December 29, 2024
very helpful content it helps a lot
Anonymous
upvote

Criss commented on December 29, 2024
Very nice and very good questions
Anonymous
upvote

Real truth commented on December 29, 2024
this is crap
Anonymous
upvote

Md commented on December 29, 2024
Totally worth it!
Anonymous
upvote

Datahighway commented on December 29, 2024
nice very good Stuff
UNITED STATES
upvote

Mon88 commented on December 29, 2024
is this dumps still valid to take the exam
UNITED STATES
upvote

Ashu commented on December 29, 2024
The best IT guide I have ever used. The content is well designed and nicely formatted. The software is very user-friendly and doesn't need an additional purchase like other websites. I highly recommend this.
UNITED STATES
upvote

Unknown Man commented on December 29, 2024
good stuff, but can you clarify the source
Anonymous
upvote

Unknown Man commented on December 29, 2024
Are these questions valid?
Anonymous
upvote

hnt commented on December 29, 2024
very good content
UNITED STATES
upvote

Subham commented on December 29, 2024
Good practice set
Anonymous
upvote

Vinod commented on December 28, 2024
very good questions
INDIA
upvote

Anon commented on December 28, 2024
Very helpful
UNITED STATES
upvote

Sachin Kamble commented on December 28, 2024
useful information
Anonymous
upvote

Sachin Kamble commented on December 28, 2024
very interesting and useful onformation
Anonymous
upvote

Bosco Oico commented on December 28, 2024
Yes. i have used this dump for CFE Investigations test yesterday- i found it useful because questions about 60% were closely related and some were exact. The only thing that needs to be improved is the accuracy of the answers. If some one read the CFE manual well, you will notice that some answers as they are answered according to the ACFE standard. So, i encourage anyone using Brain to verify answers, otherwise, its a good source to create confidence and sure pass
UGANDA
upvote

velu commented on December 28, 2024
nice,very useful
Anonymous
upvote

Sai commented on December 28, 2024
I need okta dumps
Anonymous
upvote

Edison Vásquez commented on December 28, 2024
Muy bueno todo muy bien explicado
Anonymous
upvote