Free Oracle 1Z0-449 Exam Braindumps (page: 5)

QUESTION: 7
Your customer has an older starter rack Big Data Appliance (BDA) that was purchased in 2013.
The customer would like to know what the options are for growing the storage footprint of its
server.
Which two options are valid for expanding the customer's BDA footprint? (Choose two.)

A. Elastical y expand the footprint by adding additional high capacity nodes.
B. Elastical y expand the footprint by adding additional Big Data Oracle Database Servers.
C. Elastical y expand the footprint by adding additional Big Data Storage Servers.
D. Racks manufactured before 2014 are no longer eligible for expansion.
E. Upgrade to a full 18-node Big Data Appliance.

Answer(s): D, E
QUESTION: 8
How does the Oracle SQL Connector for HDFS access HDFS data from the Oracle database?

A. NoSQL tables
B. Data Pump files
C. external tables
D. Apache Sqoop files
E. non-partitioned tables

Answer(s): B
Explanation:
Using Oracle SQL Connector for HDFS, you can use Oracle Database to access and analyze
data residing in Hadoop in these formats:
Reference:
https://docs.oracle.com/cd/E37231_01/doc.20/e36961/sqlch.htm#BDCUG126
QUESTION: 9
Your customer needs the data that is generated from social media such as Facebook and
Twitter, and the customer's website to be consumed and sent to an HDFS directory for analysis
by the marketing team.
Identify the architecture that you should configure.

A. multiple flume agents with collectors that output to a logger that writes to the Oracle Loader
for Hadoop agent
B. multiple flume agents with sinks that write to a consolidated source with a sink to the
customer's HDFS directory
C. a single flume agent that collects data from the customer's website, which is connected to
both Facebook and Twitter, and writes via the collector to the customer's HDFS directory
D. multiple HDFS agents that write to a consolidated HDFS directory
E. a single HDFS agent that collects data from the customer's website, which ls connected to
both Facebook and Twitter, and writes via the Hive to the customer's HDFS directory

Answer(s): B
Explanation:
Apache Flume - Fetching Twitter Data. Flume in this case will be responsible for capturing the

https://Free-Braindumps.com

Viewing page 5 of 36
Viewing questions 17 - 20 out of 72 questions



Post your Comments and Discuss Oracle 1Z0-449 exam prep with other Community members:

1Z0-449 Exam Discussions & Posts