Sie sind auf Seite 1von 16

Oracle Big Data 2017 Implementation Essentials Exam

Study Guide
Getting Started
The Oracle Big Data 2017 Implementation Essentials Exam study guide is designed to help you prepare for the Oracle
Big Data 2017 Implementation Essentials Exam (1Z0-449) and become OPN Certified Specialist.

Target Audience
The Oracle Big Data 2017 Implementation Essentials exam audience defines the type of participants who are likely to
pass the exam and targets individuals with a specific level of education and expertise:

Job Role:

Technical Implementers

Level of Competency:

Candidates should have 1-2 Hadoop, NoSQL or Oracle Big Data Appliance implementations.
Participants should possess good knowledge of key technologies like: Hadoop, NoSQL, Oracle Big Data
Connectors, Oracle R.
Deep experience in at least one other Oracle product family is highly recommended.

Exam Topics
The Oracle Big Data 2017 Implementation Essentials covers 11 topics:

Big Data Technical Overview


Core Hadoop
Oracle NoSQL Database
Cloudera Enterprise Hadoop Distribution
Programming with R
Oracle Loader for Hadoop
Oracle SQL Connector for HDFS
ODI and Hadoop
Big Data SQL
Xquery for Hadoop Connector
Securing Hadoop

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 2 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Levels of Knowledge
Each exam topic contains objectives and each objective is categorized as either a learner or practitioner level of
knowledge.

Learner items test foundational grasp and Example:


require product comprehension (not
When setting up price list modifiers in Advanced Pricing, which
recognition or memorization).
three steps must be completed in order to successfully activate
surcharge and price break features?

Practitioner items present on-the-job Example:


scenarios and require the ability to: integrate
1) You are creating price list modifiers in Advanced Pricing.
and apply knowledge in new contexts, analyze
Your customer has three requirements: X, Y, Z. Identify the two
and troubleshoot complex issues, and solve
steps that must be completed in order to meet those requirements.
problems.
2) You are running a two-instance database with six redo logs
defined. You decide to add a third thread to support a third
database instance, on the third node of the cluster.

Using command line administration, which two commands will


you execute to achieve this?

Training Options
Throughout the study guide each exam topic recommends one or several training/documentation titles:

Recommended Training
Online Training - recorded or live virtual training sessions
OPN Boot Camps - a combination of classroom lectures, hands-on lab exercises, and case studies
Oracle University Training - instructor-led in-class training, live virtual class, on-demand training
Recommended Documentation
Oracle Documentation product manual in on-line format
Product tutorials - on-line information on how to use the product
Datasheets and white papers - documents that summarize the performance and other technical characteristics of a
product, machine, component
Books product information written text that published in printed or electronic form

While the Oracle PartnerNetwork facilitates free access to online training, in class trainings often require a fee.

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 3 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Recommended Training
Online Training
Oracle Big Data 2016 Implementation Specialist
OPN Boot Camp
Oracle Big Data 2016 Implementation Boot Camp
Oracle University Training
Oracle Big Data Fundamentals Ed 1
Oracle NoSQL Database for Administrators Ed 1

Recommended Documentation
Oracle Documentation
Oracle Big Data Documentation
Oracle NoSQL Documentation

Product tutorials
Big Data Learning Library
Datasheets and white papers
Oracle Big Data Resources and Whitepapers
Oracle NoSQL Enterprise Edition
Apache Flume User Guide

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 4 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Exam Details per Topic
This section covers details associated to all exam topics such as: exam topics overview, objectives, levels of knowledge,
recommended trainings and sample questions. Specialization exams include all application functionalities not only the
most frequently used ones.

Topic 1: Big Data Technical Overview

Objective Level
Describe the architectural components of MapReduce Learner

Describe how Big Data Appliance integrates with Exadata and Exalytics Learner

Identify and architect the services that run on each node in the Big Data Practitioner
Appliance, as it expands from single to multiple nodes

Describe the Big Data Discovery and Big Data Spatial and Graph solutions Learner

Explain the business drivers behind Big Data and NoSQL versus Hadoop Learner

Sample Questions
Select the three-engineered systems where Big Data SQL can run.
A. Oracle Private Cloud Appliance
B. Oracle Exadata
C. Oracle Exalytics
D. Oracle Big Data Appliance

Your customer needs to provide real-time data to big data analytic systems so
that they can get insight into their business in a timely manner. What solution
would you propose?
A. Oracle Coherence
B. Big Data Discovery
C. Oracle In-Database Analytics
D. Oracle RDBMS

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 5 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 2: Core Hadoop

Objective Level
Explain the Hadoop Ecosystem Learner

Implement the Hadoop Distributed File System Practitioner

Identify the benefits of the Hadoop Distributed File System (HDFS) Learner

Describe the architectural components of MapReduce Learner

Describe the differences between MapReduce and YARN Learner

Describe Hadoop High Availability Learner

Describe the importance of Namenode, Datanode, JobTracker, TaskTracker in Practitioner


Hadoop

Use Flume in the Hadoop Distributed File System Practitioner

Sample Questions
Which command would you use to view the contents of a file in an HDFS
directory, /user/oracle/data1.txt
A. hadoop fs ls data1.txt
B. hadoop fs cat /user/oracle/data1.txt
C. cat /user/oracle/data1.txt
D. hive> select * from /user/oracle/data1.txt

How can you configure HDFS for High Availability?


A. Create a standby namenode
B. Ensure the namenodes are using shared storage
C. Ensure the clients connecting to the cluster can handle failover of the
namenode
D. Install Oracle HDFS Data Guard
E. Configure HDFS for Oracle Database Active Data Guard

When setting up Apache Flume for a customer, what does the Flume source
do?
A. Setup a passive store that keeps events until consumed by the sink
B. Consumes events delivered to it by an external source
C. Removes the event from the channel and puts it into HDFS
D. Allows a user to build multi hop flows

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 6 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 3: Oracle NoSQL Database

Objective Level
Use an Oracle NoSQL database Practitioner

Describe the architectural components (Shard, Replica, Master) of the Oracle Learner
NoSQL database

Set up the KVStore Practitioner

Use KVLite to test NoSQL applications Practitioner

Integrate an Oracle NoSQL database with an Oracle database and Hadoop Practitioner

Sample Questions
In a NoSQL architecture, how will your customer be notified of node failure?
A. NoSQL Database File Logs
B. Shard State Table
C. KVStoreConfig Utility
D. Shared Status Update Table

What is the result of the command executed using the TablesAPI in NoSQL?
add-field type INTEGER name count
A. A new integer based table named count is added
B. A new column is added with the name count and type integer
C. A new datatype is created that is mapped to the integer type
D. This is an invalid command

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 7 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 4: Cloudera Enterprise Hadoop Distribution

Objective Level
Describe the Hive architecture Learner

Set up Hive with formatters and SerDe Practitioner

Implement the Oracle Table Access for a Hadoop Connector Practitioner

Describe the Impala real-time query and explain how it differs from Hive Learner

Create a database and table from a Hadoop Distributed File System file in Hive Practitioner

Use Pig Latin to query data in HDFS Practitioner

Execute a Hive query Practitioner

Move data from a database to a Hadoop Distributed File System using Sqoop Practitioner

Sample Questions
Which option would you use to access large volumes of data in Hadoop and
identify users with issues?
A. Hive
B. NoSQL
C. SerDe
D. a custom Java program
E. Flume

Your customer has a requirement to handle multiple requests to query data in


Hive tables. Which server would you use?
A. Thrift Server
B. Thrive Server
C. SQL Server
D. Hive Server 2

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 8 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 5: Programming with R

Objective Level
Describe the Oracle R Advanced Analytics for a Hadoop connector Learner

Use Oracle R Advanced Analytics for a Hadoop connector Practitioner

Describe the architectural components of Oracle R Advanced Analytics for Hadoop Learner

Implement an Oracle Database connection with Oracle R Enterprise Learner

Sample Questions
What do the ore functions in the Oracle R Advanced Analytics for Hadoop do?
A. They create and manage objects in Oracle NoSQL.
B. They create and manage objects in Apache Pig.
C. They create and maange objects in a Hive database.
D. They create and manage files in HDFS

Your customer needs to perform statistical analysis on database tables, views,


and other data objects by using the R language. What product can the customer
use for this?
A. HiveRL Statistical Engine
B. Oracle R Advanced Analytics for Hadoop
C. Oracle R Enterpise
D. R for MongoDB

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 9 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 6: Oracle Loader for Hadoop

Objective Level
Explain the Oracle Loader for Hadoop Learner

Configure the online and offline options for the Oracle Loader for Hadoop Practitioner

Load Hadoop Distributed File System Data into an Oracle database Practitioner

Sample Questions
Identify three items in the Oracle Loader for Hadoop XML configuration file
A. FTP Output Format Class
B. Input Directory for Hadoop
C. Output Directory for Export/Import
D. Loader Map File
E. Output Directory for Hadoop
F. WebDav Output Format Class

How are passwords stored in the Oracle Loader for Hadoop configuration file?
A. Encrypted using Transparent Data Encryption
B. Passwords are not stored, during job execution they are entered by the user
C. Passwords are stored in the Oracle wallet and the location to the wallet
configured
D. Passwords are stored in Access Control list and the locationot the ACL
configured

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 10 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 7: Oracle SQL Connector for HDFS

Objective Level
Explain the Oracle Loader for Hadoop Practitioner

Configure the online and offline options for the Oracle Loader for Hadoop Practitioner

Load Hadoop Distributed File System Data into an Oracle database Learner

Sample Questions
What is the preprocessor script for the Oracle SQL Connector for HDFS?
A. hdfs_preprocess
B. hdfs_stream
C. external_osch_pre
D. external_osch_stream

How are external tables used with the Oracle SQL Connector for HDFS and
the Oracle Database
A. HDFS data is imported into external tables using export/import
B. HDFS data is imported into external tables using bulk loader
C. They provide access to data stored outside the database in HDFS

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 11 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 8: Oracle Data Integrator and Hadoop

Objective Level
Use ODI to transform data from Hive to Hive Practitioner

Use ODI to move data from Hive to Oracle Practitioner

Use ODI to move data from an Oracle database to a Hadoop Distributed File Practitioner
System using sqoop

Sample Questions
What connector must be configured to use ODI to move data from Hive to
Oracle?
A. Oracle Loader for Hadoop
B. Oracle Xquery for Hadoop
C. Oracle Sentry for Hadoop
D. Moving data using ODI is not supported

What knowledge module would you use to move data from Oracle to HDFS
when the customer has not purchased the Oracle Big Data Connectors?
A. IKM SQL to Hive Module
B. IKM SQL to HDFS File (Sqoop) Module
C. IKM SQL to Text Module
D. Moving data from Oracle to HDFS is not supported

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 12 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 9: Big Data SQL

Objective Level
Explain how Big Data SQL is used in a Big Data Appliance/Exadata architecture Learner

Set up and configure Oracle Big Data SQL Practitioner

Demonstrate Big Data SQL syntax used in create table statements Practitioner

Access NoSQL and Hadoop data using a Big Data SQL query Explain the Practitioner
Hadoop Ecosystem

Sample Questions
What would you do to modify the property values for Oracle Big Data SQLs
Memory Soft Limit on the Big Data Appliance?
A. Use the mammoth utility to update the settings.
B. Use the Yarn Resource Manager Interface to update the settings.
C. Modify the configuration settings in Cloudera Manager on the BDA.
D. Modify the configuration settings on the Enterprise Manager Agent on the
Exadata Database Machine
E. Use the DFS Health Utility to update the settings

Which Big Data SQL configuration process enables Oracle Exadata to query
the data in Hadoop on the Oracle Big Data Appliance?
A. installing Hadoop on the Exadata storage nodes to allow predicate filtering
of the Hadoop data
B. installing the Exadata storage server software on each node of the BDA
to enable SmartScan on the local data
C. connecting the Oracle Big Data Appliance to the Exadata machines to
offload all predicate processing to Exadata
D. adding the new Big Data SQL cluster to the Big Data Appliance by using
the Big Data Cluster Config process

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 13 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 10: Xquery for Hadoop Connector

Objective Level
Set up Oracle Xquery for Hadoop connector Practitioner

Perform a simple Xquery using Oracle XQuery for Hadoop Practitioner

Use Oracle Xquery with Hadoop-Hive to map an XML file into a Hive table Practitioner

Sample Questions
Your customer is trying to read from Oracle NoSQL by using the OXH
connector but is getting an error during the read. What could be causing the
error?
A. Access to Oracle NoSQL is through Apache Hbase. It cannot be accessed
directly.
B. Oracle NoSQL is not a supported input source for the Oracle Xquery
Connector for Hadoop.
C. Oracle NoSQL is not installed
D. Oracle NoSQL jars were not added to the oxhloader.jar file

What does a put function do in the Oracle Xquery for a Hadoop Connector
A. adds a single item to a data set stored in an Oracle database only
B. adds a single item to a data set stored in an Oracle Database, an Oracle
NoSQL Database, or a Hadoop file
C. reads data from Hadoop files or Oracle NoSQL Database as a collection of
items
D. adds a single item to a data set stored in a Hadoop file only
E. reads data from Hadoop files only as a collection of items

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 14 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Topic 11: Securing Hadoop

Objective Level
Describe Oracle Big Data Appliance security and encryption features Learner

Set up Kerberos security in Hadoop Practitioner

Set up the Hadoop Distributed File System to use Access Control Lists Practitioner

Set up Hive and Impala access security using Apache Sentry Practitioner

Use LDAP and the Active directory for Hadoop access control Practitioner

Sample Questions
What are the names of the main OS users and groups for the Big Data
Appliance?
A. flume
B. mapred
C. hbase
D. hdfs
E. cdh5

What product would you suggest to your customer who wants to gain deep
control over the data stored in Hadoop?
A. Sudoers file
B. Transparent Hadoop Data Encryption
C. Hadoop Database Firewall
D. Apache Sentry

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 15 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources
Exam Registration Details
Full exam preparation details are available on the exam page The Oracle Big Data 2017 Implementation Essentials
Exam (1Z0-449), including learning objectives, number of questions, time allowance, pricing and languages available.

The OPN Certified Specialist Exams appointments are available worldwide at Pearson VUE Testing Centers.
Reservations can be made via phone or online.

Candidates must have an Oracle Web Account to access CertView and check their exam results. In order to have their
certifications reflected on OPN Competency Center, both CertView and Pearson Vue accounts must be updated with the
current OPN Company ID. Your Company ID can be obtained by contacting your local Oracle Partner Business Center
or by signing in to your OPN account.

Additional Resources
Oracle Big Data Knowledge Zone
Oracle Big Data 2017 Implementation Specialist Guided Learning Path
OPN Guided Learning Paths & Assessments
OPN Certified Specialist Exam Study Guides

Oracle Big Data 2017 Implementation Essentials Exam Study Guide page 16 of 16

Getting Started | Target Audience | Exam Topics | Training Options |


Exam Details per Topic | Exam Registration Details | Additional Resources

Das könnte Ihnen auch gefallen