Support Questions

Find answers, ask questions, and share your expertise

Query Oracle Table from Apache Hadoop(Hive)

New Contributor

Hi Guys,

I am new in Apache Hadoop. I want to access oracle table through Hive. I used following document for that but not able to query Data. The document is showing that I need to create external table from Hive to Oracle. In Oracle, I have all tables exist there. My assumption is that I need to only create external table (or link)

http://oraclegis.com/blog/2016/09/27/oracle-datasource-for-apache-hadoop-od4h-introduction/

CREATE EXTERNAL TABLE oracle_table_HIVE STORED BY 'oracle.hcat.osh.OracleStorageHandler' TBLPROPERTIES (
'mapreduce.jdbc.url' = 'jdbc:oracle:thin:@//hostname:1521/ORCL',
'mapreduce.jdbc.username' = 'oracle_user',
'mapreduce.jdbc.password' = 'password',
'mapreduce.jdbc.input.table.name' = 'oracle_table',
'oracle.hcat.osh.useMonitor'='true',
'oracle.hcat.osh.fetchSize'='10000',
'oracle.hcat.osh.useOracleParallelism'='true'
);

Getting this error:

FAILED: SemanticException Cannot find class 'oracle.hcat.osh.OracleStorageHandler'

3 REPLIES 3

Super Mentor

@Tariq Hasan

Error:

FAILED: SemanticException Cannot find class 'oracle.hcat.osh.OracleStorageHandler'

.

The error indicates that you have not included the required JARs inside the hive lib directory which contains the "oracle.hcat.osh.OracleStorageHandler" and dependent classes. Please refer to the section "9.2.4 List of jars in the OTA4H package" inside the link to know about the required JARs that are needed to be included as part of hive lib:


https://docs.oracle.com/cd/E65728_01/doc.43/e65665/GUID-C2A509A4-34CB-4B58-AC55-6CCCE51163A8.htm#BIG...


Some example jars (you might not need all those jars but some of them are needed) but some of them like osh.jar & ojdbc7.jar, ucp.jar might be needed to be present inside the hive lib and hive client lib.

osh.jar    => Contains OracleStorageHandler Implementation
ojdbc7.jar =>     An OTA4H specific JDBC driver (which is optimized with internal calls), used by Spark or Hadoop tasks to connect to the database.
ucp.jar    => For creating connection pools in OracleStorageHandler
oraclepki103.jar =>, osdt_core.jar, osdt_cert.jar, osdt_jce.jar    For Oracle Wallet authentication
orai18n.jar =>    Oracle Globalization Support

.

.

Another easiest option will be to import the Oracle DB table to Hive using Sqoop as following:

# cp -p -f /PATH/TO/ojdbc7.jar  /usr/hdp/current/sqoop-client/lib/

# sqoop import --connect "jdbc:oracle:thin:@hostname:1521/ORCL'" --username oracle_user -P --table oracle_table --hive-import --create-hive-table -m1 --delete-target-dir --hive-database default --hive-table SPECIAL_CHARS_TEST_HIVE --null-string '\\N' --null-non-string '\\N' --hive-delims-replacement ";"

.

New Contributor

Thanks Jay for your response. I added all jar files in this hive location

"/usr/lib/hive/lib". Now getting following error

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org/apache/hadoop/hive/serde2/SerD.

Also you pasted following text talking about hive client lib ? is it different then hive lib?

Some example jars (you might not need all those jars but some of them are needed) but some of them like osh.jar & ojdbc7.jar, ucp.jar might be needed to be present inside the hive lib and hive client lib.

New Contributor
@Jay Kumar SenSharma

Thanks for your response.Now getting another error.