Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cloudera Hive ODBC driver error

avatar
New Contributor

I am working on a ‘approve of concept’ project to see how we can  query/transfer data between oracle and  our corporate data lake.   

 

I installed Cloudera ODBC Hive and Impala drivers, when I tried to test  'isql'onnections, it failed with the following error:

[S1000][unixODBC][Cloudera][Hardy] (34) Error from server: connect() failed: Connection refused.

[ISQL]ERROR: Could not SQLConnect

 

 

 

Here is the detailed steps I followed,  any input would be appreicated.

 

Linda

 

Environment: RedHat Linux 6.9 with UnixODBC  in place

Oracle DB: 12.1.02 with SID Name: lvdma01 wiht hivetest schema name created 

 

  1. Downloaded and installed CouderaHiveODBC-2.5.19.1004-1.el6.x86_64.rpm as root
  2. Downloaded and installed ClouderaImpalaODBC-2.5.33.1004-1.el6.x86_64.rpm
  3. Configure Cloudera Hive ODBC after installation:

 

Cd /opt/cloudera/hiveodbc/Setup

mkdir -p /usr/local/odbc

cp /opt/cloudera/hiveodbc/Setup/odbc.ini   /opt/home/oracle/      --Note: oracle home dir is /opt/home/oracle

cp /opt/cloudera/hiveodbc/lib/64/cloudera.hiveodbc.ini  /etc/

  1. Set cloudera environment variable:

 

 

export ORACLE_SID=lvdma01

export ORACLE_BASE=/u01/app/oracle

export ORACLE_HOME=/u01/app/oracle/product/12.1.0.2

export ODBCINI=~/odbc.ini

export ODBCSYSINI=/usr/local/odbc

export CLOUDERAHIVEINI=/etc/cloudera.hiveodbc.ini

export LD_LIBRARY_PATH=$ORACLE_HOME/lib

export PATH=/usr/bin:/usr/ucb:/usr/sbin:/etc:$ORACLE_HOME/bin:$ORACLE_HOME/OPatch:/bin:/usr/ccs/bin:/usr/local/bin:.

 

 

  1. Edit odbc.ini file in /opt/home/oracle /odbc.ini  (changed only the host, schema, port, and UID under [Cloudera ODBC Driver for Apache Hive (64-bit) DSN] and rename the data source name to HIVEDSN as follows)

[HiveDSN]

HOST=uslv-sdbx-ora02

PORT=10000

# The name of the database schema to use when a schema is not explicitly specified in a query.

Schema=hivetest

# Set the UID with the user name to use to access Hive when using AuthMech 2 to 8.

UID=cloudera

 

  1. Source environment file
  2. Test connection failed as shown below:

uslv-sdbx-ora02.amgen.com:oracle[lvdma01]$ isql -v hivedsn

[S1000][unixODBC][Cloudera][Hardy] (34) Error from server: connect() failed: Connection refused.

[ISQL]ERROR: Could not SQLConnect

 

5 REPLIES 5

avatar
Champion
The Cloudera ODBC (Hive or Impala) drivers are made to allow you to connect into those services to run queries. They are not meant to transfer data between RDBMS and Hadoop/Hive. For that you will want to use sqoop.

https://sqoop.apache.org/docs/1.4.6/

The error itself is just stating that the service at uslv-sdbx-ora02 on port 10000 (the default HiveServer2 port) refused the connection. This can be anything from Hive isn't running at that location or on a different port, or a firewall is blocking access, or there is something wrong with HS2 that would prevent clients from connecting to it.

Please verify that the HiveServer2 process is running on that host and listening on that port.

avatar
New Contributor

Thanks for the quick reply.

 

I am new to hive, I will do a bit more reseach and let you know what might be the issue.

 

Linda

avatar
New Contributor

I am able to connect to hive using Clouder's odbc driver for Hive now.

 

Now I am configuring  Cloudera Impala ODBC driver on the same database server, when I test the connection, I ran into the following error, what does this indicate? 

 

$ isql -v impldsn
[S1000][unixODBC][Cloudera][ODBC] (11560) Unable to locate SQLGetPrivateProfileString function.
[ISQL]ERROR: Could not SQLConnect

 

What does it indicate? 

 

Linda

 

Here is my IMPLDSN in my ODBC.ini file:

 

[IMPLDSN]
# Description: DSN Description.
# This key is not necessary and is only to give a description of the data source.
Description=Cloudera ODBC Driver for Impala (64-bit) DSN

# Driver: The location where the ODBC driver is installed to.
Driver=/opt/cloudera/impalaodbc/lib/64/libclouderaimpalaodbc64.so

# The DriverUnicodeEncoding setting is only used for SimbaDM
# When set to 1, SimbaDM runs in UTF-16 mode.
# When set to 2, SimbaDM runs in UTF-8 mode.
#DriverUnicodeEncoding=2

# Values for HOST, PORT, KrbFQDN, and KrbServiceName should be set here.
# They can also be specified on the connection string.
HOST=i-d9f3d710.aws.###.com
PORT=21050
Database=default

# The authentication mechanism.
# 0 - no authentication.
# 1 - Kerberos authentication
# 2 - Username authentication.
# 3 - Username/password authentication.
# 4 - Username/password authentication with SSL.
AuthMech=1

# Kerberos related settings.
KrbFQDN=i-d9f3d710.aws.###com
KrbRealm=HADOOP.###.COM
KrbServiceName=impala

# Username/password authentication with SSL settings.
UID=
PWD=
CAIssuedCertNamesMismatch=1
TrustedCerts=/opt/cloudera/impalaodbc/lib/64/cacerts.pem

# General settings
TSaslTransportBufSize=1000
RowsFetchedPerBlock=10000
SocketTimeout=0
StringColumnLength=32767
UseNativeQuery=0

avatar
New Contributor

Can you please guide me how to connect hive using cloudera ODBC driver my cluster is kerberoised

avatar
New Contributor

I don't think that it (Cloudera ODBC driver doesn't support insert) is true. By defining table as transcational table, you can insert data.

 

CREATE TABLE insert_test(
column1 string,
column2 string)
clustered by (column1)
into 3 buckets
stored as orcfile
TBLPROPERTIES ('transactional'='true');

 

insert into table efvci_lnd_edw_dev.insert_test values('1', 'One');
insert into table efvci_lnd_edw_dev.insert_test values('2', 'Two');

 

Thanks,

Chirag Patel