Support Questions
Find answers, ask questions, and share your expertise

Cloudera Hive ODBC driver error

New Contributor

I am working on a ‘approve of concept’ project to see how we can  query/transfer data between oracle and  our corporate data lake.   


I installed Cloudera ODBC Hive and Impala drivers, when I tried to test  'isql'onnections, it failed with the following error:

[S1000][unixODBC][Cloudera][Hardy] (34) Error from server: connect() failed: Connection refused.

[ISQL]ERROR: Could not SQLConnect




Here is the detailed steps I followed,  any input would be appreicated.




Environment: RedHat Linux 6.9 with UnixODBC  in place

Oracle DB: 12.1.02 with SID Name: lvdma01 wiht hivetest schema name created 


  1. Downloaded and installed CouderaHiveODBC- as root
  2. Downloaded and installed ClouderaImpalaODBC-
  3. Configure Cloudera Hive ODBC after installation:


Cd /opt/cloudera/hiveodbc/Setup

mkdir -p /usr/local/odbc

cp /opt/cloudera/hiveodbc/Setup/odbc.ini   /opt/home/oracle/      --Note: oracle home dir is /opt/home/oracle

cp /opt/cloudera/hiveodbc/lib/64/cloudera.hiveodbc.ini  /etc/

  1. Set cloudera environment variable:



export ORACLE_SID=lvdma01

export ORACLE_BASE=/u01/app/oracle

export ORACLE_HOME=/u01/app/oracle/product/

export ODBCINI=~/odbc.ini

export ODBCSYSINI=/usr/local/odbc

export CLOUDERAHIVEINI=/etc/cloudera.hiveodbc.ini


export PATH=/usr/bin:/usr/ucb:/usr/sbin:/etc:$ORACLE_HOME/bin:$ORACLE_HOME/OPatch:/bin:/usr/ccs/bin:/usr/local/bin:.



  1. Edit odbc.ini file in /opt/home/oracle /odbc.ini  (changed only the host, schema, port, and UID under [Cloudera ODBC Driver for Apache Hive (64-bit) DSN] and rename the data source name to HIVEDSN as follows)




# The name of the database schema to use when a schema is not explicitly specified in a query.


# Set the UID with the user name to use to access Hive when using AuthMech 2 to 8.



  1. Source environment file
  2. Test connection failed as shown below:[lvdma01]$ isql -v hivedsn

[S1000][unixODBC][Cloudera][Hardy] (34) Error from server: connect() failed: Connection refused.

[ISQL]ERROR: Could not SQLConnect



The Cloudera ODBC (Hive or Impala) drivers are made to allow you to connect into those services to run queries. They are not meant to transfer data between RDBMS and Hadoop/Hive. For that you will want to use sqoop.

The error itself is just stating that the service at uslv-sdbx-ora02 on port 10000 (the default HiveServer2 port) refused the connection. This can be anything from Hive isn't running at that location or on a different port, or a firewall is blocking access, or there is something wrong with HS2 that would prevent clients from connecting to it.

Please verify that the HiveServer2 process is running on that host and listening on that port.

New Contributor

Thanks for the quick reply.


I am new to hive, I will do a bit more reseach and let you know what might be the issue.



New Contributor

I am able to connect to hive using Clouder's odbc driver for Hive now.


Now I am configuring  Cloudera Impala ODBC driver on the same database server, when I test the connection, I ran into the following error, what does this indicate? 


$ isql -v impldsn
[S1000][unixODBC][Cloudera][ODBC] (11560) Unable to locate SQLGetPrivateProfileString function.
[ISQL]ERROR: Could not SQLConnect


What does it indicate? 




Here is my IMPLDSN in my ODBC.ini file:


# Description: DSN Description.
# This key is not necessary and is only to give a description of the data source.
Description=Cloudera ODBC Driver for Impala (64-bit) DSN

# Driver: The location where the ODBC driver is installed to.

# The DriverUnicodeEncoding setting is only used for SimbaDM
# When set to 1, SimbaDM runs in UTF-16 mode.
# When set to 2, SimbaDM runs in UTF-8 mode.

# Values for HOST, PORT, KrbFQDN, and KrbServiceName should be set here.
# They can also be specified on the connection string.

# The authentication mechanism.
# 0 - no authentication.
# 1 - Kerberos authentication
# 2 - Username authentication.
# 3 - Username/password authentication.
# 4 - Username/password authentication with SSL.

# Kerberos related settings.

# Username/password authentication with SSL settings.

# General settings

New Contributor

Can you please guide me how to connect hive using cloudera ODBC driver my cluster is kerberoised

New Contributor

I don't think that it (Cloudera ODBC driver doesn't support insert) is true. By defining table as transcational table, you can insert data.


CREATE TABLE insert_test(
column1 string,
column2 string)
clustered by (column1)
into 3 buckets
stored as orcfile
TBLPROPERTIES ('transactional'='true');


insert into table efvci_lnd_edw_dev.insert_test values('1', 'One');
insert into table efvci_lnd_edw_dev.insert_test values('2', 'Two');



Chirag Patel