Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot Create Hive Connection Pool.

avatar
Super Collaborator

Hi,

I am getting errors while trying to connect to Hive in out Kerberized HADOOP cluster. Here is what i am doing..

Nifi config properties:

# kerberos #

nifi.kerberos.krb5.file=/etc/krb5.conf

# kerberos service principle #

nifi.kerberos.service.principal=nifi/ourserver@ourdomain nifi.kerberos.service.keytab.location=/etc/security/keytabs/nifi.keytab

Configure Controller Service

9072-hive.png

as soon as i enable it i see some warnings..

9073-hive2.png

and getting these errors when i tried to use the connection in SelectHiveQL process

9074-hive3.png

16:33:08 UTC ERROR 7c2b4a17-f772-1ea7-54c9-99cc4d8dea09
HiveConnectionPool[id=7c2b4a17-f772-1ea7-54c9-99cc4d8dea09] Error getting Hive connection
16:33:08 UTC ERROR d3b62ee6-0157-1000-b66f-364970fcfa98
SelectHiveQL[id=d3b62ee6-0157-1000-b66f-364970fcfa98] Unable to execute HiveQL select query show tables due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://myserver:10000/hdf_moat;principal=hive/myserver@mydomain: GSS initiate failed). No FlowFile to route to failure: org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://myserver:10000/hdf_moat;principal=hive/mydomain: GSS initiate failed)

Regards,

1 ACCEPTED SOLUTION

avatar
Master Guru

The Hive processors share some code with the Hadoop processors (in terms of Kerberos, etc.), they expect "hadoop.security.authentication" to be set to "kerberos" in your config file(s) (core-site, hive-site, e.g.)

View solution in original post

5 REPLIES 5

avatar
Master Guru

The Hive processors share some code with the Hadoop processors (in terms of Kerberos, etc.), they expect "hadoop.security.authentication" to be set to "kerberos" in your config file(s) (core-site, hive-site, e.g.)

avatar
Super Collaborator

@Matt Burgess

in those both files it's set as Kerberos , here are some properties from hive-site.xml. is there any other files that i need to be checking.

9076-hivedb.png

avatar
Master Guru

In your snippet above, the property set to "KERBEROS" is "hive.server2.authentication", not "hadoop.security.authentication". If "hadoop.security.authentication" is set to "kerberos" in your core-site.xml, ensure the path to your core-site.xml is in the Hive Configuration Resources property. That property accepts a comma-separated list of files, so you can include your hive-site.xml (as you've done in your above screenshot) as well as the core-site.xml file (which has the aforementioned property set).

avatar
Super Collaborator

@Matt Burgess

adding core-site.xml eliminated the warning. But i had to change the kerberos principal to fully qualified name , then it started working..thanks for the help..

9081-hivedb.png

avatar
New Contributor

Hi All,

 

I am facing issue still after adding below both hive-site.xml and core-site.xml.

 

<property>
<name>hadoop.security.authentication</name>
<value>kerberos</value>
</property>

 

I am facing below error

 

org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'org.apache.hive.jdbc.HiveDriver' for connect URL
jdbc:hive2://ux329tas101.ux.hostname.net:10000/default;principal=<principal name>;ssl=true

 

Could you please help me regarding this.

 

Regards,

Swadesh Mondal