Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Solved Go to solution

Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

Hi,

We are using HDP-2.3.4.0 having knox-0.6.0.

I have followed steps as per mentioned in : https://community.hortonworks.com/articles/72431/setup-knox-over-highly-available-hiveserver2-insta....

But Knox Hive HA Configuration is not working.

Getting following error:

beeline> !connect jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive
Connecting to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive
Enter username for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: test
Enter password for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: *********************
18/05/11 00:29:35 [main]: WARN jdbc.Utils: ***** JDBC param deprecation *****
18/05/11 00:29:35 [main]: WARN jdbc.Utils: The use of hive.server2.transport.mode is deprecated.
18/05/11 00:29:35 [main]: WARN jdbc.Utils: Please use transportMode like so: jdbc:hive2://<host>:<port>/dbName;transportMode=<transport_mode_value>
18/05/11 00:29:35 [main]: WARN jdbc.Utils: ***** JDBC param deprecation *****
18/05/11 00:29:35 [main]: WARN jdbc.Utils: The use of hive.server2.thrift.http.path is deprecated.
18/05/11 00:29:35 [main]: WARN jdbc.Utils: Please use httpPath like so: jdbc:hive2://<host>:<port>/dbName;httpPath=<http_path_value>
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: Could not create http connection to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive. HTTP Response code: 500 (state=08S01,code=0)
0: jdbc:hive2://hadmgrndcc03-1.test.org:84 (closed)>

In gateway.log, I am getting following error:

2018-05-11 00:29:37,471 INFO  hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-11 00:29:37,645 WARN  hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(129)) - Connection exception dispatching request: HIVE?user.name=testit org.apache.http.client.ClientProtocolException
org.apache.http.client.ClientProtocolException
        at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:186)
        at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
        at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.http.ProtocolException: Target host is not specified
        at org.apache.http.impl.conn.DefaultRoutePlanner.determineRoute(DefaultRoutePlanner.java:69)
        at org.apache.http.impl.client.InternalHttpClient.determineRoute(InternalHttpClient.java:124)
        at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:183)
        ... 84 more
2018-05-11 00:29:37,657 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-11 00:29:37,657 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-11 00:29:37,658 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.

I have also tried configuration as per mentioned in : https://community.hortonworks.com/content/supportkb/150624/beeline-via-knox-fails-with-target-host-i...

But still getting same error.

Is Hive HA supported in HDP-2.3.4.0?

How to resolve it? Please suggest.

Thanks in advance.

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Bhushan Kandalkar

This has been fixed in HDP 2.5+

10 REPLIES 10

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor
@Bhushan Kandalkar

This issue may occur when there is unnecessary whitespaces and line-feed in Knox Topology like below:

<param>
<name>HIVE</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=zk1:2181,zk2:2181,zk3:2181;
zookeeperNamespace=hiveserver2</value>
</param>


Change above like below, then restart Knox:
<param>
<name>HIVE</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=zk1:2181,zk2:2181,zk3:2181;zookeeperNamespace=hiveserver2</value>
</param>

if not then please share your knox topology.

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor
@Bhushan Kandalkar

This issue may occur when there is unnecessary whitespaces and line-feed in Knox Topology like below:

<param>
<name>HIVE</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=zk1:2181,zk2:2181,zk3:2181;
zookeeperNamespace=hiveserver2</value>
</param>


Change above like below, then restart Knox:
<param>
<name>HIVE</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=zk1:2181,zk2:2181,zk3:2181;zookeeperNamespace=hiveserver2</value>
</param>

if not then please share your knox topology.

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Rishi, tried that but still getting same error. Please suggest.

Here is my knox topology:

 <topology>
            <gateway>
                <provider>
                    <role>authentication</role>
                    <name>ShiroProvider</name>
                    <enabled>true</enabled>
                    <param>
                        <name>sessionTimeout</name>
                        <value>30</value>
                    </param>
                    <param>
                        <name>main.ldapRealm</name>
                        <value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
                    </param>
                    <param>
                        <name>main.ldapRealm.userDnTemplate</name>
                        <value>uid={0},ou=people,dc=hadoop,dc=apache,dc=org</value>
                    </param>
                    <param>
                        <name>main.ldapRealm.contextFactory.url</name>
                        <value>ldap://{{knox_host_name}}:33389</value>
                    </param>
                    <param>
                        <name>main.ldapRealm.contextFactory.authenticationMechanism</name>
                        <value>simple</value>
                    </param>
                    <param>
                        <name>urls./**</name>
                        <value>authcBasic</value>
                    </param>
                </provider>


                <provider>
                    <role>identity-assertion</role>
                    <name>Default</name>
                    <enabled>true</enabled>
                </provider>

                <provider>
                    <role>authorization</role>
                    <name>AclsAuthz</name>
                    <enabled>true</enabled>
                </provider>
             
  <provider>
     <role>ha</role>
     <name>HaProvider</name>
     <enabled>true</enabled>
 <param>
<name>WEBHDFS</name>
 <value>maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true</value>
  </param>
</provider>


<provider> 
<role>ha</role> 
<name>HaProvider</name> 
<enabled>true</enabled> 
<param> 
<name>HIVE</name> 
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=hadmgrndcc03-1.test.org:2181,hadmgrndcc03-2.test.org:2181,hadmgrndcc03-3.test.org:2181;zookeeperNamespace=hiveserver2</value>
</param> 
</provider>

 </gateway>

<service>
    <role>NAMENODE</role>
    <url>hdfs://C03</url>
  </service>


            <service>
                <role>JOBTRACKER</role>
                <url>rpc://{{rm_host}}:{{jt_rpc_port}}</url>
            </service>


            <service>
      <role>WEBHDFS</role>
    <url>http://hadmgrndcc03-1.test.org:50070/webhdfs</url>
    <url>http://hadmgrndcc03-2.test.org:50070/webhdfs</url>
  </service>


            <service>
                <role>WEBHCAT</role>
                <url>http://{{webhcat_server_host}}:{{templeton_port}}/templeton</url>
            </service>


            <service>
                <role>OOZIE</role>
                <url>http://{{oozie_server_host}}:{{oozie_server_port}}/oozie</url>
            </service>


            <service>
                <role>WEBHBASE</role>
                <url>http://{{hbase_master_host}}:{{hbase_master_port}}</url>
            </service>


            <service>
                <role>HIVE</role>
             </service>


            <service>
                <role>RESOURCEMANAGER</role>
                <url>http://{{rm_host}}:{{rm_port}}/ws</url>
            </service>
        </topology>

Highlighted

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Bhushan Kandalkar

Please share the below command output

 #/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server hadmgrndcc03-3.test.org:2181 ls /hiveserver2 | tail

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Rishi, here is the required output:

root@hadrndcc03-3:~# /usr/hdp/current/zookeeper-client/bin/zkCli.sh -server hadmgrndcc03-3.test.org:2181 ls /hiveserver2 | tail
2018-05-11 03:48:42,234 - INFO  [main:Environment@100] - Client environment:user.dir=/root
2018-05-11 03:48:42,235 - INFO  [main:ZooKeeper@438] - Initiating client connection, connectString=hadmgrndcc03-3.test.org:2181 sessionTimeout=30000 watcher=org.apache.zookeeper.ZooKeeperMain$MyWatcher@7e32c033
2018-05-11 03:48:42,259 - INFO  [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@1019] - Opening socket connection to server hadmgrndcc03-3.test.org/172.17.20.33:2181. Will not attempt to authenticate using SASL (unknown error)
2018-05-11 03:48:42,371 - INFO  [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@864] - Socket connection established to hadmgrndcc03-3.test.org/172.17.20.33:2181, initiating session
2018-05-11 03:48:42,382 - INFO  [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@1279] - Session establishment complete on server hadmgrndcc03-3.test.org/172.17.20.33:2181, sessionid = 0x3632acb1c59002b, negotiated timeout = 30000


WATCHER::


WatchedEvent state:SyncConnected type:None path:null
[serverUri=hadmgrndcc03-3.test.org:10001;version=1.2.1.2.3.4.0-3485;sequence=0000000113, serverUri=hadmgrndcc03-2.test.org:10001;version=1.2.1.2.3.4.0-3485;sequence=0000000114]
root@hadmgrndcc03-3:~#

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Bhushan Kandalkar

I would just like you to add , and try

<service>
    <role>HIVE</role>
    <url>hive1</url>
    <url>hive2</url>
</service>

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Bhushan Kandalkar

Check this out, you are using old version of hdp, hive knox ha support will be available with knox 0.7, For now you have to add the url in service section.

https://issues-test.apache.org/jira/browse/KNOX-570

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Rishi, in which HDP version is it fixed?

Re: Knox Hive HA Configuration Does Not Work in HDP-2.3.4.0

Contributor

@Bhushan Kandalkar

This has been fixed in HDP 2.5+