Member since
08-10-2017
108
Posts
2
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2904 | 01-28-2019 08:41 AM | |
4822 | 01-28-2019 08:35 AM | |
2659 | 12-18-2018 05:42 AM | |
7813 | 08-16-2018 12:12 PM | |
3008 | 07-24-2018 06:55 AM |
05-11-2018
08:59 AM
@Rishi, here is the required output: root@hadrndcc03-3:~# /usr/hdp/current/zookeeper-client/bin/zkCli.sh -server hadmgrndcc03-3.test.org:2181 ls /hiveserver2 | tail
2018-05-11 03:48:42,234 - INFO [main:Environment@100] - Client environment:user.dir=/root
2018-05-11 03:48:42,235 - INFO [main:ZooKeeper@438] - Initiating client connection, connectString=hadmgrndcc03-3.test.org:2181 sessionTimeout=30000 watcher=org.apache.zookeeper.ZooKeeperMain$MyWatcher@7e32c033
2018-05-11 03:48:42,259 - INFO [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@1019] - Opening socket connection to server hadmgrndcc03-3.test.org/172.17.20.33:2181. Will not attempt to authenticate using SASL (unknown error)
2018-05-11 03:48:42,371 - INFO [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@864] - Socket connection established to hadmgrndcc03-3.test.org/172.17.20.33:2181, initiating session
2018-05-11 03:48:42,382 - INFO [main-SendThread(hadmgrndcc03-3.test.org:2181):ClientCnxn$SendThread@1279] - Session establishment complete on server hadmgrndcc03-3.test.org/172.17.20.33:2181, sessionid = 0x3632acb1c59002b, negotiated timeout = 30000
WATCHER::
WatchedEvent state:SyncConnected type:None path:null
[serverUri=hadmgrndcc03-3.test.org:10001;version=1.2.1.2.3.4.0-3485;sequence=0000000113, serverUri=hadmgrndcc03-2.test.org:10001;version=1.2.1.2.3.4.0-3485;sequence=0000000114]
root@hadmgrndcc03-3:~#
... View more
05-11-2018
08:26 AM
@Rishi, tried that but still getting same error. Please suggest. Here is my knox topology: <topology>
<gateway>
<provider>
<role>authentication</role>
<name>ShiroProvider</name>
<enabled>true</enabled>
<param>
<name>sessionTimeout</name>
<value>30</value>
</param>
<param>
<name>main.ldapRealm</name>
<value>org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm</value>
</param>
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>uid={0},ou=people,dc=hadoop,dc=apache,dc=org</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://{{knox_host_name}}:33389</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.authenticationMechanism</name>
<value>simple</value>
</param>
<param>
<name>urls./**</name>
<value>authcBasic</value>
</param>
</provider>
<provider>
<role>identity-assertion</role>
<name>Default</name>
<enabled>true</enabled>
</provider>
<provider>
<role>authorization</role>
<name>AclsAuthz</name>
<enabled>true</enabled>
</provider>
<provider>
<role>ha</role>
<name>HaProvider</name>
<enabled>true</enabled>
<param>
<name>WEBHDFS</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true</value>
</param>
</provider>
<provider>
<role>ha</role>
<name>HaProvider</name>
<enabled>true</enabled>
<param>
<name>HIVE</name>
<value>maxFailoverAttempts=3;failoverSleep=1000;enabled=true;zookeeperEnsemble=hadmgrndcc03-1.test.org:2181,hadmgrndcc03-2.test.org:2181,hadmgrndcc03-3.test.org:2181;zookeeperNamespace=hiveserver2</value>
</param>
</provider>
</gateway>
<service>
<role>NAMENODE</role>
<url>hdfs://C03</url>
</service>
<service>
<role>JOBTRACKER</role>
<url>rpc://{{rm_host}}:{{jt_rpc_port}}</url>
</service>
<service>
<role>WEBHDFS</role>
<url>http://hadmgrndcc03-1.test.org:50070/webhdfs</url>
<url>http://hadmgrndcc03-2.test.org:50070/webhdfs</url>
</service>
<service>
<role>WEBHCAT</role>
<url>http://{{webhcat_server_host}}:{{templeton_port}}/templeton</url>
</service>
<service>
<role>OOZIE</role>
<url>http://{{oozie_server_host}}:{{oozie_server_port}}/oozie</url>
</service>
<service>
<role>WEBHBASE</role>
<url>http://{{hbase_master_host}}:{{hbase_master_port}}</url>
</service>
<service>
<role>HIVE</role>
</service>
<service>
<role>RESOURCEMANAGER</role>
<url>http://{{rm_host}}:{{rm_port}}/ws</url>
</service>
</topology>
... View more
05-11-2018
05:52 AM
Hi, We are using HDP-2.3.4.0 having knox-0.6.0. I have followed steps as per mentioned in : https://community.hortonworks.com/articles/72431/setup-knox-over-highly-available-hiveserver2-insta.html But Knox Hive HA Configuration is not working. Getting following error: beeline> !connect jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive
Connecting to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive
Enter username for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: test
Enter password for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: *********************
18/05/11 00:29:35 [main]: WARN jdbc.Utils: ***** JDBC param deprecation *****
18/05/11 00:29:35 [main]: WARN jdbc.Utils: The use of hive.server2.transport.mode is deprecated.
18/05/11 00:29:35 [main]: WARN jdbc.Utils: Please use transportMode like so: jdbc:hive2://<host>:<port>/dbName;transportMode=<transport_mode_value>
18/05/11 00:29:35 [main]: WARN jdbc.Utils: ***** JDBC param deprecation *****
18/05/11 00:29:35 [main]: WARN jdbc.Utils: The use of hive.server2.thrift.http.path is deprecated.
18/05/11 00:29:35 [main]: WARN jdbc.Utils: Please use httpPath like so: jdbc:hive2://<host>:<port>/dbName;httpPath=<http_path_value>
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive: Could not create http connection to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive. HTTP Response code: 500 (state=08S01,code=0)
0: jdbc:hive2://hadmgrndcc03-1.test.org:84 (closed)>
In gateway.log, I am getting following error: 2018-05-11 00:29:37,471 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
2018-05-11 00:29:37,645 WARN hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(129)) - Connection exception dispatching request: HIVE?user.name=testit org.apache.http.client.ClientProtocolException
org.apache.http.client.ClientProtocolException
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:186)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.http.ProtocolException: Target host is not specified
at org.apache.http.impl.conn.DefaultRoutePlanner.determineRoute(DefaultRoutePlanner.java:69)
at org.apache.http.impl.client.InternalHttpClient.determineRoute(InternalHttpClient.java:124)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:183)
... 84 more
2018-05-11 00:29:37,657 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-11 00:29:37,657 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-11 00:29:37,658 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.
I have also tried configuration as per mentioned in : https://community.hortonworks.com/content/supportkb/150624/beeline-via-knox-fails-with-target-host-is-not-spe.html But still getting same error. Is Hive HA supported in HDP-2.3.4.0? How to resolve it? Please suggest. Thanks in advance.
... View more
Labels:
05-09-2018
08:59 AM
@Vipin Rathor @Ancil McBarnett @Kevin Minder.. Any suggestions please.
... View more
05-09-2018
05:43 AM
@Felix Albani, not getting any errors in Hiveserver2 logs: 2018-05-08 11:26:42,453 INFO [main-SendThread(hadmgrndcc03-3.lifeway.org:2181)]: zookeeper.ClientCnxn (ClientCnxn.java:onConnected(1279)) - Session establishment complete on server hadmgrndcc03-3.lifeway.org/172.17.20.33:2181, sessionid = 0x3632acb1c590011, negotiated timeout = 40000
2018-05-08 11:26:42,461 INFO [main-EventThread]: state.ConnectionStateManager (ConnectionStateManager.java:postState(228)) - State change: CONNECTED
2018-05-08 11:26:42,527 INFO [main]: server.HiveServer2 (HiveServer2.java:addServerInstanceToZooKeeper(234)) - Created a znode on ZooKeeper for HiveServer2 uri: hadmgrndcc03-3.lifeway.org:10001
2018-05-08 11:26:42,701 INFO [Thread-9]: server.Server (Server.java:doStart(252)) - jetty-7.6.0.v20120127
2018-05-08 11:26:42,753 INFO [Thread-9]: handler.ContextHandler (ContextHandler.java:startContext(737)) - started o.e.j.s.ServletContextHandler{/,null}
2018-05-08 11:26:43,080 INFO [Thread-9]: ssl.SslContextFactory (SslContextFactory.java:doStart(297)) - Enabled Protocols [SSLv2Hello, TLSv1, TLSv1.1, TLSv1.2] of [SSLv2Hello, SSLv3, TLSv1, TLSv1.1, TLSv1.2]
2018-05-08 11:26:43,120 INFO [Thread-9]: server.AbstractConnector (AbstractConnector.java:doStart(333)) - Started SslSelectChannelConnector@0.0.0.0:10001
2018-05-08 11:26:43,120 INFO [Thread-9]: thrift.ThriftCLIService (ThriftHttpCLIService.java:run(141)) - Started ThriftHttpCLIService in https mode on port 10001 path=/cliservice/* with 5...500 worker threads
... View more
05-08-2018
04:00 PM
@Felix Albani, @Alex Miller , we have Hiveserver2 HA setup in our environment. I have done following things: Imported both Hiveservers certificates in Knox gateway.jks file: echo -n | openssl s_client -connect hadmgrndcc03-2.test.org:10001 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > hadmgrndcc03-2.test.org.pem
echo -n | openssl s_client -connect hadmgrndcc03-3.test.org:10001 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > hadmgrndcc03-3.test.org.pem
keytool -import -alias hadmgrndcc03-3.test.org -file hadmgrndcc03-3.test.orgt.pem -keystore /var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks
keytool -import -alias hadmgrndcc03-2.test.org -file hadmgrndcc03-2.test.orgt.pem -keystore /var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks
Imported both hiveserver2 certificates on cacert file of Knox machine keytool -import -alias hadmgrndcc03-3.test.org -file hadmgrndcc03-3.test.org.pem -keystore /usr/lib/jvm/java-8-oracle/jre/lib/security/cacerts
keytool -import -alias hadmgrndcc03-2.test.org -file hadmgrndcc03-2.test.org.pem -keystore /usr/lib/jvm/java-8-oracle/jre/lib/security/cacerts
keytool -import -alias hadmgrndcc03-3.test.org -file hadmgrndcc03-3.test.org.pem -keystore /usr/lib/jvm/java-8-oracle/jre/lib/security/cacerts
keytool -import -alias hadmgrndcc03-2.test.org -file hadmgrndcc03-2.test.org.pem -keystore /usr/lib/jvm/java-8-oracle/jre/lib/security/cacerts
On both Hiveservers imported Knox certificate to cacert: echo -n | openssl s_client -connect hadmgrndcc03-1.test.org:8443 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > knox.pem
keytool -import -alias knox -file knox.pem -keystore /usr/lib/jvm/java-8-oracle/jre/lib/security/cacerts
Still getting same error. How to resolve it? Please suggest.
... View more
05-08-2018
01:40 PM
Hi, Before enabling SSL over Hive I was able to access Hive through Knox. After enabling SSL over Hive I am not able to access Hive through Knox. Getting following error in beeline: Beeline version 1.2.1.2.3.4.0-3485 by Apache Hive
beeline> !connect 'jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive'
Connecting to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive
Enter username for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive: guest
Enter password for jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive: **************
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive: Could not create http connection to jdbc:hive2://hadmgrndcc03-1.test.org:8443/default/;ssl=true;sslTrustStore=/var/lib/knox/data-2.3.4.0-3485/security/keystores/gateway.jks;trustStorePassword=test123;transportMode=http;httpPath=gateway/default/hive. HTTP Response code: 500 (state=08S01,code=0)
0: jdbc:hive2://hadmgrndcc03-1.test.org:84 (closed)>
Also, Getting following error in Knox gateway log: 2018-05-08 08:32:12,279 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true 2018-05-08 08:32:12,737 WARN hadoop.gateway (DefaultDispatch.java:executeOutboundRequest(129)) - Connection exception dispatching request: http://hadmgrndcc03-3.test.org:10001/cliservice?user.name=guest org.apache.http.NoHttpResponseException: hadmgrndcc03-3.test.org:10001 failed to respond org.apache.http.NoHttpResponseException: hadmgrndcc03-3.test.org:10001 failed to respond at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:143) at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:57) at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:260) at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:161) at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:153)
2018-05-08 08:32:12,767 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-08 08:32:12,767 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(63)) - Failed to execute filter: java.io.IOException: Service connectivity error.
2018-05-08 08:32:12,768 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.
2018-05-08 08:32:12,776 ERROR hadoop.gateway (AbstractGatewayFilter.java:doFilter(66)) - Failed to execute filter: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.
2018-05-08 08:32:12,776 ERROR hadoop.gateway (GatewayFilter.java:doFilter(135)) - Gateway processing failed: javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.
javax.servlet.ServletException: org.apache.shiro.subject.ExecutionException: java.security.PrivilegedActionException: java.io.IOException: Service connectivity error.
at org.apache.shiro.web.servlet.AdviceFilter.cleanup(AdviceFilter.java:196)
How to resolve it? Please suggest. Thanks in Advance.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox