Member since
09-30-2018
10
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2863 | 10-29-2020 10:08 AM |
11-09-2020
10:20 PM
Thanks, I'm able to access the Hadoop CLI after commenting out the line.
... View more
11-03-2020
06:04 AM
I didn't upgrade the java. Anyways I reinstalled the JCE jar but the issue remained the same. No luck.
... View more
11-02-2020
11:41 PM
@ChethanYM I'm unable to access the link that you shared. I'm Getting a access denied to link, then it's opening a 404 page
... View more
11-02-2020
11:14 PM
Hi all, I've followed the following tutorial CDH Hadoop Kerberos, NameNode and DataNode are able to start properly and I'm able to see all the DataNode listed on the WebUI (0.0.0.0:50070). But I'm unable to access the Hadoop CLI. I've followed this tutorial Certain Java versions cannot read credentials cache, still I'm unable to use the Hadoop CLI. [root@local9 hduser]# hadoop fs -ls /
20/11/03 12:24:32 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:24:32 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:24:32 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "local9/192.168.2.9"; destination host is: "local9":8020;
[root@local9 hduser]# klist
Ticket cache: KEYRING:persistent:0:krb_ccache_hVEAjWz
Default principal: hdfs/local9@FBSPL.COM
Valid starting Expires Service principal
11/03/2020 12:22:42 11/04/2020 12:22:42 krbtgt/FBSPL.COM@FBSPL.COM
renew until 11/10/2020 12:22:12
[root@local9 hduser]# kinit -R
[root@local9 hduser]# klist
Ticket cache: KEYRING:persistent:0:krb_ccache_hVEAjWz
Default principal: hdfs/local9@FBSPL.COM
Valid starting Expires Service principal
11/03/2020 12:24:50 11/04/2020 12:24:50 krbtgt/FBSPL.COM@FBSPL.COM
renew until 11/10/2020 12:22:12
[root@local9 hduser]# hadoop fs -ls /
20/11/03 12:25:04 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:25:04 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:25:04 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "local9/192.168.2.9"; destination host is: "local9":8020; Any Help would be greatly appreciated.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Kerberos
10-29-2020
10:08 AM
Thanks for the reply. I found the issue, the Kerberos setup was fine the only thing missing was, providing the Kerberos Principal and Keytab path to the IMPALA_CATALOG_ARGS. In the CDH documentation ( CDH Impala Kerberos Point 7) 7. Add Kerberos options to the Impala defaults file, /etc/default/impala. Add the options for both the impalad and statestored daemons, using the IMPALA_SERVER_ARGS and IMPALA_STATE_STORE_ARGS variables that I followed they have only mentioned to update IMPALA_STATE_STORE_ARGS and IMPALA_SERVER_ARGS, that's why catalog server was not authenticating with Kerberos. After adding the the Kerberos Principal and keytab path I was able to start the without any issues.
... View more
10-29-2020
07:36 AM
Hi, Above I've already listed all principals which are present in impala.keytab. klist -e -t -k /etc/impala/conf/impala-http.keytab
Keytab name: FILE:/etc/impala/conf/impala-http.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
2 10/28/2020 19:21:47 impala/local9@FBSPL.COM (aes256-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 impala/local9@FBSPL.COM (aes128-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 HTTP/local9@FBSPL.COM (aes256-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 HTTP/local9@FBSPL.COM (aes128-cts-hmac-sha1-96) Output from catalog.INFO Log file created at: 2020/10/29 17:29:11
Running on machine: local9
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
I1029 17:29:11.247231 30770 logging.cc:120] stdout will be logged to this file.
E1029 17:29:11.247392 30770 logging.cc:121] stderr will be logged to this file.
I1029 17:29:11.247609 30770 minidump.cc:231] Setting minidump size limit to 20971520.
I1029 17:29:11.249019 30770 authentication.cc:1093] Internal communication is not authenticated
I1029 17:29:11.249027 30770 authentication.cc:1114] External communication is not authenticated
I1029 17:29:11.249331 30770 init.cc:224] catalogd version 2.11.0-cdh5.14.2 RELEASE (build ed85dce709da9557aeb28be89e8044947708876c)
Built on Tue Mar 27 13:39:48 PDT 2018
I1029 17:29:11.249336 30770 init.cc:225] Using hostname: local9
I1029 17:29:11.249737 30770 logging.cc:156] Flags (see also /varz are on debug webserver): Is this okay?
... View more
10-29-2020
05:41 AM
Hello all, I've recently decided to enable Kerberos on all the service which are getting used in my company. I've successfully enabled Kerberos on Zookeeper, Kafka, Hadoop, Hbase. When I'm trying to enable Kerberos on Hive-metatore and Impala I'm getting following error: I've followed the following guides: CDH Impala Kerberos CDH Hiveserver2 Security CDH Hive Metastore Security hive-metastore.log ERROR [pool-4-thread-3]: server.TThreadPoolServer (TThreadPoolServer.java:run(297)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:794)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:791)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1900)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:791)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 10 more
catalogd.ERROR E1029 17:31:06.843065 30770 TSaslTransport.java:296] SASL negotiation failure
Java exception follows:
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:464)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:244)
at sun.reflect.GeneratedConstructorAccessor8.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1560)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:73)
at org.apache.impala.catalog.MetaStoreClientPool$MetaStoreClient.<init>(MetaStoreClientPool.java:93)
at org.apache.impala.catalog.MetaStoreClientPool$MetaStoreClient.<init>(MetaStoreClientPool.java:72)
at org.apache.impala.catalog.MetaStoreClientPool.initClients(MetaStoreClientPool.java:168)
at org.apache.impala.catalog.Catalog.<init>(Catalog.java:103)
at org.apache.impala.catalog.CatalogServiceCatalog.<init>(CatalogServiceCatalog.java:163)
at org.apache.impala.service.JniCatalog.<init>(JniCatalog.java:106)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 24 more
W1029 17:31:06.843554 30770 HiveMetaStoreClient.java:474] Failed to connect to the MetaStore Server...
W1029 17:31:07.844949 30770 MetaStoreClientPool.java:101] Failed to connect to Hive MetaStore. Retrying. hive-site.xml <property>
<name>hive.server2.authentication</name>
<value>KERBEROS</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.principal</name>
<value>hive/local9@FBSPL.COM</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.keytab</name>
<value>/etc/hive/conf/hive.keytab</value>
</property>
<property>
<name>hive.metastore.sasl.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.kerberos.keytab.file</name>
<value>/etc/hive/conf/hive.keytab</value>
</property>
<property>
<name>hive.metastore.kerberos.principal</name>
<value>hive/local9@FBSPL.COM</value>
</property>
<property>
<name>hive.server2.enable.impersonation</name>
<description>Enable user impersonation for HiveServer2</description>
<value>true</value>
</property> /etc/defaults/impala IMPALA_STATE_STORE_ARGS=" -log_dir=${IMPALA_LOG_DIR} \
-kerberos_reinit_interval=60 \
-principal=impala/local9@FBSPL.COM \
-keytab_file=/etc/impala/conf/impala-http.keytab \
-state_store_port=${IMPALA_STATE_STORE_PORT}"
IMPALA_SERVER_ARGS=" \
-log_dir=${IMPALA_LOG_DIR} \
-catalog_service_host=${IMPALA_CATALOG_SERVICE_HOST} \
-state_store_port=${IMPALA_STATE_STORE_PORT} \
-use_statestore \
-state_store_host=${IMPALA_STATE_STORE_HOST} \
-kerberos_reinit_interval=60 \
-principal=impala/local9@FBSPL.COM \
-keytab_file=/etc/impala/conf/impala-http.keytab \
-be_port=${IMPALA_BACKEND_PORT}" Keytab File permission and ownership -r--r-----. 1 hive hadoop 146 Oct 29 12:36 /etc/hive/conf/hive.keytab -r--------. 1 impala impala 294 Oct 28 19:22 /etc/impala/conf/impala-http.keytab Keytab Principals: klist -e -t -k /etc/hive/conf/hive.keytab
Keytab name: FILE:/etc/hive/conf/hive.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
1 10/29/2020 12:36:48 hive/local9@FBSPL.COM (aes256-cts-hmac-sha1-96)
1 10/29/2020 12:36:48 hive/local9@FBSPL.COM (aes128-cts-hmac-sha1-96) klist -e -t -k /etc/impala/conf/impala-http.keytab
Keytab name: FILE:/etc/impala/conf/impala-http.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
2 10/28/2020 19:21:47 impala/local9@FBSPL.COM (aes256-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 impala/local9@FBSPL.COM (aes128-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 HTTP/local9@FBSPL.COM (aes256-cts-hmac-sha1-96)
2 10/28/2020 19:21:47 HTTP/local9@FBSPL.COM (aes128-cts-hmac-sha1-96) Any Help would be greatly appreciated.
... View more
Labels: