Member since
09-10-2016
82
Posts
6
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6443 | 08-28-2019 11:07 AM | |
5952 | 12-21-2018 05:59 PM | |
3088 | 12-10-2018 05:16 PM | |
2579 | 12-10-2018 01:03 PM | |
1690 | 12-07-2018 08:04 AM |
02-11-2019
03:20 PM
It's just a workaround @Geoffrey Shelton Okot. Thanks.
... View more
02-11-2019
06:17 AM
Hi @Geoffrey Shelton Okot, Thanks for your time. I have set the below two properties in core-site.xml from Ambari. Now, NN, RM and History server UI is working fine. hadoop.http.authentication.simple.anonymous.allowed=true
hadoop.http.authentication.type=simple Regards, Sampath
... View more
02-09-2019
03:01 PM
Hi @Geoffrey Shelton Okot, Thanks for the response. I have updated the krb5.conf with the below properties # grep "enctypes" /etc/krb5.conf
default_tgs_enctypes= des3-cbc-sha1 aes256-cts-hmac-sha1-96 arcfour-hmac aes128-cts-hmac-sha1-96 des-cbc-md5
default_tkt_enctypes = des3-cbc-sha1 aes256-cts-hmac-sha1-96 arcfour-hmac aes128-cts-hmac-sha1-96 des-cbc-md5 # klist -aef
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: HTTP/hostname_fqdn@realm
Valid starting Expires Service principal
02/09/2019 14:44:22 02/10/2019 00:44:22 krbtgt/realm@realm
renew until 02/16/2019 14:44:22, Flags: FRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, aes256-cts-hmac-sha1-96
Addresses: (none)
I don't have access to check the encryption types mapped in AD server. Is there any way I can check this from my linux host? Thank you.
... View more
02-09-2019
01:20 PM
Hi, [Ambari 2.7.3, HDP 3.1] In Active Directory Kerberized environment, I'm getting below issue when I try to access Namenode UI, RM UI and Job histroy UI from Ambari Error: HTTP ERROR 403
problem accessing /index.html. Reason:
GSSException: Failure unspecified at GSS-API level (Mechanism level: Invalid argument (400) - Cannot find key of appropriate type to decrypt AP REP - AES256 CTS mode with HMAC SHA1-96) krb5.conf: max_life = 30d
default_tgs_enctypes = aes128-cts arcfour-hmac-md5 des-cbc-crc des-cbc-md5 des-hmac-sha1 aes256-cts
default_tkt_enctypes = aes128-cts arcfour-hmac-md5 des-cbc-crc des-cbc-md5 des-hmac-sha1 aes256-cts
permitted_enctypes = aes256-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac-md5 des-cbc-crc des-cbc-md5 des-cbc-md4
allow_weak_crypto = yes klist: $ls -lrt /etc/security/keytabs/spnego.service.keytab
-r--r-----. 1 root hadoop 433 Feb 9 11:59 /etc/security/keytabs/spnego.service.keytab
$klist -ket /etc/security/keytabs/spnego.service.keytab
Keytab name: FILE:/etc/security/keytabs/spnego.service.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
0 02/09/2019 07:40:04 HTTP/hostname_fqdn@realm (arcfour-hmac)
0 02/09/2019 07:40:04 HTTP/hostname_fqdn@realm (des-cbc-md5)
0 02/09/2019 07:40:04 HTTP/hostname_fqdn@realm (aes256-cts-hmac-sha1-96)
0 02/09/2019 07:40:04 HTTP/hostname_fqdn@realm (des3-cbc-sha1)
0 02/09/2019 07:40:04 HTTP/hostname_fqdn@Crealm (aes128-cts-hmac-sha1-96) kinit: $kinit -kt /etc/security/keytabs/spnego.service.keytab $(klist -kt /etc/security/keytabs/spnego.service.keytab|sed -n "4p"|cut -d" " -f7)
# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: HTTP/hostname_fqdn@realm
Valid starting Expires Service principal
02/09/2019 12:53:05 02/09/2019 22:53:05 krbtgt/realm@realm
renew until 02/16/2019 12:53:05 I have re-generated the spnego keytab in all the hosts from ambari UI but did not help. Would you please help this. Thank you.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Cloudera Manager
02-07-2019
06:44 AM
Hi I'm getting below issue while starting Timeline Service V2.0 in HDP 3.1 org.apache.zookeeper.KeeperException$NoAuthException: KeeperErrorCode = NoAuth for /atsv2-hbase-secure1/tokenauth/keys
at org.apache.zookeeper.KeeperException.create(KeeperException.java:113)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.create(ZooKeeper.java:783)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.createNonSequential(RecoverableZooKeeper.java:549)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.create(RecoverableZooKeeper.java:528)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.createWithParents(ZKUtil.java:1199)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.createWithParents(ZKUtil.java:1177)
2019-02-07 06:10:29,463 INFO [main] client.RpcRetryingCallerImpl: Call exception, tries=6, retries=36, started=6318 ms ago, cancelled=false, msg=org.apache.hadoop.hbase.ipc.ServerNotRunningYetException: Server sl73caeapd044.visa.com,17020,1549478310629 is not running yet
at org.apache.hadoop.hbase.regionserver.RSRpcServices.checkOpen(RSRpcServices.java:1487)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2443)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:131)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
, details=row 'prod.timelineservice.entity' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=sl73caeapd044.visa.com,17020,1549462163537, seqNum=-1
2019-02-07 06:10:33,487 INFO [main] client.RpcRetryingCallerImpl: Call exception, tries=7, retries=36, started=10342 ms ago, cancelled=false, msg=org.apache.hadoop.hbase.ipc.ServerNotRunningYetException: Server sl73caeapd044.visa.com,17020,1549478310629 is not running yet
at org.apache.hadoop.hbase.regionserver.RSRpcServices.checkOpen(RSRpcServices.java:1487)
Please help on this. Thank you. Regards, Sampath
... View more
Labels:
- Labels:
-
Apache YARN
02-04-2019
05:16 PM
Hi, To identify the zookeeper leader, I can use the below command in CLI. $echo stat | nc zk_hostname 2181 | grep Mode
Mode: follower
Is it possible to achieve this using REST API calls? Thank you. Regards, Sampath
... View more
Labels:
- Labels:
-
Apache Zookeeper
12-26-2018
08:11 AM
Hi @Rajeswaran Govindan, Since Ambari is running a non-privileged user, it is possible that the chown for keytab file failed due to permission issues. Make sure that the sudoers file is setup properly. Please refer the below documentation for this. http://docs.hortonworks.com/HDPDocuments/Ambari-2.4.2.0/bk_ambari-security/content/sudoer_configuration_server.html Hope this helps!
... View more
12-26-2018
07:37 AM
Hi Vinay, From HDP 3.0 onwards, to work with hive databases you should use the HiveWarehouseConnector library. Please refer the below documentation. https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/integrating-hive/content/hive_configure_a_spark_hive_connection.html Hope this helps!
... View more
12-21-2018
05:59 PM
Hi @IMRAN KHAN, Could you please check the repos are correct in host where the installation is failing # grep 'baseurl' /etc/yum.repos.d/* | grep -i HDP Try cleaning the yum cache by running the command. # yum clean all
Please check if in case of any multiple "ambari-hdp-<repoid>.repo" files present inside the "/etc/yum.repos.d/" . If so, then move the unwanted files from there to back up folder. Please try the below commands from the host where it is failing to install "hdp-select" package # yum install hdp-select -y Hope this helps!
... View more
12-20-2018
01:51 PM
Hi @Daniel Hernández Would you please try the below command. # /bin/hbase hbck -fixAssignments "hbase:acl"
... View more