Created 12-28-2016 03:29 AM
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 155, in <module> ApplicationTimelineServer().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 44, in start self.configure(env) # FOR SECURITY File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 55, in configure yarn(name='apptimelineserver') File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn.py", line 337, in yarn mode=0755 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 459, in action_create_on_execute self.action_delayed("create") File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 456, in action_delayed self.get_hdfs_resource_executor().action_delayed(action_name, self) File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 247, in action_delayed self._assert_valid() File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 231, in _assert_valid self.target_status = self._get_file_status(target) File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 292, in _get_file_status list_status = self.util.run_command(target, 'GETFILESTATUS', method='GET', ignore_status_codes=['404'], assertable_result=False) File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 192, in run_command raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X GET --negotiate -u : 'http://bigdata013.example.com:50070/webhdfs/v1/ats/done?op=GETFILESTATUS&user.name=hdfs'' returned status_code=403. <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/> <title>Error 403 GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)</title> </head> <body><h2>HTTP ERROR 403</h2> <p>Problem accessing /webhdfs/v1/ats/done. Reason: <pre> GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)</pre></p><hr /><i><small>Powered by Jetty://</small></i><br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> <br/> </body> </html>
Created 12-28-2016 06:17 AM
Are you using Sun JDK? If yes then you will have to install the JCE policies for encryption.
Please check the below link which says "Before enabling Kerberos in the cluster, you must deploy the Java Cryptography Extension (JCE) security policy files on the Ambari Server and on all hosts in the cluster."
Created 12-28-2016 09:14 AM
Yes, JCE is installed and this issue comes.
Created 12-28-2016 09:25 AM
Also can you please share the output of the following command "klist -e -k /etc/security/keytabs/hdfs.headless.keytab"
to see the encryption types used by the kerberos tickets?
# klist -e -k /etc/security/keytabs/hdfs.headless.keytab Keytab name: FILE:/etc/security/keytabs/hdfs.headless.keytab KVNO Principal ---- -------------------------------------------------------------------------- 4 hdfs-JoyCluster@EXAMPLE.COM (des3-cbc-sha1) 4 hdfs-JoyCluster@EXAMPLE.COM (arcfour-hmac) 4 hdfs-JoyCluster@EXAMPLE.COM (aes128-cts-hmac-sha1-96) 4 hdfs-JoyCluster@EXAMPLE.COM (des-cbc-md5) 4 hdfs-JoyCluster@EXAMPLE.COM (aes256-cts-hmac-sha1-96)
- We should see (aes256-cts-hmac-sha1-96)
.
Created 12-28-2016 09:36 AM
OK, wait a moment. Thank you.
Created 12-28-2016 09:44 AM
Sorry, I reinstall the kerberos so it costs some time, but I encouter a question as follows:
2016-12-28 17:41:38,853 - Failed to create principal, hdpcluster-122816@EXAMPLE.COM - Failed to create service principal for hdpcluster-122816@EXAMPLE.COM STDOUT: Authenticating as principal admin/admin@EXAMPLE.COM with password. Password for admin/admin@EXAMPLE.COM: Enter password for principal "hdpcluster-122816@EXAMPLE.COM": Re-enter password for principal "hdpcluster-122816@EXAMPLE.COM": STDERR: WARNING: no policy specified for hdpcluster-122816@EXAMPLE.COM; defaulting to no policy add_principal: Operation requires ``add'' privilege while creating "hdpcluster-122816@EXAMPLE.COM".
Created 12-28-2016 09:58 AM
After creating the admin principal as following :
kadmin.local -q "addprinc admin/admin"
- Have you added the * in the "/var/kerberos/krb5kdc/kadm5.acl" file as following:
*/admin@EXAMPLE.COM *
- Then restart "kadmin"
/etc/rc.d/init.d/kadmin restart
.
Created 07-21-2017 06:30 AM
hey,did u able to able to resolve this issue?
Created 09-21-2018 11:44 AM
I faced this problem too. Timeline API cannot access after enable kerberos. How to solved it? Thank you.
Created 09-21-2018 05:51 PM
Hi @nur majid,
Its always better to have a new thread created along with screenshot of your issue and some logs .I see this thread is very old and inactive.