Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP 3.1 Cluster with kerberos. HTTP principal missing. HDFS and YARN fail due to 'Failure unspecified at GSS-API level (Mechanism level: Checksum failed)'

HDP 3.1 Cluster with kerberos. HTTP principal missing. HDFS and YARN fail due to 'Failure unspecified at GSS-API level (Mechanism level: Checksum failed)'

Contributor

I deployed an HDP 3.1 with ambari and conected to an Active Directory. Now. I want to kerberize the cluster. The first problem is that the HTTP principal is not created automatically, so I create manually the HTTP principal. But now, the services are not starting since there is a problem with the spnego keytab file.



2019-07-11 16:14:38,962 - The NameNode is still in Safemode. Please be careful with commands that need Safemode OFF. Traceback (most recent call last):  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 287, in _run_command    result_dict = json.loads(out)  File "/usr/lib/ambari-agent/lib/ambari_simplejson/__init__.py", line 307, in loads    return _default_decoder.decode(s)  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 335, in decode    obj, end = self.raw_decode(s, idx=_w(s, 0).end())  File "/usr/lib/ambari-agent/lib/ambari_simplejson/decoder.py", line 353, in raw_decode    raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded 
The above exception was the cause of the following exception: 
Traceback (most recent call last):  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HDFS/package/scripts/namenode.py", line 408, in <module>    NameNode().execute()  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute


  File "/usr/lib/ambari-agent/lib/resource_management/libraries/providers/hdfs_resource.py", line 295, in _run_command    raise WebHDFSCallException(err_msg, result_dict) resource_management.libraries.providers.hdfs_resource.WebHDFSCallException: Execution of 'curl -sS -L -w '%{http_code}' -X GET -d '' -H 'Content-Length: 0' --negotiate -u : 'http://master01.hadoop.know-center.at:50070/webhdfs/v1/tmp?op=GETFILESTATUS'' returned status_code=403. <html> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"/> <title>Error 403 GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</title> </head> <body><h2>HTTP ERROR 403</h2> <p>Problem accessing /webhdfs/v1/tmp. Reason: <pre>    GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</pre></p> </body> </html>


2019-07-11 16:14:25,642 - Retrying after 10 seconds. Reason: Execution of '/usr/hdp/current/hadoop-hdfs-namenode/bin/hdfs dfsadmin -fs hdfs://master01.hadoop.know-center.at:8020 -safemode get | grep 'Safe mode is OFF'' returned 1. 2019-07-11 16:14:38,962 - The NameNode is still in Safemode. Please be careful with commands that need Safemode OFF. 2019-07-11 16:14:38,965 - HdfsResource['/tmp'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://master01.hadoop.know-center.at:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-kchadoop@HADOOP.KNOW-CENTER.AT', 'user': 'hdfs', 'owner': 'hdfs', 'nameservices': None, 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0777} 2019-07-11 16:14:38,971 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-kchadoop@HADOOP.KNOW-CENTER.AT'] {'user': 'hdfs'} 2019-07-11 16:14:39,091 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' --negotiate -u : '"'"'http://master01.hadoop.know-center.at:50070/webhdfs/v1/tmp?op=GETFILESTATUS'"'"' 1>/tmp/tmpuF0nCL 2>/tmp/tmpnp5ug2''] {'logoutput': None, 'quiet': False} 2019-07-11 16:14:41,481 - call returned (0, '') 2019-07-11 16:14:41,482 - get_user_call_output returned (0, u'<html>\n<head>\n<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>\n<title>Error 403 GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</title>\n</head>\n<body><h2>HTTP ERROR 403</h2>\n<p>Problem accessing /webhdfs/v1/tmp. Reason:\n<pre>    GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</pre></p>\n</body>\n</html>\n403', u'') 
Command failed after 1 tries



Additionally, I followed the steps included in links below:


https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_security/content/ch_enable_spnego_auth_f...

https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-security/content/set_up_kerberos_...

Don't have an account?
Coming from Hortonworks? Activate your account here