Support Questions
Find answers, ask questions, and share your expertise

Warning while installing apache spark 2 on hdp 2.5 - "error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>

Warning while installing apache spark 2 on hdp 2.5 - "error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>

Explorer

Hi,

I am trying to install spark 2 service to my cluster but for some reason one of the nodes is having a warning, please see the logs below:

stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 193, in <module> HiveServiceCheck().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 99, in service_check webhcat_service_check() File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service_check.py", line 125, in webhcat_service_check logoutput=True) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh clusternode2.novalocal ambari-qa 50111 idtest.ambari-qa.1500449371.28.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> stdout: 2017-07-19 07:29:10,602 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-07-19 07:29:10,620 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} 2017-07-19 07:29:10,641 - call returned (0, 'hive-server2 - 2.5.3.0-37') 2017-07-19 07:29:10,642 - Stack Feature Version Info: stack_version=2.5, version=None, current_cluster_version=2.5.3.0-37 -> 2.5 2017-07-19 07:29:10,658 - Running Hive Server checks 2017-07-19 07:29:10,658 - -------------------------- 2017-07-19 07:29:10,659 - Server Address List : [u'clusternode2.novalocal'], Port : 10000, SSL KeyStore : None 2017-07-19 07:29:10,659 - Waiting for the Hive Server to start... 2017-07-19 07:29:10,659 - Execute['! beeline -u 'jdbc:hive2://clusternode2.novalocal:10000/;transportMode=binary' -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'user': 'ambari-qa', 'timeout': 30} 2017-07-19 07:29:13,300 - Successfully connected to clusternode2.novalocal on port 10000 2017-07-19 07:29:13,301 - Successfully stayed connected to 'Hive Server' on host: clusternode3.novalocal and port 10000 after 2.64190602303 seconds 2017-07-19 07:29:13,301 - Running HCAT checks 2017-07-19 07:29:13,301 - ------------------- 2017-07-19 07:29:13,302 - checked_call['hostid'] {} 2017-07-19 07:29:13,305 - checked_call returned (0, 'a8c07201') 2017-07-19 07:29:13,306 - File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] {'content': StaticFile('hcatSmoke.sh'), 'mode': 0755} 2017-07-19 07:29:13,307 - Writing File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] because it doesn't exist 2017-07-19 07:29:13,307 - Changing permission for /var/lib/ambari-agent/tmp/hcatSmoke.sh from 644 to 755 2017-07-19 07:29:13,307 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.8.0_77 /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeida8c07201_date291917 prepare true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'], 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5} OK Time taken: 1.585 seconds OK Time taken: 1.261 seconds OK Time taken: 1.691 seconds 2017-07-19 07:29:25,460 - ExecuteHadoop['fs -test -e /apps/hive/warehouse/hcatsmokeida8c07201_date291917'] {'logoutput': True, 'bin_dir': '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/usr/hdp/current/hadoop-client/conf'} 2017-07-19 07:29:25,461 - Execute['hadoop --config /usr/hdp/current/hadoop-client/conf fs -test -e /apps/hive/warehouse/hcatsmokeida8c07201_date291917'] {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': [u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']} 2017-07-19 07:29:27,285 - Execute[' /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeida8c07201_date291917 cleanup true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'], 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5} OK Time taken: 1.397 seconds 2017-07-19 07:29:31,280 - Running WEBHCAT checks 2017-07-19 07:29:31,280 - --------------------- 2017-07-19 07:29:31,281 - File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] {'content': StaticFile('templetonSmoke.sh'), 'mode': 0755} 2017-07-19 07:29:31,282 - Writing File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] because it doesn't exist 2017-07-19 07:29:31,282 - Changing permission for /var/lib/ambari-agent/tmp/templetonSmoke.sh from 644 to 755 2017-07-19 07:29:31,288 - File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1500449371.28.pig'] {'owner': 'hdfs', 'content': Template('templeton_smoke.pig.j2')} 2017-07-19 07:29:31,289 - Writing File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1500449371.28.pig'] because it doesn't exist 2017-07-19 07:29:31,289 - Changing owner for /var/lib/ambari-agent/tmp/idtest.ambari-qa.1500449371.28.pig from 0 to hdfs 2017-07-19 07:29:31,290 - HdfsResource['/tmp/idtest.ambari-qa.1500449371.28.pig'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/var/lib/ambari-agent/tmp/idtest.ambari-qa.1500449371.28.pig', 'dfs_type': '', 'default_fs': 'hdfs://clusternode1.novalocal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']} 2017-07-19 07:29:31,293 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.pig?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpDykg27 2>/tmp/tmpCNcotQ''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,346 - call returned (0, '') 2017-07-19 07:29:31,346 - Creating new file /tmp/idtest.ambari-qa.1500449371.28.pig in DFS 2017-07-19 07:29:31,347 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/var/lib/ambari-agent/tmp/idtest.ambari-qa.1500449371.28.pig -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.pig?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmpOoKCx9 2>/tmp/tmp0_Gjm3''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,421 - call returned (0, '') 2017-07-19 07:29:31,422 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.pig?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"' 1>/tmp/tmpqw5LXY 2>/tmp/tmpBAOQp8''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,470 - call returned (0, '') 2017-07-19 07:29:31,471 - HdfsResource['/tmp/idtest.ambari-qa.1500449371.28.in'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/etc/passwd', 'dfs_type': '', 'default_fs': 'hdfs://clusternode1.novalocal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']} 2017-07-19 07:29:31,472 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.in?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpJDR6me 2>/tmp/tmpwPw2ST''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,515 - call returned (0, '') 2017-07-19 07:29:31,515 - Creating new file /tmp/idtest.ambari-qa.1500449371.28.in in DFS 2017-07-19 07:29:31,516 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/etc/passwd -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.in?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmp7twz7Y 2>/tmp/tmpvQt5Z3''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,586 - call returned (0, '') 2017-07-19 07:29:31,587 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://clusternode1.novalocal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1500449371.28.in?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"' 1>/tmp/tmpllAwXI 2>/tmp/tmphfG8Eg''] {'logoutput': None, 'quiet': False} 2017-07-19 07:29:31,639 - call returned (0, '') 2017-07-19 07:29:31,640 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://clusternode1.novalocal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']} 2017-07-19 07:29:31,641 - Execute['/var/lib/ambari-agent/tmp/templetonSmoke.sh clusternode2.novalocal ambari-qa 50111 idtest.ambari-qa.1500449371.28.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp'] {'logoutput': True, 'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 3, 'try_sleep': 5} Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> 2017-07-19 07:29:36,974 - Retrying after 5 seconds. Reason: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh clusternode2.novalocal ambari-qa 50111 idtest.ambari-qa.1500449371.28.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> 2017-07-19 07:29:46,275 - Retrying after 5 seconds. Reason: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh clusternode2.novalocal ambari-qa 50111 idtest.ambari-qa.1500449371.28.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500> Command failed after 1 tries

any idea would be quite appreciated

thank you very much

1 REPLY 1

Re: Warning while installing apache spark 2 on hdp 2.5 - "error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>

Expert Contributor