Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive Installation - Check Hive Failed

Highlighted

Hive Installation - Check Hive Failed

New Contributor

While installing Hive on hortonworks hdp2.6.3, I am getting the following error

stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 194, in <module>
HiveServiceCheck().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py", line 99, in service_check
webhcat_service_check()
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service_check.py", line 125, in webhcat_service_check
logoutput=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh ip-172-31-88-254.ec2.internal ambari-qa 50111 idtest.ambari-qa.1511991838.69.pig no_keytab false /usr/bin/kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>
stdout:
2017-11-29 21:43:21,055 - MariaDB RedHat Support: false
2017-11-29 21:43:21,061 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2017-11-29 21:43:21,077 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2017-11-29 21:43:21,101 - call returned (0, 'hive-server2 - 2.6.3.0-235')
2017-11-29 21:43:21,102 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2017-11-29 21:43:21,166 - Running Hive Server checks
2017-11-29 21:43:21,166 - --------------------------

2017-11-29 21:43:21,168 - Server Address List : [u'ip-172-31-88-254.ec2.internal'], Port : 10000, SSL KeyStore : None
2017-11-29 21:43:21,168 - Waiting for the Hive Server to start...
2017-11-29 21:43:21,168 - Execute['! beeline -u 'jdbc:hive2://ip-172-31-88-254.ec2.internal:10000/;transportMode=binary' -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'timeout_kill_strategy': 2, 'timeout': 30, 'user': 'ambari-qa'}
2017-11-29 21:43:23,766 - Successfully connected to ip-172-31-88-254.ec2.internal on port 10000
2017-11-29 21:43:23,766 - Successfully stayed connected to 'Hive Server' on host: ip-172-31-88-254.ec2.internal and port 10000 after 2.59843206406 seconds
2017-11-29 21:43:23,766 - Running HCAT checks
2017-11-29 21:43:23,766 - -------------------

2017-11-29 21:43:23,768 - checked_call['hostid'] {}
2017-11-29 21:43:23,772 - checked_call returned (0, '1facfe58')
2017-11-29 21:43:23,772 - File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] {'content': StaticFile('hcatSmoke.sh'), 'mode': 0755}
2017-11-29 21:43:23,773 - Writing File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] because it doesn't exist
2017-11-29 21:43:23,773 - Changing permission for /var/lib/ambari-agent/tmp/hcatSmoke.sh from 644 to 755
2017-11-29 21:43:23,774 - Execute['env JAVA_HOME=/usr/java/default /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeid1facfe58_date432917 prepare true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/2.6.3.0-235/hadoop/bin:/usr/hdp/2.6.3.0-235/hive/bin'], 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
OK
Time taken: 3.112 seconds
OK
Time taken: 2.768 seconds
OK
Time taken: 2.898 seconds
2017-11-29 21:43:47,543 - ExecuteHadoop['fs -test -e /apps/hive/warehouse/hcatsmokeid1facfe58_date432917'] {'logoutput': True, 'bin_dir': '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/2.6.3.0-235/hadoop/bin', 'user': 'hdfs', 'conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf'}
2017-11-29 21:43:47,544 - Execute['hadoop --config /usr/hdp/2.6.3.0-235/hadoop/conf fs -test -e /apps/hive/warehouse/hcatsmokeid1facfe58_date432917'] {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': [u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/2.6.3.0-235/hadoop/bin']}
2017-11-29 21:43:51,123 - Execute[' /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeid1facfe58_date432917 cleanup true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/2.6.3.0-235/hadoop/bin:/usr/hdp/2.6.3.0-235/hive/bin'], 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
OK
Time taken: 2.916 seconds
2017-11-29 21:43:58,685 - Running WEBHCAT checks
2017-11-29 21:43:58,685 - ---------------------

2017-11-29 21:43:58,686 - File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] {'content': StaticFile('templetonSmoke.sh'), 'mode': 0755}
2017-11-29 21:43:58,687 - Writing File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] because it doesn't exist
2017-11-29 21:43:58,687 - Changing permission for /var/lib/ambari-agent/tmp/templetonSmoke.sh from 644 to 755
2017-11-29 21:43:58,693 - File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1511991838.69.pig'] {'owner': 'hdfs', 'content': Template('templeton_smoke.pig.j2')}
2017-11-29 21:43:58,693 - Writing File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1511991838.69.pig'] because it doesn't exist
2017-11-29 21:43:58,694 - Changing owner for /var/lib/ambari-agent/tmp/idtest.ambari-qa.1511991838.69.pig from 0 to hdfs
2017-11-29 21:43:58,694 - HdfsResource['/tmp/idtest.ambari-qa.1511991838.69.pig'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'source': '/var/lib/ambari-agent/tmp/idtest.ambari-qa.1511991838.69.pig', 'dfs_type': '', 'default_fs': 'hdfs://ip-172-31-81-35.ec2.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2017-11-29 21:43:58,697 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.pig?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmprsf5YP 2>/tmp/tmpxuGDFv''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:58,747 - call returned (0, '')
2017-11-29 21:43:58,748 - Creating new file /tmp/idtest.ambari-qa.1511991838.69.pig in DFS
2017-11-29 21:43:58,749 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/var/lib/ambari-agent/tmp/idtest.ambari-qa.1511991838.69.pig -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.pig?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmpfKel5J 2>/tmp/tmpjAHPKb''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:58,833 - call returned (0, '')
2017-11-29 21:43:58,835 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.pig?op=SETOWNER&owner=ambari-qa&group=&user.name=hdfs'"'"' 1>/tmp/tmpQGRWx8 2>/tmp/tmpI6jhgd''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:58,893 - call returned (0, '')
2017-11-29 21:43:58,895 - HdfsResource['/tmp/idtest.ambari-qa.1511991838.69.in'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'source': '/etc/passwd', 'dfs_type': '', 'default_fs': 'hdfs://ip-172-31-81-35.ec2.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2017-11-29 21:43:58,896 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.in?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpJCu1jK 2>/tmp/tmphQhU2j''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:58,945 - call returned (0, '')
2017-11-29 21:43:58,946 - Creating new file /tmp/idtest.ambari-qa.1511991838.69.in in DFS
2017-11-29 21:43:58,947 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT --data-binary @/etc/passwd -H '"'"'Content-Type: application/octet-stream'"'"' '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.in?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmpadB7_c 2>/tmp/tmp7vGRwp''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:59,019 - call returned (0, '')
2017-11-29 21:43:59,020 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://ip-172-31-81-35.ec2.internal:50070/webhdfs/v1/tmp/idtest.ambari-qa.1511991838.69.in?op=SETOWNER&owner=ambari-qa&group=&user.name=hdfs'"'"' 1>/tmp/tmpkr6Zh7 2>/tmp/tmp4XEXHG''] {'logoutput': None, 'quiet': False}
2017-11-29 21:43:59,071 - call returned (0, '')
2017-11-29 21:43:59,072 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://ip-172-31-81-35.ec2.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2017-11-29 21:43:59,073 - Execute['/var/lib/ambari-agent/tmp/templetonSmoke.sh ip-172-31-88-254.ec2.internal ambari-qa 50111 idtest.ambari-qa.1511991838.69.pig no_keytab false /usr/bin/kinit no_principal /var/lib/ambari-agent/tmp'] {'logoutput': True, 'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 3, 'try_sleep': 5}
Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>
2017-11-29 21:44:09,316 - Retrying after 5 seconds. Reason: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh ip-172-31-88-254.ec2.internal ambari-qa 50111 idtest.ambari-qa.1511991838.69.pig no_keytab false /usr/bin/kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>
Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>
2017-11-29 21:44:22,556 - Retrying after 5 seconds. Reason: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh ip-172-31-88-254.ec2.internal ambari-qa 50111 idtest.ambari-qa.1511991838.69.pig no_keytab false /usr/bin/kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>
Templeton Smoke Test (pig cmd): Failed. : {"error":"User: hcat is not allowed to impersonate ambari-qa"}http_code <500>

Command failed after 1 tries

1 REPLY 1

Re: Hive Installation - Check Hive Failed

Expert Contributor

@Rajesh K

First move forward by ignoring the error and clicking next. Oce you are at Ambari Dashboard please add the following values to the HDFS custom core-site (Services -> HDFS -> Config -> Custom core-site ) configuration:

  1. hadoop.proxyuser.hcat.groups=*
  2. hadoop.proxyuser.hcat.hosts=*

Save your changes and restart all impacted services. Now, rerun the service check. It should work this time.

PS: If this solution work for you, then please accept my answer as a best answer

Don't have an account?
Coming from Hortonworks? Activate your account here