Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HiveServer2 Interactive Start Failed on HDP3

avatar
New Contributor
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server_interactive.py", line 543, in <module>
    HiveServerInteractive().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server_interactive.py", line 103, in start
    status = self._llap_start(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server_interactive.py", line 292, in _llap_start
    status = self.check_llap_app_status(params.llap_app_name, params.num_retries_for_checking_llap_status)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server_interactive.py", line 469, in check_llap_app_status
    llap_app_info = self._get_llap_app_status_info(percent_desired_instances_to_be_up/100.0, total_timeout, refresh_rate)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server_interactive.py", line 384, in _get_llap_app_status_info
    env = { 'HIVE_CONF_DIR': params.hive_server_interactive_conf_dir } )
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/hdp/current/hive-server2/bin/hive --service llapstatus -w -r 0.8 -i 2 -t 100' returned 10. SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf of name hive.stats.fetch.partition.stats does not exist
WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist

LLAPSTATUS WatchMode with timeout=100 s
--------------------------------------------------------------------------------
LLAP status unknown
--------------------------------------------------------------------------------
LLAP status unknown
--------------------------------------------------------------------------------




{
  "state" : "UNKNOWN",
  "runningThresholdAchieved" : false
}
2021-11-03 10:14:43,131 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2021-11-03 10:14:43,141 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2021-11-03 10:14:43,281 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2021-11-03 10:14:43,285 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2021-11-03 10:14:43,285 - Group['livy'] {}
2021-11-03 10:14:43,287 - Group['spark'] {}
2021-11-03 10:14:43,287 - Group['ranger'] {}
2021-11-03 10:14:43,287 - Group['hdfs'] {}
2021-11-03 10:14:43,287 - Group['zeppelin'] {}
2021-11-03 10:14:43,288 - Group['hadoop'] {}
2021-11-03 10:14:43,288 - Group['users'] {}
2021-11-03 10:14:43,288 - Group['knox'] {}
2021-11-03 10:14:43,288 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,289 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,289 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,290 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,290 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,291 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,291 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-11-03 10:14:43,292 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,293 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2021-11-03 10:14:43,293 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-11-03 10:14:43,294 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2021-11-03 10:14:43,294 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2021-11-03 10:14:43,295 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,295 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2021-11-03 10:14:43,296 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2021-11-03 10:14:43,296 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,297 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2021-11-03 10:14:43,297 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,298 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,298 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,299 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-11-03 10:14:43,299 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2021-11-03 10:14:43,300 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-11-03 10:14:43,330 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2021-11-03 10:14:43,334 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2021-11-03 10:14:43,334 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2021-11-03 10:14:43,335 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-11-03 10:14:43,336 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2021-11-03 10:14:43,337 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2021-11-03 10:14:43,341 - call returned (0, '1015')
2021-11-03 10:14:43,342 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2021-11-03 10:14:43,349 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2021-11-03 10:14:43,349 - Group['hdfs'] {}
2021-11-03 10:14:43,349 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2021-11-03 10:14:43,350 - FS Type: HDFS
2021-11-03 10:14:43,350 - Directory['/etc/hadoop'] {'mode': 0755}
2021-11-03 10:14:43,358 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-11-03 10:14:43,359 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2021-11-03 10:14:43,368 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2021-11-03 10:14:43,372 - Skipping Execute[('setenforce', '0')] due to not_if
2021-11-03 10:14:43,372 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2021-11-03 10:14:43,374 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2021-11-03 10:14:43,374 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2021-11-03 10:14:43,374 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2021-11-03 10:14:43,377 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2021-11-03 10:14:43,378 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2021-11-03 10:14:43,394 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:43,400 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2021-11-03 10:14:43,400 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2021-11-03 10:14:43,403 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2021-11-03 10:14:43,406 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:43,425 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2021-11-03 10:14:43,431 - Skipping unlimited key JCE policy check and setup since the Java VM is not managed by Ambari
2021-11-03 10:14:43,633 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2021-11-03 10:14:43,639 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2021-11-03 10:14:43,652 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2021-11-03 10:14:43,653 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2021-11-03 10:14:43,664 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://sandbox-hdp.hortonworks.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2021-11-03 10:14:43,666 - Not downloading the file from http://sandbox-hdp.hortonworks.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2021-11-03 10:14:44,131 - HdfsResource['/warehouse/tablespace/managed/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0700}
2021-11-03 10:14:44,133 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/managed/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpSZOhYc 2>/tmp/tmpYrqBVE''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,292 - call returned (0, '')
2021-11-03 10:14:44,293 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":3,"fileId":17625,"group":"hadoop","length":0,"modificationTime":1635827186599,"owner":"hive","pathSuffix":"","permission":"777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,293 - Skipping the operation for not managed DFS directory /warehouse/tablespace/managed/hive since immutable_paths contains it.
2021-11-03 10:14:44,293 - HdfsResource['/user/hive/.yarn'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755}
2021-11-03 10:14:44,294 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/user/hive/.yarn?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpAYj9gB 2>/tmp/tmpmOd3iQ''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,346 - call returned (0, '')
2021-11-03 10:14:44,346 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":98164,"group":"hadoop","length":0,"modificationTime":1635824031913,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,347 - HdfsResource['/user/hive/.yarn/package'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755}
2021-11-03 10:14:44,348 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/user/hive/.yarn/package?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpUa1nJz 2>/tmp/tmp1xr_X8''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,393 - call returned (0, '')
2021-11-03 10:14:44,393 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":98165,"group":"hadoop","length":0,"modificationTime":1635823747141,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,393 - HdfsResource['/user/hive/.yarn/package/LLAP'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755}
2021-11-03 10:14:44,394 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/user/hive/.yarn/package/LLAP?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpATxCWE 2>/tmp/tmpPPM1CF''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,466 - call returned (0, '')
2021-11-03 10:14:44,467 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":98166,"group":"hadoop","length":0,"modificationTime":1635932462106,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,467 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01755}
2021-11-03 10:14:44,468 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpN6IwYe 2>/tmp/tmpwfX0FR''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,517 - call returned (0, '')
2021-11-03 10:14:44,518 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":45,"fileId":17503,"group":"hadoop","length":0,"modificationTime":1543514171363,"owner":"hive","pathSuffix":"","permission":"1755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,518 - HdfsResource['/warehouse/tablespace/managed/hive/sys.db/query_data/'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2021-11-03 10:14:44,519 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/managed/hive/sys.db/query_data/?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpCbe_yT 2>/tmp/tmp7KqYqV''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,564 - call returned (0, '')
2021-11-03 10:14:44,565 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":2,"fileId":18755,"group":"hadoop","length":0,"modificationTime":1635912987947,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,566 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/dag_meta'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2021-11-03 10:14:44,567 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/dag_meta?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpUAT3IR 2>/tmp/tmpXx8lyg''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,612 - call returned (0, '')
2021-11-03 10:14:44,612 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":0,"fileId":17581,"group":"hadoop","length":0,"modificationTime":1543514170596,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,613 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/dag_data'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2021-11-03 10:14:44,614 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/dag_data?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpnTmORI 2>/tmp/tmpRASLUR''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,667 - call returned (0, '')
2021-11-03 10:14:44,667 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":0,"fileId":17582,"group":"hadoop","length":0,"modificationTime":1543514170991,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,667 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/app_data'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2021-11-03 10:14:44,668 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/app_data?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRWfKjc 2>/tmp/tmpcHc_QB''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,714 - call returned (0, '')
2021-11-03 10:14:44,714 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":0,"fileId":17583,"group":"hadoop","length":0,"modificationTime":1543514171363,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,715 - HdfsResource['/user/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0755}
2021-11-03 10:14:44,716 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRbmyet 2>/tmp/tmp3O2rQa''] {'logoutput': None, 'quiet': False}
2021-11-03 10:14:44,761 - call returned (0, '')
2021-11-03 10:14:44,761 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":4,"fileId":17465,"group":"hdfs","length":0,"modificationTime":1635823746748,"owner":"hive","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2021-11-03 10:14:44,762 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://sandbox-hdp.hortonworks.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2021-11-03 10:14:44,762 - Directories to fill with configs: [u'/usr/hdp/current/hive-server2/conf', u'/usr/hdp/current/hive-server2/conf_llap/']
2021-11-03 10:14:44,762 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2021-11-03 10:14:44,763 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2021-11-03 10:14:44,775 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2021-11-03 10:14:44,775 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2021-11-03 10:14:44,804 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,805 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2021-11-03 10:14:44,807 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,814 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,816 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,823 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,826 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,831 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.llap': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2', 'beeline.hs2.jdbc.url.default': 'container'}}
2021-11-03 10:14:44,836 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2021-11-03 10:14:44,836 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2021-11-03 10:14:44,858 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2021-11-03 10:14:44,859 - Directory['/etc/hive_llap/conf'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
2021-11-03 10:14:44,859 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive_llap/conf', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2021-11-03 10:14:44,864 - Generating config: /etc/hive_llap/conf/mapred-site.xml
2021-11-03 10:14:44,864 - File['/etc/hive_llap/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2021-11-03 10:14:44,893 - File['/etc/hive_llap/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,922 - File['/etc/hive_llap/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2021-11-03 10:14:44,924 - File['/etc/hive_llap/conf/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,925 - File['/etc/hive_llap/conf/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,948 - File['/etc/hive_llap/conf/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,951 - File['/etc/hive_llap/conf/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,955 - File['/etc/hive_llap/conf/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:44,956 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive_llap/conf', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.llap': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2', 'beeline.hs2.jdbc.url.default': 'container'}}
2021-11-03 10:14:44,961 - Generating config: /etc/hive_llap/conf/beeline-site.xml
2021-11-03 10:14:44,961 - File['/etc/hive_llap/conf/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2021-11-03 10:14:44,973 - File['/etc/hive_llap/conf/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,003 - Converted 'hive.llap.io.memory.size' value from '588 MB' to '616562688 Bytes' before writing it to config file.
2021-11-03 10:14:45,003 - Setup for Atlas Hive2 Hook started.
2021-11-03 10:14:45,003 - Generating Atlas Hook config file /usr/hdp/current/hive-server2/conf_llap/atlas-application.properties
2021-11-03 10:14:45,003 - PropertiesFile['/usr/hdp/current/hive-server2/conf_llap/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2021-11-03 10:14:45,013 - Generating properties file: /usr/hdp/current/hive-server2/conf_llap/atlas-application.properties
2021-11-03 10:14:45,013 - File['/usr/hdp/current/hive-server2/conf_llap/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2021-11-03 10:14:45,029 - Writing File['/usr/hdp/current/hive-server2/conf_llap/atlas-application.properties'] because contents don't match
2021-11-03 10:14:45,030 - Setup for Atlas Hive2 Hook done.
2021-11-03 10:14:45,030 - Retrieved 'tez/tez-site' for merging with 'tez_hive2/tez-interactive-site'.
2021-11-03 10:14:45,030 - XmlConfig['tez-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/tez_llap/conf', 'mode': 0664, 'configuration_attributes': {u'final': {u'tez.runtime.shuffle.ssl.enable': u'true'}}, 'owner': 'tez', 'configurations': ...}
2021-11-03 10:14:45,035 - Generating config: /etc/tez_llap/conf/tez-site.xml
2021-11-03 10:14:45,035 - File['/etc/tez_llap/conf/tez-site.xml'] {'owner': 'tez', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0664, 'encoding': 'UTF-8'}
2021-11-03 10:14:45,086 - Retrieved 'hiveserver2-site' for merging with 'hiveserver2-interactive-site'.
2021-11-03 10:14:45,086 - File['/usr/hdp/current/hive-server2/conf_llap/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_server_interactive/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2021-11-03 10:14:45,087 - Writing File['/usr/hdp/current/hive-server2/conf_llap/hive-site.jceks'] because contents don't match
2021-11-03 10:14:45,087 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf_llap/', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2021-11-03 10:14:45,092 - Generating config: /usr/hdp/current/hive-server2/conf_llap/hive-site.xml
2021-11-03 10:14:45,093 - File['/usr/hdp/current/hive-server2/conf_llap/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2021-11-03 10:14:45,214 - Writing File['/usr/hdp/current/hive-server2/conf_llap/hive-site.xml'] because contents don't match
2021-11-03 10:14:45,215 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf_llap/', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2021-11-03 10:14:45,220 - Generating config: /usr/hdp/current/hive-server2/conf_llap/hiveserver2-site.xml
2021-11-03 10:14:45,220 - File['/usr/hdp/current/hive-server2/conf_llap/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2021-11-03 10:14:45,227 - File['/usr/hdp/current/hive-server2/conf_llap//hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2021-11-03 10:14:45,241 - File['/usr/hdp/current/hive-server2/conf_llap//llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,242 - File['/usr/hdp/current/hive-server2/conf_llap//llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,245 - File['/usr/hdp/current/hive-server2/conf_llap//hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,246 - File['/usr/hdp/current/hive-server2/conf_llap//hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,247 - File['/usr/hdp/current/hive-server2/conf_llap//beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,248 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/usr/hdp/current/hive-server2/conf_llap/', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.llap': u'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-hive2', 'beeline.hs2.jdbc.url.default': 'container'}}
2021-11-03 10:14:45,253 - Generating config: /usr/hdp/current/hive-server2/conf_llap/beeline-site.xml
2021-11-03 10:14:45,253 - File['/usr/hdp/current/hive-server2/conf_llap/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2021-11-03 10:14:45,275 - File['/usr/hdp/current/hive-server2/conf_llap/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,285 - File['/usr/hdp/current/hive-server2/conf_llap//hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,289 - File['/usr/hdp/current/hive-server2/conf_llap//hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2021-11-03 10:14:45,290 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2021-11-03 10:14:45,292 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2021-11-03 10:14:45,292 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://sandbox-hdp.hortonworks.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2021-11-03 10:14:45,293 - Not downloading the file from http://sandbox-hdp.hortonworks.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2021-11-03 10:14:45,318 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_interactive_script'] {'content': Template('startHiveserver2Interactive.sh.j2'), 'mode': 0755}
2021-11-03 10:14:45,319 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2021-11-03 10:14:45,319 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2021-11-03 10:14:45,320 - Directory['/var/lib/hive2'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2021-11-03 10:14:45,329 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2021-11-03 10:14:45,334 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json
2021-11-03 10:14:45,334 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json'] {'content': Template('input.config-hive.json.j2'), 'mode': 0644}
2021-11-03 10:14:45,342 - Determining previous run 'LLAP package' folder(s) to be deleted ....
2021-11-03 10:14:45,365 - Previous run 'LLAP package' folder(s) to be deleted = ['llap-yarn-service_2021-11-03_08-39-00']
2021-11-03 10:14:45,366 - Directory['/var/lib/ambari-agent/tmp/llap-yarn-service_2021-11-03_08-39-00'] {'action': ['delete'], 'ignore_failures': True}
2021-11-03 10:14:45,366 - Removing directory Directory['/var/lib/ambari-agent/tmp/llap-yarn-service_2021-11-03_08-39-00'] and all its content
2021-11-03 10:14:45,460 - Starting LLAP
2021-11-03 10:14:45,460 - Setting slider_placement : 0, as llap_daemon_container_size : 5888 > 0.5 * YARN NodeManager Memory(6912)
2021-11-03 10:14:45,462 - LLAP start command: /usr/hdp/current/hive-server2/bin/hive --service llap --size 5888m --startImmediately --name llap0 --cache 588m --xmx 4710m --loglevel INFO --output /var/lib/ambari-agent/tmp/llap-yarn-service_2021-11-03_10-14-45 --user hive --service-placement 0 --skiphadoopversion --skiphbasecp --instances 1 --logger query-routing --args " -XX:+AlwaysPreTouch -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:InitiatingHeapOccupancyPercent=70 -XX:+UnlockExperimentalVMOptions -XX:G1MaxNewSizePercent=40 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=200 -XX:MetaspaceSize=1024m"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf of name hive.stats.fetch.partition.stats does not exist
WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist
WARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]
WARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/lib/jvm/java],process jre=[/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.191.b12-0.el7_5.x86_64/jre]
WARN conf.HiveConf: HiveConf of name hive.stats.fetch.partition.stats does not exist
WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist
WARN conf.HiveConf: HiveConf of name hive.stats.fetch.partition.stats does not exist
WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist
WARN metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
Wed Nov 03 10:15:53 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Wed Nov 03 10:15:55 UTC 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
10:15:59 Running as a child of LlapServiceDriver
10:15:59 Prepared the files
10:16:20 Packaged the files
WARN curator.CuratorZookeeperClient: session timeout [10000] is less than connection timeout [15000]
Dependency libs are already uploaded to /hdp/apps/3.0.1.0-187/yarn/service-dep.tar.gz.
2021-11-03 10:16:39,976 - 





2021-11-03 10:16:39,977 - LLAP status command : /usr/hdp/current/hive-server2/bin/hive --service llapstatus -w -r 0.8 -i 2 -t 100
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf of name hive.stats.fetch.partition.stats does not exist
WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist
WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist

LLAPSTATUS WatchMode with timeout=100 s
--------------------------------------------------------------------------------
LLAP status unknown
--------------------------------------------------------------------------------
LLAP status unknown
--------------------------------------------------------------------------------




{
  "state" : "UNKNOWN",
  "runningThresholdAchieved" : false
}

Command failed after 1 tries

On HDP 3.0, I am getting this error when it tried to start the llap service. I have checked the yarn UI and verified that the LLAP is running.  Could you give me any suggestions?

3 REPLIES 3

avatar
Super Collaborator

Hi,

 

Can you perform these steps and see if this helps to restart LLAP.

 

1. Check ps -ef | grep -I llapdaemon  run it on the node managers and kill the process if queue utilisation was full.

       check in all the nodemanager ,if any stale llap daemon process is running

       > ps -ef | grep llap

 

2.   su - hive

      # yarn app -status llap0

      # yarn app -destroy llap0

 

Regards,

Chethan YM

 

 

avatar
New Contributor

Hi @ChethanYM 

 

Thanks for your suggestion.

I run the command "ps -ef | grep -I llapdaemon" but not find any stale llap daemon process.

And I also run the command "yarn app -status llap0" via the user hive. Then I retrieve my llap server info.

avatar
Super Collaborator

Hi,

 

As per my previous comment can you destroy it and restart the LLAP and see if this works.

Note: llap0 is the default application that will be running if LLAP is installed it will be recreated even if you destroy it and restart the service. 

 

#yarn app -destroy llap0

 

Regards,

Chethan YM