Support Questions
Find answers, ask questions, and share your expertise

unable to enable HiveServer2 Interactive

unable to enable HiveServer2 Interactive

Contributor

I have followed below guideline

https://community.hortonworks.com/questions/78013/hive-server-12-to-hive-server-20.html https://hortonworks.com/tutorial/interactive-sql-on-hadoop-with-hive-llap/

On hdp2.6.2, but failed to enable HiveServer2 Interactive (llap), error as :

hiveserver2-interactive-process.txt

llap-application.txt (kinit -kt /etc/security/keytabs/hive.service.keytab hive/en1-dev1-tbdp.trendy-global.com@ABC.COM is successfully for a manually run)

Any ideas?

5 REPLIES 5

Re: unable to enable HiveServer2 Interactive

Rising Star

@forest lin

Can you provide the contents of the corresponding "stdout" as well for the hive-application.txt (which has stderr) taken from Task Log from Ambari ?

Re: unable to enable HiveServer2 Interactive

Contributor

stderr:

2017-10-31 11:18:19,943 - LLAP app 'llap0' deployment unsuccessful.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 616, in <module>
    HiveServerInteractive().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", line 123, in start
    raise Fail("Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.")
resource_management.core.exceptions.Fail: Skipping START of Hive Server Interactive since LLAP app couldn't be STARTED.

Re: unable to enable HiveServer2 Interactive

Contributor

Stdout:

2017-10-31 11:17:31,412 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-10-31 11:17:31,596 - Stack Feature Version Info: stack_version=2.6, version=2.6.1.0-129, current_cluster_version=2.6.1.0-129 -> 2.6.1.0-129
2017-10-31 11:17:31,601 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-10-31 11:17:31,602 - Group['kms'] {}
2017-10-31 11:17:31,603 - Group['livy'] {}
2017-10-31 11:17:31,603 - Group['spark'] {}
2017-10-31 11:17:31,603 - Group['ranger'] {}
2017-10-31 11:17:31,603 - Group['zeppelin'] {}
2017-10-31 11:17:31,604 - Group['hadoop'] {}
2017-10-31 11:17:31,604 - Group['users'] {}
2017-10-31 11:17:31,604 - Group['knox'] {}
2017-10-31 11:17:31,604 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,605 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,605 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,606 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-10-31 11:17:31,606 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,606 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,607 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger']}
2017-10-31 11:17:31,607 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-10-31 11:17:31,608 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop']}
2017-10-31 11:17:31,608 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,608 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,609 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,609 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,610 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-10-31 11:17:31,610 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,610 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,611 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,611 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,612 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,612 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,612 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,613 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-10-31 11:17:31,613 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-31 11:17:31,614 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-10-31 11:17:31,619 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-10-31 11:17:31,619 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-10-31 11:17:31,620 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-31 11:17:31,620 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-10-31 11:17:31,624 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-10-31 11:17:31,624 - Group['hdfs'] {}
2017-10-31 11:17:31,624 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-10-31 11:17:31,625 - FS Type: 
2017-10-31 11:17:31,625 - Directory['/etc/hadoop'] {'mode': 0755}
2017-10-31 11:17:31,637 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2017-10-31 11:17:31,638 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-10-31 11:17:31,654 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-10-31 11:17:31,660 - Skipping Execute[('setenforce', '0')] due to not_if
2017-10-31 11:17:31,660 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2017-10-31 11:17:31,662 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2017-10-31 11:17:31,662 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2017-10-31 11:17:31,665 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2017-10-31 11:17:31,667 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2017-10-31 11:17:31,672 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:31,679 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-10-31 11:17:31,679 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2017-10-31 11:17:31,680 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2017-10-31 11:17:31,683 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2017-10-31 11:17:31,686 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2017-10-31 11:17:31,936 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-10-31 11:17:31,944 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2017-10-31 11:17:31,965 - call returned (0, 'hive-server2 - 2.6.1.0-129')
2017-10-31 11:17:31,965 - Stack Feature Version Info: stack_version=2.6, version=2.6.1.0-129, current_cluster_version=2.6.1.0-129 -> 2.6.1.0-129
2017-10-31 11:17:31,986 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://en1-dev1-tbdp.abc.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2017-10-31 11:17:31,987 - Not downloading the file from http://en1-dev1-tbdp.abc.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2017-10-31 11:17:31,987 - checked_call[('/usr/java/jdk1.8.0_131/bin/java', '-cp', u'/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', u'jceks://file/var/lib/ambari-agent/cred/conf/hive/hive-site.jceks')] {}
2017-10-31 11:17:32,585 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nOct 31, 2017 11:17:32 AM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nhive')
2017-10-31 11:17:32,596 - HdfsResource['/user/hive'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': '', 'default_fs': 'hdfs://nn1-dev1-tbdp.abc.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-trendydakelake@ABC.COM', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2017-10-31 11:17:32,597 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-trendydakelake@ABC.COM'] {'user': 'hdfs'}
2017-10-31 11:17:32,643 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://nn1-dev1-tbdp.abc.com:50070/webhdfs/v1/user/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpot1jSx 2>/tmp/tmpu1CO_2''] {'logoutput': None, 'quiet': False}
2017-10-31 11:17:32,670 - call returned (0, '')
2017-10-31 11:17:32,673 - Called copy_to_hdfs tarball: tez_hive2
2017-10-31 11:17:32,673 - Default version is 2.6.1.0-129
2017-10-31 11:17:32,674 - Source file: /usr/hdp/2.6.1.0-129/tez_hive2/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/2.6.1.0-129/tez_hive2/tez.tar.gz
2017-10-31 11:17:32,674 - HdfsResource['/hdp/apps/2.6.1.0-129/tez_hive2'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': '', 'default_fs': 'hdfs://nn1-dev1-tbdp.abc.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-trendydakelake@ABC.COM', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0555}
2017-10-31 11:17:32,674 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-trendydakelake@ABC.COM'] {'user': 'hdfs'}
2017-10-31 11:17:32,703 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://nn1-dev1-tbdp.abc.com:50070/webhdfs/v1/hdp/apps/2.6.1.0-129/tez_hive2?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpNf2UaA 2>/tmp/tmpFtWZA3''] {'logoutput': None, 'quiet': False}
2017-10-31 11:17:32,728 - call returned (0, '')
2017-10-31 11:17:32,728 - HdfsResource['/hdp/apps/2.6.1.0-129/tez_hive2/tez.tar.gz'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/2.6.1.0-129/tez_hive2/lib/tez.tar.gz', 'dfs_type': '', 'default_fs': 'hdfs://nn1-dev1-tbdp.abc.com:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-trendydakelake@ABC.COM', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0444}
2017-10-31 11:17:32,729 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-trendydakelake@ABC.COM'] {'user': 'hdfs'}
2017-10-31 11:17:32,746 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://nn1-dev1-tbdp.abc.com:50070/webhdfs/v1/hdp/apps/2.6.1.0-129/tez_hive2/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp8PdFrr 2>/tmp/tmpXE9ZyC''] {'logoutput': None, 'quiet': False}
2017-10-31 11:17:32,771 - call returned (0, '')
2017-10-31 11:17:32,771 - DFS file /hdp/apps/2.6.1.0-129/tez_hive2/tez.tar.gz is identical to /usr/hdp/2.6.1.0-129/tez_hive2/lib/tez.tar.gz, skipping the copying
2017-10-31 11:17:32,772 - Will attempt to copy tez_hive2 tarball from /usr/hdp/2.6.1.0-129/tez_hive2/lib/tez.tar.gz to DFS at /hdp/apps/2.6.1.0-129/tez_hive2/tez.tar.gz.
2017-10-31 11:17:32,772 - HdfsResource[None] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'dfs_type': '', 'default_fs': 'hdfs://nn1-dev1-tbdp.abc.com:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'hdfs-trendydakelake@ABC.COM', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2017-10-31 11:17:32,772 - Directory['/etc/hive2'] {'mode': 0755}
2017-10-31 11:17:32,773 - Directories to fill with configs: [u'/usr/hdp/current/hive-server2-hive2/conf', u'/usr/hdp/current/hive-server2-hive2/conf/conf.server']
2017-10-31 11:17:32,773 - Directory['/etc/hive2/2.6.1.0-129/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2017-10-31 11:17:32,773 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.1.0-129/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:32,782 - Generating config: /etc/hive2/2.6.1.0-129/0/mapred-site.xml
2017-10-31 11:17:32,783 - File['/etc/hive2/2.6.1.0-129/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2017-10-31 11:17:32,822 - File['/etc/hive2/2.6.1.0-129/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:32,823 - File['/etc/hive2/2.6.1.0-129/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:32,823 - Directory['/etc/hive2/2.6.1.0-129/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
2017-10-31 11:17:32,823 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive2/2.6.1.0-129/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:32,829 - Generating config: /etc/hive2/2.6.1.0-129/0/conf.server/mapred-site.xml
2017-10-31 11:17:32,830 - File['/etc/hive2/2.6.1.0-129/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2017-10-31 11:17:32,868 - File['/etc/hive2/2.6.1.0-129/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:32,868 - File['/etc/hive2/2.6.1.0-129/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:32,869 - Converted 'hive.llap.io.memory.size' value from '2048 MB' to '2147483648 Bytes' before writing it to config file.
2017-10-31 11:17:32,869 - Setup for Atlas Hive2 Hook started.
2017-10-31 11:17:32,869 - Generating Atlas Hook config file /usr/hdp/current/hive-server2-hive2/conf/conf.server/atlas-application.properties
2017-10-31 11:17:32,869 - PropertiesFile['/usr/hdp/current/hive-server2-hive2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2017-10-31 11:17:32,872 - Generating properties file: /usr/hdp/current/hive-server2-hive2/conf/conf.server/atlas-application.properties
2017-10-31 11:17:32,873 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:32,891 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/atlas-application.properties'] because contents don't match
2017-10-31 11:17:32,892 - Setup for Atlas Hive2 Hook done.
2017-10-31 11:17:32,892 - Retrieved 'tez/tez-site' for merging with 'tez_hive2/tez-interactive-site'.
2017-10-31 11:17:32,892 - XmlConfig['tez-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/tez_hive2/conf', 'mode': 0664, 'configuration_attributes': {}, 'owner': 'tez', 'configurations': ...}
2017-10-31 11:17:32,898 - Generating config: /etc/tez_hive2/conf/tez-site.xml
2017-10-31 11:17:32,898 - File['/etc/tez_hive2/conf/tez-site.xml'] {'owner': 'tez', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0664, 'encoding': 'UTF-8'}
2017-10-31 11:17:32,956 - Retrieved 'hiveserver2-site' for merging with 'hiveserver2-interactive-site'.
2017-10-31 11:17:32,956 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:32,962 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hive-site.xml
2017-10-31 11:17:32,963 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2017-10-31 11:17:33,119 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/hive-site.xml'] because contents don't match
2017-10-31 11:17:33,119 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:33,125 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml
2017-10-31 11:17:33,126 - File['/usr/hdp/current/hive-server2-hive2/conf/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2017-10-31 11:17:33,133 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,137 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,139 - File['/usr/hdp/current/hive-server2-hive2/conf/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,141 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,143 - File['/usr/hdp/current/hive-server2-hive2/conf/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,145 - File['/usr/hdp/current/hive-server2-hive2/conf/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,149 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,152 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,155 - File['/usr/hdp/current/hive-server2-hive2/conf/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2017-10-31 11:17:33,155 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2017-10-31 11:17:33,156 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.jceks'] because contents don't match
2017-10-31 11:17:33,156 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:33,162 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml
2017-10-31 11:17:33,163 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2017-10-31 11:17:33,309 - Writing File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-site.xml'] because contents don't match
2017-10-31 11:17:33,310 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2-hive2/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2017-10-31 11:17:33,316 - Generating config: /usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml
2017-10-31 11:17:33,316 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2017-10-31 11:17:33,324 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,327 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,330 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,332 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,334 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,336 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,339 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,342 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llapdaemon.properties'] {'content': Template('hadoop-metrics2-llapdaemon.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,345 - File['/usr/hdp/current/hive-server2-hive2/conf/conf.server/hadoop-metrics2-llaptaskscheduler.properties'] {'content': Template('hadoop-metrics2-llaptaskscheduler.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2017-10-31 11:17:33,346 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2017-10-31 11:17:33,347 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2017-10-31 11:17:33,348 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://en1-dev1-tbdp.abc.com:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2017-10-31 11:17:33,348 - Not downloading the file from http://en1-dev1-tbdp.abc.com:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2017-10-31 11:17:33,349 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_interactive_script'] {'content': Template('startHiveserver2Interactive.sh.j2'), 'mode': 0755}
2017-10-31 11:17:33,350 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2017-10-31 11:17:33,351 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2017-10-31 11:17:33,352 - Directory['/var/lib/hive2'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2017-10-31 11:17:33,352 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hive.service.keytab hive/en1-dev1-tbdp.abc.com@ABC.COM; '] {'user': 'hive'}
2017-10-31 11:17:33,441 - Execute['slider install-keytab --keytab /etc/security/keytabs/hive.service.keytab --folder hive --overwrite'] {'user': 'hive'}
2017-10-31 11:17:36,573 - Determining previous run 'LLAP package' folder(s) to be deleted ....
2017-10-31 11:17:36,574 - Previous run 'LLAP package' folder(s) to be deleted = ['llap-slider2017-10-18_10-55-54']
2017-10-31 11:17:36,574 - Directory['/var/lib/ambari-agent/tmp/llap-slider2017-10-18_10-55-54'] {'action': ['delete'], 'ignore_failures': True}
2017-10-31 11:17:36,574 - Removing directory Directory['/var/lib/ambari-agent/tmp/llap-slider2017-10-18_10-55-54'] and all its content
2017-10-31 11:17:36,596 - Starting LLAP
2017-10-31 11:17:36,597 - Setting slider_placement : 0, as llap_daemon_container_size : 11264 > 0.5 * YARN NodeManager Memory(12288)
2017-10-31 11:17:36,599 - LLAP start command: /usr/hdp/current/hive-server2-hive2/bin/hive --service llap --slider-am-container-mb 1024 --size 11264m --cache 2048m --xmx 7372m --loglevel INFO  --output /var/lib/ambari-agent/tmp/llap-slider2017-10-31_03-17-36 --slider-placement 0 --skiphadoopversion --skiphbasecp --instances 2 --logger query-routing --slider-keytab-dir .slider/keytabs/hive/ --slider-keytab hive.service.keytab --slider-principal hive/en1-dev1-tbdp.abc.com@ABC.COM --args " -XX:+AlwaysPreTouch -Xss512k -XX:+UseG1GC -XX:TLABSize=8m -XX:+ResizeTLAB -XX:+UseNUMA -XX:+AggressiveOpts -XX:InitiatingHeapOccupancyPercent=40 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=200 -XX:MetaspaceSize=1024m"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN cli.LlapServiceDriver: Ignoring unknown llap server parameter: [hive.aux.jars.path]
WARN cli.LlapServiceDriver: Java versions might not match : JAVA_HOME=[/usr/java/jdk1.8.0_131],process jre=[/usr/java/jdk1.8.0_131/jre]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
Prepared /var/lib/ambari-agent/tmp/llap-slider2017-10-31_03-17-36/run.sh for running LLAP on Slider
2017-10-31 11:17:52,145 - Run file path: /var/lib/ambari-agent/tmp/llap-slider2017-10-31_03-17-36/run.sh
2017-10-31 11:17:52,145 - Execute['/var/lib/ambari-agent/tmp/llap-slider2017-10-31_03-17-36/run.sh'] {'logoutput': True, 'user': 'hive'}
2017-10-31 11:17:53,143 [main] INFO  tools.SliderUtils - JVM initialized into secure mode with kerberos realm ABC.COM
2017-10-31 11:17:53,818 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2017-10-31 11:17:53,828 [main] INFO  client.RMProxy - Connecting to ResourceManager at en1-dev1-tbdp.abc.com/10.3.11.10:8050
2017-10-31 11:17:53,899 [main] INFO  client.AHSProxy - Connecting to Application History server at en1-dev1-tbdp.abc.com/10.3.11.10:10200
2017-10-31 11:17:54,158 [main] INFO  client.SliderClient - Cluster llap0 is in a terminated state FINISHED
2017-10-31 11:17:54,160 [main] INFO  util.ExitUtil - Exiting with status 0
2017-10-31 11:17:56,497 [main] INFO  tools.SliderUtils - JVM initialized into secure mode with kerberos realm ABC.COM
2017-10-31 11:17:57,260 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2017-10-31 11:17:57,270 [main] INFO  client.RMProxy - Connecting to ResourceManager at en1-dev1-tbdp.abc.com/10.3.11.10:8050
2017-10-31 11:17:57,349 [main] INFO  client.AHSProxy - Connecting to Application History server at en1-dev1-tbdp.abc.com/10.3.11.10:10200
2017-10-31 11:17:57,617 [main] INFO  zk.ZKIntegration - Binding ZK client to en1-dev1-tbdp.abc.com:2181,dn1-dev1-tbdp.abc.com:2181,dn2-dev1-tbdp.abc.com:2181
2017-10-31 11:17:57,632 [main] INFO  zk.BlockingZKWatcher - waiting for ZK event
2017-10-31 11:17:57,659 [main-EventThread] INFO  zk.BlockingZKWatcher - ZK binding callback received
2017-10-31 11:17:57,688 [main] INFO  zk.RegistrySecurity - Enabling ZK sasl client: jaasClientEntry = Client, principal = null, keytab = null
2017-10-31 11:17:57,712 [main] INFO  imps.CuratorFrameworkImpl - Starting
2017-10-31 11:17:57,718 [main-SendThread(dn1-dev1-tbdp.abc.com:2181)] WARN  zookeeper.ClientCnxn - SASL configuration failed: javax.security.auth.login.LoginException: No key to store Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it.
2017-10-31 11:17:57,722 [main-EventThread] ERROR curator.ConnectionState - Authentication failed
2017-10-31 11:17:57,724 [main-EventThread] INFO  state.ConnectionStateManager - State change: CONNECTED
2017-10-31 11:17:57,744 [main] INFO  client.SliderClient - Destroyed cluster llap0
2017-10-31 11:17:57,746 [main] INFO  util.ExitUtil - Exiting with status 0
2017-10-31 11:17:59,206 [main] INFO  tools.SliderUtils - JVM initialized into secure mode with kerberos realm ABC.COM
2017-10-31 11:17:59,848 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2017-10-31 11:17:59,857 [main] INFO  client.RMProxy - Connecting to ResourceManager at en1-dev1-tbdp.abc.com/10.3.11.10:8050
2017-10-31 11:17:59,931 [main] INFO  client.AHSProxy - Connecting to Application History server at en1-dev1-tbdp.abc.com/10.3.11.10:10200
2017-10-31 11:17:59,935 [main] WARN  client.SliderClient - The install-package option has been deprecated. Please use 'package --install'.
2017-10-31 11:18:00,123 [main] INFO  client.SliderClient - Installing package file:/var/lib/ambari-agent/tmp/llap-slider2017-10-31_03-17-36/llap-31Oct2017.zip at hdfs://nn1-dev1-tbdp.abc.com:8020/user/hive/.slider/package/LLAP/llap-31Oct2017.zip and overwrite is true.
2017-10-31 11:18:01,129 [main] INFO  util.ExitUtil - Exiting with status 0
2017-10-31 11:18:02,296 [main] INFO  tools.SliderUtils - JVM initialized into secure mode with kerberos realm ABC.COM
2017-10-31 11:18:02,906 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2017-10-31 11:18:02,915 [main] INFO  client.RMProxy - Connecting to ResourceManager at en1-dev1-tbdp.abc.com/10.3.11.10:8050
2017-10-31 11:18:02,986 [main] INFO  client.AHSProxy - Connecting to Application History server at en1-dev1-tbdp.abc.com/10.3.11.10:10200
2017-10-31 11:18:03,654 [main] INFO  agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-31Oct2017.zip
2017-10-31 11:18:03,656 [main] INFO  agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-31Oct2017.zip
2017-10-31 11:18:04,544 [main] INFO  tools.SliderUtils - Reading metainfo.xml of size 1998
2017-10-31 11:18:04,693 [main] INFO  client.SliderClient - No credentials requested
2017-10-31 11:18:04,759 [main] INFO  agent.AgentUtils - Reading metainfo at .slider/package/LLAP/llap-31Oct2017.zip
2017-10-31 11:18:06,158 [main] INFO  tools.SliderUtils - Reading metainfo.xml of size 1998
2017-10-31 11:18:06,311 [main] INFO  launch.AbstractLauncher - Setting yarn.resourcemanager.am.retry-count-window-ms to 300000
2017-10-31 11:18:06,312 [main] INFO  launch.AbstractLauncher - Log include patterns: .*\.done
2017-10-31 11:18:06,312 [main] INFO  launch.AbstractLauncher - Log exclude patterns: 
2017-10-31 11:18:06,313 [main] INFO  launch.AbstractLauncher - Modified log include patterns: .*\.done
2017-10-31 11:18:06,313 [main] INFO  launch.AbstractLauncher - Modified log exclude patterns: 
2017-10-31 11:18:06,520 [main] INFO  slideram.SliderAMClientProvider - Loading all dependencies for AM.
2017-10-31 11:18:06,521 [main] INFO  tools.CoreFileSystem - Loading all dependencies from /hdp/apps/2.6.1.0-129/slider/slider.tar.gz
2017-10-31 11:18:06,525 [main] INFO  agent.AgentClientProvider - Automatically uploading the agent tarball at hdfs://nn1-dev1-tbdp.abc.com:8020/user/hive/.slider/cluster/llap0/tmp/application_1508472575762_0011/agent
2017-10-31 11:18:06,557 [main] INFO  agent.AgentClientProvider - Validating app definition .slider/package/LLAP/llap-31Oct2017.zip
2017-10-31 11:18:06,566 [main] INFO  client.SliderClient - Using queue llap for the application instance.
2017-10-31 11:18:06,567 [main] INFO  client.SliderClient - Submitting application application_1508472575762_0011
2017-10-31 11:18:06,592 [main] INFO  launch.AppMasterLauncher - Submitting application to Resource Manager
2017-10-31 11:18:06,850 [main] INFO  impl.TimelineClientImpl - Timeline service address: http://en1-dev1-tbdp.abc.com:8188/ws/v1/timeline/
2017-10-31 11:18:07,148 [main] INFO  impl.YarnClientImpl - Submitted application application_1508472575762_0011
2017-10-31 11:18:07,150 [main] INFO  util.ExitUtil - Exiting with status 0
2017-10-31 11:18:08,382 - Submitted LLAP app name : llap0
2017-10-31 11:18:08,383 - 
2017-10-31 11:18:08,383 - LLAP status command : /usr/hdp/current/hive-server2-hive2/bin/hive --service llapstatus -w -r 0.8 -i 2 -t 400
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
WARN conf.HiveConf: HiveConf hive.llap.daemon.vcpus.per.instance expects INT type value
LLAPSTATUS WatchMode with timeout=400 s
--------------------------------------------------------------------------------
LLAP Starting up with AppId=application_1508472575762_0011.
--------------------------------------------------------------------------------
ERROR cli.LlapStatusServiceDriver: FAILED: Failed to get container diagnostics from slider
org.apache.hadoop.hive.llap.cli.LlapStatusServiceDriver$LlapStatusCliException: Failed to get container diagnostics from slider
at org.apache.hadoop.hive.llap.cli.LlapStatusServiceDriver.populateAppStatusFromSliderDiagnostics(LlapStatusServiceDriver.java:543) [hive-llap-server-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
at org.apache.hadoop.hive.llap.cli.LlapStatusServiceDriver.run(LlapStatusServiceDriver.java:274) [hive-llap-server-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
at org.apache.hadoop.hive.llap.cli.LlapStatusServiceDriver.main(LlapStatusServiceDriver.java:914) [hive-llap-server-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
Caused by: org.apache.slider.core.exceptions.BadClusterStateException: Application not running: application_1508472575762_0011 state=FINISHED 
at org.apache.slider.server.appmaster.rpc.RpcBinder.getProxy(RpcBinder.java:225) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.slider.client.SliderClient.connect(SliderClient.java:3148) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.slider.client.SliderClient.bondToCluster(SliderClient.java:3526) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.slider.client.SliderClient.createClusterOperations(SliderClient.java:3539) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.slider.client.SliderClient.getApplicationDiagnostics(SliderClient.java:3831) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.slider.client.SliderClient.actionDiagnosticContainers(SliderClient.java:3816) ~[slider-core-0.92.0.2.6.1.0-129.jar:?]
at org.apache.hadoop.hive.llap.cli.LlapStatusServiceDriver.populateAppStatusFromSliderDiagnostics(LlapStatusServiceDriver.java:540) ~[hive-llap-server-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
... 2 more
FAILED: Failed to get container diagnostics from slider
WARN cli.LlapStatusServiceDriver: Watch mode enabled and got slider client error. Retrying..
WARN cli.LlapStatusServiceDriver: Application stopped while launching. COMPLETE state reached while waiting for RUNNING state. Failing fast..
LLAP Application already complete. ApplicationId=application_1508472575762_0011
Invalid resource request, requested memory < 0, or requested memory > max configured, requestedMemory=11264, maxMemory=4096
--------------------------------------------------------------------------------
{
  "amInfo" : {
    "appName" : "llap0",
    "appType" : "org-apache-slider",
    "appId" : "application_1508472575762_0011"
  },
  "state" : "COMPLETE",
  "diagnostics" : "Invalid resource request, requested memory < 0, or requested memory > max configured, requestedMemory=11264, maxMemory=4096",
  "appStartTime" : 1509419886914,
  "appFinishTime" : 1509419897498,
  "runningThresholdAchieved" : false
}
2017-10-31 11:18:19,943 - LLAP app 'llap0' current state is COMPLETE.
2017-10-31 11:18:19,943 - LLAP app 'llap0' current state is COMPLETE.
2017-10-31 11:18:19,943 - LLAP app 'llap0' deployment unsuccessful.
Command failed after 1 tries

Re: unable to enable HiveServer2 Interactive

Contributor

fixed after adjusted the container mem, cache mem

Re: unable to enable HiveServer2 Interactive

Contributor

any magic or best practice for below memory settinfg for llap?

  • LLAP Daemon Container Max Headroom (MB)
  • LLAP Daemon Heap Size (MB)
  • Memory per Daemon
  • In-Memory Cache per Daemon