Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

Highlighted

Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

New Contributor

Hello Dears

I tired a lot of solutions for my case but in vain

Changes that i tried but did not help:

  1. set home directory for java
  2. changed the warehouse root directory
  3. created the znode for hiveserver2 manually (but it returns empty set)
  4. reinstalled zookeeper and hive again

Please check the outputs below:

* stderr:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server.py", line 143, in <module>
    HiveServer().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 1003, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_server.py", line 53, in start
    hive_service('hiveserver2', action = 'start', upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_service.py", line 101, in hive_service
    wait_for_znode()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/decorator.py", line 54, in wrapper
    return function(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_service.py", line 184, in wait_for_znode
    raise Exception(format("HiveServer2 is no longer running, check the logs at {hive_log_dir}"))
Exception: HiveServer2 is no longer running, check the logs at /var/log/hive

* stdout:

2018-10-29 11:05:32,091 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:32,112 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-10-29 11:05:32,448 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:32,455 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-10-29 11:05:32,456 - Group['hdfs'] {}
2018-10-29 11:05:32,482 - Group['hadoop'] {}
2018-10-29 11:05:32,482 - Group['users'] {}
2018-10-29 11:05:32,483 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-29 11:05:32,483 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-29 11:05:32,484 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-29 11:05:32,485 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-10-29 11:05:32,486 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-10-29 11:05:32,487 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-10-29 11:05:32,488 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-29 11:05:32,488 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-29 11:05:32,489 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-29 11:05:32,491 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-10-29 11:05:32,547 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-10-29 11:05:32,547 - Group['hdfs'] {}
2018-10-29 11:05:32,548 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-10-29 11:05:32,548 - FS Type: HDFS
2018-10-29 11:05:32,549 - Directory['/etc/hadoop'] {'mode': 0755}
2018-10-29 11:05:32,576 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-10-29 11:05:32,577 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-10-29 11:05:32,600 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-10-29 11:05:32,609 - Skipping Execute[('setenforce', '0')] due to not_if
2018-10-29 11:05:32,610 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-10-29 11:05:32,612 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-10-29 11:05:32,613 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2018-10-29 11:05:32,613 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-10-29 11:05:32,618 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2018-10-29 11:05:32,620 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2018-10-29 11:05:32,628 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:32,665 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-10-29 11:05:32,666 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-10-29 11:05:32,673 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-10-29 11:05:32,677 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:32,682 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-10-29 11:05:32,686 - Skipping unlimited key JCE policy check and setup since it is not required
2018-10-29 11:05:33,116 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-10-29 11:05:33,128 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-10-29 11:05:33,157 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2018-10-29 11:05:33,158 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:33,183 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://manager.cluster.dev:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-10-29 11:05:33,185 - Not downloading the file from http://manager.cluster.dev:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-10-29 11:05:34,615 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /var/run/hive/hive-server.pid 1>/tmp/tmp5_Py1q 2>/tmp/tmpUiH4Lt''] {'quiet': False}
2018-10-29 11:05:34,767 - call returned (0, '')
2018-10-29 11:05:34,768 - get_user_call_output returned (0, u'9460', u'')
2018-10-29 11:05:34,768 - Execute['ambari-sudo.sh kill 9460'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 9460 >/dev/null 2>&1)'}
2018-10-29 11:05:34,784 - Skipping Execute['ambari-sudo.sh kill 9460'] due to not_if
2018-10-29 11:05:34,785 - Execute['ambari-sudo.sh kill -9 9460'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 9460 >/dev/null 2>&1) || ( sleep 5 && ! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 9460 >/dev/null 2>&1) )', 'ignore_failures': True}
2018-10-29 11:05:34,801 - Skipping Execute['ambari-sudo.sh kill -9 9460'] due to not_if
2018-10-29 11:05:34,802 - Execute['! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p 9460 >/dev/null 2>&1)'] {'tries': 20, 'try_sleep': 3}
2018-10-29 11:05:34,817 - File['/var/run/hive/hive-server.pid'] {'action': ['delete']}
2018-10-29 11:05:34,818 - Deleting File['/var/run/hive/hive-server.pid']
2018-10-29 11:05:34,818 - Pid file /var/run/hive/hive-server.pid is empty or does not exist
2018-10-29 11:05:34,823 - Directories to fill with configs: [u'/usr/hdp/current/hive-server2/conf', u'/usr/hdp/current/hive-server2/conf/']
2018-10-29 11:05:34,824 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-10-29 11:05:34,824 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-10-29 11:05:34,839 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2018-10-29 11:05:34,839 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-10-29 11:05:34,892 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,892 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-10-29 11:05:34,896 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,900 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,903 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,906 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,908 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,909 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-interactive', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-10-29 11:05:34,918 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2018-10-29 11:05:34,918 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-10-29 11:05:34,922 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,923 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-10-29 11:05:34,923 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-10-29 11:05:34,933 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2018-10-29 11:05:34,933 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-10-29 11:05:34,987 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,987 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-10-29 11:05:34,991 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,994 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:34,997 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:35,000 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:35,003 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:35,004 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2-interactive', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-10-29 11:05:35,014 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2018-10-29 11:05:35,014 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-10-29 11:05:35,017 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-10-29 11:05:35,018 - File['/usr/hdp/current/hive-server2/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_server/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-10-29 11:05:35,019 - Writing File['/usr/hdp/current/hive-server2/conf/hive-site.jceks'] because contents don't match
2018-10-29 11:05:35,020 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2018-10-29 11:05:35,030 - Generating config: /usr/hdp/current/hive-server2/conf/hive-site.xml
2018-10-29 11:05:35,030 - File['/usr/hdp/current/hive-server2/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-10-29 11:05:35,207 - Writing File['/usr/hdp/current/hive-server2/conf/hive-site.xml'] because contents don't match
2018-10-29 11:05:35,213 - File['/usr/hdp/current/hive-server2/conf//hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-10-29 11:05:35,213 - Writing File['/usr/hdp/current/hive-server2/conf//hive-env.sh'] because contents don't match
2018-10-29 11:05:35,214 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-10-29 11:05:35,217 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-10-29 11:05:35,218 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://manager.cluster.dev:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-10-29 11:05:35,218 - Not downloading the file from http://manager.cluster.dev:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-10-29 11:05:35,219 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-10-29 11:05:35,220 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-10-29 11:05:35,220 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-10-29 11:05:35,222 - File['/var/lib/ambari-agent/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
2018-10-29 11:05:35,228 - File['/usr/hdp/current/hive-server2/conf/hadoop-metrics2-hiveserver2.properties'] {'content': Template('hadoop-metrics2-hiveserver2.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2018-10-29 11:05:35,228 - XmlConfig['hiveserver2-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-server2/conf/', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {u'hive.metastore.metrics.enabled': u'true', u'hive.security.authorization.enabled': u'false', u'hive.service.metrics.reporter': u'HADOOP2', u'hive.service.metrics.hadoop2.component': u'hiveserver2', u'hive.server2.metrics.enabled': u'true'}}
2018-10-29 11:05:35,239 - Generating config: /usr/hdp/current/hive-server2/conf/hiveserver2-site.xml
2018-10-29 11:05:35,239 - File['/usr/hdp/current/hive-server2/conf/hiveserver2-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2018-10-29 11:05:35,248 - Called copy_to_hdfs tarball: mapreduce
2018-10-29 11:05:35,248 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:35,248 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:05:35,249 - Source file: /usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz , Dest file in HDFS: /hdp/apps/3.0.1.0-187/mapreduce/mapreduce.tar.gz
2018-10-29 11:05:35,249 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:35,249 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:05:35,249 - HdfsResource['/hdp/apps/3.0.1.0-187/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-10-29 11:05:35,252 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp0x53wy 2>/tmp/tmpjJlgwL''] {'logoutput': None, 'quiet': False}
2018-10-29 11:05:35,454 - call returned (0, '')
2018-10-29 11:05:35,455 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":16402,"group":"hdfs","length":0,"modificationTime":1540300267808,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:05:35,456 - HdfsResource['/hdp/apps/3.0.1.0-187/mapreduce/mapreduce.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-10-29 11:05:35,457 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/mapreduce/mapreduce.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpRBst3y 2>/tmp/tmpHUitUb''] {'logoutput': None, 'quiet': False}
2018-10-29 11:05:35,571 - call returned (0, '')
2018-10-29 11:05:35,571 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":1540299973992,"blockSize":134217728,"childrenNum":0,"fileId":16403,"group":"hadoop","length":307712858,"modificationTime":1540299979031,"owner":"hdfs","pathSuffix":"","permission":"444","replication":3,"storagePolicy":0,"type":"FILE"}}200', u'')
2018-10-29 11:05:35,572 - DFS file /hdp/apps/3.0.1.0-187/mapreduce/mapreduce.tar.gz is identical to /usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz, skipping the copying
2018-10-29 11:05:35,572 - Will attempt to copy mapreduce tarball from /usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz to DFS at /hdp/apps/3.0.1.0-187/mapreduce/mapreduce.tar.gz.
2018-10-29 11:05:35,572 - Called copy_to_hdfs tarball: tez
2018-10-29 11:05:35,573 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:35,573 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:05:35,573 - Source file: /usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz , Dest file in HDFS: /hdp/apps/3.0.1.0-187/tez/tez.tar.gz
2018-10-29 11:05:35,573 - Preparing the Tez tarball...
2018-10-29 11:05:35,573 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:35,573 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:05:35,573 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:05:35,574 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:05:35,574 - Extracting /usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz to /var/lib/ambari-agent/tmp/mapreduce-tarball-7p9vaL
2018-10-29 11:05:35,574 - Execute[('tar', '-xf', u'/usr/hdp/3.0.1.0-187/hadoop/mapreduce.tar.gz', '-C', '/var/lib/ambari-agent/tmp/mapreduce-tarball-7p9vaL/')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-10-29 11:05:40,795 - Extracting /usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz to /var/lib/ambari-agent/tmp/tez-tarball-BAJMNr
2018-10-29 11:05:40,796 - Execute[('tar', '-xf', u'/usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz', '-C', '/var/lib/ambari-agent/tmp/tez-tarball-BAJMNr/')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-10-29 11:05:44,716 - Execute[('cp', '-a', '/var/lib/ambari-agent/tmp/mapreduce-tarball-7p9vaL/hadoop/lib/native', '/var/lib/ambari-agent/tmp/tez-tarball-BAJMNr/lib')] {'sudo': True}
2018-10-29 11:05:44,767 - Directory['/var/lib/ambari-agent/tmp/tez-tarball-BAJMNr/lib'] {'recursive_ownership': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-29 11:05:44,768 - Creating a new Tez tarball at /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz
2018-10-29 11:05:44,768 - Execute[('tar', '-zchf', '/tmp/tmpt1l8Sg', '-C', '/var/lib/ambari-agent/tmp/tez-tarball-BAJMNr', '.')] {'tries': 3, 'sudo': True, 'try_sleep': 1}
2018-10-29 11:06:03,131 - Execute[('mv', '/tmp/tmpt1l8Sg', '/var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz')] {}
2018-10-29 11:06:04,038 - HdfsResource['/hdp/apps/3.0.1.0-187/tez'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-10-29 11:06:04,039 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/tez?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpwqi8fm 2>/tmp/tmpO_BlSx''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,164 - call returned (0, '')
2018-10-29 11:06:04,165 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":16435,"group":"hdfs","length":0,"modificationTime":1540300020020,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:04,167 - HdfsResource['/hdp/apps/3.0.1.0-187/tez/tez.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-10-29 11:06:04,170 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/tez/tez.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpUJeuh2 2>/tmp/tmpFQiHSQ''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,318 - call returned (0, '')
2018-10-29 11:06:04,319 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":1540300020020,"blockSize":134217728,"childrenNum":0,"fileId":16436,"group":"hadoop","length":254014338,"modificationTime":1540300027730,"owner":"hdfs","pathSuffix":"","permission":"444","replication":3,"storagePolicy":0,"type":"FILE"}}200', u'')
2018-10-29 11:06:04,321 - Not replacing existing DFS file /hdp/apps/3.0.1.0-187/tez/tez.tar.gz which is different from /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz, due to replace_existing_files=False
2018-10-29 11:06:04,321 - Will attempt to copy tez tarball from /var/lib/ambari-agent/tmp/tez-native-tarball-staging/tez-native.tar.gz to DFS at /hdp/apps/3.0.1.0-187/tez/tez.tar.gz.
2018-10-29 11:06:04,321 - Called copy_to_hdfs tarball: pig
2018-10-29 11:06:04,322 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:06:04,322 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:06:04,322 - pig-env is not present on the cluster. Skip copying /usr/hdp/3.0.1.0-187/pig/pig.tar.gz
2018-10-29 11:06:04,323 - Called copy_to_hdfs tarball: hive
2018-10-29 11:06:04,323 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:06:04,323 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:06:04,323 - Source file: /usr/hdp/3.0.1.0-187/hive/hive.tar.gz , Dest file in HDFS: /hdp/apps/3.0.1.0-187/hive/hive.tar.gz
2018-10-29 11:06:04,324 - HdfsResource['/hdp/apps/3.0.1.0-187/hive'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-10-29 11:06:04,327 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/hive?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmphCCNCQ 2>/tmp/tmpv4DKcT''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,472 - call returned (0, '')
2018-10-29 11:06:04,473 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":17283,"group":"hdfs","length":0,"modificationTime":1540300257703,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:04,475 - HdfsResource['/hdp/apps/3.0.1.0-187/hive/hive.tar.gz'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/3.0.1.0-187/hive/hive.tar.gz', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-10-29 11:06:04,477 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/hive/hive.tar.gz?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpd67XKd 2>/tmp/tmpXiQbDY''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,623 - call returned (0, '')
2018-10-29 11:06:04,624 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":1540300257703,"blockSize":134217728,"childrenNum":0,"fileId":17284,"group":"hadoop","length":381339333,"modificationTime":1540300266991,"owner":"hdfs","pathSuffix":"","permission":"444","replication":3,"storagePolicy":0,"type":"FILE"}}200', u'')
2018-10-29 11:06:04,626 - DFS file /hdp/apps/3.0.1.0-187/hive/hive.tar.gz is identical to /usr/hdp/3.0.1.0-187/hive/hive.tar.gz, skipping the copying
2018-10-29 11:06:04,627 - Will attempt to copy hive tarball from /usr/hdp/3.0.1.0-187/hive/hive.tar.gz to DFS at /hdp/apps/3.0.1.0-187/hive/hive.tar.gz.
2018-10-29 11:06:04,627 - Called copy_to_hdfs tarball: sqoop
2018-10-29 11:06:04,627 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:06:04,628 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:06:04,628 - sqoop-env is not present on the cluster. Skip copying /usr/hdp/3.0.1.0-187/sqoop/sqoop.tar.gz
2018-10-29 11:06:04,628 - Called copy_to_hdfs tarball: hadoop_streaming
2018-10-29 11:06:04,629 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2018-10-29 11:06:04,629 - Tarball version was calcuated as 3.0.1.0-187. Use Command Version: True
2018-10-29 11:06:04,629 - Source file: /usr/hdp/3.0.1.0-187/hadoop-mapreduce/hadoop-streaming.jar , Dest file in HDFS: /hdp/apps/3.0.1.0-187/mapreduce/hadoop-streaming.jar
2018-10-29 11:06:04,630 - HdfsResource['/hdp/apps/3.0.1.0-187/mapreduce'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0555}
2018-10-29 11:06:04,632 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/mapreduce?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpJtyWbZ 2>/tmp/tmpn_Vzgy''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,777 - call returned (0, '')
2018-10-29 11:06:04,777 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":2,"fileId":16402,"group":"hdfs","length":0,"modificationTime":1540300267808,"owner":"hdfs","pathSuffix":"","permission":"555","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:04,780 - HdfsResource['/hdp/apps/3.0.1.0-187/mapreduce/hadoop-streaming.jar'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'source': '/usr/hdp/3.0.1.0-187/hadoop-mapreduce/hadoop-streaming.jar', 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'replace_existing_files': False, 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'file', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 0444}
2018-10-29 11:06:04,782 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/hdp/apps/3.0.1.0-187/mapreduce/hadoop-streaming.jar?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpJJqlsp 2>/tmp/tmpxGY5QC''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:04,924 - call returned (0, '')
2018-10-29 11:06:04,924 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":1540300267808,"blockSize":134217728,"childrenNum":0,"fileId":17285,"group":"hadoop","length":176345,"modificationTime":1540300268016,"owner":"hdfs","pathSuffix":"","permission":"444","replication":3,"storagePolicy":0,"type":"FILE"}}200', u'')
2018-10-29 11:06:04,926 - DFS file /hdp/apps/3.0.1.0-187/mapreduce/hadoop-streaming.jar is identical to /usr/hdp/3.0.1.0-187/hadoop-mapreduce/hadoop-streaming.jar, skipping the copying
2018-10-29 11:06:04,927 - Will attempt to copy hadoop_streaming tarball from /usr/hdp/3.0.1.0-187/hadoop-mapreduce/hadoop-streaming.jar to DFS at /hdp/apps/3.0.1.0-187/mapreduce/hadoop-streaming.jar.
2018-10-29 11:06:04,928 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01755}
2018-10-29 11:06:04,930 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpktQRl1 2>/tmp/tmpqc16Lr''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:05,072 - call returned (0, '')
2018-10-29 11:06:05,073 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":4,"fileId":17286,"group":"hadoop","length":0,"modificationTime":1540300270926,"owner":"hive","pathSuffix":"","permission":"1755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:05,075 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/query_data/'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2018-10-29 11:06:05,078 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/query_data/?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmprJOPt0 2>/tmp/tmpuLrIAP''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:05,224 - call returned (0, '')
2018-10-29 11:06:05,224 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":0,"fileId":17287,"group":"hadoop","length":0,"modificationTime":1540300269219,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:05,227 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/dag_meta'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2018-10-29 11:06:05,229 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/dag_meta?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpj0bjfQ 2>/tmp/tmp8VslLM''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:05,374 - call returned (0, '')
2018-10-29 11:06:05,374 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":1,"fileId":17288,"group":"hadoop","length":0,"modificationTime":1540300308838,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:05,377 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/dag_data'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2018-10-29 11:06:05,379 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/dag_data?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp9_F0wY 2>/tmp/tmpqkv9v3''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:05,524 - call returned (0, '')
2018-10-29 11:06:05,525 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":1,"fileId":17289,"group":"hadoop","length":0,"modificationTime":1540300297502,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:05,527 - HdfsResource['/warehouse/tablespace/external/hive/sys.db/app_data'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp'], 'mode': 01777}
2018-10-29 11:06:05,530 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://node1.cluster.dev:50070/webhdfs/v1/warehouse/tablespace/external/hive/sys.db/app_data?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpMaGPu4 2>/tmp/tmpcYYOTo''] {'logoutput': None, 'quiet': False}
2018-10-29 11:06:05,674 - call returned (0, '')
2018-10-29 11:06:05,675 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"aclBit":true,"blockSize":0,"childrenNum":1,"fileId":17290,"group":"hadoop","length":0,"modificationTime":1540300293606,"owner":"hive","pathSuffix":"","permission":"1777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2018-10-29 11:06:05,677 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.0.1.0-187/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://node1.cluster.dev:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/3.0.1.0-187/hadoop/conf', 'immutable_paths': [u'/mr-history/done', u'/warehouse/tablespace/managed/hive', u'/warehouse/tablespace/external/hive', u'/app-logs', u'/tmp']}
2018-10-29 11:06:05,686 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-29 11:06:05,687 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json
2018-10-29 11:06:05,687 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-hive.json'] {'content': Template('input.config-hive.json.j2'), 'mode': 0644}
2018-10-29 11:06:05,689 - Ranger Hive plugin is not enabled
2018-10-29 11:06:05,690 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /var/run/hive/hive-server.pid 1>/tmp/tmptTvOXl 2>/tmp/tmposOzBX''] {'quiet': False}
2018-10-29 11:06:05,813 - call returned (1, '')
2018-10-29 11:06:05,814 - Execution of 'cat /var/run/hive/hive-server.pid 1>/tmp/tmptTvOXl 2>/tmp/tmposOzBX' returned 1. cat: /var/run/hive/hive-server.pid: No such file or directory

2018-10-29 11:06:05,814 - get_user_call_output returned (1, u'', u'cat: /var/run/hive/hive-server.pid: No such file or directory')
2018-10-29 11:06:05,816 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'hive --config /usr/hdp/current/hive-server2/conf/ --service metatool -listFSRoot' 2>/dev/null | grep hdfs:// | cut -f1,2,3 -d '/' | grep -v 'hdfs://node1.cluster.dev:8020' | head -1'] {}
2018-10-29 11:06:25,339 - call returned (0, '')
2018-10-29 11:06:25,340 - Execute['/var/lib/ambari-agent/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.err /var/run/hive/hive-server.pid /usr/hdp/current/hive-server2/conf/ /etc/tez/conf'] {'environment': {'HIVE_BIN': 'hive', 'JAVA_HOME': u'/usr/jdk64/jdk1.8.0_112', 'HADOOP_HOME': u'/usr/hdp/current/hadoop-client'}, 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps -p  >/dev/null 2>&1', 'user': 'hive', 'path': [u'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/var/lib/ambari-agent:/usr/hdp/current/hive-server2/bin:/usr/hdp/3.0.1.0-187/hadoop/bin']}
2018-10-29 11:06:25,483 - Execute['/usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2/lib/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://manager.cluster.dev/hivedb' hivedbuser [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
2018-10-29 11:06:26,154 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:06:27,311 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:06:27,312 - Will retry 29 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:06:37,331 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:06:38,388 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:06:38,390 - Will retry 28 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:06:48,398 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:06:49,344 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:06:49,346 - Will retry 27 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:06:59,358 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:00,778 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:00,780 - Will retry 26 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:07:10,790 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:11,856 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:11,857 - Will retry 25 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:07:21,868 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:22,791 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:22,792 - Will retry 24 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:07:32,795 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:33,822 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:33,823 - Will retry 23 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:07:43,830 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:44,758 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:44,760 - Will retry 22 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:07:54,768 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:07:56,003 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:07:56,005 - Will retry 21 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:08:06,016 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:08:06,978 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:08:06,979 - Will retry 20 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:08:16,986 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:08:17,870 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:08:17,871 - Will retry 19 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:08:27,882 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:08:28,902 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:08:28,904 - Will retry 18 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:08:38,914 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:08:39,861 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:08:39,863 - Will retry 17 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:08:49,865 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:08:50,803 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:08:50,804 - Will retry 16 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:00,816 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:01,711 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:01,712 - Will retry 15 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:11,723 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:12,687 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:12,688 - Will retry 14 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:22,695 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:23,632 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:23,634 - Will retry 13 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:33,646 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:34,657 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:34,658 - Will retry 12 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:44,666 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:45,612 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:45,614 - Will retry 11 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:09:55,631 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:09:56,860 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:09:56,862 - Will retry 10 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:10:06,872 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:10:07,798 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:10:07,800 - Will retry 9 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:10:17,809 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:10:18,792 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:10:18,794 - Will retry 8 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:10:28,800 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:10:29,747 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:10:29,748 - Will retry 7 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:10:39,759 - call['/usr/hdp/current/zookeeper-client/bin/zkCli.sh -server node1.cluster.dev:2181,node2.cluster.dev:2181,node3.cluster.dev:2181 ls /hiveserver2-interactive | grep 'serverUri=''] {}
2018-10-29 11:10:40,613 - call returned (1, 'Node does not exist: /hiveserver2-interactive')
2018-10-29 11:10:40,615 - Will retry 6 time(s), caught exception: ZooKeeper node /hiveserver2-interactive is not ready yet. Sleeping for 10 sec(s)
2018-10-29 11:10:50,620 - Process with pid 21322 is not running. Stale pid file at /var/run/hive/hive-server.pid

* hiveserver2.log:

2018-10-29T11:52:53,598 INFO  [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/etc/hive/3.0.1.0-187/0/hive-site.xml
2018-10-29T11:52:54,875 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:52:54,876 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:52:55,092 INFO  [main]: server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(767)) - STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting HiveServer2
STARTUP_MSG:   host = node2.cluster.dev/*********
STARTUP_MSG:   args = [--hiveconf, hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar]
STARTUP_MSG:   version = 3.1.0.3.0.1.0-187
STARTUP_MSG:   classpath = /etc/tez/conf:/usr/hdp/current/hive-server2/conf/:/usr/hdp/3.0.1.0-187/hive/lib/accumulo-core-1.7.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/accumulo-fate-1.7.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/accumulo-start-1.7.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/accumulo-trace-1.7.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/aircompressor-0.10.jar:/usr/hdp/3.0.1.0-187/hive/lib/ant-1.9.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/antlr4-runtime-4.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/antlr-runtime-3.5.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/aopalliance-repackaged-2.5.0-b32.jar:/usr/hdp/3.0.1.0-187/hive/lib/apache-jsp-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/apache-jstl-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/arrow-format-0.8.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/arrow-memory-0.8.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/arrow-vector-0.8.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/asm-5.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/asm-commons-5.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/asm-tree-5.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/audience-annotations-0.5.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/avatica-1.11.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/avro-1.7.7.jar:/usr/hdp/3.0.1.0-187/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/3.0.1.0-187/hive/lib/calcite-core-1.16.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/calcite-druid-1.16.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/calcite-linq4j-1.16.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-cli-1.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-codec-1.7.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-compress-1.9.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-crypto-1.0.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-io-2.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-lang-2.6.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-lang3-3.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-logging-1.0.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-math-2.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-math3-3.6.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/curator-framework-2.12.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/datanucleus-api-jdo-4.2.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/datanucleus-core-4.1.17.jar:/usr/hdp/3.0.1.0-187/hive/lib/datanucleus-rdbms-4.1.19.jar:/usr/hdp/3.0.1.0-187/hive/lib/derby-10.14.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/disruptor-3.3.6.jar:/usr/hdp/3.0.1.0-187/hive/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/druid-hdfs-storage-0.12.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/ecj-4.4.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/esri-geometry-api-2.0.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/3.0.1.0-187/hive/lib/flatbuffers-1.2.0-3f79e055.jar:/usr/hdp/3.0.1.0-187/hive/lib/groovy-all-2.4.11.jar:/usr/hdp/3.0.1.0-187/hive/lib/gson-2.2.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/guava-19.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-http-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-procedure-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-replication-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/hbase-zookeeper-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/HikariCP-2.6.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-accumulo-handler-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-accumulo-handler.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-beeline-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-beeline.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-classification-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-classification.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-cli-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-cli.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-common-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-common.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-contrib-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-contrib.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-druid-handler-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-druid-handler.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-exec-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-exec.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hbase-handler-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hbase-handler.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hcatalog-core-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hcatalog-core.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hcatalog-server-extensions-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hcatalog-server-extensions.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hplsql-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-hplsql.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-jdbc-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-jdbc-handler-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-jdbc-handler.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-jdbc.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-kryo-registrator-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-kryo-registrator.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-client-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-client.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-common-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-common-3.1.0.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-common.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-ext-client-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-ext-client.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-server-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-server.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-tez-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-llap-tez.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-metastore-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-metastore.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-pre-upgrade.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-serde-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-serde.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-service-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-service.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-service-rpc-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-service-rpc.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-0.23-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-common-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-common.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-scheduler-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-shims-scheduler.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-standalone-metastore-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-standalone-metastore.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-storage-api-2.3.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-storage-api.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-streaming-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-streaming.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-testutils-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-testutils.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-vector-code-gen-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/hive-vector-code-gen.jar:/usr/hdp/3.0.1.0-187/hive/lib/hk2-api-2.5.0-b32.jar:/usr/hdp/3.0.1.0-187/hive/lib/hk2-locator-2.5.0-b32.jar:/usr/hdp/3.0.1.0-187/hive/lib/hk2-utils-2.5.0-b32.jar:/usr/hdp/3.0.1.0-187/hive/lib/hppc-0.7.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/htrace-core-3.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hive/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hive/lib/httpclient-4.5.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/httpcore-4.4.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/ivy-2.4.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-core-2.9.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-dataformat-smile-2.9.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hive/lib/jamon-runtime-2.3.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/janino-2.7.6.jar:/usr/hdp/3.0.1.0-187/hive/lib/javassist-3.20.0-GA.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.annotation-api-1.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.inject-2.5.0-b32.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.jdo-3.2.0-m3.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.servlet.jsp-2.3.2.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.servlet.jsp-api-2.3.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/javax.ws.rs-api-2.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/javolution-5.5.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jcodings-1.0.18.jar:/usr/hdp/3.0.1.0-187/hive/lib/jcommander-1.32.jar:/usr/hdp/3.0.1.0-187/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-client-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-common-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-container-servlet-core-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-guava-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-media-jaxb-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jersey-server-2.25.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jettison-1.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-annotations-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-client-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-http-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-io-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-jaas-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-jndi-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-plus-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-rewrite-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-runner-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-schemas-3.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-security-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-server-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-servlet-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-util-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-webapp-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jetty-xml-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/jline-2.12.jar:/usr/hdp/3.0.1.0-187/hive/lib/joda-time-2.9.9.jar:/usr/hdp/3.0.1.0-187/hive/lib/joni-2.1.11.jar:/usr/hdp/3.0.1.0-187/hive/lib/jpam-1.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/json-1.8.jar:/usr/hdp/3.0.1.0-187/hive/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/jta-1.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/libfb303-0.9.3.jar:/usr/hdp/3.0.1.0-187/hive/lib/libthrift-0.9.3.jar:/usr/hdp/3.0.1.0-187/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/memory-0.9.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/metrics-core-3.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/metrics-json-3.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/metrics-jvm-3.1.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/mysql-connector-java.jar:/usr/hdp/3.0.1.0-187/hive/lib/mysql-metadata-storage-0.12.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/netty-3.10.5.Final.jar:/usr/hdp/3.0.1.0-187/hive/lib/netty-all-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hive/lib/netty-buffer-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hive/lib/netty-common-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hive/lib/opencsv-2.3.jar:/usr/hdp/3.0.1.0-187/hive/lib/orc-core-1.5.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/orc-shims-1.5.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/org.abego.treelayout.core-1.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/oro-2.0.8.jar:/usr/hdp/3.0.1.0-187/hive/lib/osgi-resource-locator-1.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/paranamer-2.3.jar:/usr/hdp/3.0.1.0-187/hive/lib/parquet-hadoop-bundle-1.10.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/3.0.1.0-187/hive/lib/postgresql-9.4.1208.jre7.jar:/usr/hdp/3.0.1.0-187/hive/lib/postgresql-metadata-storage-0.12.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/ranger-hive-plugin-shim-1.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/ranger-plugin-classloader-1.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive/lib/regexp-1.3.jar:/usr/hdp/3.0.1.0-187/hive/lib/sketches-core-0.9.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/snappy-java-1.1.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/sqlline-1.3.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/ST4-4.0.4.jar:/usr/hdp/3.0.1.0-187/hive/lib/stax-api-1.0.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/super-csv-2.2.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/taglibs-standard-impl-1.2.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/taglibs-standard-spec-1.2.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/transaction-api-1.1.jar:/usr/hdp/3.0.1.0-187/hive/lib/validation-api-1.1.0.Final.jar:/usr/hdp/3.0.1.0-187/hive/lib/velocity-1.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/websocket-api-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/websocket-client-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/websocket-common-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/websocket-server-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/websocket-servlet-9.3.20.v20170531.jar:/usr/hdp/3.0.1.0-187/hive/lib/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar:/usr/hdp/3.0.1.0-187/hive-hcatalog/share/hcatalog/hive-hcatalog-core-3.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-3.1.0.3.0.1.0-187.jar:/usr/hdp/current/hadoop-client/share/hadoop/tools/lib/hadoop-distcp-*.jar:/etc/hbase/conf:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.0.1.0-187/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-server-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-shaded-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop2-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-mapreduce-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-api-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-client-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-hadoop-compat-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-protocol-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-metrics-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/hbase-common-2.0.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hbase/lib/jackson-core-2.9.5.jar:/usr/hdp/3.0.1.0-187/hive/lib/log4j-1.2-api-2.10.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/log4j-api-2.10.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/log4j-core-2.10.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar:/usr/hdp/3.0.1.0-187/hive/lib/log4j-web-2.10.0.jar:/usr/hdp/3.0.1.0-187/tez/conf_llap:/usr/hdp/3.0.1.0-187/tez/doc:/usr/hdp/3.0.1.0-187/tez/hadoop-shim-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/hadoop-shim-2.8-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib:/usr/hdp/3.0.1.0-187/tez/man:/usr/hdp/3.0.1.0-187/tez/tez-api-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-common-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-dag-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-examples-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-history-parser-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-javadoc-tools-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-job-analyzer-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-mapreduce-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-protobuf-history-plugin-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-runtime-internals-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-runtime-library-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-tests-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-cache-plugin-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-with-acls-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-with-fs-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/ui:/usr/hdp/3.0.1.0-187/tez/lib/async-http-client-1.9.40.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-cli-1.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-codec-1.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-collections4-4.1.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-io-2.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-lang-2.6.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/3.0.1.0-187/tez/lib/gcs-connector-1.9.0.3.0.1.0-187-shaded.jar:/usr/hdp/3.0.1.0-187/tez/lib/guava-11.0.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-aws-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-azure-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-azure-datalake-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-mapreduce-client-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-mapreduce-client-core-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/jersey-client-1.19.jar:/usr/hdp/3.0.1.0-187/tez/lib/jersey-json-1.19.jar:/usr/hdp/3.0.1.0-187/tez/lib/jettison-1.3.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/jetty-server-9.3.22.v20171030.jar:/usr/hdp/3.0.1.0-187/tez/lib/jetty-util-9.3.22.v20171030.jar:/usr/hdp/3.0.1.0-187/tez/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.1.0-187/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/3.0.1.0-187/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/3.0.1.0-187/tez/lib/servlet-api-2.5.jar:/usr/hdp/3.0.1.0-187/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/3.0.1.0-187/tez/lib/tez.tar.gz:/usr/hdp/3.0.1.0-187/hadoop/conf:/usr/hdp/3.0.1.0-187/hadoop/lib/kerby-pkix-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jsch-0.1.54.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/asm-5.0.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-util-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-server-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-identity-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/snappy-java-1.0.5.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-crypto-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/stax2-api-3.1.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/curator-client-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jersey-json-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-io-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/avro-1.7.7.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jsr311-api-1.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-lang3-3.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/ranger-plugin-classloader-1.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/metrics-core-3.2.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jaxb-api-2.2.11.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/httpclient-4.5.2.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-configuration2-2.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-core-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/accessors-smart-1.2.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/woodstox-core-5.0.3.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-api-1.7.25.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-server-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-simplekdc-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jettison-1.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/htrace-core4-4.1.0-incubating.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/json-smart-2.3.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jersey-core-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/re2j-1.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-beanutils-1.9.3.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/paranamer-2.3.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-codec-1.11.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-util-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/guava-11.0.2.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jcip-annotations-1.0-1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-core-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/curator-framework-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-http-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-webapp-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerby-xdr-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-client-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerby-util-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/gson-2.2.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-common-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerby-asn1-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/xz-1.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerb-admin-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jul-to-slf4j-1.7.25.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/netty-3.10.5.Final.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jersey-server-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-servlet-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-security-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/kerby-config-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jersey-servlet-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-net-3.6.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jetty-xml-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/httpcore-4.4.4.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/ranger-hdfs-plugin-shim-1.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/token-provider-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/commons-io-2.5.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/ranger-yarn-plugin-shim-1.1.0.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-azure-datalake.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-common.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-kms.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-azure-datalake-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-auth-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-azure-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-nfs.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-common-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-azure.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-nfs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-annotations-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-kms-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-common-tests.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-annotations.jar:/usr/hdp/3.0.1.0-187/hadoop/.//hadoop-auth.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/./:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerby-pkix-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jsch-0.1.54.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/asm-5.0.4.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-util-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-server-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-identity-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/snappy-java-1.0.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-crypto-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/stax2-api-3.1.4.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/curator-client-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/zookeeper-3.4.6.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jersey-json-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/json-simple-1.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-io-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/avro-1.7.7.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-lang3-3.4.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/httpclient-4.5.2.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-configuration2-2.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-core-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-compress-1.4.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/netty-all-4.0.52.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/accessors-smart-1.2.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/woodstox-core-5.0.3.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/okhttp-2.7.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-server-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-simplekdc-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jettison-1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/htrace-core4-4.1.0-incubating.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/json-smart-2.3.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jersey-core-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/re2j-1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-beanutils-1.9.3.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/paranamer-2.3.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/okio-1.6.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-util-ajax-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-codec-1.11.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-util-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-core-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/curator-framework-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-http-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-webapp-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerby-xdr-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-client-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerby-util-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/gson-2.2.4.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-common-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerby-asn1-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/xz-1.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerb-admin-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jersey-server-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-servlet-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-security-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-math3-3.1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/kerby-config-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jersey-servlet-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-net-3.6.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/jetty-xml-9.3.19.v20170502.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/httpcore-4.4.4.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/token-provider-1.0.1.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/lib/commons-io-2.5.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-client.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/lib/*:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-archive-logs.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-codec-http-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.3.0.1.0-187-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-codec-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//lz4-1.2.0.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//azure-storage-7.0.0.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-azure-datalake.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-azure-datalake-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-rumen-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-kafka.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//aliyun-sdk-oss-2.8.3.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-handler-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-azure-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-openstack-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//azure-data-lake-store-sdk-2.2.7.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-resolver-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-kafka-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//wildfly-openssl-1.0.4.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-aliyun-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-transport-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-extras-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-distcp-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-aliyun.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//aws-java-sdk-bundle-1.11.271.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-azure.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//gcs-connector-1.9.0.3.0.1.0-187-shaded.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-common-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//kafka-clients-0.8.2.1.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-streaming-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-archives-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-sls-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//netty-buffer-4.1.17.Final.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//jdom-1.1.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//ojalgo-43.0.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-aws-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-fs2img.jar:/usr/hdp/3.0.1.0-187/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/./:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/dnsjava-2.1.7.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/ehcache-3.3.1.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/java-util-1.9.0.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/json-io-2.5.1.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/objenesis-1.0.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/metrics-core-3.2.4.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/jersey-client-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/snakeyaml-1.16.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/guice-servlet-4.0.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/fst-2.50.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/jackson-jaxrs-base-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/guice-4.0.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/lib/jersey-guice-1.19.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-client-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-services-core.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-router.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-api-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/3.0.1.0-187/hadoop-yarn/.//hadoop-yarn-services-api.jar:/usr/hdp/3.0.1.0-187/tez/hadoop-shim-2.8-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-protobuf-history-plugin-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-with-acls-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-tests-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-with-fs-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-cache-plugin-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-runtime-internals-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-yarn-timeline-history-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/hadoop-shim-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-api-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-javadoc-tools-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-history-parser-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-mapreduce-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-common-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-dag-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-runtime-library-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-examples-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/tez-job-analyzer-0.9.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-lang-2.6.jar:/usr/hdp/3.0.1.0-187/tez/lib/jettison-1.3.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-azure-datalake-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/jersey-json-1.19.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-collections4-4.1.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/3.0.1.0-187/tez/lib/jetty-server-9.3.22.v20171030.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-mapreduce-client-common-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-azure-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-codec-1.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/jersey-client-1.19.jar:/usr/hdp/3.0.1.0-187/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/3.0.1.0-187/tez/lib/async-http-client-1.9.40.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-io-2.4.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-cli-1.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/guava-11.0.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/gcs-connector-1.9.0.3.0.1.0-187-shaded.jar:/usr/hdp/3.0.1.0-187/tez/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/3.0.1.0-187/tez/lib/jetty-util-9.3.22.v20171030.jar:/usr/hdp/3.0.1.0-187/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.1.0-187/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/3.0.1.0-187/tez/lib/servlet-api-2.5.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-mapreduce-client-core-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/hadoop-aws-3.1.1.3.0.1.0-187.jar:/usr/hdp/3.0.1.0-187/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/3.0.1.0-187/tez/conf
STARTUP_MSG:   build = git://ctr-e138-1518143905142-366319-01-000006.hwx.site/grid/0/jenkins/workspace/HDP-parallel-centos7/SOURCES/hive -r 0adf8d7356dd3c294e0d2960b025ab88041fc056; compiled by 'jenkins' on Wed Sep 19 10:08:54 UTC 2018
************************************************************/
2018-10-29T11:52:55,135 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2
2018-10-29T11:52:55,226 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:52:55,229 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:52:55,816 INFO  [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json
2018-10-29T11:52:56,026 INFO  [main]: impl.MetricsConfig (:()) - Loaded properties from hadoop-metrics2-hiveserver2.properties
2018-10-29T11:52:56,074 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3794783352040067126json to /tmp/report.json
2018-10-29T11:52:56,075 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3794783352040067126json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:52:56,120 INFO  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(374)) - Scheduled Metric snapshot period at 10 second(s).
2018-10-29T11:52:56,121 INFO  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - hiveserver2 metrics system started
2018-10-29T11:52:56,286 INFO  [main]: SessionState (:()) - Hive Session ID = 0ad8dcf9-4709-423d-b8a5-3574a599a69a
2018-10-29T11:52:57,644 INFO  [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/0ad8dcf9-4709-423d-b8a5-3574a599a69a
2018-10-29T11:52:57,648 INFO  [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2
2018-10-29T11:52:57,697 INFO  [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions.
2018-10-29T11:52:57,698 WARN  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 1, will retry in 60000ms
java.lang.RuntimeException: Error applying authorization policy on hive configuration: java.io.IOException: Failed to create directory /tmp/hive/0ad8dcf9-4709-423d-b8a5-3574a599a69a on fs file:///
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:121) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: java.lang.RuntimeException: java.io.IOException: Failed to create directory /tmp/hive/0ad8dcf9-4709-423d-b8a5-3574a599a69a on fs file:///
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
Caused by: java.io.IOException: Failed to create directory /tmp/hive/0ad8dcf9-4709-423d-b8a5-3574a599a69a on fs file:///
	at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:784) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:732) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
2018-10-29T11:53:57,700 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2
2018-10-29T11:53:57,852 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:53:57,853 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:53:57,857 INFO  [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json
2018-10-29T11:53:57,862 WARN  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized!
2018-10-29T11:53:57,864 WARN  [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system.  Metrics may not be reported.
java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
2018-10-29T11:53:57,863 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES
java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 23 more
2018-10-29T11:53:57,867 INFO  [main]: SessionState (:()) - Hive Session ID = b77ec194-ff04-4dff-9671-feded4b16e27
2018-10-29T11:53:57,905 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4043023699238930789json to /tmp/report.json
2018-10-29T11:53:57,905 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4043023699238930789json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:53:57,928 INFO  [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/b77ec194-ff04-4dff-9671-feded4b16e27
2018-10-29T11:53:57,931 INFO  [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2
2018-10-29T11:53:57,932 INFO  [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions.
2018-10-29T11:53:57,932 WARN  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 2, will retry in 60000ms
java.lang.RuntimeException: Error applying authorization policy on hive configuration: java.io.IOException: Failed to create directory /tmp/hive/b77ec194-ff04-4dff-9671-feded4b16e27 on fs file:///
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:121) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: java.lang.RuntimeException: java.io.IOException: Failed to create directory /tmp/hive/b77ec194-ff04-4dff-9671-feded4b16e27 on fs file:///
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
Caused by: java.io.IOException: Failed to create directory /tmp/hive/b77ec194-ff04-4dff-9671-feded4b16e27 on fs file:///
	at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:784) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:732) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
2018-10-29T11:54:02,918 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5722873256427630646json to /tmp/report.json
2018-10-29T11:54:02,918 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5722873256427630646json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:07,925 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4126432821800749910json to /tmp/report.json
2018-10-29T11:54:07,925 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4126432821800749910json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:12,932 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6297882580445345366json to /tmp/report.json
2018-10-29T11:54:12,932 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6297882580445345366json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:17,939 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4026772594797291104json to /tmp/report.json
2018-10-29T11:54:17,939 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4026772594797291104json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:22,946 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3017861554068473468json to /tmp/report.json
2018-10-29T11:54:22,946 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3017861554068473468json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:27,954 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2026141703454692177json to /tmp/report.json
2018-10-29T11:54:27,954 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2026141703454692177json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:32,960 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2626098788443912886json to /tmp/report.json
2018-10-29T11:54:32,961 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2626098788443912886json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:37,965 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics56882305740656237json to /tmp/report.json
2018-10-29T11:54:37,966 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics56882305740656237json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:42,972 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2044183312023135062json to /tmp/report.json
2018-10-29T11:54:42,972 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2044183312023135062json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:47,978 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2423063075580872279json to /tmp/report.json
2018-10-29T11:54:47,978 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2423063075580872279json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:52,987 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8708701493679316251json to /tmp/report.json
2018-10-29T11:54:52,987 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8708701493679316251json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:57,934 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2
2018-10-29T11:54:58,002 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5211248087069145460json to /tmp/report.json
2018-10-29T11:54:58,002 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5211248087069145460json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:58,160 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:54:58,171 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:54:58,173 INFO  [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json
2018-10-29T11:54:58,191 WARN  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized!
2018-10-29T11:54:58,192 WARN  [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system.  Metrics may not be reported.
java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
2018-10-29T11:54:58,191 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES
java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 23 more
2018-10-29T11:54:58,195 INFO  [main]: SessionState (:()) - Hive Session ID = ae0c4279-98fc-4a03-afc5-3a68f7e9bab3
2018-10-29T11:54:58,235 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4617042004264891460json to /tmp/report.json
2018-10-29T11:54:58,235 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4617042004264891460json -> /tmp/report.json: Operation not permitted
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
2018-10-29T11:54:58,252 INFO  [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/ae0c4279-98fc-4a03-afc5-3a68f7e9bab3
2018-10-29T11:54:58,254 INFO  [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2
2018-10-29T11:54:58,254 INFO  [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions.
2018-10-29T11:54:58,255 WARN  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 3, will retry in 60000ms
java.lang.RuntimeException: Error applying authorization policy on hive configuration: java.io.IOException: Failed to create directory /tmp/hive/ae0c4279-98fc-4a03-afc5-3a68f7e9bab3 on fs file:///
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:121) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: java.lang.RuntimeException: java.io.IOException: Failed to create directory /tmp/hive/ae0c4279-98fc-4a03-afc5-3a68f7e9bab3 on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
	Caused by: java.io.IOException: Failed to create directory /tmp/hive/ae0c4279-98fc-4a03-afc5-3a68f7e9bab3 on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:784) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:732) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
	2018-10-29T11:55:03,008 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1044484992062841804json to /tmp/report.json
2018-10-29T11:55:03,008 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1044484992062841804json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:03,240 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1210184580412597159json to /tmp/report.json
2018-10-29T11:55:03,240 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1210184580412597159json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:08,013 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8026771374593763543json to /tmp/report.json
2018-10-29T11:55:08,013 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8026771374593763543json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:08,245 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics509011773535627555json to /tmp/report.json
2018-10-29T11:55:08,245 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics509011773535627555json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:13,018 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6613618376763127476json to /tmp/report.json
2018-10-29T11:55:13,018 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6613618376763127476json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:13,250 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2615590145715145708json to /tmp/report.json
2018-10-29T11:55:13,251 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2615590145715145708json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:18,024 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8128671353059684319json to /tmp/report.json
2018-10-29T11:55:18,024 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8128671353059684319json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:18,255 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics700477991327982896json to /tmp/report.json
2018-10-29T11:55:18,255 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics700477991327982896json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:23,030 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2934439921990588676json to /tmp/report.json
2018-10-29T11:55:23,030 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2934439921990588676json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:23,260 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5420018442276267524json to /tmp/report.json
2018-10-29T11:55:23,260 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5420018442276267524json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:28,034 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5499514961954967213json to /tmp/report.json
2018-10-29T11:55:28,034 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5499514961954967213json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:28,268 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6241499128033921859json to /tmp/report.json
2018-10-29T11:55:28,268 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6241499128033921859json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:33,038 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7958009217707666748json to /tmp/report.json
2018-10-29T11:55:33,038 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics7958009217707666748json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:33,274 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8026571601861904654json to /tmp/report.json
2018-10-29T11:55:33,274 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8026571601861904654json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:38,042 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4380013417647362572json to /tmp/report.json
2018-10-29T11:55:38,042 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4380013417647362572json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:38,288 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3322279608750193417json to /tmp/report.json
2018-10-29T11:55:38,288 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3322279608750193417json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:43,049 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4670386284513264208json to /tmp/report.json
2018-10-29T11:55:43,049 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4670386284513264208json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:43,292 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2773219272093492938json to /tmp/report.json
2018-10-29T11:55:43,292 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2773219272093492938json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:48,054 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics789938555931937158json to /tmp/report.json
2018-10-29T11:55:48,054 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics789938555931937158json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:48,296 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2494340341116729175json to /tmp/report.json
2018-10-29T11:55:48,296 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2494340341116729175json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:53,062 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics121661853913851263json to /tmp/report.json
2018-10-29T11:55:53,062 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics121661853913851263json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:53,301 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4939010971108265859json to /tmp/report.json
2018-10-29T11:55:53,301 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4939010971108265859json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:58,069 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4411049568685783352json to /tmp/report.json
2018-10-29T11:55:58,069 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4411049568685783352json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:58,256 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2
2018-10-29T11:55:58,344 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8355759151697642750json to /tmp/report.json
2018-10-29T11:55:58,344 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8355759151697642750json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:58,380 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:55:58,381 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:55:58,386 INFO  [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json
2018-10-29T11:55:58,394 WARN  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized!
2018-10-29T11:55:58,394 WARN  [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system.  Metrics may not be reported.
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	2018-10-29T11:55:58,394 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 23 more
	2018-10-29T11:55:58,397 INFO  [main]: SessionState (:()) - Hive Session ID = 66ff52f6-7496-4f82-8028-cb4a09f0ae41
2018-10-29T11:55:58,423 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6607045242995036237json to /tmp/report.json
2018-10-29T11:55:58,423 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6607045242995036237json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:55:58,462 INFO  [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/66ff52f6-7496-4f82-8028-cb4a09f0ae41
2018-10-29T11:55:58,465 INFO  [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2
2018-10-29T11:55:58,466 INFO  [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions.
2018-10-29T11:55:58,466 WARN  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 4, will retry in 60000ms
java.lang.RuntimeException: Error applying authorization policy on hive configuration: java.io.IOException: Failed to create directory /tmp/hive/66ff52f6-7496-4f82-8028-cb4a09f0ae41 on fs file:///
at org.apache.hive.service.cli.CLIService.init(CLIService.java:121) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: java.lang.RuntimeException: java.io.IOException: Failed to create directory /tmp/hive/66ff52f6-7496-4f82-8028-cb4a09f0ae41 on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
	Caused by: java.io.IOException: Failed to create directory /tmp/hive/66ff52f6-7496-4f82-8028-cb4a09f0ae41 on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:784) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:732) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 12 more
	2018-10-29T11:56:03,075 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4892397726998270361json to /tmp/report.json
2018-10-29T11:56:03,075 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4892397726998270361json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:03,359 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6777890749600790596json to /tmp/report.json
2018-10-29T11:56:03,359 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6777890749600790596json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:03,432 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1796982033276158508json to /tmp/report.json
2018-10-29T11:56:03,432 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1796982033276158508json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:08,079 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4367859486448529278json to /tmp/report.json
2018-10-29T11:56:08,079 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4367859486448529278json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:08,362 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5193729862840417002json to /tmp/report.json
2018-10-29T11:56:08,363 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5193729862840417002json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:08,437 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3551868144858128715json to /tmp/report.json
2018-10-29T11:56:08,437 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3551868144858128715json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:13,084 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8867857090306223182json to /tmp/report.json
2018-10-29T11:56:13,084 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8867857090306223182json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:13,367 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7418906357633672898json to /tmp/report.json
2018-10-29T11:56:13,367 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics7418906357633672898json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:13,441 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3477414637093388381json to /tmp/report.json
2018-10-29T11:56:13,442 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3477414637093388381json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:18,088 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2766273163807918085json to /tmp/report.json
2018-10-29T11:56:18,088 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2766273163807918085json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:18,371 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics685688502078433767json to /tmp/report.json
2018-10-29T11:56:18,371 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics685688502078433767json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:18,447 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1915199979389793111json to /tmp/report.json
2018-10-29T11:56:18,447 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1915199979389793111json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:23,091 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3744934350108943420json to /tmp/report.json
2018-10-29T11:56:23,091 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3744934350108943420json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:23,378 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3144173509555499746json to /tmp/report.json
2018-10-29T11:56:23,378 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3144173509555499746json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:23,451 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3938336314522771442json to /tmp/report.json
2018-10-29T11:56:23,451 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3938336314522771442json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:28,094 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7866624099727133689json to /tmp/report.json
2018-10-29T11:56:28,095 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics7866624099727133689json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:28,382 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3191385877366486949json to /tmp/report.json
2018-10-29T11:56:28,382 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3191385877366486949json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:28,455 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics9136327492901902529json to /tmp/report.json
2018-10-29T11:56:28,455 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics9136327492901902529json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:33,099 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics366095486712680469json to /tmp/report.json
2018-10-29T11:56:33,099 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics366095486712680469json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:33,387 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4771834415421934416json to /tmp/report.json
2018-10-29T11:56:33,387 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4771834415421934416json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:33,459 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3527323205148590430json to /tmp/report.json
2018-10-29T11:56:33,459 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3527323205148590430json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:38,104 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6765541544059834506json to /tmp/report.json
2018-10-29T11:56:38,104 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics6765541544059834506json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:38,391 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7747107529052594640json to /tmp/report.json
2018-10-29T11:56:38,391 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics7747107529052594640json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:38,464 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1476305167953211758json to /tmp/report.json
2018-10-29T11:56:38,464 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1476305167953211758json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:43,109 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2345630370892878240json to /tmp/report.json
2018-10-29T11:56:43,109 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics2345630370892878240json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:43,394 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1785924233664749747json to /tmp/report.json
2018-10-29T11:56:43,394 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1785924233664749747json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:43,468 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3603252134950218395json to /tmp/report.json
2018-10-29T11:56:43,468 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3603252134950218395json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:48,113 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4584968708013738548json to /tmp/report.json
2018-10-29T11:56:48,113 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics4584968708013738548json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:48,398 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1123210297605896986json to /tmp/report.json
2018-10-29T11:56:48,398 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1123210297605896986json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:48,472 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8038063633427389827json to /tmp/report.json
2018-10-29T11:56:48,472 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8038063633427389827json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:53,119 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3332770351667665980json to /tmp/report.json
2018-10-29T11:56:53,119 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3332770351667665980json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:53,408 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3072459231180166431json to /tmp/report.json
2018-10-29T11:56:53,408 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3072459231180166431json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:53,475 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5841451302791838842json to /tmp/report.json
2018-10-29T11:56:53,475 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics5841451302791838842json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:58,123 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3665667467717188889json to /tmp/report.json
2018-10-29T11:56:58,123 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics3665667467717188889json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:58,412 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1111281371745148428json to /tmp/report.json
2018-10-29T11:56:58,412 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics1111281371745148428json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:58,467 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2
2018-10-29T11:56:58,480 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7991318389123738869json to /tmp/report.json
2018-10-29T11:56:58,480 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics7991318389123738869json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:58,595 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.stats.fetch.partition.stats does not exist
2018-10-29T11:56:58,596 WARN  [main]: conf.HiveConf (HiveConf.java:initialize(5260)) - HiveConf of name hive.heapsize does not exist
2018-10-29T11:56:58,604 WARN  [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized!
2018-10-29T11:56:58,605 WARN  [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system.  Metrics may not be reported.
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 16 more
	2018-10-29T11:56:58,599 INFO  [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json
2018-10-29T11:56:58,605 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.<init>(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists!
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.<init>(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?]
	at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.<init>(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 23 more
	2018-10-29T11:56:58,607 INFO  [main]: SessionState (:()) - Hive Session ID = d3e6ac6a-ba09-418f-adfc-50d13209c8cc
2018-10-29T11:56:58,674 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8169487051384093996json to /tmp/report.json
2018-10-29T11:56:58,674 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename
java.nio.file.FileSystemException: /tmp/hmetrics8169487051384093996json -> /tmp/report.json: Operation not permitted
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112]
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112]
	at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112]
	at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112]
	at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112]
	at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
	2018-10-29T11:56:58,684 INFO  [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/d3e6ac6a-ba09-418f-adfc-50d13209c8cc
2018-10-29T11:56:58,687 INFO  [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2
2018-10-29T11:56:58,687 INFO  [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions.
2018-10-29T11:56:58,688 ERROR [main]: server.HiveServer2 (HiveServer2.java:execute(1343)) - Error starting HiveServer2
java.lang.Error: Max start attempts 5 exhausted
at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1098) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
	at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.0.1.0-187.jar:?]
	Caused by: java.lang.RuntimeException: Error applying authorization policy on hive configuration: java.io.IOException: Failed to create directory /tmp/hive/d3e6ac6a-ba09-418f-adfc-50d13209c8cc on fs file:///
at org.apache.hive.service.cli.CLIService.init(CLIService.java:121) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 9 more
	Caused by: java.lang.RuntimeException: java.io.IOException: Failed to create directory /tmp/hive/d3e6ac6a-ba09-418f-adfc-50d13209c8cc on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:648) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 9 more
	Caused by: java.io.IOException: Failed to create directory /tmp/hive/d3e6ac6a-ba09-418f-adfc-50d13209c8cc on fs file:///
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:784) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:732) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:624) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:583) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:133) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:118) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:228) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
	... 9 more
	2018-10-29T11:56:58,867 INFO  [shutdown-hook-0]: server.HiveServer2 (HiveStringUtils.java:run(785)) - SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down HiveServer2 at node2.cluster.dev/***********
************************************************************/


4 REPLIES 4
Highlighted

Re: Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

New Contributor

I'm facing the same problem here

have you found any solution?

,

Facing the same error here

did you find any solution?

Highlighted

Re: Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

New Contributor

@Hesham Eldib and @Ahmed Hassaan

Were you able to resolve this issue? Can you please provide the solution?

Highlighted

Re: Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

Explorer

I have exactly the same problem. Any help will be appreciated. Did you manage to get it to work?

Highlighted

Re: Failed to start HiveServer2 using Ambari 2.7 and HDP 3.0

Contributor

@hesham_eldib 

 

Could you please disable these properties hive.metastore.metrics.enabled and hive.server2.metrics.enabled from Advanced hiveserver2-site.

 

Start Hiveserver2

 

Don't have an account?
Coming from Hortonworks? Activate your account here