2019-04-17T01:08:23,398 INFO [main]: conf.HiveConf (HiveConf.java:findConfigFile(187)) - Found configuration file file:/etc/hive/3.1.0.0-78/0/hive-site.xml 2019-04-17T01:08:24,088 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:08:24,089 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:08:24,170 INFO [main]: server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(767)) - STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting HiveServer2 STARTUP_MSG: host = ip-172-31-18-160.ec2.internal/172.31.18.160 STARTUP_MSG: args = [--hiveconf, hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar] STARTUP_MSG: version = 3.1.0.3.1.0.0-78 STARTUP_MSG: classpath = /etc/tez/conf:/usr/hdp/current/hive-server2/conf/:/usr/hdp/3.1.0.0-78/hive/lib/HikariCP-2.6.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/ST4-4.0.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/accumulo-core-1.7.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/accumulo-fate-1.7.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/accumulo-start-1.7.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/accumulo-trace-1.7.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/aircompressor-0.10.jar:/usr/hdp/3.1.0.0-78/hive/lib/ant-1.9.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/ant-launcher-1.9.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/antlr-runtime-3.5.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/antlr4-runtime-4.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/aopalliance-repackaged-2.5.0-b32.jar:/usr/hdp/3.1.0.0-78/hive/lib/apache-jsp-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/apache-jstl-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/arrow-format-0.8.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/arrow-memory-0.8.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/arrow-vector-0.8.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/asm-6.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/asm-commons-6.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/asm-tree-6.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/audience-annotations-0.5.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/avatica-1.10.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/avro-1.7.7.jar:/usr/hdp/3.1.0.0-78/hive/lib/bonecp-0.8.0.RELEASE.jar:/usr/hdp/3.1.0.0-78/hive/lib/calcite-core-1.16.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/calcite-druid-1.16.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/calcite-linq4j-1.16.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-cli-1.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-codec-1.7.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-compiler-2.7.6.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-compress-1.9.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-crypto-1.0.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-dbcp-1.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-io-2.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-lang-2.6.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-lang3-3.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-logging-1.0.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-math-2.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-math3-3.6.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-pool-1.5.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/commons-vfs2-2.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/curator-framework-2.12.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/datanucleus-api-jdo-4.2.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/datanucleus-core-4.1.17.jar:/usr/hdp/3.1.0.0-78/hive/lib/datanucleus-rdbms-4.1.19.jar:/usr/hdp/3.1.0.0-78/hive/lib/derby-10.14.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/disruptor-3.3.6.jar:/usr/hdp/3.1.0.0-78/hive/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/druid-bloom-filter-0.12.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/druid-hdfs-storage-0.12.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/ecj-4.4.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/esri-geometry-api-2.0.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/3.1.0.0-78/hive/lib/flatbuffers-1.2.0-3f79e055.jar:/usr/hdp/3.1.0.0-78/hive/lib/groovy-all-2.4.11.jar:/usr/hdp/3.1.0.0-78/hive/lib/gson-2.2.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/guava-19.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-client-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-common-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-hadoop-compat-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-hadoop2-compat-2.0.2.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-hadoop2-compat-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-http-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-mapreduce-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-metrics-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-metrics-api-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-procedure-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-protocol-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-protocol-shaded-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-replication-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-server-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/hbase-zookeeper-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-accumulo-handler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-accumulo-handler.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-beeline-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-beeline.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-classification-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-classification.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-cli-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-cli.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-common-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-common.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-contrib-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-contrib.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-druid-handler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-druid-handler.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-exec-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-exec.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hbase-handler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hbase-handler.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hcatalog-core-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hcatalog-core.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hcatalog-server-extensions-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hcatalog-server-extensions.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hplsql-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-hplsql.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-jdbc-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-jdbc-handler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-jdbc-handler.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-jdbc.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-kryo-registrator-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-kryo-registrator.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-client-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-client.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-common-3.1.0.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-common-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-common.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-ext-client-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-ext-client.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-server-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-server.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-tez-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-llap-tez.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-metastore-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-metastore.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-pre-upgrade-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-pre-upgrade.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-serde-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-serde.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-service-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-service-rpc-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-service-rpc.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-service.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-0.23-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-common-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-common.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-scheduler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims-scheduler.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-shims.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-standalone-metastore-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-standalone-metastore.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-storage-api-2.3.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-storage-api.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-streaming-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-streaming.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-testutils-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-testutils.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-vector-code-gen-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/hive-vector-code-gen.jar:/usr/hdp/3.1.0.0-78/hive/lib/hk2-api-2.5.0-b32.jar:/usr/hdp/3.1.0.0-78/hive/lib/hk2-locator-2.5.0-b32.jar:/usr/hdp/3.1.0.0-78/hive/lib/hk2-utils-2.5.0-b32.jar:/usr/hdp/3.1.0.0-78/hive/lib/hppc-0.7.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/htrace-core-3.2.0-incubating.jar:/usr/hdp/3.1.0.0-78/hive/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.1.0.0-78/hive/lib/httpclient-4.5.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/httpcore-4.4.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/ivy-2.4.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-core-2.9.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-dataformat-smile-2.9.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hive/lib/jamon-runtime-2.3.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/janino-2.7.6.jar:/usr/hdp/3.1.0.0-78/hive/lib/javassist-3.20.0-GA.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.annotation-api-1.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.inject-2.5.0-b32.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.jdo-3.2.0-m3.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.servlet.jsp-2.3.2.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.servlet.jsp-api-2.3.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/javax.ws.rs-api-2.0.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/javolution-5.5.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jcodings-1.0.18.jar:/usr/hdp/3.1.0.0-78/hive/lib/jcommander-1.32.jar:/usr/hdp/3.1.0.0-78/hive/lib/jdo-api-3.0.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-client-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-common-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-container-servlet-core-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-guava-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-media-jaxb-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jersey-server-2.25.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jettison-1.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-annotations-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-client-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-http-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-io-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-jaas-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-jndi-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-plus-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-rewrite-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-runner-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-schemas-3.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-security-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-server-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-servlet-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-util-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-webapp-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jetty-xml-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/jline-2.12.jar:/usr/hdp/3.1.0.0-78/hive/lib/joda-time-2.9.9.jar:/usr/hdp/3.1.0.0-78/hive/lib/joni-2.1.11.jar:/usr/hdp/3.1.0.0-78/hive/lib/jpam-1.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/json-1.8.jar:/usr/hdp/3.1.0.0-78/hive/lib/jsr305-3.0.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/jta-1.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/kafka-clients-2.0.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/kafka-handler-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/libfb303-0.9.3.jar:/usr/hdp/3.1.0.0-78/hive/lib/libthrift-0.9.3.jar:/usr/hdp/3.1.0.0-78/hive/lib/lz4-java-1.4.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/maven-scm-api-1.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/maven-scm-provider-svn-commons-1.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/maven-scm-provider-svnexe-1.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/memory-0.9.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/metrics-core-3.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/metrics-json-3.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/metrics-jvm-3.1.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/mysql-connector-java.jar:/usr/hdp/3.1.0.0-78/hive/lib/mysql-metadata-storage-0.12.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/netty-3.10.5.Final.jar:/usr/hdp/3.1.0.0-78/hive/lib/netty-all-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hive/lib/netty-buffer-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hive/lib/netty-common-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hive/lib/opencsv-2.3.jar:/usr/hdp/3.1.0.0-78/hive/lib/orc-core-1.5.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/orc-shims-1.5.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/org.abego.treelayout.core-1.0.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/oro-2.0.8.jar:/usr/hdp/3.1.0.0-78/hive/lib/osgi-resource-locator-1.0.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/paranamer-2.3.jar:/usr/hdp/3.1.0.0-78/hive/lib/parquet-hadoop-bundle-1.10.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/plexus-utils-1.5.6.jar:/usr/hdp/3.1.0.0-78/hive/lib/postgresql-9.4.1208.jre7.jar:/usr/hdp/3.1.0.0-78/hive/lib/postgresql-metadata-storage-0.12.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-shim-1.2.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/ranger-plugin-classloader-1.2.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/regexp-1.3.jar:/usr/hdp/3.1.0.0-78/hive/lib/sketches-core-0.9.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/snappy-java-1.1.4.jar:/usr/hdp/3.1.0.0-78/hive/lib/sqlline-1.3.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/stax-api-1.0.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/super-csv-2.2.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/taglibs-standard-impl-1.2.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/taglibs-standard-spec-1.2.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/tempus-fugit-1.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/transaction-api-1.1.jar:/usr/hdp/3.1.0.0-78/hive/lib/validation-api-1.1.0.Final.jar:/usr/hdp/3.1.0.0-78/hive/lib/velocity-1.5.jar:/usr/hdp/3.1.0.0-78/hive/lib/websocket-api-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/websocket-client-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/websocket-common-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/websocket-server-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/websocket-servlet-9.3.25.v20180904.jar:/usr/hdp/3.1.0.0-78/hive/lib/zookeeper.jar:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar:/usr/hdp/3.1.0.0-78/atlas/hook/hive/atlas-hive-plugin-impl:/usr/hdp/3.1.0.0-78/atlas/hook/hive/atlas-plugin-classloader-1.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/atlas/hook/hive/hive-bridge-shim-1.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive-hcatalog/share/hcatalog/hive-hcatalog-core-3.1.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-3.1.0.3.1.0.0-78.jar:/usr/hdp/current/hadoop-client/share/hadoop/tools/lib/hadoop-distcp-*.jar:/etc/hbase/conf:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-shaded-protobuf-2.1.0.jar:/usr/hdp/3.1.0.0-78/hbase/lib/htrace-core4-4.2.0-incubating.jar:/usr/hdp/3.1.0.0-78/hbase/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-protocol-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-hadoop2-compat-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-mapreduce-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/commons-lang3-3.6.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-client-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-hadoop-compat-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.1.0.0-78/hbase/lib/jackson-core-2.9.5.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-shaded-netty-2.1.0.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-shaded-miscellaneous-2.1.0.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-common-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/metrics-core-3.2.1.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-protocol-shaded-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-server-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-metrics-api-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hbase/lib/hbase-metrics-2.0.2.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hive/lib/log4j-1.2-api-2.10.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/log4j-api-2.10.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/log4j-core-2.10.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar:/usr/hdp/3.1.0.0-78/hive/lib/log4j-web-2.10.0.jar:/usr/hdp/3.1.0.0-78/tez/conf_llap:/usr/hdp/3.1.0.0-78/tez/hadoop-shim-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/hadoop-shim-2.8-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib:/usr/hdp/3.1.0.0-78/tez/man:/usr/hdp/3.1.0.0-78/tez/tez-api-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-common-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-dag-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-examples-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-history-parser-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-javadoc-tools-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-job-analyzer-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-mapreduce-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-protobuf-history-plugin-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-runtime-internals-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-runtime-library-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-tests-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-cache-plugin-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-with-acls-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-with-fs-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/ui:/usr/hdp/3.1.0.0-78/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/3.1.0.0-78/tez/lib/async-http-client-1.9.40.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-cli-1.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-codec-1.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-collections4-4.1.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-io-2.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-lang-2.6.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/3.1.0.0-78/tez/lib/gcs-connector-1.9.10.3.1.0.0-78-shaded.jar:/usr/hdp/3.1.0.0-78/tez/lib/guava-11.0.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-aws-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-azure-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-azure-datalake-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-mapreduce-client-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-mapreduce-client-core-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/jersey-client-1.19.jar:/usr/hdp/3.1.0.0-78/tez/lib/jersey-json-1.19.jar:/usr/hdp/3.1.0.0-78/tez/lib/jettison-1.3.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/jetty-server-9.3.22.v20171030.jar:/usr/hdp/3.1.0.0-78/tez/lib/jetty-util-9.3.22.v20171030.jar:/usr/hdp/3.1.0.0-78/tez/lib/jsr305-3.0.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/servlet-api-2.5.jar:/usr/hdp/3.1.0.0-78/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/3.1.0.0-78/tez/lib/tez.tar.gz:/usr/hdp/3.1.0.0-78/hadoop/conf:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-http-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-security-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-configuration2-2.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/snappy-java-1.0.5.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-server-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/stax2-api-3.1.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-core-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/guava-11.0.2.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/woodstox-core-5.0.3.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/ranger-hdfs-plugin-shim-1.2.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/token-provider-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jersey-core-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/httpclient-4.5.2.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/re2j-1.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerby-asn1-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jersey-servlet-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/htrace-core4-4.1.0-incubating.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerby-config-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jsr311-api-1.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-net-3.6.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-codec-1.11.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/httpcore-4.4.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/json-smart-2.3.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/curator-framework-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/xz-1.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jul-to-slf4j-1.7.25.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-io-2.5.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-server-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/accessors-smart-1.2.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-crypto-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jersey-server-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/netty-3.10.5.Final.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/paranamer-2.3.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-simplekdc-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-io-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerby-util-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerby-pkix-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/metrics-core-3.2.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jcip-annotations-1.0-1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-xml-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jsch-0.1.54.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/ranger-yarn-plugin-shim-1.2.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/curator-client-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/gson-2.2.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-core-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/avro-1.7.7.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-admin-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-api-1.7.25.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-lang3-3.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-common-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jersey-json-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-identity-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-client-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/asm-5.0.4.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jetty-util-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jaxb-api-2.2.11.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/commons-beanutils-1.9.3.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/ranger-plugin-classloader-1.2.0.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/jettison-1.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerby-xdr-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/kerb-util-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop/lib/zookeeper-3.4.6.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//azure-data-lake-store-sdk-2.2.7.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-nfs.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-common.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-nfs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//azure-storage-7.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-azure.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-kms.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-common-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop/.//azure-keyvault-core-1.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-azure-datalake-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-auth-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-annotations-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-common-tests.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-azure-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-auth.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-kms-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-azure-datalake.jar:/usr/hdp/3.1.0.0-78/hadoop/.//hadoop-annotations.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/./:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-http-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-security-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-configuration2-2.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/snappy-java-1.0.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-server-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/stax2-api-3.1.4.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-core-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/woodstox-core-5.0.3.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/token-provider-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jersey-core-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/httpclient-4.5.2.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/re2j-1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-databind-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerby-asn1-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jersey-servlet-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/htrace-core4-4.1.0-incubating.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerby-config-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/okio-1.6.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jsr311-api-1.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-net-3.6.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-codec-1.11.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/httpcore-4.4.4.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/json-smart-2.3.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/curator-framework-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/xz-1.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-io-2.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-math3-3.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-xc-1.9.13.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-server-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/accessors-smart-1.2.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/javax.servlet-api-3.1.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-crypto-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jersey-server-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/netty-all-4.0.52.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/paranamer-2.3.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-simplekdc-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-io-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerby-util-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerby-pkix-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-collections-3.2.2.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jcip-annotations-1.0-1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-compress-1.4.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jsch-0.1.54.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/okhttp-2.7.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/curator-recipes-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/curator-client-2.12.0.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/gson-2.2.4.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-core-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/avro-1.7.7.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-admin-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-lang3-3.4.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-common-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jersey-json-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-identity-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-client-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/asm-5.0.4.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jetty-util-9.3.24.v20180605.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jaxb-api-2.2.11.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/commons-beanutils-1.9.3.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jackson-annotations-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/jettison-1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerby-xdr-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/kerb-util-1.0.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/json-simple-1.1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/lib/zookeeper-3.4.6.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-client.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-nfs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-native-client.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-native-client-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-httpfs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-rbf-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-httpfs.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-native-client-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-rbf.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-client-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/3.1.0.0-78/hadoop-hdfs/.//hadoop-hdfs-rbf-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/lib/*:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//azure-data-lake-store-sdk-2.2.7.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-gridmix-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-resolver-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-codec-http-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-streaming-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-kafka-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-sls-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//azure-storage-7.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//aws-java-sdk-bundle-1.11.271.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-distcp-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-resourceestimator-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//jdom-1.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//google-extensions-0.3.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-aws-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-azure.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-datajoin-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-examples-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-aws.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-archive-logs.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//wildfly-openssl-1.0.4.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-common-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-archive-logs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-fs2img-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-transport-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//flogger-log4j-backend-0.3.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-kafka.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//azure-keyvault-core-1.0.0.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-azure-datalake-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-aliyun-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//aliyun-sdk-oss-2.8.3.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-codec-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//kafka-clients-0.8.2.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-extras-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//flogger-0.3.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-3.1.1.3.1.0.0-78-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-archives-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-openstack-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-handler-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-fs2img.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-azure-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-rumen-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//ojalgo-43.0.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-nativetask-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//flogger-system-backend-0.3.1.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//lz4-1.2.0.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-uploader.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-azure-datalake.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-resourceestimator.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-core-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//gcs-connector-1.9.10.3.1.0.0-78-shaded.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-mapreduce-client-app-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//netty-buffer-4.1.17.Final.jar:/usr/hdp/3.1.0.0-78/hadoop-mapreduce/.//hadoop-aliyun.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/./:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/snakeyaml-1.16.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/jersey-guice-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/ehcache-3.3.1.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/HikariCP-java7-2.4.12.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/json-io-2.5.1.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/jersey-client-1.19.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/java-util-1.9.0.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/objenesis-1.0.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/guice-4.0.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/dnsjava-2.1.7.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/metrics-core-3.2.4.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/guice-servlet-4.0.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/jackson-jaxrs-base-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/fst-2.50.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/swagger-annotations-1.5.4.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-web-proxy-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-services-core.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-timeline-pluginstorage.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-services-api.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-router-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-registry-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-api-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-router.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-nodemanager-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-tests-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-services-core-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-client-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/hadoop-yarn/.//hadoop-yarn-services-api-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-examples-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-mapreduce-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/hadoop-shim-2.8-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-with-fs-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-runtime-internals-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-history-parser-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-cache-plugin-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-tests-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-job-analyzer-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-protobuf-history-plugin-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-yarn-timeline-history-with-acls-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-runtime-library-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/hadoop-shim-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-api-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-common-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-javadoc-tools-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/tez-dag-0.9.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/jetty-util-9.3.22.v20171030.jar:/usr/hdp/3.1.0.0-78/tez/lib/guava-11.0.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/jersey-client-1.19.jar:/usr/hdp/3.1.0.0-78/tez/lib/jetty-server-9.3.22.v20171030.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-aws-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-cli-1.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-azure-datalake-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/3.1.0.0-78/tez/lib/async-http-client-1.9.40.jar:/usr/hdp/3.1.0.0-78/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-mapreduce-client-common-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-collections4-4.1.jar:/usr/hdp/3.1.0.0-78/tez/lib/jsr305-3.0.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/jettison-1.3.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-codec-1.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-azure-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/3.1.0.0-78/tez/lib/servlet-api-2.5.jar:/usr/hdp/3.1.0.0-78/tez/lib/jersey-json-1.19.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-lang-2.6.jar:/usr/hdp/3.1.0.0-78/tez/lib/hadoop-mapreduce-client-core-3.1.1.3.1.0.0-78.jar:/usr/hdp/3.1.0.0-78/tez/lib/commons-io-2.4.jar:/usr/hdp/3.1.0.0-78/tez/lib/gcs-connector-1.9.10.3.1.0.0-78-shaded.jar:/usr/hdp/3.1.0.0-78/tez/conf STARTUP_MSG: build = git://ctr-e138-1518143905142-512280-01-000004.hwx.site/grid/0/jenkins/workspace/HDP-parallel-ubuntu18/SOURCES/hive -r 56673b027117d8cb3400675b1680a4d992360808; compiled by 'jenkins' on Thu Dec 6 12:32:31 UTC 2018 ************************************************************/ 2019-04-17T01:08:24,198 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2 2019-04-17T01:08:24,256 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:08:24,257 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:08:24,457 INFO [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json 2019-04-17T01:08:24,504 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics991698410664457131json to /tmp/report.json 2019-04-17T01:08:24,504 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics991698410664457131json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:08:24,526 INFO [main]: impl.MetricsConfig (:()) - Loaded properties from hadoop-metrics2-hiveserver2.properties 2019-04-17T01:08:24,859 INFO [main]: timeline.HadoopTimelineMetricsSink (:()) - Initializing Timeline metrics sink. 2019-04-17T01:08:24,860 INFO [main]: timeline.HadoopTimelineMetricsSink (:()) - Identified hostname = ip-172-31-18-160.ec2.internal, serviceName = hiveserver2 2019-04-17T01:08:24,900 INFO [main]: availability.MetricSinkWriteShardHostnameHashingStrategy (:()) - Calculated collector shard ip-172-31-18-160.ec2.internal based on hostname: ip-172-31-18-160.ec2.internal 2019-04-17T01:08:24,900 INFO [main]: timeline.HadoopTimelineMetricsSink (:()) - Collector Uri: http://ip-172-31-18-160.ec2.internal:6188/ws/v1/timeline/metrics 2019-04-17T01:08:24,900 INFO [main]: timeline.HadoopTimelineMetricsSink (:()) - Container Metrics Uri: http://ip-172-31-18-160.ec2.internal:6188/ws/v1/timeline/containermetrics 2019-04-17T01:08:24,909 INFO [main]: impl.MetricsSinkAdapter (:()) - Sink timeline started 2019-04-17T01:08:24,925 INFO [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(374)) - Scheduled Metric snapshot period at 10 second(s). 2019-04-17T01:08:24,926 INFO [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - hiveserver2 metrics system started 2019-04-17T01:08:24,970 INFO [main]: SessionState (:()) - Hive Session ID = 38639d06-e47d-4cb2-8ef7-9a18d953b897 2019-04-17T01:08:25,700 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/38639d06-e47d-4cb2-8ef7-9a18d953b897 2019-04-17T01:08:25,718 INFO [main]: session.SessionState (:()) - Created local directory: /tmp/hive/38639d06-e47d-4cb2-8ef7-9a18d953b897 2019-04-17T01:08:25,720 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/38639d06-e47d-4cb2-8ef7-9a18d953b897/_tmp_space.db 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/commons-configuration2-2.1.1.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/httpclient-4.5.3.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/stax2-api-3.1.4.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/woodstox-core-5.0.3.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/noggit-0.6.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/htrace-core4-4.1.0-incubating.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/httpmime-4.5.3.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/jersey-client-1.19.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/ranger-plugins-audit-1.2.0.3.1.0.0-78.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/httpcore-4.4.6.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/jersey-core-1.19.3.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/ranger-plugins-common-1.2.0.3.1.0.0-78.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/ranger-hive-plugin-1.2.0.3.1.0.0-78.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/javax.persistence-2.1.0.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/eclipselink-2.5.2.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/ranger-plugins-cred-1.2.0.3.1.0.0-78.jar 2019-04-17T01:08:25,731 INFO [main]: classloader.RangerPluginClassLoaderUtil (:()) - getFilesInDirectory('/usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl'): adding /usr/hdp/3.1.0.0-78/hive/lib/ranger-hive-plugin-impl/solr-solrj-6.6.1.jar 2019-04-17T01:08:25,750 WARN [main]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:08:25,795 INFO [main]: config.RangerConfiguration (:()) - addResourceIfReadable(ranger-hive-audit.xml): resource file is file:/etc/hive/3.1.0.0-78/0/ranger-hive-audit.xml 2019-04-17T01:08:25,795 INFO [main]: config.RangerConfiguration (:()) - addResourceIfReadable(ranger-hive-security.xml): resource file is file:/etc/hive/3.1.0.0-78/0/ranger-hive-security.xml 2019-04-17T01:08:25,797 INFO [main]: provider.AuditProviderFactory (:()) - AuditProviderFactory: creating.. 2019-04-17T01:08:25,797 INFO [main]: provider.AuditProviderFactory (:()) - AuditProviderFactory: initializing.. 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.policy.source.impl=org.apache.ranger.admin.client.RangerAdminRESTClient 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.policy.pollIntervalMs=30000 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.solr=true 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.hive.update.xapolicies.on.grant.revoke=true 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.policy.rest.url=http://ip-172-31-18-160.ec2.internal:6080 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.urlauth.filesystem.schemes=hdfs:,file:,wasb:,adl: 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.ambari.cluster.name=RxProfiler 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.batch.filespool.dir=/var/log/hive/audit/hdfs/spool 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.solr.batch.filespool.dir=/var/log/hive/audit/solr/spool 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.is.enabled=true 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.hdfs=true 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.policy.cache.dir=/etc/ranger/RxProfiler_hive/policycache 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.provider.summary.enabled=false 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.dir=hdfs://ip-172-31-18-160.ec2.internal:8020/ranger/audit 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.service.name=RxProfiler_hive 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: xasecure.audit.destination.solr.zookeepers=ip-172-31-18-160.ec2.internal:2181/infra-solr 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - AUDIT PROPERTY: ranger.plugin.hive.policy.rest.ssl.config.file=/usr/hdp/current/hive-server2/conf/ranger-policymgr-ssl.xml 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - Audit destination xasecure.audit.destination.solr is set to true 2019-04-17T01:08:25,826 INFO [main]: provider.AuditProviderFactory (:()) - Audit destination xasecure.audit.destination.hdfs is set to true 2019-04-17T01:08:25,832 INFO [main]: destination.AuditDestination (:()) - AuditDestination() enter 2019-04-17T01:08:25,833 INFO [main]: destination.SolrAuditDestination (:()) - init() called 2019-04-17T01:08:25,833 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,833 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.destination.solr 2019-04-17T01:08:25,833 INFO [main]: provider.BaseAuditHandler (:()) - Using providerName from property prefix. providerName=solr 2019-04-17T01:08:25,833 INFO [main]: provider.BaseAuditHandler (:()) - providerName=solr 2019-04-17T01:08:25,833 INFO [main]: destination.SolrAuditDestination (:()) - ==>SolrAuditDestination.init() 2019-04-17T01:08:25,833 INFO [main]: destination.SolrAuditDestination (:()) - In solrAuditDestination.init() : JAAS Configuration set as [null] 2019-04-17T01:08:25,833 WARN [main]: destination.SolrAuditDestination (:()) - No Client JAAS config present in solr audit config. Ranger Audit to Kerberized Solr will fail... 2019-04-17T01:08:25,833 INFO [main]: destination.SolrAuditDestination (:()) - Loading SolrClient JAAS config from Ranger audit config if present... 2019-04-17T01:08:25,839 INFO [main]: destination.SolrAuditDestination (:()) - In solrAuditDestination.init() (finally) : JAAS Configuration set as [null] 2019-04-17T01:08:25,839 INFO [main]: destination.SolrAuditDestination (:()) - <==SolrAuditDestination.init() 2019-04-17T01:08:25,839 INFO [main]: destination.SolrAuditDestination (:()) - Solr zkHosts=ip-172-31-18-160.ec2.internal:2181/infra-solr, solrURLs=null, collectionName=ranger_audits 2019-04-17T01:08:25,839 INFO [main]: destination.SolrAuditDestination (:()) - Connecting to solr cloud using zkHosts=ip-172-31-18-160.ec2.internal:2181/infra-solr 2019-04-17T01:08:25,864 WARN [main]: impl.Krb5HttpClientConfigurer (:()) - org.apache.solr.client.solrj.impl.Krb5HttpClientConfigurer is configured without specifying system property 'java.security.auth.login.config' 2019-04-17T01:08:25,868 INFO [main]: provider.AuditProviderFactory (:()) - xasecure.audit.destination.solr.queue is not set. Setting queue to batch for solr 2019-04-17T01:08:25,868 INFO [main]: provider.AuditProviderFactory (:()) - queue for solr is batch 2019-04-17T01:08:25,869 INFO [main]: queue.AuditQueue (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,869 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,869 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.destination.solr.batch 2019-04-17T01:08:25,869 INFO [main]: provider.BaseAuditHandler (:()) - providerName=batch 2019-04-17T01:08:25,869 INFO [main]: queue.AuditQueue (:()) - File spool is enabled for batch, logFolderProp=/var/log/hive/audit/solr/spool, xasecure.audit.destination.solr.batch.filespool.dir=false 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - retryDestinationMS=30000, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - fileRolloverSec=86400, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - maxArchiveFiles=100, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - logFolder=/var/log/hive/audit/solr/spool, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - logFileNameFormat=spool_%app-type%_%time:yyyyMMdd-HHmm.ss%.log, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - archiveFolder=/var/log/hive/audit/solr/spool/archive, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - indexFile=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2.json, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - indexDoneFile=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2_closed.json, queueName=batch 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - Loading index file. fileName=/var/log/hive/audit/solr/spool/index_batch_batch.solr_hiveServer2.json 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - INDEX printIndex() ==== START 2019-04-17T01:08:25,871 INFO [main]: queue.AuditFileSpool (:()) - INDEX printIndex() ==== END 2019-04-17T01:08:25,872 INFO [main]: destination.AuditDestination (:()) - AuditDestination() enter 2019-04-17T01:08:25,872 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,872 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.destination.hdfs 2019-04-17T01:08:25,872 INFO [main]: provider.BaseAuditHandler (:()) - Using providerName from property prefix. providerName=hdfs 2019-04-17T01:08:25,872 INFO [main]: provider.BaseAuditHandler (:()) - providerName=hdfs 2019-04-17T01:08:25,873 INFO [main]: destination.HDFSAuditDestination (:()) - logFolder=hdfs://ip-172-31-18-160.ec2.internal:8020/ranger/audit/%app-type%/%time:yyyyMMdd%, destName=hdfs 2019-04-17T01:08:25,873 INFO [main]: destination.HDFSAuditDestination (:()) - logFileNameFormat=%app-type%_ranger_audit_%hostname%.log, destName=hdfs 2019-04-17T01:08:25,873 INFO [main]: destination.HDFSAuditDestination (:()) - config={} 2019-04-17T01:08:25,873 INFO [main]: provider.AuditProviderFactory (:()) - xasecure.audit.destination.hdfs.queue is not set. Setting queue to batch for hdfs 2019-04-17T01:08:25,873 INFO [main]: provider.AuditProviderFactory (:()) - queue for hdfs is batch 2019-04-17T01:08:25,873 INFO [main]: queue.AuditQueue (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,873 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,873 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.destination.hdfs.batch 2019-04-17T01:08:25,873 INFO [main]: provider.BaseAuditHandler (:()) - providerName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditQueue (:()) - File spool is enabled for batch, logFolderProp=/var/log/hive/audit/hdfs/spool, xasecure.audit.destination.hdfs.batch.filespool.dir=false 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - retryDestinationMS=30000, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - fileRolloverSec=86400, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - maxArchiveFiles=100, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - logFolder=/var/log/hive/audit/hdfs/spool, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - logFileNameFormat=spool_%app-type%_%time:yyyyMMdd-HHmm.ss%.log, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - archiveFolder=/var/log/hive/audit/hdfs/spool/archive, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - indexFile=/var/log/hive/audit/hdfs/spool/index_batch_batch.hdfs_hiveServer2.json, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - indexDoneFile=/var/log/hive/audit/hdfs/spool/index_batch_batch.hdfs_hiveServer2_closed.json, queueName=batch 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - Loading index file. fileName=/var/log/hive/audit/hdfs/spool/index_batch_batch.hdfs_hiveServer2.json 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - INDEX printIndex() ==== START 2019-04-17T01:08:25,874 INFO [main]: queue.AuditFileSpool (:()) - INDEX printIndex() ==== END 2019-04-17T01:08:25,874 INFO [main]: provider.AuditProviderFactory (:()) - Using v3 audit configuration 2019-04-17T01:08:25,874 INFO [main]: provider.AuditProviderFactory (:()) - MultiDestAuditProvider is used. Destination count=2 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - MultiDestAuditProvider: creating.. 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - MultiDestAuditProvider.init() 2019-04-17T01:08:25,875 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,875 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.provider 2019-04-17T01:08:25,875 INFO [main]: provider.BaseAuditHandler (:()) - providerName=multi_dest 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - Adding batch as consumer to MultiDestination multi_dest 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - MultiDestAuditProvider.addAuditProvider(providerType=org.apache.ranger.audit.queue.AuditBatchQueue) 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - Adding batch as consumer to MultiDestination multi_dest 2019-04-17T01:08:25,875 INFO [main]: provider.MultiDestAuditProvider (:()) - MultiDestAuditProvider.addAuditProvider(providerType=org.apache.ranger.audit.queue.AuditBatchQueue) 2019-04-17T01:08:25,875 INFO [main]: provider.AuditProviderFactory (:()) - AuditSummaryQueue is disabled 2019-04-17T01:08:25,876 INFO [main]: queue.AuditQueue (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,876 INFO [main]: provider.BaseAuditHandler (:()) - BaseAuditProvider.init() 2019-04-17T01:08:25,876 INFO [main]: provider.BaseAuditHandler (:()) - propPrefix=xasecure.audit.provider.async 2019-04-17T01:08:25,876 INFO [main]: provider.BaseAuditHandler (:()) - providerName=async 2019-04-17T01:08:25,876 INFO [main]: queue.AuditQueue (:()) - File spool is disabled for async 2019-04-17T01:08:25,876 INFO [main]: provider.AuditProviderFactory (:()) - Starting audit queue hiveServer2.async 2019-04-17T01:08:25,876 INFO [main]: queue.AuditBatchQueue (:()) - Creating ArrayBlockingQueue with maxSize=1048576 2019-04-17T01:08:25,878 INFO [main]: queue.AuditFileSpool (:()) - Starting writerThread, queueName=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.solr 2019-04-17T01:08:25,878 INFO [main]: queue.AuditBatchQueue (:()) - Creating ArrayBlockingQueue with maxSize=1048576 2019-04-17T01:08:25,880 INFO [main]: queue.AuditFileSpool (:()) - Starting writerThread, queueName=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:08:25,883 INFO [Ranger async Audit cleanup]: provider.AuditProviderFactory (:()) - RangerAsyncAuditCleanup: Waiting to audit cleanup start signal 2019-04-17T01:08:25,896 INFO [main]: service.RangerBasePlugin (:()) - PolicyEngineOptions: { evaluatorType: auto, cacheAuditResult: false, disableContextEnrichers: false, disableCustomConditions: false, disableTrieLookupPrefilter: false, optimizeTrieForRetrieval: false } 2019-04-17T01:08:26,418 INFO [main]: util.PolicyRefresher (:()) - PolicyRefresher(serviceName=RxProfiler_hive): found updated version. lastKnownVersion=-1; newVersion=6 2019-04-17T01:08:26,468 INFO [main]: policyengine.RangerPolicyRepository (:()) - This policy engine contains 5 policy evaluators 2019-04-17T01:08:26,469 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,469 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,470 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: util.RangerResourceTrie (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: resourcetrie.init (:()) - builderThreadCount is set to [1] 2019-04-17T01:08:26,471 INFO [main]: service.RangerBasePlugin (:()) - Policies will NOT be reordered based on number of evaluations 2019-04-17T01:08:26,474 WARN [main]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:08:26,839 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:08:26,859 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 1 2019-04-17T01:08:26,869 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:08:26,869 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:08:26,990 INFO [main]: service.CompositeService (:()) - Operation log root directory is created: /tmp/hive/operation_logs 2019-04-17T01:08:26,991 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread pool size: 100 2019-04-17T01:08:26,991 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread wait queue size: 100 2019-04-17T01:08:26,991 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread keepalive time: 10 seconds 2019-04-17T01:08:26,994 INFO [main]: service.CompositeService (:()) - Connections limit are user: 0 ipaddress: 0 user-ipaddress: 0 2019-04-17T01:08:27,000 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:OperationManager is inited. 2019-04-17T01:08:27,000 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:SessionManager is inited. 2019-04-17T01:08:27,000 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:CLIService is inited. 2019-04-17T01:08:27,001 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:ThriftBinaryCLIService is inited. 2019-04-17T01:08:27,001 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:HiveServer2 is inited. 2019-04-17T01:08:27,066 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:08:27,067 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:08:27,068 INFO [main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl 2019-04-17T01:08:27,073 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 0 2019-04-17T01:08:27,074 INFO [HiveMaterializedViewsRegistry-0]: SessionState (:()) - Hive Session ID = b9a27790-42ad-4e0b-9a76-ac178b5716e7 2019-04-17T01:08:27,081 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/b9a27790-42ad-4e0b-9a76-ac178b5716e7 2019-04-17T01:08:27,083 INFO [main]: results.QueryResultsCache (:()) - Initializing query results cache at /tmp/hive/_resultscache_ 2019-04-17T01:08:27,083 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created local directory: /tmp/hive/b9a27790-42ad-4e0b-9a76-ac178b5716e7 2019-04-17T01:08:27,085 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/b9a27790-42ad-4e0b-9a76-ac178b5716e7/_tmp_space.db 2019-04-17T01:08:27,087 INFO [main]: results.QueryResultsCache (:()) - Query results cache: cacheDirectory /tmp/hive/_resultscache_/results-c31e2b41-204c-495f-a99b-d150a117b350, maxCacheSize 2147483648, maxEntrySize 10485760, maxEntryLifetime 3600000 2019-04-17T01:08:27,121 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:08:27,121 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 1 2019-04-17T01:08:27,123 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:08:27,123 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:08:27,136 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - User of session id b9a27790-42ad-4e0b-9a76-ac178b5716e7 is hive 2019-04-17T01:08:27,143 INFO [main]: events.NotificationEventPoll (:()) - Initializing lastCheckedEventId to 89 2019-04-17T01:08:27,144 INFO [main]: server.HiveServer2 (HiveServer2.java:init(315)) - Starting Web UI on port 10002 2019-04-17T01:08:27,153 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/b9a27790-42ad-4e0b-9a76-ac178b5716e7-resources/hive-hcatalog-core.jar 2019-04-17T01:08:27,154 INFO [main]: server.HiveServer2 (HiveServer2.java:init(370)) - CORS enabled - allowed-origins: * allowed-methods: GET,POST,DELETE,HEAD allowed-headers: X-Requested-With,Content-Type,Accept,Origin,X-Requested-By,x-requested-by 2019-04-17T01:08:27,178 INFO [main]: util.log (:()) - Logging initialized @5771ms 2019-04-17T01:08:27,441 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1555463307387 for hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/b9a27790-42ad-4e0b-9a76-ac178b5716e7-resources/hive-hcatalog-core.jar 2019-04-17T01:08:27,450 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:OperationManager is started. 2019-04-17T01:08:27,450 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:SessionManager is started. 2019-04-17T01:08:27,454 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:CLIService is started. 2019-04-17T01:08:27,454 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:ThriftBinaryCLIService is started. 2019-04-17T01:08:27,536 ERROR [main]: service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2 java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more 2019-04-17T01:08:27,537 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:ThriftBinaryCLIService is stopped. 2019-04-17T01:08:27,537 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:OperationManager is stopped. 2019-04-17T01:08:27,537 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:SessionManager is stopped. 2019-04-17T01:08:27,538 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:CLIService is stopped. 2019-04-17T01:08:27,538 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 0 2019-04-17T01:08:27,538 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:08:27,538 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:08:27,539 ERROR [main]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1090) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] 2019-04-17T01:08:27,540 INFO [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:08:27,540 WARN [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 1, will retry in 60000ms org.apache.hive.service.ServiceException: Failed to Start HiveServer2 at org.apache.hive.service.CompositeService.start(CompositeService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more 2019-04-17T01:08:27,551 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Created new resources: null 2019-04-17T01:08:27,553 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Jar dir is null / directory doesn't exist. Choosing HIVE_INSTALL_DIR - /user/hive/.hiveJars 2019-04-17T01:08:27,872 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Computed sha: e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d for file: file:/usr/hdp/3.1.0.0-78/hive/lib/hive-exec-3.1.0.3.1.0.0-78.jar of length: 39.59MB in 313 ms 2019-04-17T01:08:27,874 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1554360664415 for hdfs://ip-172-31-18-160.ec2.internal:8020/user/hive/.hiveJars/hive-exec-3.1.0.3.1.0.0-78-e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d.jar 2019-04-17T01:08:27,923 WARN [HiveMaterializedViewsRegistry-0]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:08:27,923 WARN [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:08:27,925 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:08:27,925 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 1 2019-04-17T01:08:27,927 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:08:27,927 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:08:28,016 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez Client Version: [ component=tez-api, version=0.9.1.3.1.0.0-78, revision=346318364f71536cb051ee88e9ee84e55b7e3e13, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2018-12-06T12:19:14Z ] 2019-04-17T01:08:28,016 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Opening new Tez Session (id: b9a27790-42ad-4e0b-9a76-ac178b5716e7, scratch dir: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/b9a27790-42ad-4e0b-9a76-ac178b5716e7) 2019-04-17T01:08:28,078 INFO [HiveMaterializedViewsRegistry-0]: client.RMProxy (:()) - Connecting to ResourceManager at ip-172-31-18-160.ec2.internal/172.31.18.160:8050 2019-04-17T01:08:28,317 INFO [HiveMaterializedViewsRegistry-0]: client.AHSProxy (:()) - Connecting to Application History server at ip-172-31-18-160.ec2.internal/172.31.18.160:10200 2019-04-17T01:08:28,324 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Session mode. Starting session. 2019-04-17T01:08:28,360 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris value from configuration: /hdp/apps/3.1.0.0-78/tez/tez.tar.gz 2019-04-17T01:08:28,360 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris.classpath value from configuration: null 2019-04-17T01:08:28,377 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez system stage directory hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/b9a27790-42ad-4e0b-9a76-ac178b5716e7/.tez/application_1555392406623_0036 doesn't exist and is created 2019-04-17T01:08:28,454 INFO [HiveMaterializedViewsRegistry-0]: conf.Configuration (Configuration.java:getConfResourceAsInputStream(2756)) - found resource resource-types.xml at file:/etc/hadoop/3.1.0.0-78/0/resource-types.xml 2019-04-17T01:08:28,492 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.zk-timeout-ms is deprecated. Instead, use hadoop.zk.timeout-ms 2019-04-17T01:08:28,493 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.zk-retry-interval-ms is deprecated. Instead, use hadoop.zk.retry-interval-ms 2019-04-17T01:08:28,493 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.display.per-user-apps is deprecated. Instead, use yarn.webapp.filter-entity-list-by-user 2019-04-17T01:08:28,494 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.zk-num-retries is deprecated. Instead, use hadoop.zk.num-retries 2019-04-17T01:08:28,494 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.zk-address is deprecated. Instead, use hadoop.zk.address 2019-04-17T01:08:28,495 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.zk-acl is deprecated. Instead, use hadoop.zk.acl 2019-04-17T01:08:28,496 INFO [HiveMaterializedViewsRegistry-0]: Configuration.deprecation (Configuration.java:logDeprecation(1395)) - yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled 2019-04-17T01:08:28,627 INFO [HiveMaterializedViewsRegistry-0]: impl.YarnClientImpl (:()) - Submitted application application_1555392406623_0036 2019-04-17T01:08:28,629 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - The url to track the Tez Session: http://ip-172-31-18-160.ec2.internal:8088/proxy/application_1555392406623_0036/ 2019-04-17T01:08:31,226 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:08:31,226 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 2 2019-04-17T01:08:31,227 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:08:31,227 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:08:31,262 INFO [HiveMaterializedViewsRegistry-0]: metadata.HiveMaterializedViewsRegistry (:()) - Materialized views registry has been initialized 2019-04-17T01:09:27,190 WARN [NotificationEventPoll 0]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:09:27,190 WARN [NotificationEventPoll 0]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:09:27,193 INFO [NotificationEventPoll 0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:09:27,193 INFO [NotificationEventPoll 0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 3 2019-04-17T01:09:27,194 INFO [NotificationEventPoll 0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:09:27,194 INFO [NotificationEventPoll 0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:09:27,541 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2 2019-04-17T01:09:27,591 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:09:27,592 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:09:27,593 INFO [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json 2019-04-17T01:09:27,595 WARN [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized! 2019-04-17T01:09:27,595 WARN [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system. Metrics may not be reported. java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more 2019-04-17T01:09:27,595 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 23 more 2019-04-17T01:09:27,597 INFO [main]: SessionState (:()) - Hive Session ID = f79a8754-2ff4-47bf-a469-fb068c1596c2 2019-04-17T01:09:27,602 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2928432000641551637json to /tmp/report.json 2019-04-17T01:09:27,602 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2928432000641551637json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:27,607 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/f79a8754-2ff4-47bf-a469-fb068c1596c2 2019-04-17T01:09:27,608 INFO [main]: session.SessionState (:()) - Created local directory: /tmp/hive/f79a8754-2ff4-47bf-a469-fb068c1596c2 2019-04-17T01:09:27,609 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/f79a8754-2ff4-47bf-a469-fb068c1596c2/_tmp_space.db 2019-04-17T01:09:27,610 WARN [main]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:09:27,610 WARN [main]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:09:27,611 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:09:27,611 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 4 2019-04-17T01:09:27,613 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:09:27,613 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:09:27,613 INFO [main]: service.CompositeService (:()) - Operation log root directory is created: /tmp/hive/operation_logs 2019-04-17T01:09:27,613 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread pool size: 100 2019-04-17T01:09:27,613 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread wait queue size: 100 2019-04-17T01:09:27,613 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread keepalive time: 10 seconds 2019-04-17T01:09:27,613 INFO [main]: service.CompositeService (:()) - Connections limit are user: 0 ipaddress: 0 user-ipaddress: 0 2019-04-17T01:09:27,614 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:OperationManager is inited. 2019-04-17T01:09:27,614 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:SessionManager is inited. 2019-04-17T01:09:27,614 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:CLIService is inited. 2019-04-17T01:09:27,615 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:ThriftBinaryCLIService is inited. 2019-04-17T01:09:27,615 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:HiveServer2 is inited. 2019-04-17T01:09:27,661 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:09:27,662 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:09:27,663 INFO [main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl 2019-04-17T01:09:27,663 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 3 2019-04-17T01:09:27,663 INFO [main]: server.HiveServer2 (HiveServer2.java:init(315)) - Starting Web UI on port 10002 2019-04-17T01:09:27,664 INFO [HiveMaterializedViewsRegistry-0]: SessionState (:()) - Hive Session ID = 5977f7e2-879c-456a-8d8a-fb58efa400d8 2019-04-17T01:09:27,666 INFO [main]: server.HiveServer2 (HiveServer2.java:init(370)) - CORS enabled - allowed-origins: * allowed-methods: GET,POST,DELETE,HEAD allowed-headers: X-Requested-With,Content-Type,Accept,Origin,X-Requested-By,x-requested-by 2019-04-17T01:09:27,669 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/5977f7e2-879c-456a-8d8a-fb58efa400d8 2019-04-17T01:09:27,669 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created local directory: /tmp/hive/5977f7e2-879c-456a-8d8a-fb58efa400d8 2019-04-17T01:09:27,671 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/5977f7e2-879c-456a-8d8a-fb58efa400d8/_tmp_space.db 2019-04-17T01:09:27,671 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - User of session id 5977f7e2-879c-456a-8d8a-fb58efa400d8 is hive 2019-04-17T01:09:27,677 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/5977f7e2-879c-456a-8d8a-fb58efa400d8-resources/hive-hcatalog-core.jar 2019-04-17T01:09:27,679 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:OperationManager is started. 2019-04-17T01:09:27,679 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:SessionManager is started. 2019-04-17T01:09:27,679 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:CLIService is started. 2019-04-17T01:09:27,680 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:ThriftBinaryCLIService is started. 2019-04-17T01:09:27,680 ERROR [main]: service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2 java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more 2019-04-17T01:09:27,681 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:ThriftBinaryCLIService is stopped. 2019-04-17T01:09:27,681 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:OperationManager is stopped. 2019-04-17T01:09:27,681 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:SessionManager is stopped. 2019-04-17T01:09:27,681 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:CLIService is stopped. 2019-04-17T01:09:27,681 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:09:27,682 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:09:27,682 ERROR [main]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1090) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] 2019-04-17T01:09:27,682 INFO [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:09:27,682 WARN [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 2, will retry in 60000ms org.apache.hive.service.ServiceException: Failed to Start HiveServer2 at org.apache.hive.service.CompositeService.start(CompositeService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more 2019-04-17T01:09:27,688 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1555463367687 for hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/5977f7e2-879c-456a-8d8a-fb58efa400d8-resources/hive-hcatalog-core.jar 2019-04-17T01:09:27,688 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Created new resources: null 2019-04-17T01:09:27,690 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Jar dir is null / directory doesn't exist. Choosing HIVE_INSTALL_DIR - /user/hive/.hiveJars 2019-04-17T01:09:27,691 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1554360664415 for hdfs://ip-172-31-18-160.ec2.internal:8020/user/hive/.hiveJars/hive-exec-3.1.0.3.1.0.0-78-e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d.jar 2019-04-17T01:09:27,705 WARN [HiveMaterializedViewsRegistry-0]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:09:27,705 WARN [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:09:27,706 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:09:27,706 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 4 2019-04-17T01:09:27,707 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:09:27,707 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:09:27,708 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez Client Version: [ component=tez-api, version=0.9.1.3.1.0.0-78, revision=346318364f71536cb051ee88e9ee84e55b7e3e13, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2018-12-06T12:19:14Z ] 2019-04-17T01:09:27,708 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Opening new Tez Session (id: 5977f7e2-879c-456a-8d8a-fb58efa400d8, scratch dir: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/5977f7e2-879c-456a-8d8a-fb58efa400d8) 2019-04-17T01:09:27,720 INFO [HiveMaterializedViewsRegistry-0]: client.RMProxy (:()) - Connecting to ResourceManager at ip-172-31-18-160.ec2.internal/172.31.18.160:8050 2019-04-17T01:09:27,720 INFO [HiveMaterializedViewsRegistry-0]: client.AHSProxy (:()) - Connecting to Application History server at ip-172-31-18-160.ec2.internal/172.31.18.160:10200 2019-04-17T01:09:27,720 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Session mode. Starting session. 2019-04-17T01:09:27,723 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris value from configuration: /hdp/apps/3.1.0.0-78/tez/tez.tar.gz 2019-04-17T01:09:27,723 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris.classpath value from configuration: null 2019-04-17T01:09:27,732 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez system stage directory hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/5977f7e2-879c-456a-8d8a-fb58efa400d8/.tez/application_1555392406623_0037 doesn't exist and is created 2019-04-17T01:09:27,977 INFO [HiveMaterializedViewsRegistry-0]: impl.YarnClientImpl (:()) - Submitted application application_1555392406623_0037 2019-04-17T01:09:27,978 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - The url to track the Tez Session: http://ip-172-31-18-160.ec2.internal:8088/proxy/application_1555392406623_0037/ 2019-04-17T01:09:32,606 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics9092559219500176712json to /tmp/report.json 2019-04-17T01:09:32,606 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics9092559219500176712json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:37,608 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5705476044824140652json to /tmp/report.json 2019-04-17T01:09:37,608 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5705476044824140652json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:42,610 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1610868474941506061json to /tmp/report.json 2019-04-17T01:09:42,610 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1610868474941506061json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:47,613 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2578906684725550100json to /tmp/report.json 2019-04-17T01:09:47,613 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2578906684725550100json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:52,615 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8253729290082257996json to /tmp/report.json 2019-04-17T01:09:52,615 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8253729290082257996json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:09:57,618 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2096922854090575657json to /tmp/report.json 2019-04-17T01:09:57,618 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2096922854090575657json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:02,620 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8441161639505111060json to /tmp/report.json 2019-04-17T01:10:02,620 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8441161639505111060json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:07,622 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1428403412835458797json to /tmp/report.json 2019-04-17T01:10:07,622 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1428403412835458797json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:12,625 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4220126345305435723json to /tmp/report.json 2019-04-17T01:10:12,625 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics4220126345305435723json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:17,627 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics9201787530669690502json to /tmp/report.json 2019-04-17T01:10:17,627 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics9201787530669690502json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:22,629 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5214286324104954790json to /tmp/report.json 2019-04-17T01:10:22,629 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5214286324104954790json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:27,631 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5432614923288602207json to /tmp/report.json 2019-04-17T01:10:27,631 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5432614923288602207json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:27,683 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2 2019-04-17T01:10:27,732 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:10:27,732 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:10:27,733 INFO [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json 2019-04-17T01:10:27,735 WARN [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized! 2019-04-17T01:10:27,735 WARN [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system. Metrics may not be reported. java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more 2019-04-17T01:10:27,735 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 23 more 2019-04-17T01:10:27,737 INFO [main]: SessionState (:()) - Hive Session ID = 05e2adf2-b3b5-4f1c-8ec6-98c3a66bfa4c 2019-04-17T01:10:27,738 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3637887828713841646json to /tmp/report.json 2019-04-17T01:10:27,738 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3637887828713841646json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:27,742 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/05e2adf2-b3b5-4f1c-8ec6-98c3a66bfa4c 2019-04-17T01:10:27,743 INFO [main]: session.SessionState (:()) - Created local directory: /tmp/hive/05e2adf2-b3b5-4f1c-8ec6-98c3a66bfa4c 2019-04-17T01:10:27,744 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/05e2adf2-b3b5-4f1c-8ec6-98c3a66bfa4c/_tmp_space.db 2019-04-17T01:10:27,744 WARN [main]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:10:27,745 WARN [main]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:10:27,746 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:10:27,747 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 5 2019-04-17T01:10:27,748 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:10:27,748 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:10:27,748 INFO [main]: service.CompositeService (:()) - Operation log root directory is created: /tmp/hive/operation_logs 2019-04-17T01:10:27,748 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread pool size: 100 2019-04-17T01:10:27,749 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread wait queue size: 100 2019-04-17T01:10:27,749 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread keepalive time: 10 seconds 2019-04-17T01:10:27,749 INFO [main]: service.CompositeService (:()) - Connections limit are user: 0 ipaddress: 0 user-ipaddress: 0 2019-04-17T01:10:27,749 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:OperationManager is inited. 2019-04-17T01:10:27,750 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:SessionManager is inited. 2019-04-17T01:10:27,750 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:CLIService is inited. 2019-04-17T01:10:27,750 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:ThriftBinaryCLIService is inited. 2019-04-17T01:10:27,750 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:HiveServer2 is inited. 2019-04-17T01:10:27,800 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:10:27,801 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:10:27,801 INFO [main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl 2019-04-17T01:10:27,802 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 4 2019-04-17T01:10:27,802 INFO [main]: server.HiveServer2 (HiveServer2.java:init(315)) - Starting Web UI on port 10002 2019-04-17T01:10:27,802 INFO [HiveMaterializedViewsRegistry-0]: SessionState (:()) - Hive Session ID = fe9fea19-44ff-41f7-9c43-3af830e31653 2019-04-17T01:10:27,804 INFO [main]: server.HiveServer2 (HiveServer2.java:init(370)) - CORS enabled - allowed-origins: * allowed-methods: GET,POST,DELETE,HEAD allowed-headers: X-Requested-With,Content-Type,Accept,Origin,X-Requested-By,x-requested-by 2019-04-17T01:10:27,806 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/fe9fea19-44ff-41f7-9c43-3af830e31653 2019-04-17T01:10:27,807 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created local directory: /tmp/hive/fe9fea19-44ff-41f7-9c43-3af830e31653 2019-04-17T01:10:27,809 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/fe9fea19-44ff-41f7-9c43-3af830e31653/_tmp_space.db 2019-04-17T01:10:27,809 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - User of session id fe9fea19-44ff-41f7-9c43-3af830e31653 is hive 2019-04-17T01:10:27,812 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fe9fea19-44ff-41f7-9c43-3af830e31653-resources/hive-hcatalog-core.jar 2019-04-17T01:10:27,816 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:OperationManager is started. 2019-04-17T01:10:27,816 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:SessionManager is started. 2019-04-17T01:10:27,819 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:CLIService is started. 2019-04-17T01:10:27,819 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:ThriftBinaryCLIService is started. 2019-04-17T01:10:27,819 ERROR [main]: service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2 java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more 2019-04-17T01:10:27,821 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:ThriftBinaryCLIService is stopped. 2019-04-17T01:10:27,821 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:OperationManager is stopped. 2019-04-17T01:10:27,821 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:SessionManager is stopped. 2019-04-17T01:10:27,821 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:CLIService is stopped. 2019-04-17T01:10:27,822 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:10:27,822 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:10:27,822 ERROR [main]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1090) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] 2019-04-17T01:10:27,822 INFO [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:10:27,822 WARN [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 3, will retry in 60000ms org.apache.hive.service.ServiceException: Failed to Start HiveServer2 at org.apache.hive.service.CompositeService.start(CompositeService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more 2019-04-17T01:10:27,825 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1555463427824 for hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fe9fea19-44ff-41f7-9c43-3af830e31653-resources/hive-hcatalog-core.jar 2019-04-17T01:10:27,826 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Created new resources: null 2019-04-17T01:10:27,826 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Jar dir is null / directory doesn't exist. Choosing HIVE_INSTALL_DIR - /user/hive/.hiveJars 2019-04-17T01:10:27,828 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1554360664415 for hdfs://ip-172-31-18-160.ec2.internal:8020/user/hive/.hiveJars/hive-exec-3.1.0.3.1.0.0-78-e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d.jar 2019-04-17T01:10:27,842 WARN [HiveMaterializedViewsRegistry-0]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:10:27,842 WARN [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:10:27,843 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:10:27,843 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 5 2019-04-17T01:10:27,844 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:10:27,844 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:10:27,845 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez Client Version: [ component=tez-api, version=0.9.1.3.1.0.0-78, revision=346318364f71536cb051ee88e9ee84e55b7e3e13, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2018-12-06T12:19:14Z ] 2019-04-17T01:10:27,845 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Opening new Tez Session (id: fe9fea19-44ff-41f7-9c43-3af830e31653, scratch dir: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fe9fea19-44ff-41f7-9c43-3af830e31653) 2019-04-17T01:10:27,856 INFO [HiveMaterializedViewsRegistry-0]: client.RMProxy (:()) - Connecting to ResourceManager at ip-172-31-18-160.ec2.internal/172.31.18.160:8050 2019-04-17T01:10:27,856 INFO [HiveMaterializedViewsRegistry-0]: client.AHSProxy (:()) - Connecting to Application History server at ip-172-31-18-160.ec2.internal/172.31.18.160:10200 2019-04-17T01:10:27,856 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Session mode. Starting session. 2019-04-17T01:10:27,858 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris value from configuration: /hdp/apps/3.1.0.0-78/tez/tez.tar.gz 2019-04-17T01:10:27,858 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris.classpath value from configuration: null 2019-04-17T01:10:27,864 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez system stage directory hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fe9fea19-44ff-41f7-9c43-3af830e31653/.tez/application_1555392406623_0038 doesn't exist and is created 2019-04-17T01:10:28,492 INFO [HiveMaterializedViewsRegistry-0]: impl.YarnClientImpl (:()) - Submitted application application_1555392406623_0038 2019-04-17T01:10:28,493 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - The url to track the Tez Session: http://ip-172-31-18-160.ec2.internal:8088/proxy/application_1555392406623_0038/ 2019-04-17T01:10:32,633 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7617681716708514783json to /tmp/report.json 2019-04-17T01:10:32,633 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7617681716708514783json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:32,740 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7728055940402728107json to /tmp/report.json 2019-04-17T01:10:32,740 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7728055940402728107json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:37,635 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6072641295092920013json to /tmp/report.json 2019-04-17T01:10:37,635 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics6072641295092920013json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:37,741 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2093889558246559773json to /tmp/report.json 2019-04-17T01:10:37,742 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2093889558246559773json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:42,636 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4779942468267241663json to /tmp/report.json 2019-04-17T01:10:42,637 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics4779942468267241663json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:42,743 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8363593171258730160json to /tmp/report.json 2019-04-17T01:10:42,743 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8363593171258730160json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:47,638 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5674945121156234888json to /tmp/report.json 2019-04-17T01:10:47,638 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5674945121156234888json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:47,745 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6301524285587455246json to /tmp/report.json 2019-04-17T01:10:47,745 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics6301524285587455246json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:52,640 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5928388521453777834json to /tmp/report.json 2019-04-17T01:10:52,640 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5928388521453777834json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:52,747 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3998771663049405729json to /tmp/report.json 2019-04-17T01:10:52,747 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3998771663049405729json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:57,642 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7841051791315388471json to /tmp/report.json 2019-04-17T01:10:57,642 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7841051791315388471json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:10:57,748 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1614049667411981579json to /tmp/report.json 2019-04-17T01:10:57,748 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1614049667411981579json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:02,644 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1347692878915909441json to /tmp/report.json 2019-04-17T01:11:02,644 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1347692878915909441json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:02,750 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6098445840206718624json to /tmp/report.json 2019-04-17T01:11:02,750 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics6098445840206718624json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:07,646 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3070985978251719459json to /tmp/report.json 2019-04-17T01:11:07,646 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3070985978251719459json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:07,752 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8071977577659313581json to /tmp/report.json 2019-04-17T01:11:07,752 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8071977577659313581json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:12,648 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2859012671290458186json to /tmp/report.json 2019-04-17T01:11:12,648 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2859012671290458186json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:12,754 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3413205988485278929json to /tmp/report.json 2019-04-17T01:11:12,754 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3413205988485278929json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:17,649 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7831639950784620955json to /tmp/report.json 2019-04-17T01:11:17,649 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7831639950784620955json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:17,755 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5416004716069800456json to /tmp/report.json 2019-04-17T01:11:17,755 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5416004716069800456json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:22,651 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5652160471276326181json to /tmp/report.json 2019-04-17T01:11:22,651 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5652160471276326181json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:22,757 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7121999525787448494json to /tmp/report.json 2019-04-17T01:11:22,757 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7121999525787448494json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:27,653 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3819856121431647525json to /tmp/report.json 2019-04-17T01:11:27,653 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3819856121431647525json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:27,759 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7322367036728052709json to /tmp/report.json 2019-04-17T01:11:27,759 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7322367036728052709json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:27,823 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2 2019-04-17T01:11:27,869 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:11:27,869 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:11:27,870 INFO [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json 2019-04-17T01:11:27,872 WARN [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized! 2019-04-17T01:11:27,873 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 23 more 2019-04-17T01:11:27,873 WARN [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system. Metrics may not be reported. java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more 2019-04-17T01:11:27,874 INFO [main]: SessionState (:()) - Hive Session ID = 5495d6e2-e631-4101-858b-80fcd79e5db5 2019-04-17T01:11:27,875 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3864621721685338845json to /tmp/report.json 2019-04-17T01:11:27,875 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3864621721685338845json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:27,882 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/5495d6e2-e631-4101-858b-80fcd79e5db5 2019-04-17T01:11:27,883 INFO [main]: session.SessionState (:()) - Created local directory: /tmp/hive/5495d6e2-e631-4101-858b-80fcd79e5db5 2019-04-17T01:11:27,884 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/5495d6e2-e631-4101-858b-80fcd79e5db5/_tmp_space.db 2019-04-17T01:11:27,884 WARN [main]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:11:27,884 WARN [main]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:11:27,885 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:11:27,886 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 6 2019-04-17T01:11:27,887 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:11:27,887 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:11:27,887 INFO [main]: service.CompositeService (:()) - Operation log root directory is created: /tmp/hive/operation_logs 2019-04-17T01:11:27,887 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread pool size: 100 2019-04-17T01:11:27,887 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread wait queue size: 100 2019-04-17T01:11:27,887 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread keepalive time: 10 seconds 2019-04-17T01:11:27,887 INFO [main]: service.CompositeService (:()) - Connections limit are user: 0 ipaddress: 0 user-ipaddress: 0 2019-04-17T01:11:27,887 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:OperationManager is inited. 2019-04-17T01:11:27,888 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:SessionManager is inited. 2019-04-17T01:11:27,888 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:CLIService is inited. 2019-04-17T01:11:27,888 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:ThriftBinaryCLIService is inited. 2019-04-17T01:11:27,888 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:HiveServer2 is inited. 2019-04-17T01:11:27,936 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:11:27,936 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:11:27,937 INFO [main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl 2019-04-17T01:11:27,937 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 5 2019-04-17T01:11:27,937 INFO [main]: server.HiveServer2 (HiveServer2.java:init(315)) - Starting Web UI on port 10002 2019-04-17T01:11:27,938 INFO [HiveMaterializedViewsRegistry-0]: SessionState (:()) - Hive Session ID = fc4902a2-3646-42f0-99d4-4f20129ebc86 2019-04-17T01:11:27,939 INFO [main]: server.HiveServer2 (HiveServer2.java:init(370)) - CORS enabled - allowed-origins: * allowed-methods: GET,POST,DELETE,HEAD allowed-headers: X-Requested-With,Content-Type,Accept,Origin,X-Requested-By,x-requested-by 2019-04-17T01:11:27,942 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/fc4902a2-3646-42f0-99d4-4f20129ebc86 2019-04-17T01:11:27,942 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created local directory: /tmp/hive/fc4902a2-3646-42f0-99d4-4f20129ebc86 2019-04-17T01:11:27,944 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/fc4902a2-3646-42f0-99d4-4f20129ebc86/_tmp_space.db 2019-04-17T01:11:27,944 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - User of session id fc4902a2-3646-42f0-99d4-4f20129ebc86 is hive 2019-04-17T01:11:27,947 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fc4902a2-3646-42f0-99d4-4f20129ebc86-resources/hive-hcatalog-core.jar 2019-04-17T01:11:27,952 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:OperationManager is started. 2019-04-17T01:11:27,952 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:SessionManager is started. 2019-04-17T01:11:27,952 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:CLIService is started. 2019-04-17T01:11:27,952 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:ThriftBinaryCLIService is started. 2019-04-17T01:11:27,952 ERROR [main]: service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2 java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more 2019-04-17T01:11:27,953 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:ThriftBinaryCLIService is stopped. 2019-04-17T01:11:27,954 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:OperationManager is stopped. 2019-04-17T01:11:27,954 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:SessionManager is stopped. 2019-04-17T01:11:27,955 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:CLIService is stopped. 2019-04-17T01:11:27,955 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:11:27,955 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:11:27,955 ERROR [main]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1090) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] 2019-04-17T01:11:27,956 INFO [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:11:27,956 WARN [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1100)) - Error starting HiveServer2 on attempt 4, will retry in 60000ms org.apache.hive.service.ServiceException: Failed to Start HiveServer2 at org.apache.hive.service.CompositeService.start(CompositeService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 11 more 2019-04-17T01:11:27,958 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1555463487957 for hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fc4902a2-3646-42f0-99d4-4f20129ebc86-resources/hive-hcatalog-core.jar 2019-04-17T01:11:27,959 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Created new resources: null 2019-04-17T01:11:27,960 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Jar dir is null / directory doesn't exist. Choosing HIVE_INSTALL_DIR - /user/hive/.hiveJars 2019-04-17T01:11:27,961 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1554360664415 for hdfs://ip-172-31-18-160.ec2.internal:8020/user/hive/.hiveJars/hive-exec-3.1.0.3.1.0.0-78-e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d.jar 2019-04-17T01:11:27,974 WARN [HiveMaterializedViewsRegistry-0]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:11:27,974 WARN [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:11:27,975 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:11:27,975 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 6 2019-04-17T01:11:27,976 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:11:27,976 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:11:27,976 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez Client Version: [ component=tez-api, version=0.9.1.3.1.0.0-78, revision=346318364f71536cb051ee88e9ee84e55b7e3e13, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2018-12-06T12:19:14Z ] 2019-04-17T01:11:27,976 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Opening new Tez Session (id: fc4902a2-3646-42f0-99d4-4f20129ebc86, scratch dir: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fc4902a2-3646-42f0-99d4-4f20129ebc86) 2019-04-17T01:11:27,987 INFO [HiveMaterializedViewsRegistry-0]: client.RMProxy (:()) - Connecting to ResourceManager at ip-172-31-18-160.ec2.internal/172.31.18.160:8050 2019-04-17T01:11:27,987 INFO [HiveMaterializedViewsRegistry-0]: client.AHSProxy (:()) - Connecting to Application History server at ip-172-31-18-160.ec2.internal/172.31.18.160:10200 2019-04-17T01:11:27,987 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Session mode. Starting session. 2019-04-17T01:11:27,989 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris value from configuration: /hdp/apps/3.1.0.0-78/tez/tez.tar.gz 2019-04-17T01:11:27,989 INFO [HiveMaterializedViewsRegistry-0]: client.TezClientUtils (:()) - Using tez.lib.uris.classpath value from configuration: null 2019-04-17T01:11:27,995 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez system stage directory hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/fc4902a2-3646-42f0-99d4-4f20129ebc86/.tez/application_1555392406623_0039 doesn't exist and is created 2019-04-17T01:11:28,220 INFO [HiveMaterializedViewsRegistry-0]: impl.YarnClientImpl (:()) - Submitted application application_1555392406623_0039 2019-04-17T01:11:28,220 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - The url to track the Tez Session: http://ip-172-31-18-160.ec2.internal:8088/proxy/application_1555392406623_0039/ 2019-04-17T01:11:32,654 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6146502396215297414json to /tmp/report.json 2019-04-17T01:11:32,654 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics6146502396215297414json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:32,761 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics9705967834908972json to /tmp/report.json 2019-04-17T01:11:32,761 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics9705967834908972json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:32,877 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4491915480400010003json to /tmp/report.json 2019-04-17T01:11:32,877 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics4491915480400010003json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:37,656 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5325189314630372256json to /tmp/report.json 2019-04-17T01:11:37,656 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5325189314630372256json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:37,762 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8621325396044572079json to /tmp/report.json 2019-04-17T01:11:37,762 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8621325396044572079json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:37,879 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1722421423051495486json to /tmp/report.json 2019-04-17T01:11:37,879 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1722421423051495486json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:42,658 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics6781420932194098355json to /tmp/report.json 2019-04-17T01:11:42,658 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics6781420932194098355json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:42,764 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3902444372442997470json to /tmp/report.json 2019-04-17T01:11:42,764 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3902444372442997470json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:42,880 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7004293282169637430json to /tmp/report.json 2019-04-17T01:11:42,881 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7004293282169637430json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:47,659 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8685424800695723197json to /tmp/report.json 2019-04-17T01:11:47,659 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8685424800695723197json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:47,766 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1943377055916759117json to /tmp/report.json 2019-04-17T01:11:47,766 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1943377055916759117json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:47,882 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8720128880839064902json to /tmp/report.json 2019-04-17T01:11:47,882 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8720128880839064902json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:52,661 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2122523010908252182json to /tmp/report.json 2019-04-17T01:11:52,661 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2122523010908252182json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:52,767 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7329876532277262371json to /tmp/report.json 2019-04-17T01:11:52,767 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7329876532277262371json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:52,884 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics9136731153057023792json to /tmp/report.json 2019-04-17T01:11:52,884 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics9136731153057023792json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:57,663 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1967008818244221260json to /tmp/report.json 2019-04-17T01:11:57,663 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1967008818244221260json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:57,769 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8797228655684381936json to /tmp/report.json 2019-04-17T01:11:57,769 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8797228655684381936json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:11:57,885 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2690395900592303182json to /tmp/report.json 2019-04-17T01:11:57,885 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2690395900592303182json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:02,664 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1090123824047679459json to /tmp/report.json 2019-04-17T01:12:02,664 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1090123824047679459json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:02,770 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics4113359161864779506json to /tmp/report.json 2019-04-17T01:12:02,771 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics4113359161864779506json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:02,887 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2702897738175936722json to /tmp/report.json 2019-04-17T01:12:02,887 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2702897738175936722json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:07,666 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics2155191385104415349json to /tmp/report.json 2019-04-17T01:12:07,666 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics2155191385104415349json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:07,772 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics449228354675598336json to /tmp/report.json 2019-04-17T01:12:07,772 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics449228354675598336json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:07,888 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3149487691156144471json to /tmp/report.json 2019-04-17T01:12:07,889 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3149487691156144471json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:12,668 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1316064787078145329json to /tmp/report.json 2019-04-17T01:12:12,668 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1316064787078145329json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:12,774 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1582404787508882913json to /tmp/report.json 2019-04-17T01:12:12,774 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1582404787508882913json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:12,890 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics52763116048147903json to /tmp/report.json 2019-04-17T01:12:12,890 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics52763116048147903json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:17,670 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics5356543667226963622json to /tmp/report.json 2019-04-17T01:12:17,670 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics5356543667226963622json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:17,776 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics1430457667663370736json to /tmp/report.json 2019-04-17T01:12:17,776 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics1430457667663370736json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:17,892 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7754382687076133691json to /tmp/report.json 2019-04-17T01:12:17,892 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7754382687076133691json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:22,672 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics101661494968237589json to /tmp/report.json 2019-04-17T01:12:22,672 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics101661494968237589json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:22,777 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics7696015773206560344json to /tmp/report.json 2019-04-17T01:12:22,777 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics7696015773206560344json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:22,893 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8797118696939297027json to /tmp/report.json 2019-04-17T01:12:22,893 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8797118696939297027json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:27,673 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics405067905911138249json to /tmp/report.json 2019-04-17T01:12:27,673 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics405067905911138249json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:27,779 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics935406733891801711json to /tmp/report.json 2019-04-17T01:12:27,779 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics935406733891801711json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:27,895 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics3767892062330475062json to /tmp/report.json 2019-04-17T01:12:27,895 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics3767892062330475062json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:27,956 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(1056)) - Starting HiveServer2 2019-04-17T01:12:28,001 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:12:28,001 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:12:28,002 INFO [main]: metrics2.JsonFileMetricsReporter (:()) - Reporting metrics to /tmp/report.json 2019-04-17T01:12:28,004 WARN [main]: impl.MetricsSystemImpl (MetricsSystemImpl.java:init(151)) - hiveserver2 metrics system already initialized! 2019-04-17T01:12:28,004 WARN [main]: server.HiveServer2 (HiveServer2.java:init(209)) - Could not initiate the HiveServer2 Metrics system. Metrics may not be reported. java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: java.lang.IllegalArgumentException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:437) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 16 more 2019-04-17T01:12:28,004 ERROR [main]: metrics2.CodahaleMetrics (:()) - Unable to instantiate using constructor(MetricRegistry, HiveConf) for reporter org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter from conf HIVE_CODAHALE_METRICS_REPORTER_CLASSES java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initCodahaleMetricsReporterClasses(CodahaleMetrics.java:429) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.initReporting(CodahaleMetrics.java:396) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.metrics.metrics2.CodahaleMetrics.(CodahaleMetrics.java:196) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.common.MetricsFactory.init(MetricsFactory.java:42) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:206) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1072) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.hadoop.metrics2.MetricsException: Metrics source hiveserver2 already exists! at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:152) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:125) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:206) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter.(HadoopMetrics2Reporter.java:62) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at com.github.joshelser.dropwizard.metrics.hadoop.HadoopMetrics2Reporter$Builder.build(HadoopMetrics2Reporter.java:162) ~[dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:?] at org.apache.hadoop.hive.common.metrics.metrics2.Metrics2Reporter.(Metrics2Reporter.java:45) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 23 more 2019-04-17T01:12:28,006 INFO [main]: SessionState (:()) - Hive Session ID = c6a8af92-2543-47be-8bfb-9756ca8a2ba6 2019-04-17T01:12:28,008 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Unable to rename temp file /tmp/hmetrics8382453847149919053json to /tmp/report.json 2019-04-17T01:12:28,008 ERROR [json-metric-reporter]: metrics2.JsonFileMetricsReporter (:()) - Exception during rename java.nio.file.FileSystemException: /tmp/hmetrics8382453847149919053json -> /tmp/report.json: Operation not permitted at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[?:1.8.0_112] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_112] at sun.nio.fs.UnixCopyFile.move(UnixCopyFile.java:396) ~[?:1.8.0_112] at sun.nio.fs.UnixFileSystemProvider.move(UnixFileSystemProvider.java:262) ~[?:1.8.0_112] at java.nio.file.Files.move(Files.java:1395) ~[?:1.8.0_112] at org.apache.hadoop.hive.common.metrics.metrics2.JsonFileMetricsReporter.run(JsonFileMetricsReporter.java:175) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_112] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,011 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/c6a8af92-2543-47be-8bfb-9756ca8a2ba6 2019-04-17T01:12:28,012 INFO [main]: session.SessionState (:()) - Created local directory: /tmp/hive/c6a8af92-2543-47be-8bfb-9756ca8a2ba6 2019-04-17T01:12:28,013 INFO [main]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/c6a8af92-2543-47be-8bfb-9756ca8a2ba6/_tmp_space.db 2019-04-17T01:12:28,014 WARN [main]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:12:28,014 WARN [main]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:12:28,015 INFO [main]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:12:28,016 INFO [main]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 7 2019-04-17T01:12:28,019 INFO [main]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:12:28,019 INFO [main]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:12:28,020 INFO [main]: service.CompositeService (:()) - Operation log root directory is created: /tmp/hive/operation_logs 2019-04-17T01:12:28,020 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread pool size: 100 2019-04-17T01:12:28,020 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread wait queue size: 100 2019-04-17T01:12:28,020 INFO [main]: service.CompositeService (:()) - HiveServer2: Background operation thread keepalive time: 10 seconds 2019-04-17T01:12:28,020 INFO [main]: service.CompositeService (:()) - Connections limit are user: 0 ipaddress: 0 user-ipaddress: 0 2019-04-17T01:12:28,021 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:OperationManager is inited. 2019-04-17T01:12:28,021 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:SessionManager is inited. 2019-04-17T01:12:28,021 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:CLIService is inited. 2019-04-17T01:12:28,022 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:ThriftBinaryCLIService is inited. 2019-04-17T01:12:28,022 INFO [main]: service.AbstractService (AbstractService.java:init(90)) - Service:HiveServer2 is inited. 2019-04-17T01:12:28,081 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.stats.fetch.partition.stats does not exist 2019-04-17T01:12:28,081 WARN [main]: conf.HiveConf (HiveConf.java:initialize(5310)) - HiveConf of name hive.heapsize does not exist 2019-04-17T01:12:28,082 INFO [main]: metastore.HiveMetaStoreClient (:()) - Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl 2019-04-17T01:12:28,082 INFO [main]: metastore.HiveMetaStoreClient (:()) - Closed a connection to metastore, current connections: 6 2019-04-17T01:12:28,083 INFO [main]: server.HiveServer2 (HiveServer2.java:init(315)) - Starting Web UI on port 10002 2019-04-17T01:12:28,083 INFO [HiveMaterializedViewsRegistry-0]: SessionState (:()) - Hive Session ID = d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9 2019-04-17T01:12:28,085 INFO [main]: server.HiveServer2 (HiveServer2.java:init(370)) - CORS enabled - allowed-origins: * allowed-methods: GET,POST,DELETE,HEAD allowed-headers: X-Requested-With,Content-Type,Accept,Origin,X-Requested-By,x-requested-by 2019-04-17T01:12:28,087 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9 2019-04-17T01:12:28,088 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created local directory: /tmp/hive/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9 2019-04-17T01:12:28,089 INFO [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - Created HDFS directory: /tmp/hive/hive/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9/_tmp_space.db 2019-04-17T01:12:28,089 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - User of session id d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9 is hive 2019-04-17T01:12:28,092 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9-resources/hive-hcatalog-core.jar 2019-04-17T01:12:28,102 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:OperationManager is started. 2019-04-17T01:12:28,102 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:SessionManager is started. 2019-04-17T01:12:28,103 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:CLIService is started. 2019-04-17T01:12:28,103 INFO [main]: service.AbstractService (AbstractService.java:start(109)) - Service:ThriftBinaryCLIService is started. 2019-04-17T01:12:28,103 ERROR [main]: service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2 java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 13 more 2019-04-17T01:12:28,104 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:ThriftBinaryCLIService is stopped. 2019-04-17T01:12:28,104 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:OperationManager is stopped. 2019-04-17T01:12:28,104 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:SessionManager is stopped. 2019-04-17T01:12:28,104 INFO [main]: service.AbstractService (AbstractService.java:stop(130)) - Service:CLIService is stopped. 2019-04-17T01:12:28,104 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,105 INFO [main]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,105 ERROR [main]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1090) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] 2019-04-17T01:12:28,105 INFO [main]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,105 ERROR [main]: server.HiveServer2 (HiveServer2.java:execute(1343)) - Error starting HiveServer2 java.lang.Error: Max start attempts 5 exhausted at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1098) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.access$1700(HiveServer2.java:135) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:1341) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:1185) [hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.0.0-78.jar:?] Caused by: org.apache.hive.service.ServiceException: Failed to Start HiveServer2 at org.apache.hive.service.CompositeService.start(CompositeService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 9 more Caused by: java.lang.RuntimeException: Failed to init thrift server at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:162) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 9 more Caused by: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:2181. at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:109) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 9 more Caused by: java.net.BindException: Address already in use (Bind failed) at java.net.PlainSocketImpl.socketBind(Native Method) ~[?:1.8.0_112] at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) ~[?:1.8.0_112] at java.net.ServerSocket.bind(ServerSocket.java:375) ~[?:1.8.0_112] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:106) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:91) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.thrift.transport.TServerSocket.(TServerSocket.java:87) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.common.auth.HiveAuthUtils.getServerSocket(HiveAuthUtils.java:87) ~[hive-common-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.initServer(ThriftBinaryCLIService.java:80) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.thrift.ThriftCLIService.start(ThriftCLIService.java:216) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.CompositeService.start(CompositeService.java:70) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:706) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:1073) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] ... 9 more 2019-04-17T01:12:28,107 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - ==> JVMShutdownHook.run() 2019-04-17T01:12:28,107 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - JVMShutdownHook: Signalling async audit cleanup to start. 2019-04-17T01:12:28,107 INFO [Ranger async Audit cleanup]: provider.AuditProviderFactory (:()) - RangerAsyncAuditCleanup: Starting cleanup 2019-04-17T01:12:28,107 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - JVMShutdownHook: Waiting up to 30 seconds for audit cleanup to finish. 2019-04-17T01:12:28,107 INFO [Ranger async Audit cleanup]: destination.HDFSAuditDestination (:()) - Flush called. name=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,107 INFO [Ranger async Audit cleanup]: queue.AuditAsyncQueue (:()) - Stop called. name=hiveServer2.async 2019-04-17T01:12:28,108 INFO [Ranger async Audit cleanup]: queue.AuditAsyncQueue (:()) - Interrupting consumerThread. name=hiveServer2.async, consumer=hiveServer2.async.multi_dest 2019-04-17T01:12:28,108 INFO [Ranger async Audit cleanup]: provider.AuditProviderFactory (:()) - RangerAsyncAuditCleanup: Done cleanup 2019-04-17T01:12:28,108 INFO [Ranger async Audit cleanup]: provider.AuditProviderFactory (:()) - RangerAsyncAuditCleanup: Waiting to audit cleanup start signal 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditAsyncQueue (:()) - Caught exception in consumer thread. Shutdown might be in progress 2019-04-17T01:12:28,108 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - JVMShutdownHook: Audit cleanup finished after 1 milli seconds 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditAsyncQueue (:()) - Exiting polling loop. name=hiveServer2.async 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditAsyncQueue (:()) - Calling to stop consumer. name=hiveServer2.async, consumer.name=hiveServer2.async.multi_dest 2019-04-17T01:12:28,108 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - JVMShutdownHook: Interrupting ranger async audit cleanup thread 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditBatchQueue (:()) - Stop called. name=hiveServer2.async.multi_dest.batch 2019-04-17T01:12:28,108 INFO [shutdown-hook-0]: provider.AuditProviderFactory (:()) - <== JVMShutdownHook.run() 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditBatchQueue (:()) - Interrupting consumerThread. name=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.solr 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditBatchQueue (:()) - Stop called. name=hiveServer2.async.multi_dest.batch 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: destination.HDFSAuditDestination (:()) - Flush called. name=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,108 INFO [Ranger async Audit cleanup]: provider.AuditProviderFactory (:()) - RangerAsyncAuditCleanup: Interrupted while waiting for audit startCleanup signal! Exiting the thread... java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) ~[?:1.8.0_112] at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) ~[?:1.8.0_112] at java.util.concurrent.Semaphore.acquire(Semaphore.java:312) ~[?:1.8.0_112] at org.apache.ranger.audit.provider.AuditProviderFactory$RangerAsyncAuditCleanup.run(AuditProviderFactory.java:495) ~[?:?] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditBatchQueue (:()) - Interrupting consumerThread. name=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditAsyncQueue0]: queue.AuditAsyncQueue (:()) - Exiting consumerThread.run() method. name=hiveServer2.async 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue0]: queue.AuditBatchQueue (:()) - Caught exception in consumer thread. Shutdown might be in progress 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue1]: queue.AuditBatchQueue (:()) - Caught exception in consumer thread. Shutdown might be in progress 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue0]: queue.AuditBatchQueue (:()) - Exiting consumerThread. Queue=hiveServer2.async.multi_dest.batch, dest=hiveServer2.async.multi_dest.batch.solr 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue1]: queue.AuditBatchQueue (:()) - Exiting consumerThread. Queue=hiveServer2.async.multi_dest.batch, dest=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue0]: queue.AuditBatchQueue (:()) - Calling to stop consumer. name=hiveServer2.async.multi_dest.batch, consumer.name=hiveServer2.async.multi_dest.batch.solr 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue1]: queue.AuditBatchQueue (:()) - Calling to stop consumer. name=hiveServer2.async.multi_dest.batch, consumer.name=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue0]: queue.AuditFileSpool (:()) - Stop called, queueName=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.solr 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue1]: queue.AuditFileSpool (:()) - Stop called, queueName=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_dest.batch.hdfs 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue0]: queue.AuditBatchQueue (:()) - Exiting consumerThread.run() method. name=hiveServer2.async.multi_dest.batch 2019-04-17T01:12:28,108 INFO [org.apache.ranger.audit.queue.AuditBatchQueue1]: queue.AuditBatchQueue (:()) - Exiting consumerThread.run() method. name=hiveServer2.async.multi_dest.batch 2019-04-17T01:12:28,108 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_dest.batch.solr_destWriter]: queue.AuditFileSpool (:()) - Caught exception in consumer thread. Shutdown might be in progress 2019-04-17T01:12:28,108 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_dest.batch.hdfs_destWriter]: queue.AuditFileSpool (:()) - Caught exception in consumer thread. Shutdown might be in progress 2019-04-17T01:12:28,505 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1555463548503 for hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9-resources/hive-hcatalog-core.jar 2019-04-17T01:12:28,505 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Created new resources: null 2019-04-17T01:12:28,523 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Jar dir is null / directory doesn't exist. Choosing HIVE_INSTALL_DIR - /user/hive/.hiveJars 2019-04-17T01:12:28,524 INFO [HiveMaterializedViewsRegistry-0]: tez.DagUtils (:()) - Resource modification time: 1554360664415 for hdfs://ip-172-31-18-160.ec2.internal:8020/user/hive/.hiveJars/hive-exec-3.1.0.3.1.0.0-78-e352011a83e63f76afe26d6047287402e4b48656f467ef20f44dc74ea57c960d.jar 2019-04-17T01:12:28,535 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,536 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,536 ERROR [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.lambda$init$0(HiveServer2.java:408) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,536 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,537 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,537 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,537 ERROR [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.lambda$init$0(HiveServer2.java:408) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,537 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,538 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,538 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,538 ERROR [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.lambda$init$0(HiveServer2.java:408) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,538 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,538 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,539 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,539 ERROR [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.lambda$init$0(HiveServer2.java:408) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,539 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,539 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(913)) - Shutting down HiveServer2 2019-04-17T01:12:28,539 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(925)) - Web UI has stopped 2019-04-17T01:12:28,539 ERROR [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stop(944)) - Error removing znode for this HiveServer2 instance from ZooKeeper. java.lang.NullPointerException: null at org.apache.hive.service.server.HiveServer2.removeServerInstanceFromZooKeeper(HiveServer2.java:677) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:942) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.server.HiveServer2.lambda$init$0(HiveServer2.java:408) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2019-04-17T01:12:28,540 INFO [shutdown-hook-0]: server.HiveServer2 (HiveServer2.java:stopOrDisconnectTezSessions(890)) - Stopping/Disconnecting tez sessions. 2019-04-17T01:12:28,540 INFO [shutdown-hook-0]: server.HiveServer2 (HiveStringUtils.java:run(785)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down HiveServer2 at ip-172-31-18-160.ec2.internal/172.31.18.160 ************************************************************/ 2019-04-17T01:12:28,545 WARN [HiveMaterializedViewsRegistry-0]: authorizer.RangerHiveAuthorizerBase (:()) - RangerHiveAuthorizerBase.RangerHiveAuthorizerBase(): hiveAuthenticator.getUserName() returned null/empty 2019-04-17T01:12:28,545 WARN [HiveMaterializedViewsRegistry-0]: session.SessionState (:()) - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory. 2019-04-17T01:12:28,546 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Trying to connect to metastore with URI thrift://ip-172-31-18-160.ec2.internal:9083 2019-04-17T01:12:28,546 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Opened a connection to metastore, current connections: 7 2019-04-17T01:12:28,547 INFO [HiveMaterializedViewsRegistry-0]: metastore.HiveMetaStoreClient (:()) - Connected to metastore. 2019-04-17T01:12:28,547 INFO [HiveMaterializedViewsRegistry-0]: metastore.RetryingMetaStoreClient (:()) - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=24 delay=5 lifetime=0 2019-04-17T01:12:28,548 INFO [HiveMaterializedViewsRegistry-0]: client.TezClient (:()) - Tez Client Version: [ component=tez-api, version=0.9.1.3.1.0.0-78, revision=346318364f71536cb051ee88e9ee84e55b7e3e13, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2018-12-06T12:19:14Z ] 2019-04-17T01:12:28,548 INFO [HiveMaterializedViewsRegistry-0]: tez.TezSessionState (:()) - Opening new Tez Session (id: d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9, scratch dir: hdfs://ip-172-31-18-160.ec2.internal:8020/tmp/hive/hive/_tez_session_dir/d2727e2d-92f1-4dda-b59f-35e4dcc4c6d9)