LogType:stderr Log Upload Time:Sat Mar 18 23:31:47 +0530 2017 LogLength:0 Log Contents: End of LogType:stderr LogType:stdout Log Upload Time:Sat Mar 18 23:31:47 +0530 2017 LogLength:0 Log Contents: End of LogType:stdout LogType:syslog Log Upload Time:Sat Mar 18 23:31:47 +0530 2017 LogLength:44902 Log Contents: 2017-03-18 22:43:13,558 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties 2017-03-18 22:43:13,622 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2017-03-18 22:43:13,623 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started 2017-03-18 22:43:13,630 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens: 2017-03-18 22:43:13,630 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1489855879213_0001, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@5524cca1) 2017-03-18 22:43:13,787 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now. 2017-03-18 22:43:14,001 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001 2017-03-18 22:43:14,205 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id 2017-03-18 22:43:14,574 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ] 2017-03-18 22:43:14,601 WARN [main] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:43:14,725 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: org.apache.hive.hcatalog.templeton.tool.NullSplit@d78795 2017-03-18 22:43:14,731 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name 2017-03-18 22:43:14,733 INFO [main] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: Launch time = 1489857182857 2017-03-18 22:43:15,054 INFO [main] org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl: Timeline service address: http://pc-1.thenet.edu:8188/ws/v1/timeline/ 2017-03-18 22:43:15,062 INFO [main] org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at pc-1.thenet.edu/127.0.1.1:8050 2017-03-18 22:43:15,146 INFO [main] org.apache.hadoop.yarn.client.AHSProxy: Connecting to Application History server at pc-1.thenet.edu/127.0.1.1:10200 2017-03-18 22:43:15,196 INFO [main] org.apache.hadoop.mapred.WebHCatJTShim23: Looking for jobs to kill... 2017-03-18 22:43:15,196 INFO [main] org.apache.hadoop.mapred.WebHCatJTShim23: Querying RM for tag = job_1489855879213_0001, starting with ts = 1489857182857 2017-03-18 22:43:15,229 INFO [main] org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at pc-1.thenet.edu/127.0.1.1:8050 2017-03-18 22:43:15,247 INFO [main] org.apache.hadoop.mapred.WebHCatJTShim23: No jobs found from 2017-03-18 22:43:15,250 INFO [main] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: templeton: copy hdfs://pc-1.thenet.edu:8020/user/subrata/pig/jobs/riskfactorpig_18-03-2017-22-43-01/script.pig => script.pig 2017-03-18 22:43:15,260 INFO [main] org.apache.hive.hcatalog.templeton.tool.TrivialExecService: run(cmd, removeEnv, environmentVariables) 2017-03-18 22:43:15,260 INFO [main] org.apache.hive.hcatalog.templeton.tool.TrivialExecService: Starting cmd: [pig.tar.gz/pig/bin/pig, -Dmapreduce.job.credentials.binary=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/container_tokens, -Dmapreduce.job.tags=job_1489855879213_0001, -useHCatalog, -x, tez, -file, script.pig] 2017-03-18 22:43:15,261 INFO [main] org.apache.hive.hcatalog.templeton.tool.TrivialExecService: Removing env var: HADOOP_ROOT_LOGGER=INFO,console 2017-03-18 22:43:15,262 INFO [main] org.apache.hive.hcatalog.templeton.tool.TrivialExecService: START========Starting process with env:========: CLASSPATH= /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/mapreduce/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/mapreduce/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/common/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/common/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/yarn/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/yarn/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/hdfs/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/hdfs/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework/hadoop/share/hadoop/tools/lib/*: /usr/hdp/2.5.3.0-37/hadoop/lib/hadoop-lzo-0.6.0.2.5.3.0-37.jar: /etc/hadoop/conf/secure: job.jar/job.jar: job.jar/classes/: job.jar/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/*: CONTAINER_ID=container_e24_1489855879213_0001_01_000002 HADOOP_CLASSPATH= /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002: job.jar/job.jar: job.jar/classes/: job.jar/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000001: job.jar/job.jar: job.jar/classes/: job.jar/lib/*: /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000001/*: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib/hive-webhcat-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr//lib: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//asm-3.1.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//commons-exec-1.1.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//hive-webhcat-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jackson-core-asl-1.9.2.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jackson-jaxrs-1.9.2.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jackson-xc-1.9.2.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jaxb-api-2.2.2.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jaxb-impl-2.2.3-1.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jersey-core-1.14.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jersey-json-1.14.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jersey-server-1.14.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jersey-servlet-1.14.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jettison-1.1.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//jul-to-slf4j-1.7.5.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//stax-api-1.0-2.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//wadl-resourcedoc-doclet-1.4.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//xercesImpl-2.9.1.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../share/webhcat/svr/lib//xml-apis-1.3.04.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/sbin/../etc/webhcat: /usr/hdp/2.5.3.0-37/atlas/hook/hive/*: /usr/hdp/2.5.3.0-37/hive/lib/accumulo-core-1.7.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/accumulo-fate-1.7.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/accumulo-start-1.7.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/accumulo-trace-1.7.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/activation-1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/ant-1.9.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/ant-launcher-1.9.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/antlr-2.7.7.jar: /usr/hdp/2.5.3.0-37/hive/lib/antlr-runtime-3.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/apache-log4j-extras-1.2.17.jar: /usr/hdp/2.5.3.0-37/hive/lib/asm-commons-3.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/asm-tree-3.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/avatica-1.8.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/avatica-metrics-1.8.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/avro-1.7.5.jar: /usr/hdp/2.5.3.0-37/hive/lib/bonecp-0.8.0.RELEASE.jar: /usr/hdp/2.5.3.0-37/hive/lib/calcite-core-1.2.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/calcite-linq4j-1.2.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-cli-1.2.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-codec-1.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-collections-3.2.2.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-compiler-2.7.6.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-compress-1.4.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-dbcp-1.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-httpclient-3.0.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-io-2.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-lang-2.6.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-logging-1.1.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-math-2.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-pool-1.5.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/commons-vfs2-2.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/curator-client-2.6.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/curator-framework-2.6.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/curator-recipes-2.6.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/datanucleus-api-jdo-4.2.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/datanucleus-core-4.1.6.jar: /usr/hdp/2.5.3.0-37/hive/lib/datanucleus-rdbms-4.1.7.jar: /usr/hdp/2.5.3.0-37/hive/lib/derby-10.10.2.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar: /usr/hdp/2.5.3.0-37/hive/lib/eigenbase-properties-1.1.5.jar: /usr/hdp/2.5.3.0-37/hive/lib/geronimo-annotation_1.0_spec-1.1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/geronimo-jaspic_1.0_spec-1.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/geronimo-jta_1.1_spec-1.1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/groovy-all-2.4.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/guava-14.0.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/HikariCP-1.3.9.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-accumulo-handler-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-accumulo-handler.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-ant-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-ant.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-beeline-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-beeline.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-cli-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-cli.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-common-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-common.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-contrib-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-contrib.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-exec-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-exec.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-hbase-handler-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-hbase-handler.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-hwi-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-hwi.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-jdbc-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-jdbc-1.2.1000.2.5.3.0-37-standalone.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-jdbc.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-metastore-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-metastore.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-serde-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-serde.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-service-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-service.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-0.20S-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-0.23-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-common-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-common.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-scheduler-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/hive-shims-scheduler.jar: /usr/hdp/2.5.3.0-37/hive/lib/htrace-core-3.1.0-incubating.jar: /usr/hdp/2.5.3.0-37/hive/lib/httpclient-4.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/httpcore-4.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/ivy-2.4.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/jackson-annotations-2.4.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/jackson-core-2.4.2.jar: /usr/hdp/2.5.3.0-37/hive/lib/jackson-databind-2.4.2.jar: /usr/hdp/2.5.3.0-37/hive/lib/janino-2.7.6.jar: /usr/hdp/2.5.3.0-37/hive/lib/javassist-3.18.1-GA.jar: /usr/hdp/2.5.3.0-37/hive/lib/javax.jdo-3.2.0-m3.jar: /usr/hdp/2.5.3.0-37/hive/lib/jcommander-1.32.jar: /usr/hdp/2.5.3.0-37/hive/lib/jdo-api-3.0.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/jetty-all-7.6.0.v20120127.jar: /usr/hdp/2.5.3.0-37/hive/lib/jetty-all-server-7.6.0.v20120127.jar: /usr/hdp/2.5.3.0-37/hive/lib/jline-2.12.jar: /usr/hdp/2.5.3.0-37/hive/lib/joda-time-2.8.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/jpam-1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/json-20090211.jar: /usr/hdp/2.5.3.0-37/hive/lib/jsr305-3.0.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/jta-1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/libfb303-0.9.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/libthrift-0.9.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/log4j-1.2.16.jar: /usr/hdp/2.5.3.0-37/hive/lib/mail-1.4.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/maven-scm-api-1.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/maven-scm-provider-svn-commons-1.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/maven-scm-provider-svnexe-1.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/metrics-core-3.1.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/metrics-json-3.1.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/metrics-jvm-3.1.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/mysql-connector-java.jar: /usr/hdp/2.5.3.0-37/hive/lib/netty-3.7.0.Final.jar: /usr/hdp/2.5.3.0-37/hive/lib/ojdbc6.jar: /usr/hdp/2.5.3.0-37/hive/lib/opencsv-2.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/oro-2.0.8.jar: /usr/hdp/2.5.3.0-37/hive/lib/paranamer-2.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/parquet-hadoop-bundle-1.8.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar: /usr/hdp/2.5.3.0-37/hive/lib/plexus-utils-1.5.6.jar: /usr/hdp/2.5.3.0-37/hive/lib/protobuf-java-2.5.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/ranger-hive-plugin-shim-0.6.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/lib/regexp-1.3.jar: /usr/hdp/2.5.3.0-37/hive/lib/servlet-api-2.5.jar: /usr/hdp/2.5.3.0-37/hive/lib/snappy-java-1.0.5.jar: /usr/hdp/2.5.3.0-37/hive/lib/ST4-4.0.4.jar: /usr/hdp/2.5.3.0-37/hive/lib/stax-api-1.0.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/stringtemplate-3.2.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/super-csv-2.2.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/transaction-api-1.1.jar: /usr/hdp/2.5.3.0-37/hive/lib/velocity-1.5.jar: /usr/hdp/2.5.3.0-37/hive/lib/xz-1.0.jar: /usr/hdp/2.5.3.0-37/hive/lib/zookeeper.jar: /usr/hdp/2.5.3.0-37/hive-hcatalog/libexec/../share/hcatalog/hive-hcatalog-core-1.2.1000.2.5.3.0-37.jar: /usr/hdp/2.5.3.0-37/hive/conf: jdbc-mysql.jar: mysql-connector-java-5.1.37-bin.jar: mysql-connector-java.jar: /usr/hdp/2.5.3.0-37/tez/*: /usr/hdp/2.5.3.0-37/tez/lib/*: /usr/hdp/2.5.3.0-37/tez/conf: HADOOP_CLIENT_OPTS=-Xmx1024m -Xmx1024m HADOOP_CONF_DIR=/usr/hdp/current/hadoop-client/conf HADOOP_DATANODE_OPTS=-server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT HADOOP_HEAPSIZE=1024 HADOOP_HOME=/usr/hdp/2.5.3.0-37/hadoop HADOOP_HOME_WARN_SUPPRESS=1 HADOOP_IDENT_STRING=yarn HADOOP_LIBEXEC_DIR=/usr/hdp/current/hadoop-client/libexec HADOOP_LOG_DIR=/var/log/hadoop/yarn HADOOP_MAPRED_HOME=/usr/hdp/2.5.3.0-37/hadoop-mapreduce HADOOP_MAPRED_LOG_DIR=/var/log/hadoop-mapreduce/yarn HADOOP_MAPRED_PID_DIR=/var/run/hadoop-mapreduce/yarn HADOOP_NAMENODE_INIT_HEAPSIZE=-Xms1024m HADOOP_NAMENODE_OPTS=-server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=128m -XX:MaxNewSize=128m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -Dorg.mortbay.jetty.Request.maxFormContentSize=-1 -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=128m -XX:MaxNewSize=128m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -Dorg.mortbay.jetty.Request.maxFormContentSize=-1 HADOOP_OPTS=-Dhdp.version=2.5.3.0-37 -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/yarn -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=yarn -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/var/log/hadoop/yarn -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=yarn -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true HADOOP_PID_DIR=/var/run/hadoop/yarn HADOOP_PREFIX=/usr/hdp/2.5.3.0-37/hadoop HADOOP_SECONDARYNAMENODE_OPTS=-server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=128m -XX:MaxNewSize=128m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/yarn/hs_err_pid%p.log -XX:NewSize=128m -XX:MaxNewSize=128m -Xloggc:/var/log/hadoop/yarn/gc.log-201703182224 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms1024m -Xmx1024m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" HADOOP_SECURE_DN_LOG_DIR=/var/log/hadoop/ HADOOP_SECURE_DN_PID_DIR=/var/run/hadoop/ HADOOP_SECURE_DN_USER= HADOOP_SSH_OPTS=-o ConnectTimeout=5 -o SendEnv=HADOOP_CONF_DIR HADOOP_TOKEN_FILE_LOCATION=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/container_tokens HADOOP_USER_NAME=subrata HADOOP_YARN_HOME=/usr/hdp/current/hadoop-yarn-nodemanager HADOOP_YARN_USER=yarn HCAT_HOME=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/hive.tar.gz/hive/hcatalog HDP_VERSION=2.5.3.0-37 HIVE_HOME=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/hive.tar.gz/hive HOME=/home/ JAVA_HOME=/usr/jdk64/jdk1.8.0_77 JAVA_LIBRARY_PATH= /usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64: /usr/hdp/2.5.3.0-37/hadoop/lib/native: /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir: /usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64: /usr/hdp/2.5.3.0-37/hadoop/lib/native: /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir: JSVC_HOME=/usr/lib/bigtop-utils JVM_PID=10780 LANG=en_IN LANGUAGE=en_IN:en LD_LIBRARY_PATH= /hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002: /usr/hdp/2.5.3.0-37/hadoop/lib/native: /usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64: LOCAL_DIRS=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001 LOCAL_USER_DIRS=/hadoop/yarn/local/usercache/subrata/ LOGNAME=subrata LOG_DIRS=/hadoop/yarn/log/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002 MAIL=/var/mail/yarn MALLOC_ARENA_MAX=4 NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat NM_AUX_SERVICE_mapreduce_shuffle=AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA= NM_AUX_SERVICE_spark2_shuffle= NM_AUX_SERVICE_spark_shuffle= NM_HOST=pc-1.thenet.edu NM_HTTP_PORT=8042 NM_PORT=45454 PATH= /usr/sbin: /sbin: /usr/lib/ambari-server/*: /usr/local/sbin: /usr/local/bin: /usr/sbin: /usr/bin: /sbin: /bin: /usr/games: /usr/local/games: /var/lib/ambari-agent: PIG_OPTS=-Dhive.metastore.local=false -Dhive.metastore.uris=thrift://pc-1.thenet.edu:9083 -Dhive.metastore.sasl.enabled=false -Dhive.metastore.execute.setugi=true -Dhive.execution.engine=tez PWD=/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002 QT_QPA_PLATFORMTHEME=appmenu-qt5 SHELL=/bin/bash SHLVL=4 STDERR_LOGFILE_ENV=/hadoop/yarn/log/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/stderr STDOUT_LOGFILE_ENV=/hadoop/yarn/log/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/stdout USER=subrata XDG_SEAT=seat0 XDG_SESSION_ID=c7 XDG_VTNR=7 XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt YARN_CONF_DIR=/usr/hdp/current/hadoop-client/conf YARN_IDENT_STRING=yarn YARN_LOGFILE=yarn-yarn-nodemanager-pc-1.log YARN_LOG_DIR=/var/log/hadoop-yarn/yarn YARN_NICENESS=0 YARN_NODEMANAGER_HEAPSIZE=1024 YARN_NODEMANAGER_OPTS= -Dnm.audit.logger=INFO,NMAUDIT -Dnm.audit.logger=INFO,NMAUDIT YARN_OPTS= -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-nodemanager-pc-1.log -Dyarn.log.file=yarn-yarn-nodemanager-pc-1.log -Dyarn.home.dir= -Dyarn.id.str=yarn -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dyarn.policy.file=hadoop-policy.xml -Djava.io.tmpdir=/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -server -Dnm.audit.logger=INFO,NMAUDIT -Dnm.audit.logger=INFO,NMAUDIT -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-nodemanager-pc-1.log -Dyarn.log.file=yarn-yarn-nodemanager-pc-1.log -Dyarn.home.dir=/usr/hdp/current/hadoop-yarn-nodemanager -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.3.0-37/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir YARN_PID_DIR=/var/run/hadoop-yarn/yarn YARN_RESOURCEMANAGER_HEAPSIZE=1024 YARN_RESOURCEMANAGER_OPTS=-Dyarn.server.resourcemanager.appsummary.logger=INFO,RMSUMMARY -Drm.audit.logger=INFO,RMAUDIT YARN_ROOT_LOGGER=INFO,EWMA,RFA YARN_TIMELINESERVER_HEAPSIZE=1024 _=/usr/jdk64/jdk1.8.0_77/bin/java END========Starting process with env:======== 2017-03-18 22:43:15,263 INFO [main] org.apache.hive.hcatalog.templeton.tool.TrivialExecService: Files in '.' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/. File: .script.pig.crc File: script.pig File: .job.xml.crc --Files in 'pig.tar.gz' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/pig.tar.gz ----Files in 'pig' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/pig ----Files in 'build' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/build File: default_container_executor_session.sh File: hive-shims-common-1.2.1000.2.5.3.0-37.jar File: container_tokens --Files in 'tmp' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/tmp File: default_container_executor.sh File: hive-common.jar --Files in 'mr-framework' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/mr-framework ----Files in 'hadoop' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/hadoop File: .launch_container.sh.crc File: .container_tokens.crc File: launch_container.sh File: job.xml File: zookeeper.jar File: hive-shims-0.23-1.2.1000.2.5.3.0-37.jar File: .default_container_executor_session.sh.crc --Files in 'job.jar' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/job.jar --File: job.jar --Files in 'hive.tar.gz' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/hive.tar.gz ----Files in 'hive' dir:/hadoop/yarn/local/usercache/subrata/appcache/application_1489855879213_0001/container_e24_1489855879213_0001_01_000002/hive File: .default_container_executor.sh.crc 2017-03-18 22:43:15,316 INFO [main] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: templeton: Writing status to /user/subrata/pig/jobs/riskfactorpig_18-03-2017-22-43-01/stdout 2017-03-18 22:43:15,328 INFO [main] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: templeton: Writing status to /user/subrata/pig/jobs/riskfactorpig_18-03-2017-22-43-01/stderr 2017-03-18 22:43:15,330 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat. 2017-03-18 22:43:20,667 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:44:15,330 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.. 2017-03-18 22:44:20,772 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:45:15,330 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat... 2017-03-18 22:45:20,825 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:46:15,330 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.... 2017-03-18 22:46:20,879 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:47:15,331 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat..... 2017-03-18 22:47:20,927 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:48:15,331 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat...... 2017-03-18 22:48:20,974 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:49:15,331 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat....... 2017-03-18 22:49:21,029 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:50:15,331 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........ 2017-03-18 22:50:21,082 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:51:15,332 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat......... 2017-03-18 22:51:21,128 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:52:15,332 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.......... 2017-03-18 22:52:21,173 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:53:15,332 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........... 2017-03-18 22:53:21,218 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:54:15,332 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............ 2017-03-18 22:54:21,269 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:55:15,333 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............. 2017-03-18 22:55:21,323 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:56:15,333 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.............. 2017-03-18 22:56:18,372 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:57:15,333 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............... 2017-03-18 22:57:18,424 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:58:15,333 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................ 2017-03-18 22:58:18,487 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 22:59:15,334 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................. 2017-03-18 22:59:18,537 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:00:15,334 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.................. 2017-03-18 23:00:18,585 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:01:15,334 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................... 2017-03-18 23:01:18,654 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:02:15,334 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.................... 2017-03-18 23:02:18,711 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:03:15,335 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat..................... 2017-03-18 23:03:18,771 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:04:15,335 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat...................... 2017-03-18 23:04:18,813 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:05:15,335 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat....................... 2017-03-18 23:05:18,854 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:06:15,335 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........................ 2017-03-18 23:06:18,898 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:07:15,336 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat......................... 2017-03-18 23:07:18,941 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:08:15,336 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.......................... 2017-03-18 23:08:18,982 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:09:15,336 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........................... 2017-03-18 23:09:19,025 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:10:15,336 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................ 2017-03-18 23:10:19,068 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:11:15,337 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................. 2017-03-18 23:11:19,109 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:12:15,337 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.............................. 2017-03-18 23:12:19,151 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:13:15,337 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................... 2017-03-18 23:13:19,194 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:14:15,337 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................................ 2017-03-18 23:14:19,236 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:15:15,338 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................................. 2017-03-18 23:15:19,278 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:16:15,338 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.................................. 2017-03-18 23:16:19,317 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:17:15,338 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................................... 2017-03-18 23:17:19,358 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:18:15,338 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.................................... 2017-03-18 23:18:19,399 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:19:15,339 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat..................................... 2017-03-18 23:19:19,443 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:20:15,339 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat...................................... 2017-03-18 23:20:19,486 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:21:15,339 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat....................................... 2017-03-18 23:21:19,533 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:22:15,339 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........................................ 2017-03-18 23:22:19,576 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:23:15,340 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat......................................... 2017-03-18 23:23:19,619 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:24:15,340 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.......................................... 2017-03-18 23:24:19,660 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:25:15,340 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat........................................... 2017-03-18 23:25:19,701 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:26:15,340 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................................ 2017-03-18 23:26:19,742 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:27:15,341 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................................. 2017-03-18 23:27:19,792 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:28:15,341 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat.............................................. 2017-03-18 23:28:19,833 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:29:15,341 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat............................................... 2017-03-18 23:29:19,874 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 2017-03-18 23:30:15,341 INFO [pool-6-thread-3] org.apache.hive.hcatalog.templeton.tool.LaunchMapper: KeepAlive Heart beat................................................ 2017-03-18 23:30:19,908 WARN [communication thread] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: Unexpected: procfs stat file is not in the expected format for process with pid 4168 End of LogType:syslog