Support Questions

Find answers, ask questions, and share your expertise

Unable to Start DataNode in kerberos cluster

avatar
Master Collaborator

Hi Guys,

I'm unable to start DataNode after enabling the kerberos in my cluster. I tried all the suggested solutions in the community and Internet and without any success to solve it.

All other servers started and my cluster and node able to authenticate against the active directory.

Here the important config in the HDFS:

dfs.datanode.http.address 1006

dfs.datanode.address 1004

hadoop.security.authentication kerberos

hadoop.security.authorization true

hadoop.rpc.protection authentication

Enable Kerberos Authentication for HTTP Web-Consoles true

and here is the log: STARTUP_MSG: java = 1.8.0_101 ************************************************************/ 2017-10-23 06:56:02,698 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2017-10-23 06:56:03,449 INFO org.apache.hadoop.security.UserGroupInformation: Login successful for user hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM using keytab file hdfs.keytab 2017-10-23 06:56:03,812 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2017-10-23 06:56:03,891 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2017-10-23 06:56:03,891 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2017-10-23 06:56:03,899 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2017-10-23 06:56:03,900 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is enabled. 2017-10-23 06:56:03,903 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is aopr-dhc001.lpdomain.com 2017-10-23 06:56:03,908 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP. Using privileged resources in combination with SASL RPC data transfer protection is not supported. at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1371) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1271) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:464) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2583) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2470) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2517) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2699) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2723) 2017-10-23 06:56:03,919 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1 2017-10-23 06:56:03,921 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at aopr-dhc001.lpdomain.com/10.16.144.131 ************************************************************/ 2017-10-23 06:56:08,422 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = aopr-dhc001.lpdomain.com/10.16.144.131 STARTUP_MSG: args = [] STARTUP_MSG: version = 2.6.0-cdh5.13.0=======================

34 REPLIES 34

avatar
Master Mentor

@Fawze AbuJaber

Don't give up as yet, just imagine it was a production cluster 🙂 Do you have a documentation you followed, I could compare it to my MIT kerberos HDP / AD integration setups and maybe find the discrepancy.

Please revert.

avatar
Master Collaborator

avatar
New Contributor

How did you resolve this?

avatar
Community Manager

Hi @Priya09, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Master Collaborator

Can anyone have a look at these logs and let me if there is an error that i'm missing:

Tue Oct 31 18:04:01 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576










using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:04 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:09 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:14 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.comTue Oct 31 18:04:01 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS=' 	
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS=' 	
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25559#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:04 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS=' 	
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS=' 	
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25751#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:09 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS=' 	
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS=' 	
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25899#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:14 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS=' 	
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS=' 	
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#26047#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode