Member since
01-25-2017
396
Posts
28
Kudos Received
11
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
835 | 10-19-2023 04:36 PM | |
4367 | 12-08-2018 06:56 PM | |
5459 | 10-05-2018 06:28 AM | |
19877 | 04-19-2018 02:27 AM | |
19899 | 04-18-2018 09:40 AM |
12-05-2017
07:37 AM
Hi Guys,
I trying to get all the finished applications ( which is not running, it may killed, success etc ...), and using the cloudera manager API, but with no success, can anyone help what i'm missing in the below command, i'm collecting the metrics each 5 minutes
STARTDATE=`date -d " -5 minute" "+%FT%T"`
result=`curl -u 'admin' : 'admin' 'http://cloudera_manager_host:7180/api/v11/clusters/cluster/services/yarn/yarnApplications?from='$STARTDATE'&limit=1000&filters(state!=RUNNING)'`
... View more
Labels:
- Labels:
-
Apache YARN
-
Cloudera Manager
11-05-2017
06:52 AM
@mbigelow Hi mbigelow, I tried to use LIKE in the CM API with now success: I ahve one like this: curl -u 'xxxx':'xxxx' 'http://CM_server.domain.com:7180/api/v11/clusters/cluster/services/impala/impalaQueries?from=2017-10-10T00:00:00&to2017-10-11T00:00:00&limit=1000&filter=statement RLIKE ".*fawzea.*"' >>f.json Can you help
... View more
10-31-2017
10:19 PM
@Geoffrey Shelton Okot [root@aopr-dhc001 ~]# kinit -V -J-Dsun.security.krb5.debug=true -J-Djava.security.debug=true -k -t cloudera-scm@LPDOMAIN.COM.ktab cloudera-scm@LPDOMAIN.COM.ktab_Principal
kinit: invalid option -- 'J'
kinit: invalid option -- '-'
kinit: invalid option -- 'D'
Bad start time value un.security.krb5.debug=true
kinit: invalid option -- 'J'
kinit: invalid option -- '-'
kinit: invalid option -- 'D'
kinit: invalid option -- 'j'
kinit: invalid option -- '.'
Bad start time value ecurity.debug=true
Usage: kinit [-V] [-l lifetime] [-s start_time]
[-r renewable_life] [-f | -F] [-p | -P] -n [-a | -A] [-C]
[-E]
[-v] [-R] [-k [-t keytab_file]] [-c cachename]
[-S service_name] [-T ticket_armor_cache]
[-X <attribute>[=<value>]] [principal]
options: -V verbose
-l lifetime
-s start time
-r renewable lifetime
-f forwardable
-F not forwardable
-p proxiable
-P not proxiable
-n anonymous
-a include addresses
-A do not include addresses
-v validate
-R renew
-C canonicalize
-E client is enterprise principal name
-k use keytab
-t filename of keytab to use
-c Kerberos 5 cache name
-S service
-T armor credential cache
... View more
10-31-2017
10:15 PM
Can anyone have a look at these logs and let me if there is an error that i'm missing: Tue Oct 31 18:04:01 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:04 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:09 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.com
Tue Oct 31 18:04:14 EDT 2017
JAVA_HOME=/liveperson/jdk8
using /liveperson/jdk8 as JAVA_HOME
using 5 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR
using cloudera-scm as SECURE_USER
using cloudera-scm as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
CMF_CONF_DIR=/etc/cloudera-scm-agent
1048576
using cloudera-scm as HADOOP_SECURE_DN_USER
using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
>>> KeyTabInputStream, readName(): LPDOMAIN.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): aopr-dhc001.lpdomain.com
>>> KeyTab: load() entry length: 77; type: 23
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
default etypes for default_tkt_enctypes: 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000, number of retries =3, #bytes=161
>>> KDCCommunication: kdc=ropr-mng01.lpdomain.com TCP:88, timeout=5000,Attempt =1, #bytes=161
>>>DEBUG: TCPClient reading 623 bytes
>>> KrbKdcReq send: #bytes read=623
>>> KdcAccessibility: remove ropr-mng01.lpdomain.com
Looking for keys for: hdfs/aopr-dhc001.lpdomain.com@LPDOMAIN.COM
Added key: 23version: 1
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/aopr-dhc001.lpdomain.comTue Oct 31 18:04:01 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25559#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25559#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:04 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25751#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25751#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:09 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#25899#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#25899#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
Tue Oct 31 18:04:14 EDT 2017
+ source_parcel_environment
+ '[' '!' -z /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh:/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=3
++ seq 1 3
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.13.0-1.cdh5.13.0.p0.29
++ export CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HADOOP_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ CDH_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ CDH_HBASE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/zookeeper
++ export CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ CDH_HIVE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive
++ export CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ CDH_HUE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hue
++ export CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ CDH_OOZIE_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ CDH_FLUME_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/flume-ng
++ export CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ CDH_PIG_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/pig
++ export CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ CDH_HCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ CDH_SQOOP2_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sqoop2
++ export CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ CDH_LLAMA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/llama
++ export CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ CDH_SENTRY_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/sentry
++ export TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ TOMCAT_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-tomcat
++ export JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ JSVC_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ CDH_IMPALA_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/impala
++ export CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ CDH_SOLR_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hbase-solr
++ export SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ SEARCH_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/search
++ export CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ CDH_SPARK_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/spark
++ export WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ CDH_KMS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-kms
++ export CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ CDH_PARQUET_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/parquet
++ export CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ CDH_AVRO_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/avro
++ export CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
++ CDH_KUDU_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/kudu
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
+ PARCEL_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
+ . /liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/meta/gplextras_env.sh
++ GPLEXTRAS_DIRNAME=GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29
++ '[' -n '' ']'
++ export 'HADOOP_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HADOOP_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'MR2_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ MR2_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'HBASE_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ HBASE_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export 'FLUME_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ FLUME_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
++ '[' -n '' ']'
++ export JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ JAVA_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ '[' -n '' ']'
++ export LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ LD_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/impala/lib
++ '[' -n '' ']'
++ export 'CDH_SPARK_CLASSPATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ CDH_SPARK_CLASSPATH='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/spark-netlib/lib/*'
++ '[' -n '' ']'
++ export SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
++ SPARK_LIBRARY_PATH=/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/native
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
+ PARCEL_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
+ . /liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/meta/spark2_env.sh
++ CDH_DIRNAME=SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904
++ export CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
++ CDH_SPARK2_HOME=/liveperson/hadoop/parcels/SPARK2-2.1.0.cloudera1-1.cdh5.7.0.p0.120904/lib/spark2
+ locate_cdh_java_home
+ '[' -z /liveperson/jdk8 ']'
+ verify_java_home
+ '[' -z /liveperson/jdk8 ']'
+ echo JAVA_HOME=/liveperson/jdk8
+ . /usr/lib64/cmf/service/common/cdh-default-hadoop
++ [[ -z 5 ]]
++ '[' 5 = 3 ']'
++ '[' 5 = -3 ']'
++ '[' 5 -ge 4 ']'
++ export HADOOP_HOME_WARN_SUPPRESS=true
++ HADOOP_HOME_WARN_SUPPRESS=true
++ export HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_PREFIX=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ HADOOP_LIBEXEC_DIR=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec
++ export HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ HADOOP_CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
++ export HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ HADOOP_COMMON_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop
++ export HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ HADOOP_HDFS_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs
++ export HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ HADOOP_MAPRED_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-mapreduce
++ '[' 5 = 4 ']'
++ '[' 5 = 5 ']'
++ export HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ HADOOP_YARN_HOME=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-yarn
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_NAMENODE_OPTS=
+ HADOOP_NAMENODE_OPTS=
++ replace_pid -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ echo -Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh
++ sed 's#{{PID}}#26047#g'
+ export 'HADOOP_DATANODE_OPTS=-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
+ HADOOP_DATANODE_OPTS='-Xms4294967296 -Xmx4294967296 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSParallelRemarkEnabled -XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh'
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_SECONDARYNAMENODE_OPTS=
+ HADOOP_SECONDARYNAMENODE_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_NFS3_OPTS=
+ HADOOP_NFS3_OPTS=
++ replace_pid
++ echo
++ sed 's#{{PID}}#26047#g'
+ export HADOOP_JOURNALNODE_OPTS=
+ HADOOP_JOURNALNODE_OPTS=
+ '[' 5 -ge 4 ']'
+ HDFS_BIN=/liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs
+ export 'HADOOP_OPTS=-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ echo 'using /liveperson/jdk8 as JAVA_HOME'
+ echo 'using 5 as CDH_VERSION'
+ echo 'using /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE as CONF_DIR'
+ echo 'using cloudera-scm as SECURE_USER'
+ echo 'using cloudera-scm as SECURE_GROUP'
+ set_hadoop_classpath
+ set_classpath_in_var HADOOP_CLASSPATH
+ '[' -z HADOOP_CLASSPATH ']'
+ [[ -n /usr/share/cmf ]]
++ find /usr/share/cmf/lib/plugins -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:
+ [[ -n navigator/cdh57 ]]
+ for DIR in '$CM_ADD_TO_CP_DIRS'
++ find /usr/share/cmf/lib/plugins/navigator/cdh57 -maxdepth 1 -name '*.jar'
++ tr '\n' :
+ PLUGIN=/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ ADD_TO_CP=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:
+ eval 'OLD_VALUE=$HADOOP_CLASSPATH'
++ OLD_VALUE='/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ NEW_VALUE='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ export 'HADOOP_CLASSPATH=/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ HADOOP_CLASSPATH='/usr/share/cmf/lib/plugins/event-publish-5.13.0-shaded.jar:/usr/share/cmf/lib/plugins/tt-instrumentation-5.13.0.jar:/usr/share/cmf/lib/plugins/navigator/cdh57/audit-plugin-cdh57-2.12.0-shaded.jar:/liveperson/hadoop/parcels/GPLEXTRAS-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/lib/*'
+ set -x
+ replace_conf_dir
+ echo CONF_DIR=/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ echo CMF_CONF_DIR=/etc/cloudera-scm-agent
+ EXCLUDE_CMF_FILES=('cloudera-config.sh' 'httpfs.sh' 'hue.sh' 'impala.sh' 'sqoop.sh' 'supervisor.conf' 'config.zip' 'proc.json' '*.log' '*.keytab' '*jceks')
++ printf '! -name %s ' cloudera-config.sh httpfs.sh hue.sh impala.sh sqoop.sh supervisor.conf config.zip proc.json '*.log' hdfs.keytab '*jceks'
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -type f '!' -path '/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE/logs/*' '!' -name cloudera-config.sh '!' -name httpfs.sh '!' -name hue.sh '!' -name impala.sh '!' -name sqoop.sh '!' -name supervisor.conf '!' -name config.zip '!' -name proc.json '!' -name '*.log' '!' -name hdfs.keytab '!' -name '*jceks' -exec perl -pi -e 's#{{CMF_CONF_DIR}}#/var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE#g' '{}' ';'
+ make_scripts_executable
+ find /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE -regex '.*\.\(py\|sh\)$' -exec chmod u+x '{}' ';'
+ '[' DATANODE_MAX_LOCKED_MEMORY '!=' '' ']'
+ ulimit -l
+ export HADOOP_IDENT_STRING=hdfs
+ HADOOP_IDENT_STRING=hdfs
+ '[' -n true ']'
+ '[' 5 -ge 4 ']'
+ export HADOOP_SECURE_DN_USER=cloudera-scm
+ HADOOP_SECURE_DN_USER=cloudera-scm
+ echo 'using cloudera-scm as HADOOP_SECURE_DN_USER'
+ set_jsvc_home
+ [[ ! -e /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils/jsvc ]]
+ echo 'using /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/bigtop-utils as JSVC_HOME'
+ chown -R cloudera-scm:cloudera-scm /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE
+ '[' mkdir '!=' datanode ']'
+ acquire_kerberos_tgt hdfs.keytab
+ '[' -z hdfs.keytab ']'
+ '[' -n '' ']'
+ '[' validate-writable-empty-dirs = datanode ']'
+ '[' file-operation = datanode ']'
+ '[' bootstrap = datanode ']'
+ '[' failover = datanode ']'
+ '[' transition-to-active = datanode ']'
+ '[' initializeSharedEdits = datanode ']'
+ '[' initialize-znode = datanode ']'
+ '[' format-namenode = datanode ']'
+ '[' monitor-decommission = datanode ']'
+ '[' jnSyncWait = datanode ']'
+ '[' nnRpcWait = datanode ']'
+ '[' -safemode = '' -a get = '' ']'
+ '[' monitor-upgrade = datanode ']'
+ '[' finalize-upgrade = datanode ']'
+ '[' rolling-upgrade-prepare = datanode ']'
+ '[' rolling-upgrade-finalize = datanode ']'
+ '[' nnDnLiveWait = datanode ']'
+ '[' monitor-offline = datanode ']'
+ '[' refresh-datanode = datanode ']'
+ '[' mkdir = datanode ']'
+ '[' nfs3 = datanode ']'
+ '[' namenode = datanode -o secondarynamenode = datanode -o datanode = datanode ']'
+ HADOOP_OPTS='-Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ export 'HADOOP_OPTS=-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ HADOOP_OPTS='-Dhdfs.audit.logger=INFO,RFAAUDIT -Dsecurity.audit.logger=INFO,RFAS -Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true'
+ '[' namenode = datanode -a rollingUpgrade = '' ']'
+ exec /liveperson/hadoop/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop-hdfs/bin/hdfs --config /var/run/cloudera-scm-agent/process/3195-hdfs-DATANODE datanode
... View more
10-29-2017
05:55 PM
@Geoffrey Shelton Okot https://www.cloudera.com/documentation/enterprise/5-6-x/topics/cm_sg_s3_cm_principal.html
... View more
10-29-2017
05:28 PM
Hi Guys, Really appreciate your quick responses and readiness to help, i'm almost 1 week on the issue without success even i tried all the documentations i found over the internet. Starting to give up ........ 😞
... View more
10-29-2017
03:06 PM
@Jay SenSharma ran the command and get the same output shown and udp_preference_limit = 1' under '[libdefaults]:
... View more
10-29-2017
11:29 AM
@Geoffrey Shelton Okot When you see restart the KDC server, you mean to restart the active directory server? how i can avoid that?
... View more
10-29-2017
04:06 AM
The problem was the limitation of sub directory under specific dir so when checking the folder container i see there is 32,000 directories which is the limit. looking why the retention isnot deleting these files and i have the following conf: Log Aggregation Retention Period 7 days Job History Files Cleaner Interval 1 day Log Retain Duration 3 hours
... View more