Support Questions

Find answers, ask questions, and share your expertise

error: [Errno 104] Connection reset by peer in agent log cloudera

avatar
Rising Star

Hello All,

 

I noticed agent status bad notification for running 3-node cluster in cloudera manager CDH5.4 console.

 

When I tried to access the link for agent log, it gave me error.

 

I checked logs on the corresponding server at /var/log/cloudera-scm-agent , there I fiound out below error.

 

TP/1.1" 200 2120 "" "Java/1.7.0_67"
[08/Aug/2017 05:22:47 +0000] 8060 MainThread agent INFO Process with same id has changed: 2940-host-inspector.
[08/Aug/2017 05:22:47 +0000] 8060 MainThread agent INFO Deactivating process 2940-host-inspector
[08/Aug/2017 05:22:48 +0000] 8060 Metadata-Plugin navigator_plugin INFO stopping Metadata Plugin for host-inspector with pipelines []
[08/Aug/2017 05:22:48 +0000] 8060 Metadata-Plugin navigator_plugin_pipeline INFO Stopping Navigator Plugin Pipeline '' for host-inspector (log dir: None)
[08/Aug/2017 05:22:49 +0000] 8060 Audit-Plugin navigator_plugin INFO stopping Audit Plugin for host-inspector with pipelines []
[08/Aug/2017 05:22:49 +0000] 8060 Audit-Plugin navigator_plugin_pipeline INFO Stopping Navigator Plugin Pipeline '' for host-inspector (log dir: None)
[08/Aug/2017 05:26:00 +0000] 8060 MonitorDaemon-Reporter throttling_logger ERROR (1 skipped) Error sending messages to firehose: mgmt-SERVICEMONITOR-0fc5431c91f288535e98f1ed6d2d7836
Traceback (most recent call last):
File "/usr/lib64/cmf/agent/src/cmf/monitor/firehose.py", line 74, in _send
self._requestor.request('sendAgentMessages', dict(messages=messages))
File "/usr/lib64/cmf/agent/build/env/lib/python2.6/site-packages/avro-1.6.3-py2.6.egg/avro/ipc.py", line 139, in request
return self.issue_request(call_request, message_name, request_datum)
File "/usr/lib64/cmf/agent/build/env/lib/python2.6/site-packages/avro-1.6.3-py2.6.egg/avro/ipc.py", line 249, in issue_request
call_response = self.transceiver.transceive(call_request)
File "/usr/lib64/cmf/agent/build/env/lib/python2.6/site-packages/avro-1.6.3-py2.6.egg/avro/ipc.py", line 478, in transceive
result = self.read_framed_message()
File "/usr/lib64/cmf/agent/build/env/lib/python2.6/site-packages/avro-1.6.3-py2.6.egg/avro/ipc.py", line 482, in read_framed_message
response = self.conn.getresponse()
File "/usr/lib64/python2.6/httplib.py", line 990, in getresponse
response.begin()
File "/usr/lib64/python2.6/httplib.py", line 391, in begin
version, status, reason = self._read_status()
File "/usr/lib64/python2.6/httplib.py", line 349, in _read_status
line = self.fp.readline()
File "/usr/lib64/python2.6/socket.py", line 433, in readline
data = recv(1)
error: [Errno 104] Connection reset by peer

 

 

Below are the outputs that I checked.

[root@LinuxUL cloudera-scm-agent]# cat /etc/hosts
127.0.0.1 localhost

10.68.200.34 LinuxUL.ad.infosys.com LinuxUL
10.68.200.152 linux152.ad.infosys.com linux152
10.68.200.170 linux170.ad.infosys.com linux170
172.21.5.224 nfrsat01.ad.infosys.com nfrsat01
10.67.200.77 blrsat06.ad.infosys.com blrsat06

 

[root@LinuxUL ~]# hostname -f
LinuxUL.ad.infosys.com
[root@LinuxUL ~]# python -c 'import socket; print socket.getfqdn(), socket.gethostbyname(socket.getfqdn())'
LinuxUL.ad.infosys.com 10.68.200.34

 

Also, selinux and iptables are disabled.

 

Please suggest.

 

Thanks,

Priya

8 REPLIES 8

avatar
Contributor

Hi cdhhadoop,

 

Cloudera agent is completely down? It happens in more servers?

 

Can you provide /var/log/cloudera-scm-agent/cloudera-scm-agent.out output?

 

Can you provide the output of the next commands?:

 

 

$ netstat -ltnp | grep :9000
$ source /etc/cloudera-scm-agent/config.ini &>/dev/null
$ ping -w1 $server_host $ telnet $server_host $server_port

Regards,

Marc Casajus

avatar
Rising Star
Hi marccasajus,

Seems like agent is completely down. It is on only one server as of now.
Command netstat -ltnp | grep :9000 gave no output.

[root@LinuxUL cloudera-scm-agent]# ping -w1 linux170.ad.infosys.com
PING linux170.ad.infosys.com (10.68.200.170) 56(84) bytes of data.
64 bytes from linux170.ad.infosys.com (10.68.200.170): icmp_seq=1 ttl=64 time=0.595 ms

--- linux170.ad.infosys.com ping statistics ---
2 packets transmitted, 1 received, 50% packet loss, time 1000ms
rtt min/avg/max/mdev = 0.595/0.595/0.595/0.000 ms

[root@LinuxUL cloudera-scm-agent]# source /etc/cloudera-scm-agent/config.ini & > /dev/null
[1] 8911
[root@LinuxUL cloudera-scm-agent]# -bash: [General]: command not found
-bash: [Security]: command not found
-bash: [Hadoop]: command not found
-bash: [Cloudera]: command not found
-bash: [JDBC]: command not found

[1]+ Exit 127 source /etc/cloudera-scm-agent/config.ini
.
Please find /var/log/cloudera-scm-agent/cloudera-scm-agent.out output as below.


/usr/lib64/cmf/agent/src/cmf/parcel.py:17: DeprecationWarning: the sets module is deprecated
from sets import Set
[24/Mar/2017 20:12:05 +0000] 8060 MainThread agent INFO SCM Agent Version: 5.4.9
[24/Mar/2017 20:12:05 +0000] 8060 MainThread agent INFO Adding env vars that start with CMF_AGENT_
[24/Mar/2017 20:12:05 +0000] 8060 MainThread agent INFO Logging to /var/log/cloudera-scm-agent/cloudera-scm-agent.log
/usr/lib64/cmf/agent/src/cmf/agent.py:786: DeprecationWarning: psutil.cached_phymem is deprecated; use psutil.virtual_memory().cached instead
cached = psutil.cached_phymem()
/usr/lib64/cmf/agent/src/cmf/agent.py:787: DeprecationWarning: psutil.phymem_buffers is deprecated; use psutil.virtual_memory().buffers instead
buffers = psutil.phymem_buffers()
/usr/lib64/cmf/agent/src/cmf/agent.py:788: DeprecationWarning: psutil.used_phymem is deprecated; use psutil.phymem_usage().used instead
really_used = psutil.used_phymem() - cached - buffers
/usr/lib64/cmf/agent/build/env/lib/python2.6/site-packages/psutil-2.1.3-py2.6-linux-x86_64.egg/psutil/__init__.py:1863: DeprecationWarning: psutil.phymem_usage is deprecated; use psutil.virtual_memory() instead
return phymem_usage().used
/usr/lib64/cmf/agent/src/cmf/monitor/host/__init__.py:190: DeprecationWarning: psutil.phymem_usage is deprecated; use psutil.virtual_memory() instead
really_used = psutil.phymem_usage().used - cached - buffers
2711-zookeeper-server: added process group
2711-zookeeper-server: stopped
2711-zookeeper-server: removed process group
2716-zookeeper-server: added process group
2720-hdfs-JOURNALNODE: added process group
2718-hdfs-DATANODE: added process group
2730-hbase-REGIONSERVER: added process group
2733-hue-HUE_SERVER: added process group
+ source_parcel_environment
+ '[' '!' -z /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=1
++ seq 1 1
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
+ . /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
++ export CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ export CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ export CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ export CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ export CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ export CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ export CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ export CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ export TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ export JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ export CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ export SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ export CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ export WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
++ CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
+ env
2736-yarn-NODEMANAGER: added process group
2739-collect-host-statistics: added process group
2742-host-inspector: added process group
2746-solr-SOLR_SERVER: added process group
2733-hue-HUE_SERVER: stopped
2733-hue-HUE_SERVER: removed process group
2747-hue-HUE_SERVER: added process group
2739-collect-host-statistics: stopped
2739-collect-host-statistics: removed process group
2742-host-inspector: stopped
2742-host-inspector: removed process group
2749-collect-host-statistics: added process group
2752-host-inspector: added process group
+ source_parcel_environment
+ '[' '!' -z /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=1
++ seq 1 1
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
+ . /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
++ export CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ export CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ export CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ export CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ export CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ export CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ export CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ export CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ export TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ export JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ export CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ export SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ export CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ export WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
++ CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
+ env
2756-mapreduce-TASKTRACKER: added process group
2752-host-inspector: stopped
2752-host-inspector: removed process group
2749-collect-host-statistics: stopped
2749-collect-host-statistics: removed process group
2761-collect-host-statistics: added process group
2764-host-inspector: added process group
+ source_parcel_environment
+ '[' '!' -z /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=1
++ seq 1 1
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
+ . /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
++ export CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ export CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ export CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ export CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ export CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ export CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ export CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ export CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ export TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ export JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ export CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ export SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ export CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ export WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
++ CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
+ env
2736-yarn-NODEMANAGER: stopped
2736-yarn-NODEMANAGER: removed process group
2769-yarn-NODEMANAGER: added process group
2761-collect-host-statistics: stopped
2761-collect-host-statistics: removed process group
2764-host-inspector: stopped
2764-host-inspector: removed process group
2772-collect-host-statistics: added process group
2775-host-inspector: added process group
2772-collect-host-statistics: stopped
2772-collect-host-statistics: removed process group
2775-host-inspector: stopped
2775-host-inspector: removed process group
2778-kafka-KAFKA_BROKER: added process group
2779-collect-host-statistics: added process group
2782-host-inspector: added process group
2779-collect-host-statistics: stopped
2779-collect-host-statistics: removed process group
2782-host-inspector: stopped
2782-host-inspector: removed process group
2716-zookeeper-server: stopped
2716-zookeeper-server: removed process group
2785-zookeeper-server: added process group
2785-zookeeper-server: stopped
2785-zookeeper-server: removed process group
2788-zookeeper-server: added process group
2788-zookeeper-server: stopped
2788-zookeeper-server: removed process group
2793-zookeeper-server: added process group
2793-zookeeper-server: stopped
2793-zookeeper-server: removed process group
2796-zookeeper-server: added process group
2797-collect-host-statistics: added process group
2800-host-inspector: added process group
2797-collect-host-statistics: stopped
2797-collect-host-statistics: removed process group
2800-host-inspector: stopped
2800-host-inspector: removed process group
2803-collect-host-statistics: added process group
2806-host-inspector: added process group
2803-collect-host-statistics: stopped
2803-collect-host-statistics: removed process group
2806-host-inspector: stopped
2806-host-inspector: removed process group
2809-collect-host-statistics: added process group
2812-host-inspector: added process group
2809-collect-host-statistics: stopped
2809-collect-host-statistics: removed process group
2812-host-inspector: stopped
2812-host-inspector: removed process group
2796-zookeeper-server: stopped
2796-zookeeper-server: removed process group
2817-zookeeper-server: added process group
2817-zookeeper-server: stopped
2817-zookeeper-server: removed process group
2818-zookeeper-server: added process group
2818-zookeeper-server: stopped
2818-zookeeper-server: removed process group
2823-zookeeper-server: added process group
2823-zookeeper-server: stopped
2823-zookeeper-server: removed process group
2828-zookeeper-server: added process group
2829-collect-host-statistics: added process group
2832-host-inspector: added process group
2829-collect-host-statistics: stopped
2829-collect-host-statistics: removed process group
2832-host-inspector: stopped
2832-host-inspector: removed process group
2835-collect-host-statistics: added process group
2838-host-inspector: added process group
2835-collect-host-statistics: stopped
2835-collect-host-statistics: removed process group
2838-host-inspector: stopped
2838-host-inspector: removed process group
2842-collect-host-statistics: added process group
2845-host-inspector: added process group
2842-collect-host-statistics: stopped
2842-collect-host-statistics: removed process group
2845-host-inspector: stopped
2845-host-inspector: removed process group
2848-collect-host-statistics: added process group
2851-host-inspector: added process group
2848-collect-host-statistics: stopped
2848-collect-host-statistics: removed process group
2851-host-inspector: stopped
2851-host-inspector: removed process group
2730-hbase-REGIONSERVER: stopped
2730-hbase-REGIONSERVER: removed process group
2861-hbase-REGIONSERVER: added process group
2861-hbase-REGIONSERVER: stopped
2861-hbase-REGIONSERVER: removed process group
2867-hbase-REGIONSERVER: added process group
2878-collect-host-statistics: added process group
2881-host-inspector: added process group
2881-host-inspector: stopped
2881-host-inspector: removed process group
2878-collect-host-statistics: stopped
2878-collect-host-statistics: removed process group
2884-collect-host-statistics: added process group
2887-host-inspector: added process group
2887-host-inspector: stopped
2887-host-inspector: removed process group
2884-collect-host-statistics: stopped
2884-collect-host-statistics: removed process group
2890-collect-host-statistics: added process group
2893-host-inspector: added process group
2890-collect-host-statistics: stopped
2890-collect-host-statistics: removed process group
2893-host-inspector: stopped
2893-host-inspector: removed process group
2896-collect-host-statistics: added process group
2899-host-inspector: added process group
2896-collect-host-statistics: stopped
2896-collect-host-statistics: removed process group
2899-host-inspector: stopped
2899-host-inspector: removed process group
+ source_parcel_environment
+ '[' '!' -z /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh ']'
+ OLD_IFS='
'
+ IFS=:
+ SCRIPT_ARRAY=($SCM_DEFINES_SCRIPTS)
+ DIRNAME_ARRAY=($PARCEL_DIRNAMES)
+ IFS='
'
+ COUNT=1
++ seq 1 1
+ for i in '`seq 1 $COUNT`'
+ SCRIPT=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
+ PARCEL_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
+ . /opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/meta/cdh_env.sh
++ CDH_DIRNAME=CDH-5.4.0-1.cdh5.4.0.p0.27
++ export CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ CDH_MR1_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-0.20-mapreduce
++ export CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ CDH_HDFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-hdfs
++ export CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ CDH_HTTPFS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-httpfs
++ export CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ CDH_MR2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-mapreduce
++ export CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ CDH_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-yarn
++ export CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ CDH_HBASE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase
++ export CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ CDH_ZOOKEEPER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/zookeeper
++ export CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ CDH_HIVE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive
++ export CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ CDH_HUE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hue
++ export CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ CDH_OOZIE_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/oozie
++ export CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ CDH_HUE_PLUGINS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop
++ export CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ CDH_FLUME_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/flume-ng
++ export CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ CDH_PIG_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/pig
++ export CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ CDH_HCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hive-hcatalog
++ export CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ CDH_SQOOP2_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sqoop2
++ export CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ CDH_LLAMA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/llama
++ export CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ CDH_SENTRY_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/sentry
++ export TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ TOMCAT_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-tomcat
++ export JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ JSVC_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/bigtop-utils
++ export CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ CDH_HADOOP_BIN=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop/bin/hadoop
++ export CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ CDH_IMPALA_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/impala
++ export CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ CDH_SOLR_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/solr
++ export CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ CDH_HBASE_INDEXER_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hbase-solr
++ export SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ SEARCH_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/search
++ export CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ CDH_SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/spark
++ export WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/etc/hive-webhcat/conf.dist/webhcat-default.xml
++ export CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
++ CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.0-1.cdh5.4.0.p0.27/lib/hadoop-kms
+ env
2769-yarn-NODEMANAGER: stopped
2769-yarn-NODEMANAGER: removed process group
2905-yarn-NODEMANAGER: added process group
2908-collect-host-statistics: added process group
2911-host-inspector: added process group
2908-collect-host-statistics: stopped
2908-collect-host-statistics: removed process group
2911-host-inspector: stopped
2911-host-inspector: removed process group
2914-collect-host-statistics: added process group
2917-host-inspector: added process group
2914-collect-host-statistics: stopped
2914-collect-host-statistics: removed process group
2917-host-inspector: stopped
2917-host-inspector: removed process group
2746-solr-SOLR_SERVER: stopped
2746-solr-SOLR_SERVER: removed process group
2921-solr-SOLR_SERVER: added process group
2921-solr-SOLR_SERVER: stopped
2921-solr-SOLR_SERVER: removed process group
2923-solr-SOLR_SERVER: added process group
2923-solr-SOLR_SERVER: stopped
2923-solr-SOLR_SERVER: removed process group
2925-solr-SOLR_SERVER: added process group
2926-collect-host-statistics: added process group
2929-host-inspector: added process group
2747-hue-HUE_SERVER: stopped
2747-hue-HUE_SERVER: removed process group
2932-hue-HUE_SERVER: added process group
2926-collect-host-statistics: stopped
2926-collect-host-statistics: removed process group
2929-host-inspector: stopped
2929-host-inspector: removed process group
2937-collect-host-statistics: added process group
2940-host-inspector: added process group
Segmentation fault (core dumped)

avatar
Contributor

Please run:

 

$ source /etc/cloudera-scm-agent/config.ini &>/dev/null
$ telnet $server_host $server_port

&> without any space between & and >.

 

Regards, 

Marc.

avatar
Rising Star
Hi marccasajus,

source /etc/cloudera-scm-agent/config.ini &>/dev/null command gave no output.

Also please tell me what port should I use for telnet command?

Thanks,
Priya

avatar
Contributor
This command loads environment variables that you will use in the next command:

telnet $server_host $server_port

You need to check if the problem is for network issue or for application issue.

Regards,
Marc.

avatar
Rising Star
Hi marccasajus,

I deleted the pid file and started the agent using sudo service cloudera-scm-agent start coommand and agent is now up and running. Seems like pid file was from older session.

Thanks for all the help.

Thanks,
priya

avatar
Rising Star
Hi marccasajus,

Today again I saw the same error for agent as below.
sudo service cloudera-scm-agent status
cloudera-scm-agent dead but pid file exists

I see I can see 2937-collect-host-statistics: stopped
2937-collect-host-statistics: removed process group
2940-host-inspector: stopped
2940-host-inspector: removed process group in agent.out

And in agent.log, [25/Jun/2017 13:31:29 +0000] 8060 Monitor-SolrServerMonitor throttling_logger ERROR (52 skipped) Error fetching Solr core status at 'http://LinuxUL.ad.infosys.com:8983/solr//admin/cores?wt=json&action=STATUS'


Please help.

Thanks,
Priya

avatar
New Contributor

Hi Team ,

 

 

can you please help on below error.

 

connection reset by peer error.

 

Warning: Master yarn-client is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
19/01/31 18:43:18 INFO spark.SparkContext: Running Spark version 2.2.0.cloudera1
19/01/31 18:43:19 WARN spark.SparkConf: In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
19/01/31 18:43:19 INFO spark.SparkContext: Submitted application: Prime_CEP_BGFR_1309_Process Rates/Other Errors
19/01/31 18:43:19 INFO spark.SecurityManager: Changing view acls to: ggbmgphdpngrp
19/01/31 18:43:19 INFO spark.SecurityManager: Changing modify acls to: ggbmgphdpngrp
19/01/31 18:43:19 INFO spark.SecurityManager: Changing view acls groups to:
19/01/31 18:43:19 INFO spark.SecurityManager: Changing modify acls groups to:
19/01/31 18:43:19 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ggbmgphdpngrp); groups with view permissions: Set(); users with modify permissions: Set(ggbmgphdpngrp); groups with modify permissions: Set()
19/01/31 18:43:19 INFO util.Utils: Successfully started service 'sparkDriver' on port 50000.
19/01/31 18:43:19 INFO spark.SparkEnv: Registering MapOutputTracker
19/01/31 18:43:19 INFO spark.SparkEnv: Registering BlockManagerMaster
19/01/31 18:43:19 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/01/31 18:43:19 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/01/31 18:43:19 INFO storage.DiskBlockManager: Created local directory at /PBMG/users/ggbmgphdpngrp/prime/cep/tmp/blockmgr-88cc1ce5-d255-4009-9864-25e5f567879e
19/01/31 18:43:19 INFO memory.MemoryStore: MemoryStore started with capacity 6.2 GB
19/01/31 18:43:20 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/01/31 18:43:20 INFO util.log: Logging initialized @2402ms
19/01/31 18:43:20 INFO server.Server: jetty-9.3.z-SNAPSHOT
19/01/31 18:43:20 INFO server.Server: Started @2475ms
19/01/31 18:43:20 INFO server.AbstractConnector: Started ServerConnector@3ad394e6{HTTP/1.1,[http/1.1]}{0.0.0.0:52000}
19/01/31 18:43:20 INFO util.Utils: Successfully started service 'SparkUI' on port 52000.
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26f143ed{/jobs,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/jobs/job,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77b325b3{/jobs/job/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e8e8651{/stages,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@271f18d3{/stages/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61e3a1fd{/stages/stage,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@315df4bb{/stages/stage/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5cad8b7d{/stages/pool,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25243bc1{/stages/pool/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e6ee0bc{/storage,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@467f77a5{/storage/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@420bc288{/storage/rdd,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@308a6984{/storage/rdd/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a34b7b8{/environment,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3be8821f{/environment/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3b65e559{/executors,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74a9c4b0{/executors/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c05a54d{/executors/threadDump,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fd9b663{/executors/threadDump/json,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/static,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@60b85ba1{/,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@117632cf{/api,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@159e366{/jobs/job/kill,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/stages/stage/kill,null,AVAILABLE,@Spark}
19/01/31 18:43:20 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.6.209.22:52000
19/01/31 18:43:20 INFO spark.SparkContext: Added JAR file:/PBMG/users/ggbmgphdpngrp/prime/cep/prime-cep.jar at spark://10.6.209.22:50000/jars/prime-cep.jar with timestamp 1548956600301
19/01/31 18:43:20 INFO util.Utils: Using initial executors = 15, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
19/01/31 18:43:24 INFO yarn.Client: Requesting a new application from cluster with 8 NodeManagers
19/01/31 18:43:25 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (61440 MB per container)
19/01/31 18:43:25 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
19/01/31 18:43:25 INFO yarn.Client: Setting up container launch context for our AM
19/01/31 18:43:25 INFO yarn.Client: Setting up the launch environment for our AM container
19/01/31 18:43:25 INFO yarn.Client: Preparing resources for our AM container
19/01/31 18:43:25 INFO security.HadoopFSCredentialProvider: getting token for: hdfs://nameservice-np/user/ggbmgphdpngrp
19/01/31 18:43:25 INFO hdfs.DFSClient: Created token for ggbmgphdpngrp: HDFS_DELEGATION_TOKEN owner=ggbmgphdpngrp@BMEDIA.BAGINT.COM, renewer=yarn, realUser=, issueDate=1548956605075, maxDate=1549561405075, sequenceNumber=1281621, masterKeyId=1013 on ha-hdfs:nameservice-np
19/01/31 18:43:26 INFO hive.metastore: Trying to connect to metastore with URI thrift://gtunxlnu00853.server.arvato-systems.de:9083
19/01/31 18:43:26 INFO hive.metastore: Opened a connection to metastore, current connections: 1
19/01/31 18:43:26 INFO hive.metastore: Connected to metastore.
19/01/31 18:43:27 INFO metadata.Hive: Registering function dateconversion com.infosys.bmg.analytics.Date_Convert
19/01/31 18:43:27 INFO metadata.Hive: Registering function calc_week com.bmg.main.CalcWeek
19/01/31 18:43:27 INFO metadata.Hive: Registering function prev_week com.infosys.bmg.analytics.HiveUdfPrevWeek
19/01/31 18:43:27 INFO metadata.Hive: Registering function prev_week com.infosys.bmg.analytics.HiveUdfPrevWeek
19/01/31 18:43:27 INFO metadata.Hive: Registering function dateconversion com.infosys.bmg.analytics.Date_Convert
19/01/31 18:43:27 INFO metadata.Hive: Registering function date_convert com.infosys.bmg.analytics.Date_Convert
19/01/31 18:43:27 INFO metadata.Hive: Registering function calc_week com.infosys.bmg.analytics.HiveUdfCalcWeek
19/01/31 18:43:27 INFO metadata.Hive: Registering function day_of_week com.infosys.bmg.analytics.HiveUdfDayOfWeek
19/01/31 18:43:27 INFO metadata.Hive: Registering function beginning_of_fin_week_func com.infosys.bmg.date.Begining_Of_Financial_Week
19/01/31 18:43:27 INFO metadata.Hive: Registering function end_of_fin_week_func com.infosys.bmg.date.End_Of_Financial_Week
19/01/31 18:43:27 INFO metadata.Hive: Registering function dateconversion com.infosys.bmg.analytics.DateConvertFlash
19/01/31 18:43:27 INFO metadata.Hive: Registering function beginning_of_fin_week_func_ada com.infosys.bmg.date.BeginingOfFinancialWeekADA
19/01/31 18:43:27 INFO metadata.Hive: Registering function end_of_fin_week_func_ada com.infosys.bmg.date.EndOfFinancialWeekADA
19/01/31 18:43:27 INFO metadata.Hive: Registering function first_financial_day_func_ada com.infosys.bmg.date.FirstFinancialDayOfYearADA
19/01/31 18:43:27 INFO metadata.Hive: Registering function titleconversionudf com.infosys.bmg.Pr