Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

Contributor

Hi ALL,

Environment:

  • Cluster details: (All are on internal Datacentre and all nodes are VM servers)
    DH01 - Active Name Node
    DH02 - Secondary Name Node
    Data Nodes -DH03,DH04,DH05,DH07,DH08,DH09

We had a disk failure in DH03 yesterday (it shared a partition with all other datanodes) and since then we have had cluster issues. We were able to restore functioning of the cluster by restarting services and servers itself by end of the day.

But today we have an issue with DH04 one of the datanodes , There is no error in the Ambari Dashboard for this node.

Error while running HDFS Service Check in Ambari

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py", line 146, in <module>
    HdfsServiceCheck().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py", line 85, in service_check
    journalnode_port = params.journalnode_port
AttributeError: 'module' object has no attribute 'journalnode_port'
 stdout:
2017-08-31 12:16:58,770 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-31 12:16:58,773 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-31 12:16:58,786 - ExecuteHadoop['dfsadmin -fs hdfs://belongcluster1 -safemode get | grep OFF'] {'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'logoutput': True, 'try_sleep': 3, 'tries': 20, 'user': 'hdfs'}
2017-08-31 12:16:58,816 - Execute['hadoop --config /usr/hdp/current/hadoop-client/conf dfsadmin -fs hdfs://belongcluster1 -safemode get | grep OFF'] {'logoutput': True, 'try_sleep': 3, 'environment': {}, 'tries': 20, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.


Safe mode is OFF in dh01.int.belong.com.au/58.162.144.211:8020
Safe mode is OFF in dh02.int.belong.com.au/58.162.144.163:8020
2017-08-31 12:17:01,324 - HdfsResource['/tmp'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://belongcluster1', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': None, 'user': 'hdfs', 'action': ['create_on_execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'mode': 0777}
2017-08-31 12:17:01,328 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpiDhVPF 2>/tmp/tmpZ6ub1k''] {'quiet': False}
2017-08-31 12:17:01,364 - call returned (0, '')
2017-08-31 12:17:01,366 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpInP_PS 2>/tmp/tmpLTWE4J''] {'quiet': False}
2017-08-31 12:17:01,401 - call returned (0, '')
2017-08-31 12:17:01,401 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,403 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmponZBis 2>/tmp/tmplQEMFQ''] {'quiet': False}
2017-08-31 12:17:01,437 - call returned (0, '')
2017-08-31 12:17:01,439 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpcIIQH0 2>/tmp/tmpCV7JdS''] {'quiet': False}
2017-08-31 12:17:01,475 - call returned (0, '')
2017-08-31 12:17:01,476 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,479 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://dh01.int.belong.com.au:50070/webhdfs/v1/tmp?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpr3Qr8R 2>/tmp/tmpOC6vO1''] {'logoutput': None, 'quiet': False}
2017-08-31 12:17:01,521 - call returned (0, '')
2017-08-31 12:17:01,523 - HdfsResource['/tmp/ida23a1791_date163117'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://belongcluster1', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': None, 'user': 'hdfs', 'action': ['delete_on_execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file'}
2017-08-31 12:17:01,524 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpfXuEJY 2>/tmp/tmp1S6ZvL''] {'quiet': False}
2017-08-31 12:17:01,561 - call returned (0, '')
2017-08-31 12:17:01,563 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmp7IJkiJ 2>/tmp/tmpfBFxn0''] {'quiet': False}
2017-08-31 12:17:01,600 - call returned (0, '')
2017-08-31 12:17:01,601 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,602 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmp2Eq8T1 2>/tmp/tmpHQxjDx''] {'quiet': False}
2017-08-31 12:17:01,636 - call returned (0, '')
2017-08-31 12:17:01,638 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpf1URL3 2>/tmp/tmpvhaZVZ''] {'quiet': False}
2017-08-31 12:17:01,676 - call returned (0, '')
2017-08-31 12:17:01,677 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,679 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://dh01.int.belong.com.au:50070/webhdfs/v1/tmp/ida23a1791_date163117?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpr_fu1H 2>/tmp/tmpgoPH6t''] {'logoutput': None, 'quiet': False}
2017-08-31 12:17:01,721 - call returned (0, '')
2017-08-31 12:17:01,723 - HdfsResource['/tmp/ida23a1791_date163117'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'source': '/etc/passwd', 'default_fs': 'hdfs://belongcluster1', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': None, 'user': 'hdfs', 'action': ['create_on_execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'file'}
2017-08-31 12:17:01,724 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmp3lwQ6c 2>/tmp/tmptAtVx4''] {'quiet': False}
2017-08-31 12:17:01,762 - call returned (0, '')
2017-08-31 12:17:01,764 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmp0LqJrZ 2>/tmp/tmp2UX9mD''] {'quiet': False}
2017-08-31 12:17:01,802 - call returned (0, '')
2017-08-31 12:17:01,804 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,805 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh01.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpVSsujH 2>/tmp/tmpvH1da9''] {'quiet': False}
2017-08-31 12:17:01,840 - call returned (0, '')
2017-08-31 12:17:01,842 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -s '"'"'http://dh02.int.belong.com.au:50070/jmx?qry=Hadoop:service=NameNode,name=FSNamesystem'"'"' 1>/tmp/tmpSS2wIv 2>/tmp/tmpjscftE''] {'quiet': False}
2017-08-31 12:17:01,880 - call returned (0, '')
2017-08-31 12:17:01,882 - NameNode HA states: active_namenodes = [('nn1', 'dh01.int.belong.com.au:50070')], standby_namenodes = [('nn2', 'dh02.int.belong.com.au:50070')], unknown_namenodes = []
2017-08-31 12:17:01,884 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://dh01.int.belong.com.au:50070/webhdfs/v1/tmp/ida23a1791_date163117?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp6LBGYJ 2>/tmp/tmpSryJ0C''] {'logoutput': None, 'quiet': False}
2017-08-31 12:17:01,921 - call returned (0, '')
2017-08-31 12:17:01,922 - Creating new file /tmp/ida23a1791_date163117 in DFS
2017-08-31 12:17:01,924 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -T /etc/passwd '"'"'http://dh01.int.belong.com.au:50070/webhdfs/v1/tmp/ida23a1791_date163117?op=CREATE&user.name=hdfs&overwrite=True'"'"' 1>/tmp/tmpnWZTOm 2>/tmp/tmpd0EkYg''] {'logoutput': None, 'quiet': False}
2017-08-31 12:17:02,577 - call returned (0, '')
2017-08-31 12:17:02,580 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://belongcluster1', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': None, 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf'}

Datanode Log: cat hadoop-hdfs-datanode-dh04.log | tail -500
************************************************************/
2017-08-31 12:23:49,718 INFO  datanode.DataNode (LogAdapter.java:info(45)) - STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = dh04.int.belong.com.au/XX.XXX.XXX.XX
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.7.1.2.4.0.0-169
STARTUP_MSG:   classpath = /usr/hdp/current/hadoop-client/conf:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/zookeeper-3.4.6.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.0.0-169/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-common-2.7.1.2.4.0.0-169-tests.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-azure-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-annotations-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-auth-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-nfs-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-aws-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/./:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.0.0-169-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/zookeeper-3.4.6.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/zookeeper-3.4.6.2.4.0.0-169-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.0.0-169-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//joda-time-2.9.2.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//commons-digester-1.8.jar::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.4.0.0-169/tez/tez-dag-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-common-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-tests-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-runtime-internals-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-examples-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-api-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-history-parser-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-runtime-library-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-mapreduce-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-azure-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-annotations-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.0.0-169/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.0.0-169/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.0.0-169/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-aws-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/conf:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.4.0.0-169/tez/tez-dag-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-common-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-tests-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-runtime-internals-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-examples-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-api-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-history-parser-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-runtime-library-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-mapreduce-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-azure-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-annotations-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.0.0-169/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.0.0-169/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.0.0-169/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.0.0-169/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.0.0-169/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.0.0-169/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-aws-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.0.0-169/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/tez/conf
STARTUP_MSG:   build = git@github.com:hortonworks/hadoop.git -r 26104d8ac833884c8776473823007f176854f2eb; compiled by 'jenkins' on 2016-02-10T06:18Z
STARTUP_MSG:   java = 1.8.0_60
************************************************************/
2017-08-31 12:23:49,742 INFO  datanode.DataNode (LogAdapter.java:info(45)) - registered UNIX signal handlers for [TERM, HUP, INT]
2017-08-31 12:23:50,446 INFO  impl.MetricsConfig (MetricsConfig.java:loadFirst(112)) - loaded properties from hadoop-metrics2.properties
2017-08-31 12:23:50,627 INFO  timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(61)) - Initializing Timeline metrics sink.
2017-08-31 12:23:50,628 INFO  timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(79)) - Identified hostname = dh04.int.belong.com.au, serviceName = datanode
2017-08-31 12:23:50,633 INFO  timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(91)) - Collector Uri: http://dh08.int.belong.com.au:6188/ws/v1/timeline/metrics
2017-08-31 12:23:50,646 INFO  impl.MetricsSinkAdapter (MetricsSinkAdapter.java:start(206)) - Sink timeline started
2017-08-31 12:23:50,751 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(377)) - Scheduled snapshot period at 60 second(s).
2017-08-31 12:23:50,751 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:start(192)) - DataNode metrics system started
2017-08-31 12:23:50,760 INFO  datanode.BlockScanner (BlockScanner.java:<init>(172)) - Initialized block scanner with targetBytesPerSec 1048576
2017-08-31 12:23:50,762 INFO  datanode.DataNode (DataNode.java:<init>(418)) - File descriptor passing is enabled.
2017-08-31 12:23:50,762 INFO  datanode.DataNode (DataNode.java:<init>(429)) - Configured hostname is dh04.int.belong.com.au
2017-08-31 12:23:50,772 INFO  datanode.DataNode (DataNode.java:startDataNode(1127)) - Starting DataNode with maxLockedMemory = 0
2017-08-31 12:23:50,801 INFO  datanode.DataNode (DataNode.java:initDataXceiver(921)) - Opened streaming server at /0.0.0.0:50010
2017-08-31 12:23:50,803 INFO  datanode.DataNode (DataXceiverServer.java:<init>(76)) - Balancing bandwith is 6250000 bytes/s
2017-08-31 12:23:50,803 INFO  datanode.DataNode (DataXceiverServer.java:<init>(77)) - Number threads for balancing is 5
2017-08-31 12:23:50,807 INFO  datanode.DataNode (DataXceiverServer.java:<init>(76)) - Balancing bandwith is 6250000 bytes/s
2017-08-31 12:23:50,807 INFO  datanode.DataNode (DataXceiverServer.java:<init>(77)) - Number threads for balancing is 5
2017-08-31 12:23:50,808 INFO  datanode.DataNode (DataNode.java:initDataXceiver(936)) - Listening on UNIX domain socket: /var/lib/hadoop-hdfs/dn_socket
2017-08-31 12:23:50,908 INFO  mortbay.log (Slf4jLog.java:info(67)) - Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2017-08-31 12:23:50,920 INFO  server.AuthenticationFilter (AuthenticationFilter.java:constructSecretProvider(294)) - Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2017-08-31 12:23:50,927 INFO  http.HttpRequestLog (HttpRequestLog.java:getRequestLog(80)) - Http request log for http.requests.datanode is not defined
2017-08-31 12:23:50,934 INFO  http.HttpServer2 (HttpServer2.java:addGlobalFilter(710)) - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2017-08-31 12:23:50,937 INFO  http.HttpServer2 (HttpServer2.java:addFilter(685)) - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2017-08-31 12:23:50,937 INFO  http.HttpServer2 (HttpServer2.java:addFilter(693)) - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2017-08-31 12:23:50,937 INFO  http.HttpServer2 (HttpServer2.java:addFilter(693)) - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2017-08-31 12:23:50,955 INFO  http.HttpServer2 (HttpServer2.java:openListeners(915)) - Jetty bound to port 34808
2017-08-31 12:23:50,955 INFO  mortbay.log (Slf4jLog.java:info(67)) - jetty-6.1.26.hwx
2017-08-31 12:23:51,179 INFO  mortbay.log (Slf4jLog.java:info(67)) - Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:34808
2017-08-31 12:23:51,394 INFO  web.DatanodeHttpServer (DatanodeHttpServer.java:start(201)) - Listening HTTP traffic on /0.0.0.0:50075
2017-08-31 12:23:51,585 INFO  datanode.DataNode (DataNode.java:startDataNode(1144)) - dnUserName = hdfs
2017-08-31 12:23:51,585 INFO  datanode.DataNode (DataNode.java:startDataNode(1145)) - supergroup = hdfs
2017-08-31 12:23:51,634 INFO  ipc.CallQueueManager (CallQueueManager.java:<init>(56)) - Using callQueue class java.util.concurrent.LinkedBlockingQueue
2017-08-31 12:23:51,654 INFO  ipc.Server (Server.java:run(676)) - Starting Socket Reader #1 for port 8010
2017-08-31 12:23:51,687 INFO  datanode.DataNode (DataNode.java:initIpcServer(837)) - Opened IPC server at /0.0.0.0:8010
2017-08-31 12:23:51,701 INFO  datanode.DataNode (BlockPoolManager.java:refreshNamenodes(152)) - Refresh request received for nameservices: belongcluster1
2017-08-31 12:23:51,731 INFO  datanode.DataNode (BlockPoolManager.java:doRefreshNamenodes(197)) - Starting BPOfferServices for nameservices: belongcluster1
2017-08-31 12:23:51,746 INFO  datanode.DataNode (BPServiceActor.java:run(814)) - Block pool <registering> (Datanode Uuid unassigned) service to dh01.int.belong.com.au/58.162.144.211:8020 starting to offer service
2017-08-31 12:23:51,746 INFO  datanode.DataNode (BPServiceActor.java:run(814)) - Block pool <registering> (Datanode Uuid unassigned) service to dh02.int.belong.com.au/58.162.144.163:8020 starting to offer service
2017-08-31 12:23:51,754 INFO  ipc.Server (Server.java:run(906)) - IPC Server Responder: starting
2017-08-31 12:23:51,754 INFO  ipc.Server (Server.java:run(746)) - IPC Server listener on 8010: starting
2017-08-31 12:23:52,015 INFO  common.Storage (Storage.java:tryLock(715)) - Lock on /data/hadoop/hdfs/data/in_use.lock acquired by nodename 15550@dh04.int.belong.com.au
2017-08-31 12:23:52,063 INFO  common.Storage (BlockPoolSliceStorage.java:recoverTransitionRead(241)) - Analyzing storage directories for bpid BP-1930018148-58.162.144.211-1462411884867
2017-08-31 12:23:52,063 INFO  common.Storage (Storage.java:lock(675)) - Locking is disabled for /data/hadoop/hdfs/data/current/BP-1930018148-58.162.144.211-1462411884867
2017-08-31 12:23:52,066 INFO  datanode.DataNode (DataNode.java:initStorage(1402)) - Setting up storage: nsid=1515412344;bpid=BP-1930018148-58.162.144.211-1462411884867;lv=-56;nsInfo=lv=-63;cid=CID-0019b609-89c6-421f-b98b-21607b8a21c6;nsid=1515412344;c=0;bpid=BP-1930018148-58.162.144.211-1462411884867;dnuuid=fcb7fe98-6504-40aa-be27-4a4f29e2dde9
2017-08-31 12:23:52,114 INFO  impl.FsDatasetImpl (FsVolumeList.java:addVolume(304)) - Added new volume: DS-0abd3f8e-d495-4740-9fdf-bf528bec435a
2017-08-31 12:23:52,114 INFO  impl.FsDatasetImpl (FsDatasetImpl.java:addVolume(391)) - Added volume - /data/hadoop/hdfs/data/current, StorageType: DISK
2017-08-31 12:23:52,152 INFO  impl.FsDatasetImpl (FsDatasetImpl.java:registerMBean(2055)) - Registered FSDatasetState MBean
2017-08-31 12:23:52,159 INFO  impl.FsDatasetImpl (FsDatasetImpl.java:addBlockPool(2501)) - Adding block pool BP-1930018148-58.162.144.211-1462411884867
2017-08-31 12:23:52,161 INFO  impl.FsDatasetImpl (FsVolumeList.java:run(403)) - Scanning block pool BP-1930018148-58.162.144.211-1462411884867 on volume /data/hadoop/hdfs/data/current...
2017-08-31 12:24:22,319 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43133 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:25:22,303 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43149 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:26:22,307 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43155 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:27:22,304 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43165 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:28:22,308 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43177 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:29:22,303 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43187 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:30:22,313 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43199 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:31:22,310 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43213 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:32:22,306 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43224 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:33:22,301 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43234 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:34:22,305 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43246 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:35:22,303 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43254 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:36:22,312 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43268 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:37:22,303 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43276 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)
2017-08-31 12:38:22,286 INFO  impl.FsDatasetImpl (FsVolumeList.java:run(408)) - Time taken to scan block pool BP-1930018148-58.162.144.211-1462411884867 on /data/hadoop/hdfs/data/current: 870125ms
2017-08-31 12:38:22,286 INFO  impl.FsDatasetImpl (FsVolumeList.java:addBlockPool(434)) - Total time to scan all replicas for block pool BP-1930018148-58.162.144.211-1462411884867: 870126ms
2017-08-31 12:38:22,287 INFO  impl.FsDatasetImpl (FsVolumeList.java:run(190)) - Adding replicas to map for block pool BP-1930018148-58.162.144.211-1462411884867 on volume /data/hadoop/hdfs/data/current...
2017-08-31 12:38:22,288 INFO  impl.BlockPoolSlice (BlockPoolSlice.java:readReplicasFromCache(710)) - Replica Cache file: /data/hadoop/hdfs/data/current/BP-1930018148-58.162.144.211-1462411884867/current/replicas doesn't exist
2017-08-31 12:38:22,300 ERROR datanode.DataNode (DataXceiver.java:run(278)) - dh04.int.belong.com.au:50010:DataXceiver error processing unknown operation  src: /127.0.0.1:43290 dst: /127.0.0.1:50010
java.io.EOFException
        at java.io.DataInputStream.readShort(DataInputStream.java:315)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
        at java.lang.Thread.run(Thread.java:745)



Any input will be helpfull.

4 REPLIES 4

Re: HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

Contributor

@Suhel

Please see the following HCC article which has the similar symptoms. Solution is to upgrade your ambari version.

https://community.hortonworks.com/content/supportkb/49602/dataxceiver-error-processing-unknown-opera...

Re: HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

Super Mentor

@Suhel

I am suspecting about the following failure:

  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py", line 85, in service_check
    journalnode_port = params.journalnode_port
AttributeError: 'module' object has no attribute 'journalnode_port'   

.

Can you please let us know if by any chance you tried installing/upgrading any of the ambari binary on that host? Or tried changing the python/HDP version?

Please also check the output of the following command if it is pointing to Ambari 2.4 (exact versions on all the hosts)

# rpm -qa | grep ambari

- Also please try clearing the ambari agent cache and then restart it once.

# ambari-agent stop
# mv /var/lib/ambari-agent/cache /var/lib/ambari-agent/cache_OLD
# ambari-agent start

.

.

Highlighted

Re: HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

Contributor

Hi .. @Rajesh , @Jay SenSharma .. thanks for getting back on this as quickly as you guys did.
And apologies for late response as restoring business continuity and back log processing took a while.

We did not update any packages or any core component of HDP suite. The following package check return the same component versions in all nodes.

[dataops@dh07 ~]$ rpm -qa | grep ambari
ambari-metrics-hadoop-sink-2.2.1.0-161.x86_64
ambari-agent-2.2.1.0-161.x86_64
ambari-metrics-monitor-2.2.1.0-161.x86_64

[dataops@dh08 ~]$ rpm -qa | grep ambari ambari-metrics-collector-2.2.1.0-161.x86_64 ambari-metrics-monitor-2.2.1.0-161.x86_64 ambari-agent-2.2.1.0-161.x86_64 ambari-metrics-hadoop-sink-2.2.1.0-161.x86_64
[dataops@dh01 ~]$ rpm -qa | grep ambari ambari-metrics-monitor-2.2.1.0-161.x86_64 ambari-server-2.2.1.0-161.x86_64 ambari-agent-2.2.1.0-161.x86_64 ambari-metrics-hadoop-sink-2.2.1.0-161.x86_64
Even now with all processes running fine we have the HDFS Service Check issue, but now the Service check errors out in DH01 (stand by NN).

PS : By the way , we had to take out DH04 and DH03 out of the cluster to restore sanity , we were able to stabilize the cluster then onwards. We are working to re balance the HDFS and then reformat DH04 to bring it back as a fresh Data nodes.

Re: HDFS Service Check failed with (AttributeError: 'module' object has no attribute 'journalnode_port')

New Contributor

I am also getting the same error after enabling the NAME NODE HA

Check the parameter

dfs.journalnode.http-address

=0.0.0.0:8480

it was not there in my case so I add the custom property now hdfs-service check is fine

Don't have an account?
Coming from Hortonworks? Activate your account here