Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Not able to start HDFS service from ambari after installing HDP 2.4 in single node vm

avatar
Explorer

Hi,

I am a beginner in HDP and Ambari. Recently, i have installed HDP 2.4 in my cent os 7 VM ( single node with 5 gb ram). Installation went fine. After installation, when I tried to start hdfs service from UI ( automated start got failed), I got below error:

################

resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode'' returned 1. /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh: line 76: [: centos-node2.com: integer expression expected starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-192.168.2.15 Error: Could not find or load main class centos-node2.com

#####################

Not sure how to fix this issue (:. If anyone can please help!!!!!

some more info:

################

[root@192 sbin]# hadoop classpath /usr/hdp/2.4.0.0-169/hadoop/conf:/usr/hdp/2.4.0.0-169/hadoop/lib/*:/usr/hdp/2.4.0.0-169/hadoop/.//*:/usr/hdp/2.4.0.0-169/hadoop-hdfs/./:/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/*:/usr/hdp/2.4.0.0-169/hadoop-hdfs/.//*:/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/*:/usr/hdp/2.4.0.0-169/hadoop-yarn/.//*:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/lib/*:/usr/hdp/2.4.0.0-169/hadoop-mapreduce/.//*:/bin:/usr/java/jdk1.8.0_51/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/usr/java/jdk1.8.0_51/bin:/root/bin:mysql-connector-java.jar:postgresql-jdbc2ee.jar:postgresql-jdbc2.jar:postgresql-jdbc3.jar:postgresql-jdbc.jar:/usr/hdp/2.4.0.0-169/tez/*:/usr/hdp/2.4.0.0-169/tez/lib/*:/usr/hdp/2.4.0.0-169/tez/conf

##################################

/var/lib/ambari-agent/data/errors-257.txt :

Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 401, in NameNode().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 102, in start namenode(action="start", hdfs_binary=hdfs_binary, upgrade_type=upgrade_type, env=env) File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 146, in namenode create_log_dir=True File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 267, in service Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop- daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode'' returned 1. /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh: line 76: [: centos-node2.com: integer expression expected starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-192.168.2.15 Error: Could not find or load main class centos-node2.com

Thanks,

Subhadeep

1 ACCEPTED SOLUTION

avatar

@subhadeep dutta gupta

No need to unistall the openjdk you can change the jave with alternative command. Perfom follwoing steps to set your java path:-

alternatives --install /usr/bin/java java /usr/java/jdk1.8.0_91/bin/java 2

alternatives --install /usr/bin/javac javac /usr/java/jdk1.8.0_91/bin/javac 2

alternatives --config java

alternatives --config javac

java -version

Here my java path is /usr/java/jdk1.8.0_91/bin/java , replce this with you java path

View solution in original post

5 REPLIES 5

avatar

Can you please confirm the hostname is correctly setup on the node. Please validate that against your core-site.xml and hdfs-site.xml files as well.

avatar
Master Mentor

@milind pandit

Watch out for blank space in the hostname .. The easiest solution is to rename the host .

avatar
Explorer

Hi...Thanks for reply. I have found some issue with hostname set in /etc/hostname. Also I have found that both openjdk and java 1.8 are installed in vm and when I am executing java --version command, its taking open jdk java where as java home is set as per java 1.8 path. I will remove open jdk and try to start services again..

avatar

@subhadeep dutta gupta

No need to unistall the openjdk you can change the jave with alternative command. Perfom follwoing steps to set your java path:-

alternatives --install /usr/bin/java java /usr/java/jdk1.8.0_91/bin/java 2

alternatives --install /usr/bin/javac javac /usr/java/jdk1.8.0_91/bin/javac 2

alternatives --config java

alternatives --config javac

java -version

Here my java path is /usr/java/jdk1.8.0_91/bin/java , replce this with you java path

avatar
Explorer

Many thanks @Ashnee Sharma....