Created on 03-01-2016 03:03 AM - edited 08-18-2019 04:25 AM
Hi,
I'm trying to check Hdfs Service using Ambari UI. And I can run 'java -version' as any user in linux.
I got the following error with check Hdfs:
stderr: /var/lib/ambari-agent/data/errors-2113.txtTraceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py", line 146, in <module> HdfsServiceCheck().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py", line 52, in service_check bin_dir=params.hadoop_bin_dir File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/execute_hadoop.py", line 55, in action_run environment = self.resource.environment, File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of 'hadoop --config /usr/hdp/current/hadoop-client/conf dfsadmin -fs hdfs://bigdata -safemode get | grep OFF' returned 1. DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. /usr/hdp/2.3.2.0-2950//hadoop-hdfs/bin/hdfs.distro: line 308: /usr/java/default/bin/java: No such file or directory /usr/hdp/2.3.2.0-2950//hadoop-hdfs/bin/hdfs.distro: line 308: exec: /usr/java/default/bin/java: cannot execute: No such file or directory
Created 03-01-2016 03:05 AM
From error it seems java is not in path.
Created 03-01-2016 03:08 AM
I can run 'java -version' as any user in linux shell. So java is in path.
Created 03-01-2016 05:52 AM
Is this the correct JAVA_HOME path => /usr/java/default/bin/java?
Check /etc/ambari-server/conf/ambari.properties and make sure the configured java paths are correct,
You can also check the health of your HDFS by running
hdfs fsck /
Created on 03-01-2016 06:20 AM - edited 08-18-2019 04:25 AM
HI,
Ambari server is installed on a separate server.And the server only run ambari server.
In /etc/ambari-server/conf/ambari.properties:
hdfs fsck /
Created 03-01-2016 11:12 AM
Following errors:
/usr/hdp/2.3.2.0-2950//hadoop-hdfs/bin/hdfs.distro: line 308: /usr/java/default/bin/java: No such file or directory /usr/hdp/2.3.2.0-2950//hadoop-hdfs/bin/hdfs.distro: line 308: exec: /usr/java/default/bin/java: cannot execute: No such file or directory
clearly point out that /usr/java/default/bin/java is not accessible. Does 'java' exist in this folder and does it have execute permission enabled?
If /usr/java/default/bin/java is a symbolic link to a given version of Java - does that location exist as well?
Created 03-02-2016 02:38 AM
Hi, I found a cluster of a disk space is full.And I clean the disk.Checking hdfs service run successfully.
Thank you.
Created 05-19-2016 02:07 PM
@suizhe007 john It looks like HDFS is in "Safe Mode" .
You can run following command to check the status of HDFS
hdfs dfsadmin -safemode get
If it returns "Safe mode is ON" then you have to wait until it comes out of safemode. Check Namenode logs to find out the reason of safemode.
To forcefully leave the safemode run command : (This is not good way to turn off safemode in prod env.)
hdfs dfsadmin -safemode leave
Created 05-19-2016 02:29 PM
Well spotted @Pradeep. This is so easy to miss. Look carefully at the bottom of the screenshot and also near the bottom of the logfile. The important line is
Execution of 'hadoop --config /usr/hdp/current/hadoop-client/conf dfsadmin -fs hdfs://bigdata -safemode get | grep OFF' returned 1.
(italics and bold-text added for clarity). There is some further text after that about the fact that the 'hadoop' command is deprecated, which may have caused you to visually skip over that all-important "returned 1".