Support Questions
Find answers, ask questions, and share your expertise

Service 'hdfs' check failed-From Ambari

Solved Go to solution

Re: Service 'hdfs' check failed-From Ambari

Contributor

@Geoffrey Shelton Okot

Here is description of of my cluster :

One node is having the Ambari Server
In Cluster i have 4 nodes .

I have the entries for "nameserver" and the names to resolve .

Yeah i have the correct entries in /etc/hosts .

Regarding the Firewall : Here is the output

service ip6tables status ip6tables: Firewall is not running.

service iptables status IPv4 iptables: Firewall is not running.

When i run following command , this is the output .

# hdfs dfs -ls /user bash: hdfs: command not found

On thing to to add during the cluster setup , i have made changes to some of folder under . I hope this wount make any impact . Please let me know if need some more info

NameNode

/opt/mount1/hdp/hadoop/hdfs/namenode,/opt/mount1/hdp/home/hadoop/hdfs/namenode,/opt/hadoop/hdfs/namenode,/opt/Tidal/hadoop/hdfs/namenode,/opt/mount1/hdp/hadoop/hdfs/namenode,/opt/mount2/hadoop/hdfs/namenode,/opt/mount1/hdp/usr/hadoop/hdfs/namenode,/var/hadoop/hdfs/namenode,/var/crash/hadoop/hdfs/namenode

DataNode

/opt/mount1/hdp/hadoop/hdfs/data,/opt/mount1/hdp/home/hadoop/hdfs/data,/opt/mount1/hdp/Tidal/hadoop/hdfs/data,/opt/mount1/hadoop/hdfs/data,/opt/mount2/hadoop/hdfs/data,/opt/mount1/hdp/usr/hadoop/hdfs/data,/var/hadoop/hdfs/data,/var/crash/hadoop/hdfs/data

Re: Service 'hdfs' check failed-From Ambari

Mentor

@Kishore Kumar

Stop hdfs and change the below parameters

NameNode

/var/hadoop/hdfs/namenode 

DataNode

/opt/mount1/hdp/hadoop/hdfs/data,/opt/mount2/hadoop/hdfs/data 

Restart HDFS

To see the user directory while logged on as root

# su - hdfs 
$ hdfs dfs -ls /user 

You should be able to see /user/hdp the above command should work!

Let me know

Re: Service 'hdfs' check failed-From Ambari

Contributor

@Geoffrey Shelton Okot

That use dosnt exist in Ambari server node . Its there in rest of Cluster hosts . Here is the output.

drwxrwx--- - ambari-qa hdfs 0 2017-08-18 04:21 /user/ambari-qa

drwxr-xr-x - hcat hdfs 0 2017-08-18 04:18 /user/hcat

drwxr-xr-x - hive hdfs 0 2017-08-18 04:19 /user/hive

Error now .

Failed to transition to undefined

Server status: 500


Server Message:

    <small>
      User: hdp is not allowed to impersonate admin
    </small>

Re: Service 'hdfs' check failed-From Ambari

Mentor

@Kishore Kumar

Add these two property settings in core-site.xml. You can find that in the Ambari HDFS config section.

hadoop.proxyuser.hdp.hosts=*
hadoop.proxyuser.hdp.groups=* 

As root user

# su - hdfs 

Create the hdp user directory in hdfs

$hdfs dfs -mkdir /user/hdp 

Create permissions hdp user directory in hdfs

$hdfs dfs -chown hdp:hdfs 

For your information HDFS is a distributed File system so needless to say once created its accessible form all the cluster hosts using hdfs user !

Re: Service 'hdfs' check failed-From Ambari

Contributor

@Geoffrey Shelton Okot

Now all those errors are fixed with this and i could mostly all Views .

For Smart Sense view i am getting following error .

ster information associated with this view instance 20 Aug 2017 11:07:02,068 ERROR [ambari-client-thread-28] ViewRegistry:930 - Could not find the cluster identified by 2. 20 Aug 2017 11:07:02,070 ERROR [ambari-client-thread-28] ContainerResponse:419 - The RuntimeException could not be mapped to a response, re-throwing to the HTTP container org.apache.ambari.server.view.IllegalClusterException: Failed to get cluster information associated with this view instance at org.apache.ambari.server.view.ViewRegistry.getCluster(ViewRegistry.java:931) at org.apache.ambari.server.view.ViewContextImpl.getCluster(ViewContextImpl.java:370) at org.apache.ambari.server.view.ViewContextImpl.getPropertyValues(ViewContextImpl.java:437) at org.apache.ambari.server.view.ViewContextImpl.getProperties(ViewContextImpl.java:171) at com.hortonworks.support.tools.view.ServerProxy.buildHeaders(ServerProxy.java:243) at com.hortonworks.support.tools.view.ServerProxy.execute(ServerProxy.java:133) at com.hortonworks.support.tools.view.ServerProxy.execute(ServerProxy.java:112) at com.hortonworks.support.tools.view.ServerProxy.execute(ServerProxy.java:94)

Re: Service 'hdfs' check failed-From Ambari

Contributor

@Geoffrey Shelton Okot

Now all works . After deleting the old view .

Thanks a lot @Geoffrey Shelton Okot for helping and answering question very prompt .

View solution in original post

Re: Service 'hdfs' check failed-From Ambari

Mentor

@Kishore Kumar

I am happy you can smile and progress with your project! Can you accept my answer and its advisable to open a new thread for the Smart Sense view.

If you are using admin as the login for Smartsense view make sure you have done the following

Add these two property settings in core-site.xml. You can find that in the Ambari HDFS config section.

hadoop.proxyuser.admin.hosts=*
hadoop.proxyuser.admin.groups=*

As root user

# su - hdfs 

Create the admin user directory in hdfs

$hdfs dfs -mkdir /user/admin 

Create permissions admin user directory in hdfs

$hdfs dfs -chown admin:hdfs 

Please revert

Re: Service 'hdfs' check failed-From Ambari

Mentor

@Kishore Kumar

Good to know that your Service hdfs check failed-From Ambari issue was resolved. Can you then accept my answer and open a new thread for the Smart Sense view issue.

This ensures that a thread doesn't span x pages and HCC members usually ignore very old threads. Rewarding answers also encourages members to respond and resolves issues .

Re: Service 'hdfs' check failed-From Ambari

New Contributor

do you used the "hdfs" keywords in your clusterName or hostName?

if you did it change it please!