Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Ambari Files View - Service hdfs check failed

avatar
Contributor

Ambari Files View indicates that service hdfs check failed

screen-shot-2018-03-11-at-53312-pm.png

I can't bring Files View.

I go to Dashboard/HDFS and run a service check and nothing wrong is found. All blocks are fine: no corrupt, no missing. HDFS is not in safe mode.

I restarted all HDFS services, restarted Ambari, the problem does not go away.

I can access HDFS using CLI and everything is fine. The issue is that I have a few users that use that View.

Any suggestion?

1 ACCEPTED SOLUTION

avatar
Super Guru

@Jane Becker

I believe that your views got corrupted. This could have happened due metadata corruption. You may want to check the recent period events that may have lead to this situation.

To delete and recreate new instances of Ambari Views, go to "Manage Ambari". Let us know if that addressed your issue.

View solution in original post

6 REPLIES 6

avatar
Master Mentor

@Jane Becker

In general the HDFS Service check needs to have the the PATH (user home directory) to be created in advance as following:

Suppose if the user who has logged in to FileView has the name as "testuser1" then you will need to create a directory in HDFS as following:

# su - hdfs -c "hdfs dfs -mkdir /user/testuser1"
# su - hdfs -c "hdfs dfs -chown -R testuser1:hadoop /user/testuser1"

.

Ambari also allows to auto create the users home directory as described in the following link with the help of "ambari.post.user.creation.hook" : https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.5/bk_ambari-administration/content/create_use...

avatar
Master Mentor

@Jane Becker @Constantin Stanca

If it is happening with few specific users then it will be good to check if by any chance their users name contains any UpperCase/MixedCase letters?

Also we can enable the DEBUG on fileview to collect more detailed information (by default the logging level will be INFO)

# grep 'INFO' /var/lib/ambari-server/resources/views/work/FILES\{1.0.0\}/view.log4j.properties 
log4j.logger.org.apache.ambari.view.filebrowser=INFO,filesView

.

After the change we will need to restart the ambari server and then we can see some additional logging inside

 /var/log/ambari-server/files-view/files-view.log

.

avatar
Contributor

@Jay Kumar SenSharma

After more investigation, I realized that other views were broken too and they seemed to have some issues with cluster configuration. I noticed that one of the IP addressed of one of the nodes was wrong. I checked the hosts file and IP addressed seemed fine. I restarted the ambari agent for that node and Ambari reported the correct IP this time. That did not resolve the problem.

Good advice on setting the logging level. I'll change that and continue the troubleshooting.

avatar
Super Guru

@Jane Becker

I believe that your views got corrupted. This could have happened due metadata corruption. You may want to check the recent period events that may have lead to this situation.

To delete and recreate new instances of Ambari Views, go to "Manage Ambari". Let us know if that addressed your issue.

avatar
Contributor

@Constantin Stanca

I just tried it and it worked!!! I guess the metadata was corrupted. That is something that I will investigate later, but I am back in business before my Monday morning crowd gets cranky.

Thank you @Constantin Stanca and @Jay Kumar SenSharma for helping me on a Sunday!

avatar
Super Guru

@Jane Becker

Happy it worked out. Enjoy the rest of the week-end!