Member since
12-09-2015
97
Posts
51
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1513 | 05-04-2016 06:00 AM | |
3257 | 04-11-2016 09:57 AM | |
1010 | 04-08-2016 11:30 AM |
03-07-2016
11:36 AM
3 Kudos
I am posting this answer after searching in the internet for a good explanation. Currently the total physical hard disk space (4 nodes) is 720 GB. The dashboard currently shows that only 119 GB is configured for DFS. I want to increase this space to at last 300 GB. I didn't find anything staright forward on Ambari dashboard to do this. The only information I found on the internet is to modifify core-site.xml file to hav a property hadoop.tmp.dir pr that points to another directory. I do not want to blankly do it, without understanding what it means to be expanding HDFS capacity and how to do it through Ambari Dashboard.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
03-07-2016
11:18 AM
I think I have now understood the calculation displayed on the Ambari Dashboard and NN Web UI. What confused me is the information displayed under the 'hosts' link, which displays all nodes with their hard disk consumption. The figure shown for each node (on placing the mouse over the bar), does not match with the DFS consumption. The hdfs dfsadmin -report command displays the consumption of the node on which I executed the command, which is fine. I understood this calculation after you posted the information from your cluster, so thanks again!
... View more
03-07-2016
11:00 AM
@Neeraj Sabharwal I have put all details into a image file and attached. information.png
... View more
03-07-2016
10:32 AM
@Neeraj Sabharwal Thanks for the link. The information provided by the 'hdfs dfsadmin -report' further confuses me. For eg. currently, on the Dashboard it says DFS Used 118.7 GB. But on running the 'hdfs dfsadmin -report' command I see DFS used as 38.64 GB. Why are these figures so different different?
... View more
03-07-2016
09:14 AM
1 Kudo
@Artem Ervits The documents didn't help me much to understand my issue. The document provides only general information, it doesn't give me any information regarding how calculation is made. Currently my main dashboard is showing "100%" dfs consumption and I am seeing red alerts. But if I see the disk consumption of individual hosts, I do not see full consumption. Please I need some help on it so that I know how to make the full use of the hard disk space.
... View more
03-02-2016
02:39 PM
2 Kudos
I have a four node Ambari cluster. When I look at the DFS usage on the Ambari main dashboard and look at the disk space on individual hosts, I get different figures. Following are the figures that I see on the main dashboard HDFS Disk Usage DFS used 49.7 GB (25.84%) Non-DFS 66 GB (34.64%) Remaining 76.1 GB (39.52%) If I see the the disk space for the individual hosts, then I see the following. host1: 28.07 GB/138.87 GB (20.21% used) host2: 27.45 GB/138.87 GB (19.77% used) host3: 31.45 GB/221.63 GB (14.19% used) host4: 77.38 GB/221.38 GB (34% used) So if you see the disk space in GB for each host is more than the total disk space shown under the HDFS disk usage. I am not able to relate these two figures. Also, I would like to understand whether there is a any documentation that explains the different sections of the Ambari Dashboard.
... View more
Labels:
- Labels:
-
Apache Ambari
02-17-2016
03:44 PM
1 Kudo
@Artem Ervits I was looking under "Advanced core-site" section in Ambari, which was my mistake. It was under "Custom core-site" section. Since Neeraj posted the answer first, I have accepted his answer. Thanks anyway 🙂
... View more
02-17-2016
03:41 PM
1 Kudo
Thanks Neeraj. It was my mistake that I didn't read the section name clearly. I have added the two new properties, restarted the components suggested by Ambari and I am able to access the file view without any error!.
... View more
02-17-2016
03:08 PM
Okay, but I do not find the "Add Property.." button to add new property. Where is this button? I know I am overlooking something :(.
... View more
02-17-2016
02:46 PM
1 Kudo
I have created a "Files" view successfully. But, when I click on the file view, I see the following error on the top
500 User: root is not allowed to impersonate admin I know this has to do something with impersonation. I read this post in hortonworks community to understand the cause and fix for this issue https://community.hortonworks.com/questions/153/impersonation-error-while-trying-to-access-ambari.html As suggested in the above thread, I am trying to set the hadoop.proxyuser.root.groups and hadoop.proxyuser.root.host, through the advance config under "HDFS -> Configs -> Advanced" tab. But I am not able to find the above two properties under "Advanced core-site" section. I only see ProxyUserGroups property under "Advanced hadoop-env" section, but I am not sure whether this is the property I need to change. Anybody has any idea about this issue and how to fix it?
... View more
Labels:
- Labels:
-
Apache Ambari