Member since
08-08-2017
1652
Posts
30
Kudos Received
11
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1970 | 06-15-2020 05:23 AM | |
| 16047 | 01-30-2020 08:04 PM | |
| 2108 | 07-07-2019 09:06 PM | |
| 8245 | 01-27-2018 10:17 PM | |
| 4671 | 12-31-2017 10:12 PM |
11-29-2017
06:55 PM
is it possible to print by CLI all application list so I will by grep capture the hdfs and appliaction ID and then remove it by hdfs dfs -rm -R /spark2-history//{app-id}
... View more
11-29-2017
06:48 PM
ok , so if I want to remove it from the ambari GUI then , how to do it ( I ask because from the page I not see any delete option )
... View more
11-29-2017
06:44 PM
@Aditya thank you - but how to delete all application that use HDFS , because in the page I see a lot of application around 1000 , so I cant delete one by one
... View more
11-29-2017
06:36 PM
but I want to delete all application now ! not to wait for the retention
... View more
11-29-2017
06:13 PM
from hdfs dfs -du -h / we see that spark history take a lot space from HDFS from ambari GUI I choose spark and then quick links and then I get the history server page with all applications I want to delete all applications from the page how to do it because I not see the delete button ? second is it possible to delete the application that use hdfs by API or CLI ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Spark
11-26-2017
01:14 PM
@Jay , first thanks a lot for the great support , actually we solved it by re-configure the worker IP with the previous IP , and then restart the worker host , after server go's , data-node show alive on all workers and worker is part of the cluster
... View more
11-26-2017
01:10 PM
the problem was solved , we see wrong configuration in host file /etc/hosts ( wrong host IP address ) and by edit the host file , we fixed also the DNS configuration , and this solved the problem
... View more
11-26-2017
01:08 PM
from the article - How to identify what is consuming space in HDFS * ( link https://community.hortonworks.com/articles/16846/how-to-identify-what-is-consuming-space-in-hdfs.html ) , by running the script ( from the article ) , we can see who take the most space so in our case - spark-history take the most space , and we deleted the logs/files from Ambari-GUI
... View more
11-26-2017
12:08 PM
from ambari dashboard we can see that HDFS is 100% I start with hadoop fs -du -h / | grep spark2-history
60.9 G /spark2-history * other way to see who take space is from - https://community.hortonworks.com/articles/16846/how-to-identify-what-is-consuming-space-in-hdfs.html is it mean that spark-history take from HDFS 60.9G ? if not how the know what need to clean from the HDFS ? we have 5 workers machine and each worker have the following:
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
11-26-2017
08:37 AM
we add recently the worker06 to the mabari cluster after ambari-agent restart we see that worker machine have heartbeat loos from the ambari-agent log we can see the following: before the ambari-agent restart worker machine heartbeat was ok , so what chould be the reson for that? ERROR 2017-11-26 08:27:09,659 script_alert.py:123 - [Alert][yarn_nodemanager_health] Failed with result CRITICAL: ['Connection failed to http://work
er06.sys58.com:8042/ws/v1/node/info (Traceback (most recent call last):\n File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/packa
ge/alerts/alert_nodemanager_health.py", line 171, in execute\n url_response = urllib2.urlopen(query, timeout=connection_timeout)\n File "/usr/li
b64/python2.7/urllib2.py", line 154, in urlopen\n return opener.open(url, data, timeout)\n File "/usr/lib64/python2.7/urllib2.py", line 431, in
open\n response = self._open(req, data)\n File "/usr/lib64/python2.7/urllib2.py", line 449, in _open\n \'_open\', req)\n File "/usr/lib64/py
thon2.7/urllib2.py", line 409, in _call_chain\n result = func(*args)\n File "/usr/lib64/python2.7/urllib2.py", line 1244, in http_open\n retu
rn self.do_open(httplib.HTTPConnection, req)\n File "/usr/lib64/python2.7/urllib2.py", line 1214, in do_open\n raise URLError(err)\nURLError: <u
rlopen error [Errno 111] Connection refused>\n)']
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache YARN