Support Questions

Find answers, ask questions, and share your expertise

how to delete all application logs from spark history + not by rotation !!

avatar

from

hdfs dfs -du -h /

we see that spark history take a lot space from HDFS

from ambari GUI

I choose spark

and then quick links

and then I get the history server page with all applications

42812-capture.png

I want to delete all applications from the page

how to do it because I not see the delete button ?

second

is it possible to delete the application that use hdfs by API or CLI ?

Michael-Bronson
1 ACCEPTED SOLUTION

avatar
Super Guru

@Michael Bronson,

If you want to delete applications in spark2

hdfs dfs -rm -R /spark2-history/{app-id}

If you want to delete applications in spark1

hdfs dfs -rm -R /spark-history/{app-id}

Restart history servers after running the commands.

Thanks,

Aditya

View solution in original post

14 REPLIES 14

avatar

I try this but get errors ( what is wrong in my syntax ? ) , ( master02 is the name of park-history server )

.

curl -sH "X-Requested-By: ambari" -u admin:admin -i curl http://master02:8080/api/v1/applications

.

HTTP/1.1 404 Not Found

X-Frame-Options: DENY

X-XSS-Protection: 1; mode=block

X-Content-Type-Options: nosniff

Michael-Bronson

avatar
Super Guru

@Michael Bronson,

I see that you are using port 8080.

For spark history server port is 18080 by default

for spark2 history server port is 18081 by default. You can check the port in UI where you saw the applications

avatar

ok , now I am using this:

.

curl -sH "X-Requested-By: ambari" -u "$API_USER"":""$API_PASSWORD" -i curl http://master02:18081/api/v1/applications

.

but no any output from command

what its wrong ?

Michael-Bronson

avatar
Super Guru
@Michael Bronson

Run this command as is. No need to append headers and password. In above command you were using curl twice.

curl http://master02:18081/api/v1/applications | grep "\"id\"" > a.txt

avatar
Cloudera Employee
Hi, if there are more no of files are present in spark history Server, then FS would not be working as expected. In that case, We may need to write a script to delete the old files that are more then 7 days ( or as per your requirement) from the Spark history server location . Thanks Arun