Member since
07-07-2015
6
Posts
2
Kudos Received
0
Solutions
10-19-2018
02:10 AM
1 Kudo
You can try run the following your browser https://<query_coordinator_server_name>:25000/close_session?session_id=<session_id>
... View more
10-24-2017
12:12 AM
There is a simple method to remove those. 1. List those directories inside a txt file like below hadoop fs -ls /path > test 2. cat -t test will give you positions of duplicate with junk character 3. open another shell and just try to comment it # to identify exact ones 4. again cat -t the file to confirm u commented the culprits 5. remove original folder frm list 6. for i in `cat list`; do hadoop fs -rmr $i; done
... View more
03-03-2016
09:01 AM
Thank you for following up as always, Srini!
... View more