Attempting to copy multiple files into HDFS from FTP location via "hadoop fs -get <ftp url> <hdfs location>" and was seeing errors
-get: Fatal internal error
org.apache.hadoop.fs.ftp.FTPException: Failed to get home directory
Caused by: org.apache.commons.net.ftp.FTPConnectionClosedException: Connection closed without indication.
Looking here (https://stackoverflow.com/a/34506003/8236733) I am assuming that when calling "hadoop fs -get <ftp url> <hdfs location>" multiple times, the connection is getting left open and goes stale after some time which is causing the error (do let me know if it's actually from something else (though I will note I was not getting this error until increased the concurrency to a certain limit)).
When adding the "disconnect=true" option to the ftp url as