Hi, Can someone please tell me what I did wrong? I have attached the screenshot.
Thanks in advance 🙂
Because the directory "/forPig" or may be its content already exist in your local file system hence the hdfs get command is not able to replace the local dir/file "/forPig" content with the HDFS file content "/testing/pigData/drivers.csv"
So if you want to override the content inside the directory "/forPig" then you can use "-f" force option.
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/drivers.csv (OR) # hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/
Also in your case "/forData" seems to be a directory hence you can try to specify the filename there like above or the dir like above
Instead of doing "#cd forPig" and then checking the listing can you try specifying the fully qualified path of the dir like: (because there may be multiple "forPig" directories (like one may be in relative path and other on absolute path)
# pwd # ls -lart /forPig
If you still face any issue then please try the DEBUG logging to see if there is anything wrong?
# export HADOOP_ROOT_LOGGER=DEBUG,console # hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/