Created on
03-12-2020
10:00 PM
- last edited on
03-12-2020
11:22 PM
by
VidyaSargur
Hi, Can someone please tell me what I did wrong? I have attached the screenshot.
Thanks in advance 🙂
Created on 03-12-2020 10:12 PM - edited 03-12-2020 10:18 PM
Because the directory "/forPig" or may be its content already exist in your local file system hence the hdfs get command is not able to replace the local dir/file "/forPig" content with the HDFS file content "/testing/pigData/drivers.csv"
So if you want to override the content inside the directory "/forPig" then you can use "-f" force option.
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/drivers.csv
(OR)
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/
Also in your case "/forData" seems to be a directory hence you can try to specify the filename there like above or the dir like above
Created 03-12-2020 10:18 PM
@jsensharma thanks for your reply, but if there is already a file in that directory why can't I see it when i use ls command?
Created on 03-12-2020 10:22 PM - edited 03-12-2020 10:23 PM
Instead of doing "#cd forPig" and then checking the listing can you try specifying the fully qualified path of the dir like: (because there may be multiple "forPig" directories (like one may be in relative path and other on absolute path)
# pwd
# ls -lart /forPig
If you still face any issue then please try the DEBUG logging to see if there is anything wrong?
# export HADOOP_ROOT_LOGGER=DEBUG,console
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/
.