Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

my file is copied to local but I can't find it hdfs

New Contributor

Hi, Can someone please tell me what I did wrong? I have attached the screenshot.

Thanks in advancedel.PNG 🙂

3 REPLIES 3

Super Mentor

@Aman075 

Because the directory "/forPig" or may be its content already exist in your local file system hence the hdfs get command is not able to replace the local dir/file "/forPig" content with the HDFS file content  "/testing/pigData/drivers.csv"

 

So if you want to override the content inside the directory "/forPig" then you can use "-f" force option.

 

# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/drivers.csv
(OR)
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/

 

 

Also in your case "/forData" seems to be a directory hence you can try to specify the filename there like above or the dir like above

 

New Contributor

@jsensharma thanks for your reply, but if there is already a file in that directory why can't I see it when i use ls command? 

Super Mentor

@Aman075 

Instead of doing "#cd  forPig" and then checking the listing can you try specifying the fully qualified path of the dir like: (because there may be multiple "forPig" directories (like one may be in relative path and other on absolute path)

 

# pwd
# ls -lart /forPig

 

 

If you still face any issue then please try the DEBUG logging to see if there is anything wrong?

 

# export HADOOP_ROOT_LOGGER=DEBUG,console
# hdfs dfs -get -f /testing/pigData/drivers.csv /forPig/

 


.