Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Solved
Go to solution
how to use path globs to copy files from local to hdfs???
Labels:
- Labels:
-
Apache Hadoop
Expert Contributor
Created ‎01-05-2016 10:08 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have some number of files having format like
1)filename+date.filefomat
2)filename.fileformat
now i need to copy only files which have some number before .(dot).
1 ACCEPTED SOLUTION
Guru
Created ‎01-05-2016 05:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can do something like this:
hdfs dfs -put *filename*[0-9].txt /tmp
For example:
$ touch ${RANDOM}filename-$(date +"%F").txt ${RANDOM}filename.txt $ ls *filename*.txt 17558filename-2016-01-05.txt 27880filename.txt $ hdfs dfs -put *filename*[0-9].txt /tmp $ hdfs dfs -ls /tmp -rw-r--r-- 3 hdfs hdfs 0 2016-01-05 16:39 /tmp/17558filename-2016-01-05.txt
If that doesn't work, add this to the beginning:
set +f
1 REPLY 1
Guru
Created ‎01-05-2016 05:03 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can do something like this:
hdfs dfs -put *filename*[0-9].txt /tmp
For example:
$ touch ${RANDOM}filename-$(date +"%F").txt ${RANDOM}filename.txt $ ls *filename*.txt 17558filename-2016-01-05.txt 27880filename.txt $ hdfs dfs -put *filename*[0-9].txt /tmp $ hdfs dfs -ls /tmp -rw-r--r-- 3 hdfs hdfs 0 2016-01-05 16:39 /tmp/17558filename-2016-01-05.txt
If that doesn't work, add this to the beginning:
set +f
