Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

how to use path globs to copy files from local to hdfs???

Rising Star

I have some number of files having format like

1)filename+date.filefomat

2)filename.fileformat

now i need to copy only files which have some number before .(dot).

1 ACCEPTED SOLUTION

You can do something like this:

hdfs dfs -put *filename*[0-9].txt /tmp

For example:

$ touch ${RANDOM}filename-$(date +"%F").txt ${RANDOM}filename.txt
$ ls *filename*.txt
17558filename-2016-01-05.txt  27880filename.txt
$ hdfs dfs -put *filename*[0-9].txt /tmp
$ hdfs dfs -ls /tmp
-rw-r--r--   3 hdfs      hdfs          0 2016-01-05 16:39 /tmp/17558filename-2016-01-05.txt

If that doesn't work, add this to the beginning:

set +f

View solution in original post

1 REPLY 1

You can do something like this:

hdfs dfs -put *filename*[0-9].txt /tmp

For example:

$ touch ${RANDOM}filename-$(date +"%F").txt ${RANDOM}filename.txt
$ ls *filename*.txt
17558filename-2016-01-05.txt  27880filename.txt
$ hdfs dfs -put *filename*[0-9].txt /tmp
$ hdfs dfs -ls /tmp
-rw-r--r--   3 hdfs      hdfs          0 2016-01-05 16:39 /tmp/17558filename-2016-01-05.txt

If that doesn't work, add this to the beginning:

set +f
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.