Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Transfer files to S3 based on file timestamp

avatar
Contributor

I have used case where i am reading files with timestamp and these files has to be transferred to S3 and create folder with respective dates. Ex: file names abcd.out.gz.20200303 , abcd.out.gz.20200302

and the file abcd.out.gz.20200303 need to be in S3 under /data/20200303

and file abcd.out.gz.20200302 under /data/20200302.

 

How can i achieve this in NiFi. 

1 ACCEPTED SOLUTION

avatar
Contributor

I got solution for this. Had to use expression language in Object Key to fetch date from file and it worked. Below is expression :${filename:substringAfter('.gz.')}/${filename}

View solution in original post

4 REPLIES 4

avatar
Explorer

1. ListS3:List all object file path in the specific bucket.

2. RouteOnAttribute:Filter out unused file (optional).

3. FetchS3Object :Fetch file.

4. UpdateAttribute:Rename filename(file path in bucket) to the specific path.  ex:

5. PutS3Object :Put file to the specific bucket.

 

PS:Once execute FetchS3Object, files will be load in memory. So, it's better to limit Back Pressure Object Threshold or Size Threshold in the Connection after FetchS3Object.

 

 

avatar
Contributor

@AustinLiu : But i need to transfer file abcd_20200303 to S3 folder 20200303 and respectively based on dates. Every day when the files arrive my processor should identify file based on date and push it to respective date folder in S3.

avatar
Contributor

@AustinLiu Just to clarify, i am transferring files from linux box to S3. 

avatar
Contributor

I got solution for this. Had to use expression language in Object Key to fetch date from file and it worked. Below is expression :${filename:substringAfter('.gz.')}/${filename}