Support Questions

Find answers, ask questions, and share your expertise

Read Database tables and convert into CSV format

avatar
Contributor

Hi Team,

I'm trying to read the data from MySQL database and trying to place the data in S3. I was able to read the data from MySQL and able to place the data in S3 bucket as well. However whatever the data I'm placing in S3 bucket is not a CSV data. It's JSON data. Below is the flow I have right now. I have total 6 tables that I need to read from database and each table have different schema.

64537-mainflow.png

I want to place the CSV data in S3 bucket. For that I got suggestions from experts and I was bit puzzled whatever I'm doing is correct or not. Any other possible solutions/suggestions to place the CSV data in S3 bucket will be helpful for me.

Suggestions I got:

https://stackoverflow.com/questions/49145832/convert-json-to-csv-in-nifi

https://community.hortonworks.com/questions/176254/convert-json-to-csv-in-nifi.html?childToView=1770...

64541-inferavroschema.png

64540-evaluate.png

64539-convertrecord.png

64538-plan.png

1 ACCEPTED SOLUTION

avatar
Master Guru
@User 805

As you are using NiFi 1.5 you can use Avroreader as RecordReader and CsvRecordSetWriter as Record Writer in ConvertRecord processor.

ConvertRecord processor 
RecordReader-->AvroReader//reads the incoming avro format flowfile contents RecordWriter-->CsvRecordSetWriter//write the output results in csv format

Flow:-

  1. ListDatabaseTables
  2. GenerateTableFetch
  3. UpdateAttribute(are you changing the filename to UUID?)
  4. ConvertRecord
  5. PutS3Object

Please refer to below link to configure AvroReader
https://community.hortonworks.com/questions/175208/how-to-store-the-output-of-a-query-to-one-text-fi...

If you are still facing issues then share us sample of 10 records in csv (or) json format to recreate your scenario on our side.

View solution in original post

4 REPLIES 4

avatar
Master Guru
@User 805

As you are using NiFi 1.5 you can use Avroreader as RecordReader and CsvRecordSetWriter as Record Writer in ConvertRecord processor.

ConvertRecord processor 
RecordReader-->AvroReader//reads the incoming avro format flowfile contents RecordWriter-->CsvRecordSetWriter//write the output results in csv format

Flow:-

  1. ListDatabaseTables
  2. GenerateTableFetch
  3. UpdateAttribute(are you changing the filename to UUID?)
  4. ConvertRecord
  5. PutS3Object

Please refer to below link to configure AvroReader
https://community.hortonworks.com/questions/175208/how-to-store-the-output-of-a-query-to-one-text-fi...

If you are still facing issues then share us sample of 10 records in csv (or) json format to recreate your scenario on our side.

avatar
Contributor

Hi @Shu,

I made the changes as you suggested and it worked. I added one extra processor(Execute SQL). I'm posting the flow here so that it might help someone like me in future who are having same requirement. Really appreciate your help. Thanks a lot.

64544-screen-shot-2018-03-08-at-101317-pm.png

avatar
Master Guru
@User 805

Awesome,thankyou very much for sharing the flow with us 🙂

avatar
Explorer

Hi @Shu_ashu, @ramesh_ganginen Screenshot (27).pngScreenshot (28).png I am encountering the same issue. I attempted the method you suggested, but it didn't work. I'm trying to copy a table from my MySQL database to my local machine using Apache NiFi. I've configured the 'ConvertRecord' processor and added the 'CSVRecordSetWriter', but I'm still receiving files in JSON format. Can you please guide me if I'm making any mistakes?