Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Getting error of avro runtime exception invalid sync while loading data from mysql to hdfs in avro format with same filename


Hi Guys,

I have made the following nifi flow to load data from Mysql to hdfs to capture data change and putting the same in hdfs creating only one file in ddMmYYYY every day:

QueryDatabaseTable->UpdateAttribute(Attribute name:filename and Value:${now():format("ddMMyyyy")}->PutHdfs.

The above flow works fine while loading data for the first time from mysql but gives the following error if I try to open the file after doing new insert or update using command hdfs dfs -text :

org.apache.avro.AvroRuntimeException: Invalid sync! at org.apache.avro.file.DataFileStream.hasNext( at$ at at at at at at$Cat.printToStdout( at$Cat.processPath( at at at at at at at at at at org.apache.hadoop.fs.FsShell.main( Caused by: Invalid sync! at org.apache.avro.file.DataFileStream.nextRawBlock( at org.apache.avro.file.DataFileStream.hasNext( ... 18 more

However if I insert the processor ConvertAvroToJson in between it works fine giving output in json format,but gives above error for avro format.Can any of you suggest a solution to resolve above error?


Master Guru
@Parth Karkhanis

Could you try with introducing SplitAvro Processor in your flow after QueryDatabaseTable processor and configure the processor to create small chunks of flowfile instead of one big AVRO file then try to run your commands again.


Hi Shu,

I got the same error after adding split avro in between querydatabasetable and update attribute . Is it not possible with a single avro file I know it works fine with individual small flow files.