Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

PuthiveStreaming processor - Avro record error


Hi @Shu

I have seen your article below and tried everything it was suggested there . You have helped me alot and thanks for that .

I'm stuck with the error Refer Using


  1. QueryDatabasetable
  2. PutHiveStreaming

Tables are created with same column names as like Source table , Partitioned , bucketed , and orc.

Also could you please help us understand how the flow read the data and how it should be given . we are trying to connect from Db2 to Hive . Do we really need to convert inbetween from db2 to avro or anything of that sort


Super Guru
@Raj ji

Once make sure the avro schema field name is matching(case sensitive) with the Partition Columns property value specified.

If your avro data file having chrono is Captial letters then you need to change the property value according to the avro schema field name.

Please refer to this and this links explains about streaming api of Hive.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.