Support Questions

Find answers, ask questions, and share your expertise

Error: PutDatabaseRecord + Hive connection pool

avatar
Contributor

Hello,

I'm using the processor PutDatabaseRecord with a Hive connection pool to insert data in my table. But I get the following error:

Error :

My log file :

77416-fichierlog.png

77415-error.png

I used the same connection pool with putHiveQL and everything is working fine.

Can someone help me plz?

Thank you

1 ACCEPTED SOLUTION

avatar
Master Guru

What version of the Hive driver are you using? I'm not sure there is a version of the Hive driver available that supports all the JDBC API calls made by PutDatabaseRecord, such as executeBatch(). Also since the Hive JDBC driver auto-commits after each operation, PutDatabaseRecord + Hive would not be any more performant than using PutHiveQL. In an upcoming version of NiFi/HDF (for Hive 3.0), you should be able to use PutHive3Streaming to do what you want.

View solution in original post

5 REPLIES 5

avatar
Master Guru

What version of the Hive driver are you using? I'm not sure there is a version of the Hive driver available that supports all the JDBC API calls made by PutDatabaseRecord, such as executeBatch(). Also since the Hive JDBC driver auto-commits after each operation, PutDatabaseRecord + Hive would not be any more performant than using PutHiveQL. In an upcoming version of NiFi/HDF (for Hive 3.0), you should be able to use PutHive3Streaming to do what you want.

avatar
Contributor

@Matt Burgess Thank you for your response

I'm using hive-jdbc-2.1.0.jar.

So the processor PutDatabaseRecord does not offer the insert functionality wih a hive pool connection? if it is the case ,why this connection is proposed in the processor ? Can we do an update and delete query with PutDatabaseRecord + Hive ?

avatar
Master Guru

The HiveConnectionPool is a special type of DBCPConnectionPool, so instances of it get listed with all the others, as it is not the connection pool that doesn't support the operations, but the driver itself.

What format is the input data in? You should be able to use ConvertRecord with a FreeFormTextWriter to generate SQL from your input data (don't forget the semicolon at the end of the line), then you can send that to PutHiveQL.

avatar
Contributor

@Matt Burgess

I have a csv file .. I will try this approach

Thank you ! I accept this answer 🙂

avatar
New Contributor

Hi @mburgess,

 

I was looking for a processor that would let me pull a query out of a field of the incoming flow file, instead of turning the entire flow file into a query. PutDatabaseRecord allowed to do that (which is when I discovered that the Hive connection does not support an explicit call of conn.commit()).

 

I want to keep as much of the flow file intact as possible, is there a way to do that?

 

Thank you.