We are trying to load data from Hive External table to Hive Managed table and facing Hive permission issues. NiFi uses user: nifi to load into hive that's where its crashing.
Note: Avro files are been created after pulling from source Hive External table.
Please refer to the screenshot attached.
As your SelectHiveQL processor Max rows flowfile set to 0 if you are having huge data then it's better to use SplitAvro processor to Split into 1000 records(because HiveStreaming configured records per transactions is 1000) for each flowfile then feed to PutHiveStreaming processor.
--> Try with small table name instead of long table name that you are having now in PutHiveStreaming.
-->If you are having HA enabled for Hive Meta store then mention all the meta store URI with comma separated so that Hive Streaming processor will try to connect another meta store if the configured meta store is busy.
-->If the error still exists then try to use String datatype (although which is not optimal but to debug this issue you can try this)for all the columns and try to insert data into table, if this works then find out the column that having issues with the data types.
Refer to this link for more details regards to Hive Streaming Datatypes.
Still same error. Done following as per your direction.
Can it be a permission issue to create Partition directories by Nifi as loading in non-partition table is not an issue. The only issue we are facing after loading in non-partition table is, we are unable to see the data from command line where can see data in HDFS hive directory, well we will raise this issue in different thread.
As far as this thread is concern, NiFi --> PutHiveStreaming is unable to load in Hive Partitioned table.
Please advice what to check next, really appreciate your help here @Shu.