Hi,
Greetings!
Looking for assistance for Nifi-hive process.
I am pushing data on HDFS (version 2.7.3) using NIFI putHDFS (version1.2.0). So that i can access hdfs data on hive using external table.
I am getting following error :
Failed to write to HDFS due to org.apache.nifi.processor.exception.
ProcessException: IOException thrown from PutHDFS[id=6a3e3aa1-ed4a-19b3-bfd8-75796c673298]:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException):
The directory item limit of /apps/hive/warehouse/hdf_stg_table_nifi is exceeded: limit=1048576 items=1048576
I checked few topics on the same & did following changes, but did not help yet to resolve issue.
- Following changes done on hive configuration
hive.tez.dynamic.partition.pruning.max.event.size,changed from 1048576 to 2097152
hive.vectorized.groupby.checkinterval, changed from 4096 8192
HiveServer Interactive Heap Size, changed from 512 to 1024
hive.tez.dynamic.partition.pruning.max.data.size 104857600 to 209715200
- Added following code in hdfs-site.xml
<property>
<name>dfs.namenode.fs-limits.max-directory-items</name>
<value>4194304</value>
</property>