Support Questions

Find answers, ask questions, and share your expertise

after enabling HDFS HA nifi PutHiveStreaming processor stopped working

avatar
Rising Star

After we enabled HDFS HA PutHiveStreaming processor in our NiFi stopped working and generate following errors:

2016-10-14 21:50:53,840 WARN [Timer-Driven Process Thread-6] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=01571000-c4de-1bfd-0f09-5c439230e84e] Processor Administratively Yielded for 1 sec due to processing failure

2016-10-14 21:50:53,840 WARN [Timer-Driven Process Thread-6] o.a.n.c.t.ContinuallyRunProcessorTask Administratively Yielding PutHiveStreaming[id=01571000-c4de-1bfd-0f09-5c439230e84e] due to uncaught Exception: java.lang.IllegalArgumentException: java.net.UnknownHostException: hdpCROC

2016-10-14 21:50:53,847 WARN [Timer-Driven Process Thread-6] o.a.n.c.t.ContinuallyRunProcessorTask

java.lang.IllegalArgumentException: java.net.UnknownHostException: hdpCROC

at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:411) ~[na:na]

at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311) ~[na:na]

at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176) ~[na:na]

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:688) ~[na:na]

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:629) ~[na:na]

at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:159) ~[na:na]

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) ~[na:na]

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) ~[na:na]

at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) ~[na:na]

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) ~[na:na]

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) ~[na:na]

at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) ~[na:na]

at org.apache.hadoop.hive.ql.io.orc.OrcRecordUpdater.<init>(OrcRecordUpdater.java:234) ~[na:na]

at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat.getRecordUpdater(OrcOutputFormat.java:289) ~[na:na]

at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.createRecordUpdater(AbstractRecordWriter.java:253) ~[na:na]

at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.createRecordUpdaters(AbstractRecordWriter.java:245) ~[na:na]

at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.newBatch(AbstractRecordWriter.java:189) ~[na:na]

at org.apache.hive.hcatalog.streaming.StrictJsonWriter.newBatch(StrictJsonWriter.java:41) ~[na:na]

at org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.<init>(HiveEndPoint.java:607) ~[na:na]

at org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.<init>(HiveEndPoint.java:555) ~[na:na]

at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.fetchTransactionBatchImpl(HiveEndPoint.java:441) ~[na:na]

at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.fetchTransactionBatch(HiveEndPoint.java:421) ~[na:na]

at org.apache.nifi.util.hive.HiveWriter.lambda$nextTxnBatch$7(HiveWriter.java:250) ~[na:na]

at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_77]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_77]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_77]

at java.lang.Thread.run(Thread.java:745) [na:1.8.0_77]

Caused by: java.net.UnknownHostException: hdpCROC

hdpCROC - our HDP cluster and dfs.servicenames property. All files such as hive-site.xml, hdfs-site.xml, hdfs-core.xml are actual. What can cause this issue?

1 ACCEPTED SOLUTION

avatar
Master Guru

You are running into NIFI-2873, this will be fixed in an upcoming version.

View solution in original post

3 REPLIES 3

avatar
Master Guru

You are running into NIFI-2873, this will be fixed in an upcoming version.

avatar
Rising Star

Is the any workaround of this? or some hot fix?

avatar
Rising Star

Resolution/Workaround:

- Clear any value assigned to the Hive Configuration Resources property in the PutHiveStreaming processor. (With no site.xml files provided, NiFi will use the site.xml files that are loaded in the classpath). - To load the site.xml files (core-site.xml, hdfs-site.xml, and hive-site.xml) on NiFi's classpath, place them in NiFi's conf directory (for Ambari based installs that would be in /etc/nifi/conf) - Restart NiFi.