Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Solved Go to solution

PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

I've a Nifi datalow where I get data from a source, build an Avro file and am ready to stream to Hive.

I setup Hive as requested on https://community.hortonworks.com/questions/59411/how-to-use-puthivestreaming.html. I also moved the entire hive-client and hadoop-client directories to the Nifi node, and reference this hive-site.xml on the PutHiveStreaming processor.

But I'm getting the following error on this processor:

2018-06-28 11:29:21,053 ERROR [Timer-Driven Process Thread-7] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=46762ec7-0164-1000-0000-00001008870a] Failed to process session due to org.apache.nifi.processor.exception.ProcessException: Error writing [org.apache.nifi.processors.hive.PutHiveStreaming$HiveStreamingRecord@733df43a] to Hive Streaming transaction due to com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient: {} org.apache.nifi.processor.exception.ProcessException: Error writing [org.apache.nifi.processors.hive.PutHiveStreaming$HiveStreamingRecord@733df43a] to Hive Streaming transaction due to com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordsError$1(PutHiveStreaming.java:555) at org.apache.nifi.processor.util.pattern.ExceptionHandler$OnError.lambda$andThen$0(ExceptionHandler.java:54) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onHiveRecordError$2(PutHiveStreaming.java:562) at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:148) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$12(PutHiveStreaming.java:740) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2175) at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2145) at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:694) at org.apache.nifi.processors.hive.PutHiveStreaming.lambda$onTrigger$4(PutHiveStreaming.java:572) at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114) at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184) at org.apache.nifi.processors.hive.PutHiveStreaming.onTrigger(PutHiveStreaming.java:572) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2203) at com.google.common.cache.LocalCache.get(LocalCache.java:3937) at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739) at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:291) at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:266) at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:312) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:192) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:122) at org.apache.nifi.util.hive.HiveWriter.lambda$newConnection$1(HiveWriter.java:246) at org.apache.nifi.util.hive.HiveWriter.lambda$callWithTimeout$4(HiveWriter.java:365) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 common frames omitted Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1566) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:92) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:138) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:124) at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:295) at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:291) at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742) at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527) at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282) at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) ... 17 common frames omitted Caused by: java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1564) ... 27 common frames omitted Caused by: java.lang.NullPointerException: null at org.apache.thrift.transport.TSocket.open(TSocket.java:209) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:487) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:282) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:188) at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:406) ... 32 common frames omitted

I thought that by copying and referencing the hive-client dir I'd have all the jars needed. Is this a problem with the CLASSPATH, with the processor configuration or something else?

1 ACCEPTED SOLUTION

Accepted Solutions

Re: PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Super Guru

HDF 3.1.1.0 was built with HDP Hive 2.6.4 libraries, which are not compatible with HDP Hive 2.6.2 (there were evidently some changes to the Thrift interface that were not backwards compatible. The HDF 3.0.x.y series should be compatible with HDP Hive 2.6.2, I think as of this writing the latest is HDF 3.0.2.6.

4 REPLIES 4

Re: PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Super Guru

HDF 3.1.1.0 was built with HDP Hive 2.6.4 libraries, which are not compatible with HDP Hive 2.6.2 (there were evidently some changes to the Thrift interface that were not backwards compatible. The HDF 3.0.x.y series should be compatible with HDP Hive 2.6.2, I think as of this writing the latest is HDF 3.0.2.6.

Re: PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Thanks Matt. I installed the "aligned" HDP and HDF versions and it worked.

Re: PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Explorer

I am also facing this issue where my HDF Version : 3.1.2.0 and HDP : 2.6.5.0. Is there any listing where we can find out which version of hive supported by given HDF version.

Highlighted

Re: PutHiveStreaming error with HDF-3.1.1.0 and HDP-2.6.2.0: " Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient

Hello ashok,

this webpage should be what you are looking for. => https://supportmatrix.hortonworks.com/

Don't have an account?
Coming from Hortonworks? Activate your account here