Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

puthivestreaming fail

avatar
Expert Contributor

I want to put json data to hive via InvokeHttp -> SplitJson -> ConvertJsonToAvro -> PutHiveStreaming .

splitted json as below: {"id":"9018133883","name":"meiji"} In ConvertJsonToAvro, the record schema as below: { "name": "dtu", "type": "record", "fields":[ { "name":"id","type": "string" }, { "name":"name","type": "string" } ] } In PutHiveStreaming, the hive metastore uri is : thrift://hive1.wdp:9083 but I got this error: 2016-08-19 15:32:49,067 ERROR [Timer-Driven Process Thread-7] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=a17a3678-0156-1000-6037-0cbc710e7027] PutHiveStreaming[id=a17a3678-0156-1000-6037-0cbc710e7027] failed to process session due to com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient 2016-08-19 15:32:49,067 ERROR [Timer-Driven Process Thread-7] o.a.n.processors.hive.PutHiveStreaming com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2256) ~[na:na] at com.google.common.cache.LocalCache.get(LocalCache.java:3985) ~[na:na] at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4788) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202) ~[na:na] at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:448) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:274) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:243) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:180) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:157) ~[na:na] at org.apache.nifi.util.hive.HiveWriter.lambda$newConnection$0(HiveWriter.java:237) ~[na:na] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:230) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227) ~[na:na] at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4791) ~[na:na] at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3584) ~[na:na] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2372) ~[na:na] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2335) ~[na:na] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2250) ~[na:na] ... 15 common frames omitted Caused by: java.lang.reflect.InvocationTargetException: null at sun.reflect.GeneratedConstructorAccessor80.newInstance(Unknown Source) ~[na:na] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_101] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_101] at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) ~[na:na] ... 25 common frames omitted Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.NullPointerException at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2256) ~[na:na] at com.google.common.cache.LocalCache.get(LocalCache.java:3985) ~[na:na] at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3989) ~[na:na] at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4873) ~[na:na] at org.apache.hadoop.security.Groups.getGroups(Groups.java:173) ~[na:na] at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1516) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:436) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:181) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330) ~[na:na] ... 29 common frames omitted Caused by: java.lang.NullPointerException: null at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012) ~[na:1.8.0_101] at org.apache.hadoop.util.Shell.runCommand(Shell.java:482) ~[na:na] at org.apache.hadoop.util.Shell.run(Shell.java:455) ~[na:na] at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715) ~[na:na] at org.apache.hadoop.util.Shell.execCommand(Shell.java:808) ~[na:na] at org.apache.hadoop.util.Shell.execCommand(Shell.java:791) ~[na:na] at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:84) ~[na:na] at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52) ~[na:na] at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:231) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:211) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:199) ~[na:na] at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3584) ~[na:na] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2372) ~[na:na] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2335) ~[na:na] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2250) ~[na:na] ... 38 common frames omitted thanks in advance.

1 ACCEPTED SOLUTION

avatar
Master Guru

Are you running NiFi on Windows? If so, you'll need winutils.exe and to add it to the HADOOP_HOME environment variable as described here: http://stackoverflow.com/questions/33048363/issue-with-hivetopology-from-storm-hive

View solution in original post

9 REPLIES 9

avatar
Expert Contributor

Output of ConvertJsonToAvro is :

Objavro.schemaÎ{"type":"record","name":"dtu","fields":[{"name":"id","type":"string"},{"name":"name","type":"string"}]}avro.codec snappyÊk¢Î Wõ™ðw]«ç€.@9018133883 meijiáàø6Êk¢Î Wõ™ðw]«ç€

avatar
Master Guru

Are you running NiFi on Windows? If so, you'll need winutils.exe and to add it to the HADOOP_HOME environment variable as described here: http://stackoverflow.com/questions/33048363/issue-with-hivetopology-from-storm-hive

avatar
Expert Contributor

Thanks for your help, it works for me.

now I got anoth issue:

failed to create hivewriter for endpoint: metaStoreUri='thrift://hive1.wdp:9083', database='yl', table='ddd', partitionVals=[]

Is the metaStoreUri wrong?

avatar
Expert Contributor

6849-qq20160822-1.png

I got these errors in nifi PutHiveStreaming.

avatar
Master Guru

are there firewall issues? can you connect to hive from other apps on that machine. that doesn't look like a valid thrift port? is that port open? is that real name hive1.wdp? is thrift running?

avatar
Contributor

I also got these errors in nifi PutHiveStreaming,Did you solve?

avatar
Expert Contributor

I use convertjsontoavro processor:

json

{ "name": "张三", "num": "2", "score": "3.4", "newtime": "2016-03-01 10:10:10" }

avro schema

{ "name" : "newsInfo", "type" : "record", "fields" : [{"name" : "name", "type" : "string"}, {"name" : "num", "type" : "int"}, {"name" : "score", "type" : "double"}, {"name" : "newtime", "type" : "long", "logicalType" : "timestamp"}] }

but got error:

Failed to convert 1/1 records from JSON to Avro

avatar

Hi Boyer, I have come across the same issue with NiFi 1.2.0 installed on HDP 2.6.0 (as a standalone verison).

Getting the same error 'Error connecting to Hive EndPoint:{metaStoreUri='thrift://sandbox.hortonworks.com:9083''

Just asking you, whether this issue is resolved for you or not ?

If resolved, can you please guide me the steps to solve the problem.

Thanks in advance.

avatar

In my experience, the connection error goes away if you remove "thrift://" from the URI.