<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: puthivestreaming fail in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111363#M74211</link>
    <description>&lt;P&gt;Output of ConvertJsonToAvro is : &lt;/P&gt;&lt;P&gt;Obj&amp;#1;&amp;#4;&amp;#22;avro.schemaÎ&amp;#1;{"type":"record","name":"dtu","fields":[{"name":"id","type":"string"},{"name":"name","type":"string"}]}&amp;#20;avro.codec
snappyÊ&amp;#2;k¢Î
Wõ™ðw]«ç&amp;#7;€&amp;#2;.&amp;#17;@&amp;#20;9018133883
meijiáàø6Ê&amp;#2;k¢Î
Wõ™ðw]«ç&amp;#7;€&lt;/P&gt;</description>
    <pubDate>Fri, 19 Aug 2016 16:36:57 GMT</pubDate>
    <dc:creator>121285904</dc:creator>
    <dc:date>2016-08-19T16:36:57Z</dc:date>
    <item>
      <title>puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111362#M74210</link>
      <description>&lt;P&gt;I want to put json data to hive via InvokeHttp -&amp;gt; SplitJson -&amp;gt; ConvertJsonToAvro -&amp;gt; PutHiveStreaming .&lt;/P&gt;&lt;P&gt;splitted json as below:
{"id":"9018133883","name":"meiji"}
In ConvertJsonToAvro, the record schema as below:
{ "name": "dtu", "type": "record", "fields":[ { "name":"id","type": "string" }, { "name":"name","type": "string" } ] }
In PutHiveStreaming, the hive metastore uri is : thrift://hive1.wdp:9083
but I got this error:
2016-08-19 15:32:49,067 ERROR [Timer-Driven Process Thread-7] o.a.n.processors.hive.PutHiveStreaming PutHiveStreaming[id=a17a3678-0156-1000-6037-0cbc710e7027] PutHiveStreaming[id=a17a3678-0156-1000-6037-0cbc710e7027] failed to process session due to com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient 2016-08-19 15:32:49,067 ERROR [Timer-Driven Process Thread-7] o.a.n.processors.hive.PutHiveStreaming com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2256) ~[na:na] at com.google.common.cache.LocalCache.get(LocalCache.java:3985) ~[na:na] at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4788) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202) ~[na:na] at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:448) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.&amp;lt;init&amp;gt;(HiveEndPoint.java:274) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.&amp;lt;init&amp;gt;(HiveEndPoint.java:243) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:180) ~[na:na] at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:157) ~[na:na] at org.apache.nifi.util.hive.HiveWriter.lambda$newConnection$0(HiveWriter.java:237) ~[na:na] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:86) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) ~[na:na] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:230) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227) ~[na:na] at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4791) ~[na:na] at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3584) ~[na:na] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2372) ~[na:na] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2335) ~[na:na] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2250) ~[na:na] ... 15 common frames omitted Caused by: java.lang.reflect.InvocationTargetException: null at sun.reflect.GeneratedConstructorAccessor80.newInstance(Unknown Source) ~[na:na] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_101] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_101] at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) ~[na:na] ... 25 common frames omitted Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.NullPointerException at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2256) ~[na:na] at com.google.common.cache.LocalCache.get(LocalCache.java:3985) ~[na:na] at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3989) ~[na:na] at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4873) ~[na:na] at org.apache.hadoop.security.Groups.getGroups(Groups.java:173) ~[na:na] at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1516) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:436) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:236) ~[na:na] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:181) ~[na:na] at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveClientCache.java:330) ~[na:na] ... 29 common frames omitted Caused by: java.lang.NullPointerException: null at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012) ~[na:1.8.0_101] at org.apache.hadoop.util.Shell.runCommand(Shell.java:482) ~[na:na] at org.apache.hadoop.util.Shell.run(Shell.java:455) ~[na:na] at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715) ~[na:na] at org.apache.hadoop.util.Shell.execCommand(Shell.java:808) ~[na:na] at org.apache.hadoop.util.Shell.execCommand(Shell.java:791) ~[na:na] at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:84) ~[na:na] at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:52) ~[na:na] at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:231) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:211) ~[na:na] at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:199) ~[na:na] at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3584) ~[na:na] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2372) ~[na:na] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2335) ~[na:na] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2250) ~[na:na] ... 38 common frames omitted
thanks in advance.&lt;/P&gt;</description>
      <pubDate>Fri, 19 Aug 2016 15:30:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111362#M74210</guid>
      <dc:creator>121285904</dc:creator>
      <dc:date>2016-08-19T15:30:28Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111363#M74211</link>
      <description>&lt;P&gt;Output of ConvertJsonToAvro is : &lt;/P&gt;&lt;P&gt;Obj&amp;#1;&amp;#4;&amp;#22;avro.schemaÎ&amp;#1;{"type":"record","name":"dtu","fields":[{"name":"id","type":"string"},{"name":"name","type":"string"}]}&amp;#20;avro.codec
snappyÊ&amp;#2;k¢Î
Wõ™ðw]«ç&amp;#7;€&amp;#2;.&amp;#17;@&amp;#20;9018133883
meijiáàø6Ê&amp;#2;k¢Î
Wõ™ðw]«ç&amp;#7;€&lt;/P&gt;</description>
      <pubDate>Fri, 19 Aug 2016 16:36:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111363#M74211</guid>
      <dc:creator>121285904</dc:creator>
      <dc:date>2016-08-19T16:36:57Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111364#M74212</link>
      <description>&lt;P&gt;Are you running NiFi on Windows? If so, you'll need winutils.exe and to add it to the HADOOP_HOME environment variable as described here: &lt;A href="http://stackoverflow.com/questions/33048363/issue-with-hivetopology-from-storm-hive" target="_blank"&gt;http://stackoverflow.com/questions/33048363/issue-with-hivetopology-from-storm-hive&lt;/A&gt; &lt;/P&gt;</description>
      <pubDate>Fri, 19 Aug 2016 19:37:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111364#M74212</guid>
      <dc:creator>mburgess</dc:creator>
      <dc:date>2016-08-19T19:37:39Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111365#M74213</link>
      <description>&lt;P&gt;Thanks for your help, it works for me.&lt;/P&gt;&lt;P&gt;now I got anoth issue: &lt;/P&gt;&lt;P&gt;failed to create hivewriter for endpoint: metaStoreUri='thrift://hive1.wdp:9083', database='yl', table='ddd', partitionVals=[]&lt;/P&gt;&lt;P&gt;Is the metaStoreUri wrong?&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 14:21:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111365#M74213</guid>
      <dc:creator>121285904</dc:creator>
      <dc:date>2016-08-22T14:21:32Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111366#M74214</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="6849-qq20160822-1.png" style="width: 718px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/23264iFD72369BEEFF8834/image-size/medium?v=v2&amp;amp;px=400" role="button" title="6849-qq20160822-1.png" alt="6849-qq20160822-1.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;I got these errors in nifi PutHiveStreaming.&lt;/P&gt;</description>
      <pubDate>Mon, 19 Aug 2019 11:20:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111366#M74214</guid>
      <dc:creator>121285904</dc:creator>
      <dc:date>2019-08-19T11:20:58Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111367#M74215</link>
      <description>&lt;P&gt;are there firewall issues?  can you connect to hive from other apps on that machine.  that doesn't look like a valid thrift port?  is that port open?  is that real name  hive1.wdp?   is thrift running?&lt;/P&gt;</description>
      <pubDate>Mon, 29 Aug 2016 02:20:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111367#M74215</guid>
      <dc:creator>TimothySpann</dc:creator>
      <dc:date>2016-08-29T02:20:31Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111368#M74216</link>
      <description>&lt;P&gt;I use convertjsontoavro processor:&lt;/P&gt;&lt;P&gt;json &lt;/P&gt;&lt;P&gt;{
  "name": "张三",
  "num": "2",
  "score": "3.4",
  "newtime": "2016-03-01 10:10:10"
}&lt;/P&gt;&lt;P&gt;avro schema &lt;/P&gt;&lt;P&gt;{ 
"name" : "newsInfo", 
"type" : "record", 
"fields" :  [{"name" : "name", "type" : "string"},
{"name" : "num", "type" : "int"}, 
{"name" : "score", "type" : "double"}, 
{"name" : "newtime", "type" : "long", "logicalType" : "timestamp"}]
}&lt;/P&gt;&lt;P&gt;but got error:&lt;/P&gt;&lt;P&gt;Failed to convert 1/1 records from JSON to Avro&lt;/P&gt;</description>
      <pubDate>Sun, 09 Oct 2016 17:38:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111368#M74216</guid>
      <dc:creator>121285904</dc:creator>
      <dc:date>2016-10-09T17:38:54Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111369#M74217</link>
      <description>&lt;P&gt;I also got these errors in nifi PutHiveStreaming,Did you solve?&lt;/P&gt;</description>
      <pubDate>Mon, 17 Oct 2016 14:40:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111369#M74217</guid>
      <dc:creator>565109424</dc:creator>
      <dc:date>2016-10-17T14:40:57Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111370#M74218</link>
      <description>&lt;P&gt;Hi Boyer, I have come across the same issue with NiFi 1.2.0 installed on HDP 2.6.0 (as a standalone verison).&lt;/P&gt;&lt;P&gt;Getting the same error 'Error connecting to Hive EndPoint:{metaStoreUri='thrift://sandbox.hortonworks.com:9083''&lt;/P&gt;&lt;P&gt;Just asking you, whether this issue is resolved for you or not ?&lt;/P&gt;&lt;P&gt;If resolved, can you please guide me the steps to solve the problem.&lt;/P&gt;&lt;P&gt;Thanks in advance.&lt;/P&gt;</description>
      <pubDate>Wed, 07 Jun 2017 19:11:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111370#M74218</guid>
      <dc:creator>santhosh_sagi</dc:creator>
      <dc:date>2017-06-07T19:11:53Z</dc:date>
    </item>
    <item>
      <title>Re: puthivestreaming fail</title>
      <link>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111371#M74219</link>
      <description>&lt;P&gt;In my experience, the connection error goes away if you remove "thrift://" from the URI.&lt;/P&gt;</description>
      <pubDate>Thu, 28 Jun 2018 22:49:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/puthivestreaming-fail/m-p/111371#M74219</guid>
      <dc:creator>dougspadotto_h</dc:creator>
      <dc:date>2018-06-28T22:49:18Z</dc:date>
    </item>
  </channel>
</rss>

