Member since
02-22-2017
33
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2837 | 10-28-2016 09:38 AM |
11-26-2019
03:43 AM
Check IP address mapping in /etc/hosts If you are using floating IP to map public and private IP, update /etc/hosts with only private IP address. The IP in /etc/hosts should be the one that displays with ifconfig command.
... View more
10-28-2016
09:38 AM
Issue was solved by my self. The solutin was: 1) under folder in which workflow.xml is create folder lib and put there all hive jar files from sharedlibDir(/user/oozie/share/lib/lib_20160928171540)/hive; 2) Create hive-site.xml with contents: <configuration>
<property>
<name>ambari.hive.db.schema.name</name>
<value>hive</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://xxxxx:9083</value>
</property>
<property>
<name>hive.zookeeper.quorum</name>
<value>xxxx:2181,yyyyy:2181,zzzzzz:2181</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/smartdata/hive/</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.postgresql.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:postgresql://xxxxx:5432/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
</configuration>
and put it on hdfs for example in /tmp/hive-site.xml 3) Add following line in workflow.xml: <file>/tmp/hive-site.xml</file> This solved my issue.
... View more
05-09-2017
09:17 PM
Resolution/Workaround:
- Clear any value assigned to the Hive Configuration Resources property in the PutHiveStreaming processor. (With no site.xml files provided, NiFi will use the site.xml files that are loaded in the classpath).
- To load the site.xml files (core-site.xml, hdfs-site.xml, and hive-site.xml) on NiFi's classpath, place them in NiFi's conf directory (for Ambari based installs that would be in /etc/nifi/conf)
- Restart NiFi.
... View more
10-07-2016
04:50 PM
For other users/readers who do not know, HDF 2.0 includes as part of the release includes the following: GetKafka and PutKafka --> Support Kafka 0.8 ConsumeKafka and PublishKafka --> Supports Kafka 0.9 ConsumeKafka_0_10 and PublishKafka_0_10 --> Supports Kafka 0.10 Thanks, Matt
... View more
12-14-2018
03:53 PM
I am getting same error in HDP 3.
... View more