Member since
12-09-2014
30
Posts
4
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
12266 | 04-08-2016 10:28 PM | |
2865 | 12-18-2014 09:00 PM |
10-12-2019
10:49 PM
This (NULL issue) can be caused by data schema mismatch.
... View more
06-13-2016
12:01 PM
This is my testing result at the Hadoop master node which used for namenode and hive server2. When I executing beeline for local file loading to a table, I met same error. It was the permission issue on the file to the hive user which is the owner of HiveServer2 process. It was solve when I grant read permission the file including whole path. Please check the file accessible permission as this. sudo -u hive cat ~/test.txt
... View more
04-08-2016
10:28 PM
I wonder if the meta-tool can help? https://cwiki.apache.org/confluence/display/Hive/Hive+MetaTool
... View more
11-20-2015
05:13 AM
Thank you! According the cloudera jdbc documentation this is the right format: jdbc:hive2://zk=hadoopcluster01:2181,hadoopcluster02:2181/hiveserver2 both of course I tried both way... unfortunatelly it was not work. So I changed it to the original hive jdbc: <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId> <version>1.2.1</version> </dependency> and it looks good now! 😉
... View more
12-18-2014
09:00 PM
1 Kudo
This behavior is like any system with float representation, float bit representation does not allow exact representation of most values. There is a lot of literature on the subject like http://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html If you want exact values it's better to go with decimal, where you can specify scale and precision. Thanks Szehon
... View more
12-10-2014
11:41 AM
Yea I agree, it is mostly likely a permission error, I would check the hive logs to see exactly which file/dir path is giving the error. Thanks Szehon
... View more
12-09-2014
02:25 PM
Looks like you need to increase the memory of the HS2 process itself. That flag you mentioned will only affect the MR jobs that is spawned by Hive, but the stack indicates that it didn't make it past the compiler. Hope that helps, Szehon
... View more