- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
SemanticException Error retrieving udf
- Labels:
-
Apache Hive
Created on ‎06-06-2017 06:54 AM - edited ‎09-16-2022 04:42 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey,
We create the udf function in hive:
1) hdfs dfs -put /tmp/changeFormatDate-0.0.1.jar /data/changeFormatDate-0.0.1.jar
2) With Sentry enabled, grant privilege to the JAR on HDFS:
create role role_udfs;
GRANT ALL ON URI 'hdfs:///data/changeFormatDate-0.0.1.jar' TO ROLE role_udfs;
grant role role_udfs to group hive;
Detail in changeFormatDate-0.0.1.jar
META-INF/
META-INF/MANIFEST.MF
com/
com/ag2rlm/
com/ag2rlm/udfs/
com/ag2rlm/udfs/ChangeFormatDate.class
META-INF/maven/
META-INF/maven/com.ag2rlm/
META-INF/maven/com.ag2rlm/changeFormatDate/
META-INF/maven/com.ag2rlm/changeFormatDate/pom.xml
META-INF/maven/com.ag2rlm/changeFormatDate/pom.properties
Class heading
package com.ag2rlm.udfs;
.....
public class ChangeFormatDate extends UDF
.....
3) create function with hdfs
CREATE FUNCTION CHANGE_FORMAT_DATE AS 'com.ag2rlm.udfs.ChangeFormatDate';
output error :
Error while compiling statement: FAILED: SemanticException Error retrieving udf class:com.ag2rlm.udfs.ChangeFormatDate
Can you help me on this problem ?
Best regards,
Created ‎06-06-2017 11:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is /data, in HDFS, and /tmp, in the local filesystem, in the Hive Aux or Reloadable Aux paths? If yes, did you restart HS2 (if it is in the Aux path) or run the reload command in beeline (if it is in the Reloadable Aux path)?
You ran the grant statement on the HDFS path but not the local path. Refer to the UDF doc as it states that you must do both.
The create function statement is also missing the USING JAR portion. You need to specify the jar path in it.
https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cm_mc_hive_udf.html
Created ‎06-06-2017 11:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is /data, in HDFS, and /tmp, in the local filesystem, in the Hive Aux or Reloadable Aux paths? If yes, did you restart HS2 (if it is in the Aux path) or run the reload command in beeline (if it is in the Reloadable Aux path)?
You ran the grant statement on the HDFS path but not the local path. Refer to the UDF doc as it states that you must do both.
The create function statement is also missing the USING JAR portion. You need to specify the jar path in it.
https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cm_mc_hive_udf.html
Created on ‎01-26-2018 08:05 AM - edited ‎01-26-2018 08:08 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have the same issue. I am following both the documentation in
https://www.bmc.com/blogs/how-to-write-a-hive-user-defined-function-udf-in-java/
and the link mentioned in previous post:
https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cm_mc_hive_udf.html
.
These are the steps I have taken:
1) The goal is to create a temporary user defined function FNV.java. I have put in dir /src/main/java/com/company/hive/udf/FNV.java
the following code:
package com.company.hive.udf;
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.hive.ql.exec.Description;
import org.apache.hadoop.io.Text;
import java.math.BigInteger;
public final class FNV extends UDF{
<...all tha java code...>
}
2)I have added the 2 required JARS for the imports to the CLASSPATH, compiled, build a jar out of this: /src/main/java/com/company/hive/udf/FNV.jar. This is present on the host where the hive metastore and hiveserver2 is running.
I check with jar tvf FNV.jar and see that my class src/main/java/com/company/hive/udf/FNV.class is present
3)I put the FNV.jar file on hdfs and did a chown hive:hive and a chmod with full 777 rights
4)I changed the configuration for 'Hive Auxiliary JARs Directory' in Hive to the path of the jar: /src/main/java/com/company/hive/udf/
5)I redeployed the client config and restarted hive. Here I notice that the 2nd hiveserver (on a different node-not where the JAR is located) has trouble restarted. The host with the hive metastore, hiveserver2 and the jar is up and running.
6) I granted access to the hdfs location and the file on the local host to a role called 'hive_jar'. This is done by logging into beeline
!connect jdbc:hive2://node009.cluster.local:10000/default
GRANT ALL ON URI 'file:///src/main/java/com/company/hive/udf/FNV.jar' TO ROLE HIVE_JAR;
GRANT ALL ON URI 'hdfs:///user/name/FNV.jar' TO ROLE HIVE_JAR;
I do notice that SHOW CURRENT ROLES in beeline for the hive user does give the HIVE_JAR role as wanted.
7)I start hive and add the jar using the local hosts's path: add jar /src/main/java/com/company/hive/udf/FNV.jar; I check with list jars that the jar is present
😎 In the same session I try to create the temporary function:
create temporary function FNV as 'com.company.hive.udf.FNV';
I keep on getting error :
FAILED: Class com.company.hive.udf.FNV not found
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.FunctionTask
Any clue what I am missing??
THanks for feedback!
