New Contributor
Posts: 1
Registered: ‎06-06-2017
Accepted Solution

SemanticException Error retrieving udf

We create the udf function in hive:

1) hdfs dfs -put /tmp/changeFormatDate-0.0.1.jar /data/changeFormatDate-0.0.1.jar

2) With Sentry enabled, grant privilege to the JAR on HDFS:

create role role_udfs;
GRANT ALL ON URI 'hdfs:///data/changeFormatDate-0.0.1.jar' TO ROLE role_udfs;
grant role role_udfs to group hive;

Detail in changeFormatDate-0.0.1.jar


Class heading
package com.ag2rlm.udfs;
public class ChangeFormatDate extends UDF


3) create function with hdfs
CREATE FUNCTION CHANGE_FORMAT_DATE AS 'com.ag2rlm.udfs.ChangeFormatDate';

output error :

Error while compiling statement: FAILED: SemanticException Error retrieving udf class:com.ag2rlm.udfs.ChangeFormatDate

Can you help me on this problem ?

Best regards,

Posts: 642
Topics: 3
Kudos: 105
Solutions: 67
Registered: ‎08-16-2016

Re: SemanticException Error retrieving udf

Can you run 'SHOW CURRENT ROLES;' and 'SHOW GRANT ROLE <role_name>;' and provide the output for the user creating the function?

Is /data, in HDFS, and /tmp, in the local filesystem, in the Hive Aux or Reloadable Aux paths? If yes, did you restart HS2 (if it is in the Aux path) or run the reload command in beeline (if it is in the Reloadable Aux path)?

You ran the grant statement on the HDFS path but not the local path. Refer to the UDF doc as it states that you must do both.

The create function statement is also missing the USING JAR portion. You need to specify the jar path in it.
Posts: 21
Registered: ‎10-18-2017

Re: SemanticException Error retrieving udf

[ Edited ]

I have the same issue. I am following both the documentation in

and the link mentioned in previous post:


These are the steps I have taken:


1) The goal is to create a temporary user defined function I have put in dir /src/main/java/com/company/hive/udf/
the following code:
import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.hive.ql.exec.Description;
import java.math.BigInteger;
public final class FNV extends UDF{
<...all tha java code...>
2)I have added the 2 required JARS for the imports to the CLASSPATH, compiled, build a jar out of this: /src/main/java/com/company/hive/udf/FNV.jar. This is present on the host where the hive metastore and hiveserver2 is running.
I check with jar tvf FNV.jar and see that my class src/main/java/com/company/hive/udf/FNV.class is present
3)I put the FNV.jar file on hdfs and did a chown hive:hive and a chmod with full 777 rights
4)I changed the configuration for 'Hive Auxiliary JARs Directory' in Hive to the path of the jar: /src/main/java/com/company/hive/udf/
5)I redeployed the client config and restarted hive. Here I notice that the 2nd hiveserver (on a different node-not where the JAR is located) has trouble restarted. The host with the hive metastore, hiveserver2 and the jar is up and running.
6) I granted access to the hdfs location and the file on the local host to a role called 'hive_jar'. This is done by logging into beeline
!connect jdbc:hive2://node009.cluster.local:10000/default
GRANT ALL ON URI 'file:///src/main/java/com/company/hive/udf/FNV.jar' TO ROLE HIVE_JAR;
GRANT ALL ON URI 'hdfs:///user/name/FNV.jar' TO ROLE HIVE_JAR;

I do notice that SHOW CURRENT ROLES in beeline for the hive user does give the HIVE_JAR role as wanted.

7)I start hive and add the jar using the local hosts's path: add jar /src/main/java/com/company/hive/udf/FNV.jar; I check with list jars that the jar is present
8) In the same session I try to create the temporary function:
create temporary function FNV as '';
I keep on getting error :
FAILED: Class not found
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.FunctionTask



Any clue what I am missing??

THanks for feedback!