Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

How to update the jar of a udf

Expert Contributor

I created my own (generic) udf, which works very well when added in hive:

CREATE FUNCTION myfunc  AS '' USING JAR 'hdfs:///myudfs/myfunc.jar';

After a while I wanted to update my udf, so I created a new jar with the same name, and put it in hdfs by overwriting the old jar. Lo and behold, I cannot use my function again! It does not matter if I do first a:

drop function if exists myfunc;
CREATE FUNCTION myfunc AS '' USING JAR 'hdfs:///myudfs/myfunc.jar';

From beeline, I got one of these error message: Previous writer likely failed to write hdfs:// Failing because I am unlikely to write too.


org.apache.hadoop.hive.ql.metadata.HiveException: Default queue should always be returned.Hence we should not be here.

Looking at the logs, it looks like Hive is localising the jar file (good) but as a session is reused, if the new jar does not match the jar already present in the localised directory hive will complain and will apparently wait indefinitely.

If my understanding is correct, is there a way to tell Tez to not reuse any of the current sessions?

If my understanding is not correct, is there a way to do what I want?

Context: hdp, no llap, on aws.



Rising Star

@Guillaume Roger

This error says you already have an existing jar with same name in the classpath. Can you delete the old jar from classpath before adding the new one . Please refer HiveResources for delete jar commands.

Expert Contributor

Thanks for your answer but I believe that's not the issue. I tried a lot of variations with the `hive delete` command, to no avail:

delete jar hdfs:///myudfs/myfunc.jar;
list jar; --give a localised jar
delete jar $localised_jar;
CREATE FUNCTION myfunc AS '' USING JAR 'hdfs:///myudfs/myfunc.jar';

And I end up having the same error again.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.