Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to update the jar of a udf

How to update the jar of a udf

Expert Contributor

I created my own (generic) udf, which works very well when added in hive:

CREATE FUNCTION myfunc  AS 'io.company.hive.udf.myfunc' USING JAR 'hdfs:///myudfs/myfunc.jar';

After a while I wanted to update my udf, so I created a new jar with the same name, and put it in hdfs by overwriting the old jar. Lo and behold, I cannot use my function again! It does not matter if I do first a:

drop function if exists myfunc;
CREATE FUNCTION myfunc AS 'io.company.hive.udf.myfunc' USING JAR 'hdfs:///myudfs/myfunc.jar';

From beeline, I got one of these error message:

java.io.IOException: Previous writer likely failed to write hdfs://ip-10-0-10-xxx.eu-west-1.compute.internal:8020/tmp/hive/hive/_tez_session_dir/0de6055d-190d-41ee-9acb-c6b402969940/hmyfunc.jar Failing because I am unlikely to write too.

or

org.apache.hadoop.hive.ql.metadata.HiveException: Default queue should always be returned.Hence we should not be here.

Looking at the logs, it looks like Hive is localising the jar file (good) but as a session is reused, if the new jar does not match the jar already present in the localised directory hive will complain and will apparently wait indefinitely.

If my understanding is correct, is there a way to tell Tez to not reuse any of the current sessions?

If my understanding is not correct, is there a way to do what I want?

Context: hdp 2.6.0.3, no llap, on aws.

Thanks,

2 REPLIES 2

Re: How to update the jar of a udf

Contributor

@Guillaume Roger

This error says you already have an existing jar with same name in the classpath. Can you delete the old jar from classpath before adding the new one . Please refer HiveResources for delete jar commands.

Re: How to update the jar of a udf

Expert Contributor
@rtrivedi

Thanks for your answer but I believe that's not the issue. I tried a lot of variations with the `hive delete` command, to no avail:

delete jar hdfs:///myudfs/myfunc.jar;
list jar; --give a localised jar
delete jar $localised_jar;
CREATE FUNCTION myfunc AS 'io.company.hive.udf.myfunc' USING JAR 'hdfs:///myudfs/myfunc.jar';

And I end up having the same error again.

Don't have an account?
Coming from Hortonworks? Activate your account here