Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to call UDF via JDBC

Highlighted

How to call UDF via JDBC

Super Collaborator

I know how to use a UDF via Hive CLI.

And I know how to use JDBC to query Hive.

But I'm struggling to use JDBC to use a UDF in a query.

I think the issue comes down to permissions. My call to register the UDF fails.

I put the UDF jar in HDFS at "/user/myuser", and I'm guessing that the Hive server used by HDFS doesn't have proper permissions to access it there.

Is there a best practice for where to store jars for this type of scenario?

I'm guessing I could get this to work if I created a folder at /user/hdfs/udf and turned up all the permissions, and put the jar there.

Is there a better way?

1 REPLY 1
Highlighted

Re: How to call UDF via JDBC

@Zack Riesland

Add the udf jar to HDFS and provide 755 permissions. Make user the "hdfs fs -ls <udf>" works from all datanodes.

You can also add the jar under the property as below under Hive:

<property>
  <name>hive.aux.jars.path</name>
  <value>file:///path/to/JAR</value>
</property>

Thanks and Regards,

Sindhu

Don't have an account?
Coming from Hortonworks? Activate your account here