Support Questions

Find answers, ask questions, and share your expertise

Permanently added UDF doesn't work with ODBC connection

avatar
Expert Contributor

I'm using the Hortonworks Hive ODBC driver in my application.

I did: CREATE FUNCTION MyFunc as 'com.my.udf.class' USING JAR 'hdfs:///user/location/to/my.jar';

That worked. Ehen I close my HiveCLI session and open it back up, I can immediately run SELECT myfunc(data) FROM tbl; and it loads the class and functions correctly. However it doesn't work inside of HUE or in my ODBC connection within my app.

1 ACCEPTED SOLUTION

avatar
Contributor

Both Hue /ODBC make use of HiveServer2. You might want to check the HiveServer2 logs to see if there are any errors when running these queries. Is the HDFS location of your JAR accessible to the "hive" user (or whatever user is running HiveServer2)?

View solution in original post

13 REPLIES 13

avatar
Expert Contributor

I'll check this.

I am using HDP 2.3.2 (sandbox) which I believe comes with Hive 1.2.1 so that defect *shouldn't* be the problem.

avatar
Expert Contributor

I don't even see hive.server2.enable.doAs. Would it be under the Hive configuration settings?

avatar

@Kevin Vasko It is in Summary section. Run as End User instead of Hive user.

avatar
Expert Contributor

Thanks, found it and it was already set to true and that still wasn't the issue.

I went into hue and ran the create function command (same command as I did in HiveCLI ) and the command worked and I was able to run the function within hue. this to me looks like some type of context issue where the persistent function that is added in the CLI doesn't work in the other contexts (ODBC and Hue). I have no idea how to solve that.