I'm stuck for a while trying to add a custom UDF on Hive LLAP. I followed this link : https://community.cloudera.com/t5/Community-Articles/Creating-custom-udf-and-adding-udf-jar-to-Hive...
But for now without any success...
To resume what I did :
1- uploaded my jar file to the server where HSI is running.
2- modified my conf in Ambari (hive-interactive-env template attribute) and added this line :
3-modified my conf in Ambari (auxillary JAR list attribute) and added this line:
4- Restarted LLAP
5- Connected to hsi via beeline and type:
CREATE FUNCTION my_function_name as 'blabla.MyFunction';
Below the results/tests that I did:
show functions ;
==> I can see my function "my_function_name". So for now everything is OK.
Now when I try to use my function in a query like that:
select my_function_name(my_field) from my_table ;
(For information, my function encodes a string. So, in input it takes a string and returns a string encoded in output).
I have the result:
Error: Error while compiling statement: FAILED: RuntimeException Cannot run all parts of query in llap. Failing since hive.llap.execution.mode is set to only (state=42000,code=40000)
When I just try my function like that:
select my_function("blablabla") ;
It works, I have the result wanted.
What I think is my permanent UDF is only deployed on my HSI server but not on all the LLAP nodes.
I also try to activate this parameters:
set hive.llap.allow.permanent.fns=true ;
Then, I will some security errors:
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.exec.UDFArgumentException: Unable to instantiate UDF implementation class <my_function>: java.lang.SecurityException
I hope you will have some ideas or solutions.
Thank you in advance for helping me.
I use HDP 2.6