Support Questions
Find answers, ask questions, and share your expertise

Running phoenix udfs from JAVA service

New Contributor

I am using phoenix udfs, which is working fine from sqlline.py I am trying to run the same query using my Java application but it is not throwing following exception. Can someone let me know what am I missing?

DriverManager.setLoginTimeout(500);
Properties props = new Properties();
props.setProperty("phoenix.functions.allowUserDefinedFunctions", "true");
DriverManager.getConnection("jdbc:phoenix:<host>:2181:/hbase-unsecure", props);

I am running following query where EpochPastDays is my custom UDF:

  • select * from "table" WHERE "PK">'v1:' || EpochPastDays(7) AND "PK"<'v2' limit 10
The above statement is throwing:
java.sql.SQLException: java.lang.reflect.InvocationTargetException
at org.apache.phoenix.parse.FunctionParseNode.create(FunctionParseNode.java:265)
at org.apache.phoenix.compile.ExpressionCompiler.visitLeave(ExpressionCompiler.java:330)
at org.apache.phoenix.compile.ExpressionCompiler.visitLeave(ExpressionCompiler.java:141)
....
....
....
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.phoenix.parse.FunctionParseNode.create(FunctionParseNode.java:253)
... 31 more
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: co.<name>.phoenix.customudfs.EpochPastDays
at org.apache.phoenix.expression.function.UDFExpression.constructUDFFunction(UDFExpression.java:170)
at org.apache.phoenix.expression.function.UDFExpression.<init>(UDFExpression.java:72)
... 36 more
4 REPLIES 4

When you use the UDF in where condition it will be push down to region server. So you need to add the jar to hdfs so that the UDF will be automatically loaded at server.

You can do proper configurations as below.

http://phoenix.apache.org/udf.html#Configuration

And upload the jar from sqlline using following query and try.

add jar <local jar path>

New Contributor

I have already put that jar in the 'hbase.dynamic.jars.dir' as mentioned in the link you shared.

Can you check whether the jar copied to local path at RegionServer?

{hbase.local.dir}/jars

New Contributor
[abhishekk@hdps01 jars]$ ls -l /tmp/hbase-hbase/local/jars/
total 76724
-rw-r--r-- 1 hbase hadoop 78562046 Mar 29 06:47 phoenix-custom-udfs-1.0-SNAPSHOT.jar

<strong>hbase-site.xml</strong>
hbase.local.dir: ${hbase.tmp.dir}/local

I just checked my other 2 regions servers, they don't have this jar copied to the /tmp/hbase-hbase/local/jars folder. I copied the jar to other regions servers, but still the same error.

java.lang.ClassNotFoundException: co.flock.phoenix.customudfs.EpochPastDays.

The same statement runs from sqlline.py

0: jdbc:phoenix:<host>> select EpochPastDays(7) from "events" limit 1; 
+------------------+ 
| '1490276764970'  | 
+------------------+ 
| 1490276764970    | 
+------------------+
; ;