- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Error executing a hive UDF through jbdc
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Knox
Created ‎12-21-2015 10:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have created a UDF and I added it to hive as a function as below
create function HVDB_Analysis_Archived.UDF_ForLogDatatoXML as 'UDF_ForLogDatatoXML'
using jar 'hdfs://InnovationLab/tmp/UDF_ForLogDatatoXML.jar';
I am able to use this function through hive cli but through a java program using jbdc connection (with knox configuration), I am getting the below error.
Exception in thread "main" org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'HVDB_Analysis_Archived.UDF_ForLogDatatoXML' at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:255) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:241) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:247) at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:378) at find_data_from_hive_uc3.main(find_data_from_hive_uc3.java:60) Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'HVDB_XDLogFileAnalysis_Archived.UDF_ForLogDatatoXML' at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315) at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:112) at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:181) at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:388) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:375) at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:274) at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486) at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313) at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.thrift.server.TServlet.doPost(TServlet.java:83) at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:171) at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:565) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:479) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:225) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1031) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:406) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:965) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:111) at org.eclipse.jetty.server.Server.handle(Server.java:349) at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:449) at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:925) at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:952) at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235) at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:76) at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:609) at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:45) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
Created ‎12-22-2015 02:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Try this:
1- Create directory below in all hiveserver2 hosts:
mkdir /usr/hdp/current/hive-server2/auxlib
2- Copy your jar to the above folder in all hiveserver2 hosts.
3- Restart all hiveserver2 services
4- Create your UDF without 'using jar' clause (just once):
create function HVDB_Analysis_Archived.UDF_ForLogDatatoXML as 'UDF_ForLogDatatoXML'
Created ‎12-21-2015 01:24 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When connecting remotely via JDBC the function library should be accessible to this environment. It's not the same as hive cli.
What tier are you getting this error in? Are you using hiveserver2? You would need to ensure those middle tiers have your custom jar as well.
Created ‎12-21-2015 05:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have added the custom jar in my remote server too. Yes I am using hiveserver2. The code was working in HDP 2.1 but getting this issue after upgrade to 2.3
Created ‎12-22-2015 02:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Try this:
1- Create directory below in all hiveserver2 hosts:
mkdir /usr/hdp/current/hive-server2/auxlib
2- Copy your jar to the above folder in all hiveserver2 hosts.
3- Restart all hiveserver2 services
4- Create your UDF without 'using jar' clause (just once):
create function HVDB_Analysis_Archived.UDF_ForLogDatatoXML as 'UDF_ForLogDatatoXML'
Created ‎02-11-2016 11:05 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have tested this and this works. Accepting this as best answer.
Created ‎12-22-2015 06:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Make sure the jar file being referenced by the function has adequate permissions on hdfs. Is it the same user accessing this from hive cli and hiveserver2? Depending on hive.server2.enable.doAs setting your hiveserver2 user could be hive or the connecting user from JDBC.
Created ‎02-03-2016 02:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@pooja khandelwal has this been resolved? Can you accept the best answer or provide your own solution?
