Reply
Explorer
Posts: 9
Registered: ‎12-22-2016

Load impala UDF without deploying in HDFS

Hi all,

 

To load Impala UDF, We have to deploy impala UDF (shared library) in Hadoop file system.

 

Is there any way to Load impala UDF (shared library) from Linux file system?

 

Thanks,

Rajesh 

Posts: 642
Topics: 3
Kudos: 121
Solutions: 67
Registered: ‎08-16-2016

Re: Load impala UDF without deploying in HDFS

There is with Hive but I don't think there is with Impala. I suspect that the UDF is loaded in each Impala daemon, therefor the library needs to be accessible by all and HDFS is the easy way to do it.
Cloudera Employee
Posts: 432
Registered: ‎07-29-2015

Re: Load impala UDF without deploying in HDFS

It's also because of the client-server model Impala uses - queries always
execute server-side so don't have access to the client's local filesystem.



I believe Hive also requires UDFs to be copied to HDS when it's running in
client-server mode over the HS2 protocol.
Posts: 642
Topics: 3
Kudos: 121
Solutions: 67
Registered: ‎08-16-2016

Re: Load impala UDF without deploying in HDFS

In Hive you can load them into the aux path which can be HDFS or the local FS. I have only used added libs in HDFS using the ADD JAR command while using HS2. It is possible that local does not work as Hue users would need local FS access on the Hue server.

So local FS only works in Hive when using Aux path, which is on the HS2 server and probably not what you were trying to achieve in Impala.