Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark SQL User Defined Function

Spark SQL User Defined Function

In a spark shell I did the following:

spark-shell --jars /home/spark/esri/esri-geometry-api.jar,/home/spark/esri/spatial-sdk-hive-1.1.1-SNAPSHOT.jar

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);

sqlContext.sql("""create function ST_Point as ‘com.esri.hadoop.hive.ST_Point‘ using jar ‘hdfs:///user/esri/spatial-sdk-hive-1.1.1-SNAPSHOT.jar’""")

To my understanding the last command able me to create a UDF using ESRI jar, and I didn't use a temporary before function so does this UDF remains permanently.

If so how can I list all my permanent function in a spark shell ?

Don't have an account?
Coming from Hortonworks? Activate your account here