I placed the dependency jars to
/home/spark/esri path, but you can store them in hdfs or local filesystem and grant proper privileges to your spark user.
2.
Instantiate sqlContext:
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc);
3.
From spark-shell, define temporary functions:
sqlContext.sql("""create temporary function st_point as 'com.esri.hadoop.hive.ST_Point'""");
sqlContext.sql("""create temporary function st_x as 'com.esri.hadoop.hive.ST_X'""");