Member since
07-28-2016
13
Posts
1
Kudos Received
0
Solutions
11-13-2017
09:55 AM
I'm not sure it is possible to execute Python files in HDFS; hence the error that only local files are supported. (If you know how to make it work with HDFS let me know!) To get this to work for me I had to manually upload my Python files to a directory on the Livy server itself. You also have to make sure that the directory in which you put the Python files is listed in the livy.file.local-dir-whitelist property in livy.conf on the Livy server. You might also have to restart the Livy server but I'm not sure about that as I wasn't the server admin. After doing all this you can invoke POST /batches by giving the path to your Python file in the 'file' arg of the request. Make sure you use the "file:" protocol in the path's value. Only one forward slash is needed after the colon; example value: "file:/data/pi.py"
... View more
07-28-2016
05:42 PM
5 Kudos
@Srinivas Santhanam Add semicolon at the end and make sure that your ambari user is mapped to an OS user that has access to the path add jar /tmp/udfs/esri-geometry-api.jar; add jar /tmp/udfs/spatial-sdk-hadoop.jar; My suggestion is to place these libraries in HDFS with your Ambari user that has hdfs priviles. Such you can have access to the libraries from any node with hive client. Example: add jar hdfs://whateverhostname:8020/tmp/esri/esri-geometry-api.jar; If the response addresses your problem, don't forget to vote and accept best answer. It is a stimulus for the effort. Thank you.
... View more
10-12-2016
06:37 PM
@Srinivas Santhanam I guess you would have figured out by now. For others, the problem with above query is quotes in hdfs path, try without quote like below add jar hdfs:///tmp/udfs/hive/esri-geometry-api.jar
... View more