Created 02-23-2018 08:43 AM
I want to implement a Spark/Livy usecase, where users can share one and the same Spark context and so have access to the same pre-cached RDDs. So, let me explain this on the following scenario:
The only way I found so far is through the Java/Scala API of Livy: https://livy.incubator.apache.org/examples/
The other way, using the REST API only supports either
As I'm coming from an R application, I need to start these Spark applications (as JAR) somehow via POST request to Livy's /sessions URL, as I need the session context for sharing the RDD.
I read something of a /sessions/<id>/submit-job URL but I don't know how to use it as it is neverwhere documented.
Can someone help?