- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Does Spark allow multiple context to run in
- Labels:
-
Apache Hive
-
Apache Spark
Created ‎04-07-2017 09:38 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am working on an application where I am making use of StreamingContext as well as SPark SQLContext. Now in the same application I am writing HiveContext as well, but it is throwing error saying
rg.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
ALthough I have set spark.driver.allowMultipleContexts = true in SparkConfig, but no luck.
Could you please tell me how to proceed on this?
Created ‎04-07-2017 02:06 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Apache Spark cannot do that out of the box.
But you might be looking for is somemiddlewarethat can interface with Apache Spark and run, submit and manage jobs for you.
Livy - REST server with extensive language support (Python, R, Scala), ability to maintain interactive sessions and object sharing.
spark-jobserver - A simple Spark as a Service which supports objects sharing using so called named objects. JVM only.
Mist - A service for exposing Spark analytical jobs and machine learning models as realtime, batch or reactive web services.
Apache Toree - IPython protocol based middleware for interactive applications.
Hortonworks recommends Livy.
Also, read the last comment at: https://issues.apache.org/jira/browse/SPARK-2243
allowMultipleContexts has a very limited use for test only and can lead to the error you see.
