Created 02-14-2018 08:45 AM
Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true.
Created 02-14-2018 04:20 PM
You are trying to create another SparkContext. Please use the existing one.
In `spark-shell`, `sc` is the SparkContext which Spark created for you.