Earlier I upgraded my development cluster from 5.13.0 to 5.14.2
All is working now apart from Spark2
When we try to run spark2-shell the following error occurs:
ERROR spark.SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Exception when registering SparkListener at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2364) at org.apache.spark.SparkContext.<init>(SparkContext.scala:553) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)
Caused by: java.io.FileNotFoundException: Lineage directory /var/log/spark2/lineage doesn't exist or is not writable.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
Anyone able to help?
A bit trivial.. but
1/ Do you have the folder /var/log/spark2/lineage present on the Gateway instance of Spark2?
2/ is spark:spark listed as owner of said folder if it exist?
I have a similar issue (CDH 5.14.1 / Spark 2.3 - Issue appaered after Spark 2.3 got enabled) though Yarn & workbench for which a case is open
In the meantime, I disabled Navigator Lineage from cloudera Manager (spark2 Configuration / config.navigator.lineage_enabled) which allowed my colleagues to work.
Where exactly did you disable Navigator Lineage in Cloudera Manager, I can't find that property under the configs tab in Spark. I'm using Cloudera express version 5.14.1 with a custom Spark install (Version-2.3).
Lineage directory /var/log/spark2/lineage doesn't exist or is not writable.
As it says you can fix that by creating the formentioned folder and then doing
# chown spark:spark /var/log/spark2/lineage
Even easier, disable the listeners which generate the lineage files, by setting the following spark properties: