- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
CDSW job "No SPARK_HOME found"
Created ‎07-29-2020 03:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
After installing CDSW 1.6 on Cloudera 7.1, in every job I start I see this message:
FileNotFoundError: [Errno 2] No such file or directory: '{{SPARK_HOME}}/./bin/spark-submit': '{{SPARK_HOME}}/./bin/spark-submit'
After re-installing Spark gateway and adding Hive and hdfs instances on the machine with CDSW nothing has changed.
spark-shell works fine.
After some investigation, I've changed file /var/lib/cdsw/client-config/spark-conf/ spark-defaults.conf , namely,
export SPARK_HOME= {{SPARK_HOME}}
to an actual value of SPARK_HOME.
It helped, and the exception was gone.
But it feels like a hack or a bug, and I would like to know if there's a better way to solve this?
Created on ‎08-25-2020 02:48 PM - edited ‎08-25-2020 02:51 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@ipshubin Below should help.
1) Stop CDSW 2) Remove Spark gateway configuration from masternode. 3) Recreate the spark gateway configuration. 4) Start CDSW.
Also check if there is any softlink pointing to wrong directory on the hosts.
Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
