- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Java Spark driver and executor logs in cluster mode
- Labels:
-
Apache Spark
Created ‎04-27-2021 10:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--driver-java-options %%JAVA_OPTIONS%% --driver-memory 6g \
--master dse://?workpool=default \
--deploy-mode cluster \
--supervise \
--conf spark.executor.extraJavaOptions=%%EXECUTOR_JAVA_OPTIONS%% --total-executor-cores 6 --executor-memory 6g --num-executors 6 \
--conf spark.driver.extraClassPath=${CLASSPATH} --conf spark.executor.extraClassPath=${CLASSPATH} \
%%ADDITIONAL_SPARK_SUBMIT_CONFIGS%% \
using the above configuration.
I am trying to consolidate driver and worker node logs in console or single file and new to spark application.
Can you please provide your adivse
Created ‎05-03-2021 11:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Sugumar
When you run application in client mode, you can see driver logs in your console (where you have submitted application) and executor logs in respected container.
But in the case of cluster mode, spark driver will be launched in one of the container. So you will not see driver logs in console.
To get the logs in yarn,
yarn logs -applicationId <Application_ID> > application_id.log
Created ‎05-04-2021 12:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am running spark job using DSE not yarn client/cluster mode,
herewith i am including spark submit command for your reference,
-- spark-submit \
--driver-java-options %%JAVA_OPTIONS%% --driver-memory 6g \
--master dse://?workpool=default \
--deploy-mode cluster \
--supervise \
--conf spark.executor.extraJavaOptions=%%EXECUTOR_JAVA_OPTIONS%% --total-executor-cores 6 --executor-memory 6g --num-executors 6 \
--conf spark.driver.extraClassPath=${CLASSPATH} --conf spark.executor.extraClassPath=${CLASSPATH} \
%%ADDITIONAL_SPARK_SUBMIT_CONFIGS%% \
--class com.test.start.RptProcessor file:%%JAR_PATH%%
Created ‎05-04-2021 01:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Sugumar
I don't have much idea on DSE cluster. Please check the following link maybe it will help
https://docs.datastax.com/en/dse/6.7/dse-admin/datastax_enterprise/spark/sparkLogging.html
