Member since
04-27-2021
3
Posts
0
Kudos Received
0
Solutions
05-04-2021
12:17 AM
I am running spark job using DSE not yarn client/cluster mode, herewith i am including spark submit command for your reference, -- spark-submit \ --driver-java-options %%JAVA_OPTIONS%% --driver-memory 6g \ --master dse://?workpool=default \ --deploy-mode cluster \ --supervise \ --conf spark.executor.extraJavaOptions=%%EXECUTOR_JAVA_OPTIONS%% --total-executor-cores 6 --executor-memory 6g --num-executors 6 \ --conf spark.driver.extraClassPath=${CLASSPATH} --conf spark.executor.extraClassPath=${CLASSPATH} \ %%ADDITIONAL_SPARK_SUBMIT_CONFIGS%% \ --class com.test.start.RptProcessor file:%%JAR_PATH%%
... View more
05-03-2021
11:19 PM
Hello Team, I am running spark job in cluster mode using java, the jobs are running without issue but not able to see the logs through console both driver and executor log, but spark UI able to see the executor logs. I want to see the driver and executor logs in console while running the spark job. Could you please assist someone to help this issue asap.
... View more
Labels:
- Labels:
-
Apache Spark
04-27-2021
10:56 AM
--driver-java-options %%JAVA_OPTIONS%% --driver-memory 6g \ --master dse://?workpool=default \ --deploy-mode cluster \ --supervise \ --conf spark.executor.extraJavaOptions=%%EXECUTOR_JAVA_OPTIONS%% --total-executor-cores 6 --executor-memory 6g --num-executors 6 \ --conf spark.driver.extraClassPath=${CLASSPATH} --conf spark.executor.extraClassPath=${CLASSPATH} \ %%ADDITIONAL_SPARK_SUBMIT_CONFIGS%% \ using the above configuration. I am trying to consolidate driver and worker node logs in console or single file and new to spark application. Can you please provide your adivse
... View more
Labels:
- Labels:
-
Apache Spark