Created 12-15-2022 04:20 AM
Hi,
we are using HDP 2.6.5 and are facing an issue for some days now.
Yesterday we realized that when we issue a spark-submit command, it is sent to the same yarn queue even if we specify --queue param.
Had anybody ever come to that issue?
Spark version is 2.3.0.
Regards
Created 12-15-2022 09:13 PM
HI @Samie
Please attach the spark application and event logs to check the queue name. The easiest way to check the spark application is by running spark pi example.
spark-submit \
--class org.apache.spark.examples.SparkPi \
--queue <queue_name> \
--master yarn \
--deploy-mode cluster \
--num-executors 1 \
--driver-memory 512m \
--executor-memory 512m \
--executor-cores 1 \
/usr/hdp/current/spark2-client/examples/jars/spark-examples_*.jar 10
Spark on YARN only:
--queue QUEUE_NAME The YARN queue to submit to (Default: "default").
Created 12-20-2022 10:18 PM
Hi @Samie
Is there any update on your testing?
Created 01-02-2023 12:05 AM
@Samie, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Regards,
Vidya Sargur,