I am able to submit job through the terminal and if I am using the same command and submit the job through the yarn rest api it not submitting and one more thing is even I specified the queue as 'a' but it is taking queue as default. As I am just beginner I am not able encounter the problem can anyone help getting out of this please . THANKS IN ADVANCE
{
'application-id': 'application_1675072653900_0123',
'application-name': None,
'am-container-spec': {
'commands': {
'command': '/home/kirankathe/Desktop/spark-3.3.1-bin-hadoop3/bin/spark-submit --master yarn --deploy-mode client --packages org.mongodb.spark:mongo-spark-connector_2.12:3.0.1 --driver-memory 1g --executor-memory 1g --num-executors 1 --executor-cores 2 --queue a --class org.apache.spark.examples.SparkPi hdfs://localhost:9000/user/kirankathe/word_count.py'
},
'local-resources': {
'entry': [{
'key': 'word_count.py',
'value': {
'resource': 'hdfs://localhost:9000/user/kirankathe/word_count.py',
'type': 'FILE',
'visibility': 'APPLICATION',
'timestamp': 1675259542629
}
}]
}
},
'application-type': 'SPARK',
'priority': 0,
'max-app-attempts': 1
}