Support Questions

Find answers, ask questions, and share your expertise

Unknown option --job=sample during spark submit

avatar

YEAR=2018 STARTDAY=${YEAR}1115 ENDDAY=${YEAR}1115 EBDSJOB=sample EXECUTORS_NUM=5 EXECUTORS_MEM=4G #submit the below commands AS IS after modifying the job parameters LOGNAME=$EBDSJOB\_$STARTDAY-$ENDDAY\_`date +"%Y%m%d_%H_%M_%S"` EBDSLOG=/home/svc_hortonworks/$LOGNAME.log nohup /usr/hdp/current/spark2-client/bin/spark-submit --verbose --master yarn --deploy-mode client --queue=default --num-executors $EXECUTORS_NUM --executor-memory $EXECUTORS_MEM --executor-cores 1 --driver-memory 8G --class com.loves.spark.core.Driver --jars $SPARKLE_JARS2 /home/svc_hortonworks/SparkCourse-1.0-SNAPSHOT.jar --job=$EBDSJOB >> $EBDSLOG 2>&1&

Error: Unknown option --job=sample

5 REPLIES 5

avatar

@Nikil Katturi There is no --job option for spark 2. I imagine this is set for the spark application you created, right? And is suppose to pass it as an argument to this application.

I see you have set the --verbose to the spark submit command so I was expecting to see more output hopefully before the ERROR. Perhaps since you are redirecting the output to log file you can attach this to the post?

avatar

@Nikil Katturi

So you are using scopt to parse. The key is to pass the arguments after the application jar, which I think you are doing but perhaps there is a problem with the SPARKLE_JARS which is right before. You can review this for more information:

https://stackoverflow.com/questions/40535304/how-to-pass-parameters-properties-to-spark-jobs-with-sp...

I tested with spark pi example like this:

/usr/hdp/current/spark2-client/bin/spark-submit --master local --class org.apache.spark.examples.SparkPi --driver-memory 512m --executor-memory 512m --num-executors 2 --executor-cores 1 examples/jars/spark-examples*.jar --job="Hello"

And I get an error indicating the actual argument was passed correctly to the main function as expected. Which means spark-submit understood this was an argument to the applications and did not parse it.

So in your case I suggest you check the order of the different parameters for spark-submit, use the above link example and make sure the application jar is last. Then in the application itself try to print the arguments in the stdout to troubleshoot in case the error is coming from your custom code.

HTH

avatar

@Nikil Katturi any luck with this one? Please keep me posted.

avatar

No Felix , still it is not working. Actually i was using the same statement in the past . It did work but now i am facing issues with this. I am trying alternatives but you support would help me in figuring out the solution for this.

Let me know if you need the code of mine , i can send it over to you.

avatar

Thanks for your reply ,i am trying to pass the job name from the command line to my below class to make it parameter --job=$EBDSJOB

below is the class that reads the parameter from the spark submit. Please let me know if you have any best solution.

val cmdLineParser = new scopt.OptionParser[JobConfiguration]("DataQuality") {<br>  head("DataQuality")<br><br>  opt[String]("job")<br>    .text("Job Name")<br>    .action((x, config) => config.copy(job = x))<br><br>  opt[String]("appName")<br>    .text("Application Name")<br>    .action((x, config) => config.copy(appName = x))<br><br>  opt[String]("sparkMaster")<br>    .text("Spark Master")<br>    .action((x, config) => config.copy(sparkMaster = x))<br><br>  opt[String]("days")<br>    .text("Date Range")<br>    .action((x, config) => config.copy(dates = x))<br><br>  opt[String]("config")<br>    .text("Config Path")<br>    .action((x, config) => config.copy(configPath = x))<br><br>  help("help").text("prints this help text")<br>  checkConfig(cmdLine => if (cmdLine.dates == null) failure("dates must be specified") else success)