Created 09-27-2016 07:23 AM
I am trying the launch spark job using spark-submit. But I am getting the error as "Cannot load main class from JAR file"
spark-submit \
--verbose --master local[4] \
--class com.training.bigdata.SparkPhoenixHbase sparkPhoenixHbase-1.0-SNAPSHOT-job.jar
Created 09-27-2016 08:18 AM
Hi @Sindhu , I think I got the solution now. I was missing "package" keyword while creating jar using maven.
I was using below command earlier :
- - > mvn clean compile assembly:single
Now I have changed the command to :
- - > mvn clean package assembly:single
After building the jar using above command, spark job is running fine.
Thanks a lot @Sindhu for your help. I got the solution because of your help.
Created 09-27-2016 07:31 AM
Seems to be related to Jira SPARK-4298. Please try sqoop command as below:
spark-submit \
--verbose --master local[4] \
--class SparkPhoenixHbase com/training/bigdata/sparkPhoenixHbase-1.0-SNAPSHOT-job.jar
Created 09-27-2016 07:44 AM
Hi @Sindhu
I tried like you said.
still I am getting below error :
java.lang.ClassNotFoundException: SparkPhoenixHbase
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:652)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Created 09-27-2016 07:54 AM
Please share the output of command jar tvf sparkPhoenixHbase-1.0-SNAPSHOT-job.jar.
Created 09-27-2016 08:01 AM
Created 09-27-2016 08:18 AM
Hi @Sindhu , I think I got the solution now. I was missing "package" keyword while creating jar using maven.
I was using below command earlier :
- - > mvn clean compile assembly:single
Now I have changed the command to :
- - > mvn clean package assembly:single
After building the jar using above command, spark job is running fine.
Thanks a lot @Sindhu for your help. I got the solution because of your help.
Created 09-29-2016 07:09 AM
Thank you for the update. Please mark helpful answer to close the discussion.
Created 08-23-2018 12:15 PM
I too was facing the same issue
since i was copying the command from a window editor so you might need to edit the command yourself and change the eol in notepad++(EDIT > EOL Conversion > UNIX) and the problem might get solved.
,I too faced the problem. i was copying the command from window editor ( word) to notepad++ and was running in EMR linux machine it was showing the same error as mentioned. The problem was resolved only when i typed the command my self on notepad++ and changed the EOL of the file and it ran successfully without the error. Please try this in case you face the same issue.