Support Questions

Find answers, ask questions, and share your expertise

Cannot load main class from JAR file

avatar
Contributor

I am trying the launch spark job using spark-submit. But I am getting the error as "Cannot load main class from JAR file"

spark-submit \

--verbose --master local[4] \

--class com.training.bigdata.SparkPhoenixHbase sparkPhoenixHbase-1.0-SNAPSHOT-job.jar

1 ACCEPTED SOLUTION

avatar
Contributor

Hi @Sindhu , I think I got the solution now. I was missing "package" keyword while creating jar using maven.

I was using below command earlier :

- - > mvn clean compile assembly:single

Now I have changed the command to :

- - > mvn clean package assembly:single

After building the jar using above command, spark job is running fine.

Thanks a lot @Sindhu for your help. I got the solution because of your help.

View solution in original post

7 REPLIES 7

avatar
@anjul tiwari

Seems to be related to Jira SPARK-4298. Please try sqoop command as below:

spark-submit \

--verbose --master local[4] \

--class SparkPhoenixHbase com/training/bigdata/sparkPhoenixHbase-1.0-SNAPSHOT-job.jar

avatar
Contributor

Hi @Sindhu

I tried like you said.

still I am getting below error :

java.lang.ClassNotFoundException: SparkPhoenixHbase

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Class.java:270)

at org.apache.spark.util.Utils$.classForName(Utils.scala:173)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:652)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

avatar
@anjul tiwari

Please share the output of command jar tvf sparkPhoenixHbase-1.0-SNAPSHOT-job.jar.

avatar
Contributor
@Sindhu

Here is the output of jar tvf sparkPhoenixHbase-1.0-SNAPSHOT-job.jar.

output.txt

avatar
Contributor

Hi @Sindhu , I think I got the solution now. I was missing "package" keyword while creating jar using maven.

I was using below command earlier :

- - > mvn clean compile assembly:single

Now I have changed the command to :

- - > mvn clean package assembly:single

After building the jar using above command, spark job is running fine.

Thanks a lot @Sindhu for your help. I got the solution because of your help.

avatar

@anjul tiwari

Thank you for the update. Please mark helpful answer to close the discussion.

avatar
New Contributor

I too was facing the same issue

since i was copying the command from a window editor so you might need to edit the command yourself and change the eol in notepad++(EDIT > EOL Conversion > UNIX) and the problem might get solved.

,

I too faced the problem. i was copying the command from window editor ( word) to notepad++ and was running in EMR linux machine it was showing the same error as mentioned. The problem was resolved only when i typed the command my self on notepad++ and changed the EOL of the file and it ran successfully without the error. Please try this in case you face the same issue.