Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to run the jar of scala app in Spark enviroment

avatar
Explorer

Hi Owen, how to run the jar of scala app.

 

When I use "java -jar sparkalsapp-build.jar" , it look like this:

 

[root@hadoop140 SparkALSApp]# java -jar sparkalsapp-build.jar
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Seq
        at org.hisense.intelligence.MovieLensALS.main(MovieLensALS.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 1 more

**************************************************************************************************************************************

 

When I use "scala -classpath sparkalsapp-build.jar org.hisense.intelligence.MovieLensALS", it look like this:

 

[root@hadoop140 SparkALSApp]# scala -classpath sparkalsapp-build.jar org.hisense.intelligence.MovieLensALS
Spark ALS for movie recommendation begin...
java.lang.ClassNotFoundException: org.apache.spark.SparkConf
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at org.hisense.intelligence.MovieLensALS$.main(MovieLensALS.scala:37)
        at org.hisense.intelligence.MovieLensALS.main(MovieLensALS.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:71)
        at scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:139)
        at scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:71)
        at scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:139)
        at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:28)
        at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
        at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:35)
        at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
        at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74)
        at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:96)
        at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:105)
        at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)

 

****************************************************************************************************************

But my code is running good in spark-shell.

Please tell me how to run the jar of scala app.

Thanks.

Xuesong
1 ACCEPTED SOLUTION

avatar
Explorer

Thanks for Mr. Owen told me before that to using cloudera repo.

My Spark ALS app are now successfully running on CDH4.6 .

I change my pom.xml to cloudera-repos.

<repositories>
        <repository>
            <id>cloudera.repo</id>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos</url>
            <name>Cloudera Repositories</name>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </repository>
    </repositories>

 

And change my dependencies in pom.xml

 

<dependencies>
        <!--<dependency>-->
            <!--<groupId>org.apache.spark</groupId>-->
            <!--<artifactId>spark-core_2.10</artifactId>-->
            <!--<version>0.9.0-incubating</version>-->
        <!--</dependency>-->

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.2.0</version>
        </dependency>

        <!--<dependency>-->
            <!--<groupId>org.apache.spark</groupId>-->
            <!--<artifactId>spark-streaming_2.10</artifactId>-->
            <!--<version>0.9.0-incubating</version>-->
        <!--</dependency>-->

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>0.9.0-incubating</version>
        </dependency>

        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.10.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>0.9.0-cdh4.6.0</version>
        </dependency>
    </dependencies>

Xuesong

View solution in original post

1 REPLY 1

avatar
Explorer

Thanks for Mr. Owen told me before that to using cloudera repo.

My Spark ALS app are now successfully running on CDH4.6 .

I change my pom.xml to cloudera-repos.

<repositories>
        <repository>
            <id>cloudera.repo</id>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos</url>
            <name>Cloudera Repositories</name>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </repository>
    </repositories>

 

And change my dependencies in pom.xml

 

<dependencies>
        <!--<dependency>-->
            <!--<groupId>org.apache.spark</groupId>-->
            <!--<artifactId>spark-core_2.10</artifactId>-->
            <!--<version>0.9.0-incubating</version>-->
        <!--</dependency>-->

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.2.0</version>
        </dependency>

        <!--<dependency>-->
            <!--<groupId>org.apache.spark</groupId>-->
            <!--<artifactId>spark-streaming_2.10</artifactId>-->
            <!--<version>0.9.0-incubating</version>-->
        <!--</dependency>-->

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>0.9.0-incubating</version>
        </dependency>

        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.10.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>0.9.0-cdh4.6.0</version>
        </dependency>
    </dependencies>

Xuesong