Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

SPARK SUBMIT - java.lang.NoSuchMethodError

avatar
Expert Contributor

I am trying to submit a job which is in target/main/scala, which is a jar file

I submitted in the spark_home/bin directory with the following command,

./spark-submit --class "SimpleApp" --master local[4] /proj/target/main/scala/SimpleProj.jar

All goes well, except the following errors

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; at com.databricks.spark.csv.util.CompressionCodecs$.<init>(CompressionCodecs.scala:29) at com.databricks.spark.csv.util.CompressionCodecs$.<clinit>(CompressionCodecs.scala) at com.databricks.spark.csv.DefaultSource.createRelation(DefaultSource.scala:189) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:139) at projupq$.main(projupq.scala:35) at projupq.main(projupq.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Unable to rectify these errors, any help would be useful

Thanks

Sridhar

3 REPLIES 3

avatar
Super Collaborator

I think your scala versions are clashing with your databricks spark csv package. You're most likely running scala 2.10, try adding updating your artifact thats being used to build your application to:

groupId: com.databricks

artifactId: spark-csv_2.10

version: 1.4.0

avatar
Rising Star

You should run that command with this parameter "--jars spark-csv_2.10-1.4.0.jar",

and check your spark and scala version which are compatible.

avatar
Super Collaborator

Hi @Sridhar Babu,

Apparently there is an issue with library in compatable with2.11:1.3.0 and 2.11:1.4.0

please use verison com.databricks:spark-csv_2.10:1.4.0