Created 01-12-2018 10:39 AM
I am running my spark application using spark-submit in yarn-cluster mode. But it exits with following execption (full log using yarn logs).
What could be the problem?
yarn logs -applicationId application_1507132932520_2087 Container: container_e131_1507132932520_2087_01_000001 on b14-bigdata.polito.it_8041 ====================================================================================== LogType:stderr Log Upload Time:Fri Jan 12 11:29:24 +0100 2018 LogLength:1835 Log Contents: Exception in thread "main" java.lang.UnsupportedClassVersionError: it/polito/bigdata/spark/example/SparkDriver : Unsupported major.minor version 52.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:546) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:335) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:197) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:680) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:678) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) LogType:stdout Log Upload Time:Fri Jan 12 11:29:24 +0100 2018 LogLength:0 Log Contents:
Thanks in advance.
Created 01-12-2018 10:42 AM
The following error indicates that you are using Old JDK.
Exceptionin thread "main" java.lang.UnsupportedClassVersionError: it/polito/bigdata/spark/example/SparkDriver:Unsupported major.minor version 52.0
Pleas check if your JAVA_HOME variable is pointing to which JDK ?
It is better if you set the JAVA_HOME explicitly to JDK 1.8 (As JDK 1.7 is already declared as End Of Life Long back)
.
Even from HDP 2.6.3 onwards JDK 1.8 is made compulsory: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_support-matrices/content/ch_matrices-hdp...
Created 01-12-2018 10:42 AM
The following error indicates that you are using Old JDK.
Exceptionin thread "main" java.lang.UnsupportedClassVersionError: it/polito/bigdata/spark/example/SparkDriver:Unsupported major.minor version 52.0
Pleas check if your JAVA_HOME variable is pointing to which JDK ?
It is better if you set the JAVA_HOME explicitly to JDK 1.8 (As JDK 1.7 is already declared as End Of Life Long back)
.
Even from HDP 2.6.3 onwards JDK 1.8 is made compulsory: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_support-matrices/content/ch_matrices-hdp...
Created on 01-12-2018 10:51 AM - edited 08-18-2019 01:06 AM
I check the Java version on the server:
And then I chenged the Java version used to generate the JAR in my IDE:
I launched the new JAR without any errors.
It's ok now, thank you!