Support Questions

Find answers, ask questions, and share your expertise

I'm running a Spark2 job but get a java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration

avatar

I keep getting a NoClassDefFoundError using the Java API for creating a SparkSession

My java code is just simple SparkSession:

spark = SparkSession.builder() .master("localhost")

.config("SPARK_MAJOR_VERSION", "2")

.config("SPARK_HOME", "/usr/hdp/current/spark2-client")

.appName("Spark E2D")

.getOrCreate();

Here is the output:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 17/02/01 13:53:44 INFO SparkContext: Running Spark version 2.0.0 17/02/01 13:53:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/02/01 13:53:44 INFO SecurityManager: Changing view acls to: hdfs 17/02/01 13:53:44 INFO SecurityManager: Changing modify acls to: hdfs 17/02/01 13:53:44 INFO SecurityManager: Changing view acls groups to: 17/02/01 13:53:44 INFO SecurityManager: Changing modify acls groups to: 17/02/01 13:53:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hdfs); groups with view permissions: Set(); users with modify permissions: Set(hdfs); groups with modify permissions: Set() 17/02/01 13:53:44 INFO Utils: Successfully started service 'sparkDriver' on port 37105. 17/02/01 13:53:44 INFO SparkEnv: Registering MapOutputTracker 17/02/01 13:53:44 INFO SparkEnv: Registering BlockManagerMaster 17/02/01 13:53:44 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b01321d-cafe-4de1-b3f8-f232ce42754b 17/02/01 13:53:44 INFO MemoryStore: MemoryStore started with capacity 863.4 MB 17/02/01 13:53:44 INFO SparkEnv: Registering OutputCommitCoordinator

Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:142) at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:135) at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129) at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:99) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:128) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:115) at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:80) at org.apache.spark.ui.WebUI$anonfun$attachTab$1.apply(WebUI.scala:64) at org.apache.spark.ui.WebUI$anonfun$attachTab$1.apply(WebUI.scala:64) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:64) at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68) at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:81) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157) at org.apache.spark.SparkContext.<init>(SparkContext.scala:443) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256) at org.apache.spark.sql.SparkSession$Builder$anonfun$8.apply(SparkSession.scala:831) at org.apache.spark.sql.SparkSession$Builder$anonfun$8.apply(SparkSession.scala:823) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)

My POM dependencies:

<dependencies>

<dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>1.2.4</version>

</dependency> <dependency> <groupId>org.json</groupId> <artifactId>json</artifactId> <version>20090211</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <scope>test</scope> </dependency>

<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.3</version> <exclusions> <exclusion> <groupId>javax.servlet</groupId> <artifactId>servlet-api</artifactId> </exclusion> </exclusions> </dependency>

<dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </dependency> <dependency> <groupId>joda-time</groupId> <artifactId>joda-time</artifactId> </dependency>

<dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-csv</artifactId> </dependency> <dependency> <groupId>commons-io</groupId> <artifactId>commons-io</artifactId> </dependency>

<!-- Spark -->

<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.0.0</version> </dependency>

<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>2.0.0</version> </dependency>

2 REPLIES 2

avatar

Looks to me like there's some conflicting versions of the servlet API on your classpath, and even though the spark jetty version is shaded, the version of the servlet API classes which are being loaded aren't. I can see that you've already excluded the one in hadoop-common, so I'm not sure where else it can be coming from, except maybe hbase. Alternatively, no servlet api is being pulled in, and you need to get one on the CP

try doing a

mvn dependency:tree -Dverbose > target/dependencies.txt

and then examine the dependencies.txt file to see where it's being pulled in.

avatar

The only dependencies I see for servlet are:

javax.servlet.jsp:jsp-api:jar:2.1:runtime from hadoop common

javax.servlet:javax.servlet-api:jar:3.1.0:provided (version managed from 3.0.1; scope managed from compile) from hadoop-clientdependencies.txt

org.glassfish.grizzly:grizzly-http-servlet:jar:2.1.2:compile from hadoop-client

org.glassfish:javax.servlet:jar:3.1:compile from hadoop-client

org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:compile from hadoop-client

which should I remove? I have uploaded the dependencies in attachment.