Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark 2 - Elasticsearch java.lang.NoSuchFieldError: INSTANCE

avatar
New Contributor

Hello, 

My project try to use the elasticsearch with spark 2 to create an index and populate it after.

I use these dependencies : 

// https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-spark-20
val elasticVersion = "6.6.2"
libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-20" % elasticVersion
// https://mvnrepository.com/artifact/org.elasticsearch.client/elasticsearch-rest-high-level-client
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-high-level-client" % elasticVersion

and spark 2 

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0" % Provided
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0" % Provided
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.0" % Provided

When i run it from my local installation of spark 2.4.x i have no problem, but when i launch it on my cloudera manager platform ( CDH 5.16.1 ) with spark 2, i have this exception : 

 

19/10/24 16:20:00 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchFieldError: INSTANCE
java.lang.NoSuchFieldError: INSTANCE
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager$InternalAddressResolver.<init>(PoolingNHttpClientConnectionManager.java:591)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:163)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:147)
	at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:119)
	at org.apache.http.impl.nio.client.HttpAsyncClientBuilder.build(HttpAsyncClientBuilder.java:668)
	at org.elasticsearch.client.RestClientBuilder$2.run(RestClientBuilder.java:240)
	at org.elasticsearch.client.RestClientBuilder$2.run(RestClientBuilder.java:237)
	at java.security.AccessController.doPrivileged(Native Method)
	at org.elasticsearch.client.RestClientBuilder.createHttpClient(RestClientBuilder.java:237)
	at org.elasticsearch.client.RestClientBuilder.access$000(RestClientBuilder.java:42)
	at org.elasticsearch.client.RestClientBuilder$1.run(RestClientBuilder.java:208)
	at org.elasticsearch.client.RestClientBuilder$1.run(RestClientBuilder.java:205)
	at java.security.AccessController.doPrivileged(Native Method)
	at org.elasticsearch.client.RestClientBuilder.build(RestClientBuilder.java:205)
	at org.elasticsearch.client.RestHighLevelClient.<init>(RestHighLevelClient.java:267)
	at org.elasticsearch.client.RestHighLevelClient.<init>(RestHighLevelClient.java:259)

I have no idea why...

 

thank you for your answers

4 REPLIES 4

avatar
New Contributor

Hi, 

I have the same error in my stacktrace.

Did you solve the problem?

 

Thanks

avatar
New Contributor

Hello @Mase ,

 

It's a very old issue ...

 

You can try to exclude "org.apache.spark" on elasticsearch-spark-20_$scalaVersion

 

val sparkVersion = "2.4.0.cloudera2"
val elasticVersion = "7.10.0"

dependencies {
    implementation("org.scala-lang:scala-library:$scalaVersion.12")
    implementation("com.typesafe.scala-logging:scala-logging_$scalaVersion:3.9.3")
    compileOnly("org.apache.spark:spark-core_$scalaVersion:$sparkVersion")
    compileOnly("org.apache.spark:spark-sql_$scalaVersion:$sparkVersion")
    compileOnly("org.apache.spark:spark-hive_$scalaVersion:$sparkVersion")
    implementation("org.mongodb.spark:mongo-spark-connector_$scalaVersion:2.4.1") {
        exclude("org.mongodb")
    }

 

 

avatar
New Contributor

@Mase ,

 

You can also try to shadow library : org.apache.http

    relocate("org.apache.http", "org.shaded.apache.http")

 

avatar
New Contributor

Hello @Flof ,

I know it's old issue but unfortunally I'm working on it now and I didn't find any solution for it. 😞

I use Maven in my project but in the code wrote by you I don't find the exclusion from elasticsearch-spark.

In any case the problem It seems is linked from some conflict between httpcomponents libraries.

It seems like the problem is in library that contains it:

elasticsearch-rest-high-level-client

 I'm trying different solutions but the error persist.

 

If you can remember something let me know.

Thanks a lot