Member since
10-24-2019
3
Posts
1
Kudos Received
0
Solutions
01-14-2022
05:32 AM
@Mase , You can also try to shadow library : org.apache.http relocate("org.apache.http", "org.shaded.apache.http")
... View more
01-14-2022
05:28 AM
Hello @Mase , It's a very old issue ... You can try to exclude "org.apache.spark" on elasticsearch-spark-20_$scalaVersion val sparkVersion = "2.4.0.cloudera2"
val elasticVersion = "7.10.0"
dependencies {
implementation("org.scala-lang:scala-library:$scalaVersion.12")
implementation("com.typesafe.scala-logging:scala-logging_$scalaVersion:3.9.3")
compileOnly("org.apache.spark:spark-core_$scalaVersion:$sparkVersion")
compileOnly("org.apache.spark:spark-sql_$scalaVersion:$sparkVersion")
compileOnly("org.apache.spark:spark-hive_$scalaVersion:$sparkVersion")
implementation("org.mongodb.spark:mongo-spark-connector_$scalaVersion:2.4.1") {
exclude("org.mongodb")
}
... View more
10-24-2019
07:59 AM
1 Kudo
Hello,
My project try to use the elasticsearch with spark 2 to create an index and populate it after.
I use these dependencies :
// https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-spark-20 val elasticVersion = "6.6.2" libraryDependencies += "org.elasticsearch" %% "elasticsearch-spark-20" % elasticVersion // https://mvnrepository.com/artifact/org.elasticsearch.client/elasticsearch-rest-high-level-client libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-high-level-client" % elasticVersion
and spark 2
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0" % Provided libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0" % Provided libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.0" % Provided
When i run it from my local installation of spark 2.4.x i have no problem, but when i launch it on my cloudera manager platform ( CDH 5.16.1 ) with spark 2, i have this exception :
19/10/24 16:20:00 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchFieldError: INSTANCE
java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager$InternalAddressResolver.<init>(PoolingNHttpClientConnectionManager.java:591)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:163)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:147)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.<init>(PoolingNHttpClientConnectionManager.java:119)
at org.apache.http.impl.nio.client.HttpAsyncClientBuilder.build(HttpAsyncClientBuilder.java:668)
at org.elasticsearch.client.RestClientBuilder$2.run(RestClientBuilder.java:240)
at org.elasticsearch.client.RestClientBuilder$2.run(RestClientBuilder.java:237)
at java.security.AccessController.doPrivileged(Native Method)
at org.elasticsearch.client.RestClientBuilder.createHttpClient(RestClientBuilder.java:237)
at org.elasticsearch.client.RestClientBuilder.access$000(RestClientBuilder.java:42)
at org.elasticsearch.client.RestClientBuilder$1.run(RestClientBuilder.java:208)
at org.elasticsearch.client.RestClientBuilder$1.run(RestClientBuilder.java:205)
at java.security.AccessController.doPrivileged(Native Method)
at org.elasticsearch.client.RestClientBuilder.build(RestClientBuilder.java:205)
at org.elasticsearch.client.RestHighLevelClient.<init>(RestHighLevelClient.java:267)
at org.elasticsearch.client.RestHighLevelClient.<init>(RestHighLevelClient.java:259)
I have no idea why...
thank you for your answers
... View more
Labels:
- Labels:
-
Apache Spark