Member since
05-10-2017
1
Post
0
Kudos Received
0
Solutions
05-10-2017
09:59 AM
Hi We are using CDH 5.11, yarn and spark2. In our application we are using guava 19 dependency, we shaded it in to our jar. But when we submit the job, we are seeing java.lang.NoSuchMethodError: com.google.common.base.Splitter.splitToList(Ljava/lang/CharSequence;)Ljava/util/List; when i check the environment for he job.. we see /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/lib/hadoop/../../../CDH-5.11.0-1.cdh5.11.0.p0.34/jars/guava-11.0.2.jar What are the options we have to use different version of gauva. I tried spark.executor.userClassPathFirst=true and spark.driver.userClassPathFirst=true but ran into job start issues it self. What worked for us when we run a job in local mode is setting spark distribution classpath in cludera manager SPARK_DIST_CLASSPATH=/home/dep/test-lib/guava-19.0.jar:$SPARK_DIST_CLASSPATH export SPARK_DIST_CLASSPATH When we run with yarn with above spark setting it still failed. Tried setting tarn.application.classpath to have /home/dep/test-lib/* as first entry but it still failed Can you suggest us.. whats the recommended approach for this conflict.
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN