Support Questions

Find answers, ask questions, and share your expertise

Guava library conflict

avatar
New Contributor

Hi

 

We are using CDH 5.11, yarn and spark2.

In our application we are using guava 19 dependency, we shaded it in to our jar.

But when we submit the job, we are seeing 

java.lang.NoSuchMethodError: com.google.common.base.Splitter.splitToList(Ljava/lang/CharSequence;)Ljava/util/List;

 

when i check the environment for he job.. we see

/opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/lib/hadoop/../../../CDH-5.11.0-1.cdh5.11.0.p0.34/jars/guava-11.0.2.jar

 

What are the options we have to use different version of gauva.

 

I tried spark.executor.userClassPathFirst=true and spark.driver.userClassPathFirst=true

but ran into job start issues it self.

 

What worked for us when we run a job in local mode is setting  spark distribution classpath in cludera manager

SPARK_DIST_CLASSPATH=/home/dep/test-lib/guava-19.0.jar:$SPARK_DIST_CLASSPATH
export SPARK_DIST_CLASSPATH

 

When we run with yarn with above spark setting it still failed.

Tried setting tarn.application.classpath to have  /home/dep/test-lib/* as first entry but it still failed

 

Can you suggest us.. whats the recommended approach for this conflict.

 

 

2 REPLIES 2

avatar
Have you resolved this? I had a similar issue, but did not find any answer for that

avatar
Super Collaborator

You will need to shade the guava that you use in your application. There is no way to replace the guava that is part of CDH with a later release, it will break a number of things.

What it looked like from the previous message is that they did not shade it correctly.

 

Wilfred