02-25-2016 09:32 AM
I have the similar issue as andreF mentioned in http://community.cloudera.com/t5/Batch-Processing-and-Workflow/Spark-distributed-classpath/m-p/31515..., we have serval differnt guava versions in /etc/spark/conf/classpath.txt, do you know how to fix the issue?
Our app needs to use guava-16.0.1.jar, so I add guava-16.0.1.jar into /opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/jars/, and add "/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/jars/guava-16.0.1.jar" into /etc/spark/conf/classpath.txt.
However, it doesn't work, spark action in oozie still can not find guava-16.0.1.jar, saying "java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z". How does classpath.txt work? Do you know how to manage or modify the classpath.txt manually? Thanks!
02-25-2016 10:51 AM
It sounds like you have the guava 16 jar in the classpath then. If you look at the stdout from the Launcher Job, you should see that it's listed there and that it's being passed to Spark.
| 1.built spark-assembly-1.5.3-hadoop2.6.0.jar with guava 16.0.1 by myself
| 2.renamed it as
| spark-assembly-1.5.0-cdh5.5.0-hadoop2.6.0-cdh5.5.0.jar under
That's going to result in all kinds of problems. You must use CDH Spark with CDH.
I can't speak to /etc/spark/conf/classpath.txt (You'll have to ask on the Spark forum). Though when run from Oozie, I don't think that Spark uses that file.
Keep in mind that what you're trying to do here (replacing jars we're shipping) isn't really supported or tested, so it might not even be possible to do what you want.
02-25-2016 11:07 AM - edited 02-25-2016 11:53 AM
I revert spark back to CDH spark, but the issue is that why spark-submit works successfully with the command in CLI, but ozzie can not work with the same command in CDH?
spark-submit --master local --class TestCassandra --jars /tmp/zlp1/cassandra-driver-core-2.2.0-rc3.jar,/tmp/zlp1/spark-cassandra-connector_2.10-1.5.0-M2.jar,/tmp/zlp1/jsr166e-1.1.0.jar --driver-class-path /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar sparktest.jar s3n://gridx-output/sparktest/ 10 3 2
02-25-2016 11:55 AM
As I said, you should look at the stdout of the Launcher Job. Can you post the stdout, stderr, and syslogs from the Launcher Job somewhere and link to them here?
02-25-2016 11:59 AM - edited 02-25-2016 12:03 PM
The issue I posted in this topic is still not resolved. It still reports following NoSuchMethodError:
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, com.google.common.reflect.TypeToken.isPrimitive()Z java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142) at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:1
Definitly the classpath of oozie spark action does not pick up the right guava-16.0.1.jar when running spark action in oozie, even after I tried above several ways.
Appricated if someone can help! Thanks!
02-25-2016 12:08 PM
As I said, you should look at the stdout of the Launcher Job. Can you post the stdout, stderr, and syslogs from the Launcher Job somewhere and link to them here? It contains a log of useful information and might help narrow down your classpath problem.
02-25-2016 12:41 PM
Thanks for your response, since I can not find a good place to paste such long stdout log of oozie spark action. So I posted to you in our orginal oozie user/dev mail list.
You can check it there.
02-25-2016 12:53 PM
I'm not sure why that's not working. I can see that guava 16 is being passed to Spark and guava 14 isn't there (FYI: you also replaced guava 11 from Hadoop with 16, which may cause problems for Hadoop).
Can you try yarn-client or yarn-cluster mode instead of local? My understanding is that local mode doesn't always work right.