Support Questions
Find answers, ask questions, and share your expertise

Spark on Yarn but Program dependencies cannot be found !I haved put jars to "/usr/hdp/3.0.0.0-1634/spark2/jars" ,but it do not works!

Spark on Yarn but Program dependencies cannot be found !I haved put jars to "/usr/hdp/3.0.0.0-1634/spark2/jars" ,but it do not works!

New Contributor

when i use hbase , i put data to hbase and ClassNotFoundException.

19/01/16 13:23:41 WARN RpcControllerFactory: Cannot load configured "hbase.rpc.controllerfactory.class" (org.apache.hadoop.hbase.ipc.RpcControllerFactory) from hbase-site.xml, falling back to use default RpcControllerFactory java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:221) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:114) at common.utils.hbase.HBaseUtils.getCon(HBaseUtils.java:38) at com.dfwy.online.sparkstreamingtask.vcloude.task.hbase.HBaseOperater.data2HBase(HBaseOperater.java:50) at com.dfwy.online.sparkstreamingtask.vcloude.main.JavaDirectKafkaAnalyze.lambda$null$d469caef$1(JavaDirectKafkaAnalyze.java:92) at org.apache.spark.api.java.JavaRDDLike$anonfun$foreach$1.apply(JavaRDDLike.scala:351) at org.apache.spark.api.java.JavaRDDLike$anonfun$foreach$1.apply(JavaRDDLike.scala:351) at scala.collection.Iterator$class.foreach(Iterator.scala:893) at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at org.apache.spark.rdd.RDD$anonfun$foreach$1$anonfun$apply$28.apply(RDD.scala:921) at org.apache.spark.rdd.RDD$anonfun$foreach$1$anonfun$apply$28.apply(RDD.scala:921) at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:2074) at org.apache.spark.SparkContext$anonfun$runJob$5.apply(SparkContext.scala:2074) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:219) ... 18 more Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.client.backoff.ClientBackoffPolicyFactory$NoBackoffPolicy at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:47) at org.apache.hadoop.hbase.client.backoff.ClientBackoffPolicyFactory.create(ClientBackoffPolicyFactory.java:42) at org.apache.hadoop.hbase.client.ConnectionImplementation.<init>(ConnectionImplementation.java:264) ... 23 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.client.backoff.ClientBackoffPolicyFactory$NoBackoffPolicy at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:43) ... 25 more 19/01/16 13:23:41 INFO Executor: Finished task 0.0 in stage 27.0 (TID 27). 794 bytes result sent to driver 19/01/16 13:23:41 INFO BlockManager: Removing RDD 53 19/01/16 13:23:41 INFO BlockManager: Removing RDD 52

tks!

2 REPLIES 2

Re: Spark on Yarn but Program dependencies cannot be found !I haved put jars to "/usr/hdp/3.0.0.0-1634/spark2/jars" ,but it do not works!

New Contributor

/usr/hdp/3.0.0.0-1634/spark2/bin/spark-submit --driver-memory 1g --executor-memory 2g --executor-cores 2 --conf spark.yarn.am.memory=1024m --class com.dfwy.online.sparkstreamingtask.vcloude.main.JavaDirectKafkaAnalyze --master yarn --deploy-mode cluster --conf spark.io.compression.codec=snappy /usr/hdp/3.0.0.0-1634/spark2/examples/jars/wybigdata.jar

this my program!

Re: Spark on Yarn but Program dependencies cannot be found !I haved put jars to "/usr/hdp/3.0.0.0-1634/spark2/jars" ,but it do not works!

New Contributor

i also use this mode

$ ./bin/spark-submit --class my.main.Class \
    --master yarn \
    --deploy-mode cluster \
    --jars my-other-jar.jar,my-other-other-jar.jar \
    my-main-jar.jar \
    app_arg1 app_arg2

but it doeswork!