Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

java.lang.ClassNotFoundException: com.yammer.metrics.Metrics

avatar
New Contributor

Hellos,

I keep getting the error 'java.lang.ClassNotFoundException: com.yammer.metrics.Metrics' when running the spark streaming examples that come with HDP 2.6.1

have tried including the assembly jar 'spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar' with no success, also tried including the metrics core package - no success

i am relatively new to spark, i need all the help i can get 🙂

see below trace

[spark@namenode streaming]$ run-example --jars /tmp/metrics-core-2.2.0.jar,/tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar streaming.JavaDirectKafkaWordCount broker1:6667 test1g

17/07/21 13:28:45 INFO SparkContext: Running Spark version 2.1.1.2.6.1.0-129 17/07/21 13:28:45 INFO SecurityManager: Changing view acls to: spark 17/07/21 13:28:45 INFO SecurityManager: Changing modify acls to: spark 17/07/21 13:28:45 INFO SecurityManager: Changing view acls groups to: 17/07/21 13:28:45 INFO SecurityManager: Changing modify acls groups to: 17/07/21 13:28:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); groups with view permissions: Set(); users with modify permissions: Set(spark); groups with modify permissions: Set() 17/07/21 13:28:46 INFO Utils: Successfully started service 'sparkDriver' on port 54545. 17/07/21 13:28:46 INFO SparkEnv: Registering MapOutputTracker 17/07/21 13:28:46 INFO SparkEnv: Registering BlockManagerMaster 17/07/21 13:28:46 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 17/07/21 13:28:46 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 17/07/21 13:28:46 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-dd9f4e5f-8f2c-40b1-be60-d0e7c9dfd0f4 17/07/21 13:28:46 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 17/07/21 13:28:46 INFO SparkEnv: Registering OutputCommitCoordinator 17/07/21 13:28:46 INFO log: Logging initialized @1576ms 17/07/21 13:28:46 INFO Server: jetty-9.2.z-SNAPSHOT 17/07/21 13:28:46 INFO Server: Started @1646ms 17/07/21 13:28:46 WARN AbstractLifeCycle: FAILED ServerConnector@21694e53{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: Address already in use java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321) at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80) at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236) at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$newConnector$1(JettyUtils.scala:321) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$httpConnect$1(JettyUtils.scala:353) at org.apache.spark.ui.JettyUtils$anonfun$7.apply(JettyUtils.scala:356) at org.apache.spark.ui.JettyUtils$anonfun$7.apply(JettyUtils.scala:356) at org.apache.spark.util.Utils$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2220) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2212) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:356) at org.apache.spark.ui.WebUI.bind(WebUI.scala:130) at org.apache.spark.SparkContext$anonfun$10.apply(SparkContext.scala:460) at org.apache.spark.SparkContext$anonfun$10.apply(SparkContext.scala:460) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:460) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836) at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84) at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:138) at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:750) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/07/21 13:28:46 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 17/07/21 13:28:46 INFO ServerConnector: Started ServerConnector@2b12a5c3{HTTP/1.1}{0.0.0.0:4041} 17/07/21 13:28:46 INFO Utils: Successfully started service 'SparkUI' on port 4041. 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7e70bd39{/jobs,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2e77b8cf{/jobs/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@67ef029{/jobs/job,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2755d705{/jobs/job/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@740abb5{/stages,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5fe8b721{/stages/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@578524c3{/stages/stage,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4cc547a{/stages/stage/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4152d38d{/stages/pool,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5398edd0{/stages/pool/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5cc5b667{/storage,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@758f4f03{/storage/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6928f576{/storage/rdd,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@69f63d95{/storage/rdd/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@27e0f2f5{/environment,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6db66836{/environment/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@2de366bb{/executors,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@61a002b1{/executors/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@780ec4a5{/executors/threadDump,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6f70f32f{/executors/threadDump/json,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5aabbb29{/static,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@78461bc4{/,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@64f857e7{/api,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@56102e1c{/jobs/job/kill,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7927bd9f{/stages/stage/kill,null,AVAILABLE,@Spark} 17/07/21 13:28:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.102.16.74:4041 17/07/21 13:28:46 INFO SparkContext: Added JAR file:/tmp/metrics-core-2.2.0.jar at spark://10.102.16.74:54545/jars/metrics-core-2.2.0.jar with timestamp 1500632926492 17/07/21 13:28:46 INFO SparkContext: Added JAR file:/tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar at spark://10.102.16.74:54545/jars/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar with timestamp 1500632926492 17/07/21 13:28:46 INFO SparkContext: Added JAR file:/usr/hdp/current/spark2-client/examples/jars/scopt_2.11-3.3.0.jar at spark://10.102.16.74:54545/jars/scopt_2.11-3.3.0.jar with timestamp 1500632926493 17/07/21 13:28:46 INFO SparkContext: Added JAR file:/usr/hdp/current/spark2-client/examples/jars/spark-examples_2.11-2.1.1.2.6.1.0-129.jar at spark://10.102.16.74:54545/jars/spark-examples_2.11-2.1.1.2.6.1.0-129.jar with timestamp 1500632926493 17/07/21 13:28:46 INFO Executor: Starting executor ID driver on host localhost 17/07/21 13:28:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40501. 17/07/21 13:28:46 INFO NettyBlockTransferService: Server created on 10.102.16.74:40501 17/07/21 13:28:46 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 17/07/21 13:28:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.102.16.74, 40501, None) 17/07/21 13:28:46 INFO BlockManagerMasterEndpoint: Registering block manager 10.102.16.74:40501 with 366.3 MB RAM, BlockManagerId(driver, 10.102.16.74, 40501, None) 17/07/21 13:28:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.102.16.74, 40501, None) 17/07/21 13:28:46 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.102.16.74, 40501, None) 17/07/21 13:28:46 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@26a529dc{/metrics/json,null,AVAILABLE,@Spark} 17/07/21 13:28:47 INFO EventLoggingListener: Logging events to hdfs:///spark2-history/local-1500632926523 17/07/21 13:28:47 INFO VerifiableProperties: Verifying properties 17/07/21 13:28:47 INFO VerifiableProperties: Property group.id is overridden to 17/07/21 13:28:47 INFO VerifiableProperties: Property zookeeper.connect is overridden to Exception in thread "main" java.lang.NoClassDefFoundError: com/yammer/metrics/Metrics at kafka.metrics.KafkaMetricsGroup$class.newTimer(KafkaMetricsGroup.scala:89) at kafka.consumer.FetchRequestAndResponseMetrics.newTimer(FetchRequestAndResponseStats.scala:26) at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:35) at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:47) at kafka.consumer.FetchRequestAndResponseStatsRegistry$anonfun$2.apply(FetchRequestAndResponseStats.scala:60) at kafka.consumer.FetchRequestAndResponseStatsRegistry$anonfun$2.apply(FetchRequestAndResponseStats.scala:60) at kafka.utils.Pool.getAndMaybePut(Pool.scala:59) at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:64) at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:59) at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:62) at org.apache.spark.streaming.kafka.KafkaCluster$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$withBrokers$1.apply(KafkaCluster.scala:368) at org.apache.spark.streaming.kafka.KafkaCluster$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$withBrokers$1.apply(KafkaCluster.scala:365) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35) at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$withBrokers(KafkaCluster.scala:365) at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:136) at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:123) at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:212) at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:485) at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:608) at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala) at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:78) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:750) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.yammer.metrics.Metrics at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 31 more 17/07/21 13:28:47 INFO SparkContext: Invoking stop() from shutdown hook 17/07/21 13:28:47 INFO ServerConnector: Stopped Spark@2b12a5c3{HTTP/1.1}{0.0.0.0:4041} 17/07/21 13:28:47 INFO SparkUI: Stopped Spark web UI at http://10.102.16.74:4041 17/07/21 13:28:47 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/07/21 13:28:47 INFO MemoryStore: MemoryStore cleared 17/07/21 13:28:47 INFO BlockManager: BlockManager stopped 17/07/21 13:28:47 INFO BlockManagerMaster: BlockManagerMaster stopped 17/07/21 13:28:47 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/07/21 13:28:47 INFO SparkContext: Successfully stopped SparkContext 17/07/21 13:28:47 INFO ShutdownHookManager: Shutdown hook called 17/07/21 13:28:47 INFO ShutdownHookManager: Deleting directory /tmp/spark-49ed96b9-c982-41dc-baca-4aa892da3db6

4 REPLIES 4

avatar
Contributor

Can u try with spark submit command to run the examples.

Ref:- Spark-submit command

avatar
New Contributor

Hi @kalai selvan

Still getting the same output using spark-submit

[hdfs@namenode jars]$ spark-submit --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount --master local[4] --jars /tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar spark-examples_2.11-2.1.1.2.6.1.0-129.jar broker1:6667 test1g

17/07/22 17:37:24 INFO SparkContext: Running Spark version 2.1.1.2.6.1.0-129 17/07/22 17:37:25 INFO SecurityManager: Changing view acls to: hdfs 17/07/22 17:37:25 INFO SecurityManager: Changing modify acls to: hdfs 17/07/22 17:37:25 INFO SecurityManager: Changing view acls groups to: 17/07/22 17:37:25 INFO SecurityManager: Changing modify acls groups to: 17/07/22 17:37:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hdfs); groups with view permissions: Set(); users with modify permissions: Set(hdfs); groups with modify permissions: Set() 17/07/22 17:37:25 INFO Utils: Successfully started service 'sparkDriver' on port 53996. 17/07/22 17:37:25 INFO SparkEnv: Registering MapOutputTracker 17/07/22 17:37:25 INFO SparkEnv: Registering BlockManagerMaster 17/07/22 17:37:25 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 17/07/22 17:37:25 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 17/07/22 17:37:25 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-7e1af9d1-727d-4acf-b9c1-e964fba9c3c5 17/07/22 17:37:25 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 17/07/22 17:37:25 INFO SparkEnv: Registering OutputCommitCoordinator 17/07/22 17:37:25 INFO log: Logging initialized @1526ms 17/07/22 17:37:25 INFO Server: jetty-9.2.z-SNAPSHOT 17/07/22 17:37:25 INFO Server: Started @1596ms 17/07/22 17:37:25 WARN AbstractLifeCycle: FAILED ServerConnector@2c715e84{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: Address already in use java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321) at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80) at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236) at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$newConnector$1(JettyUtils.scala:321) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$httpConnect$1(JettyUtils.scala:353) at org.apache.spark.ui.JettyUtils$anonfun$7.apply(JettyUtils.scala:356) at org.apache.spark.ui.JettyUtils$anonfun$7.apply(JettyUtils.scala:356) at org.apache.spark.util.Utils$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2220) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2212) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:356) at org.apache.spark.ui.WebUI.bind(WebUI.scala:130) at org.apache.spark.SparkContext$anonfun$10.apply(SparkContext.scala:460) at org.apache.spark.SparkContext$anonfun$10.apply(SparkContext.scala:460) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:460) at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836) at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84) at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:138) at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:750) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/07/22 17:37:25 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 17/07/22 17:37:25 INFO ServerConnector: Started ServerConnector@1133ec6e{HTTP/1.1}{0.0.0.0:4041} 17/07/22 17:37:25 INFO Utils: Successfully started service 'SparkUI' on port 4041. 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7eb01b12{/jobs,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5c1bd44c{/jobs/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@18cc679e{/jobs/job,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@67ef029{/jobs/job/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6e57e95e{/stages,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@56db847e{/stages/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@560cbf1a{/stages/stage,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@64c2b546{/stages/stage/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a11c4c7{/stages/pool,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7555b920{/stages/pool/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3591009c{/storage,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@b5cc23a{/storage/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@61edc883{/storage/rdd,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@182f1e9a{/storage/rdd/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@660e9100{/environment,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@9cd25ff{/environment/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3574e198{/executors,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@db44aa2{/executors/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3f093abe{/executors/threadDump,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4eeea57d{/executors/threadDump/json,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@e24ddd0{/static,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@70ab80e3{/,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@67427b69{/api,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3d6300e8{/jobs/job/kill,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@24a1c17f{/stages/stage/kill,null,AVAILABLE,@Spark} 17/07/22 17:37:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.102.16.74:4041 17/07/22 17:37:25 INFO SparkContext: Added JAR file:/tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar at spark://10.102.16.74:53996/jars/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar with timestamp 1500734245790 17/07/22 17:37:25 INFO SparkContext: Added JAR file:/usr/hdp/2.6.1.0-129/spark2/examples/jars/spark-examples_2.11-2.1.1.2.6.1.0-129.jar at spark://10.102.16.74:53996/jars/spark-examples_2.11-2.1.1.2.6.1.0-129.jar with timestamp 1500734245790 17/07/22 17:37:25 INFO Executor: Starting executor ID driver on host localhost 17/07/22 17:37:25 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52964. 17/07/22 17:37:25 INFO NettyBlockTransferService: Server created on 10.102.16.74:52964 17/07/22 17:37:25 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 17/07/22 17:37:25 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.102.16.74, 52964, None) 17/07/22 17:37:25 INFO BlockManagerMasterEndpoint: Registering block manager 10.102.16.74:52964 with 366.3 MB RAM, BlockManagerId(driver, 10.102.16.74, 52964, None) 17/07/22 17:37:25 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.102.16.74, 52964, None) 17/07/22 17:37:25 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.102.16.74, 52964, None) 17/07/22 17:37:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@323659f8{/metrics/json,null,AVAILABLE,@Spark} 17/07/22 17:37:26 INFO EventLoggingListener: Logging events to hdfs:///spark2-history/local-1500734245821 17/07/22 17:37:26 INFO VerifiableProperties: Verifying properties 17/07/22 17:37:26 INFO VerifiableProperties: Property group.id is overridden to 17/07/22 17:37:26 INFO VerifiableProperties: Property zookeeper.connect is overridden to Exception in thread "main" java.lang.NoClassDefFoundError: com/yammer/metrics/Metrics at kafka.metrics.KafkaMetricsGroup$class.newTimer(KafkaMetricsGroup.scala:89) at kafka.consumer.FetchRequestAndResponseMetrics.newTimer(FetchRequestAndResponseStats.scala:26) at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:35) at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:47) at kafka.consumer.FetchRequestAndResponseStatsRegistry$anonfun$2.apply(FetchRequestAndResponseStats.scala:60) at kafka.consumer.FetchRequestAndResponseStatsRegistry$anonfun$2.apply(FetchRequestAndResponseStats.scala:60) at kafka.utils.Pool.getAndMaybePut(Pool.scala:59) at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:64) at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:59) at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:62) at org.apache.spark.streaming.kafka.KafkaCluster$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$withBrokers$1.apply(KafkaCluster.scala:368) at org.apache.spark.streaming.kafka.KafkaCluster$anonfun$org$apache$spark$streaming$kafka$KafkaCluster$withBrokers$1.apply(KafkaCluster.scala:365) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35) at org.apache.spark.streaming.kafka.KafkaCluster.org$apache$spark$streaming$kafka$KafkaCluster$withBrokers(KafkaCluster.scala:365) at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:136) at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:123) at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:212) at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:485) at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:608) at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala) at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:78) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:750) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.yammer.metrics.Metrics at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 31 more 17/07/22 17:37:26 INFO SparkContext: Invoking stop() from shutdown hook 17/07/22 17:37:26 INFO ServerConnector: Stopped Spark@1133ec6e{HTTP/1.1}{0.0.0.0:4041} 17/07/22 17:37:26 INFO SparkUI: Stopped Spark web UI at http://10.102.16.74:4041 17/07/22 17:37:27 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/07/22 17:37:27 INFO MemoryStore: MemoryStore cleared 17/07/22 17:37:27 INFO BlockManager: BlockManager stopped 17/07/22 17:37:27 INFO BlockManagerMaster: BlockManagerMaster stopped 17/07/22 17:37:27 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/07/22 17:37:27 INFO SparkContext: Successfully stopped SparkContext 17/07/22 17:37:27 INFO ShutdownHookManager: Shutdown hook called 17/07/22 17:37:27 INFO ShutdownHookManager: Deleting directory /tmp/spark-887e00e2-8a18-4525-8fee-0bb0fc8997d5

avatar
Contributor

Try adding the jars as extra classpath with option --driver-class-path.

avatar
New Contributor

Hi @kalai selvan

so using the option --driver-class-path has worked - thanks

Had to include it and the --jars option still

[hdfs@namenode jars]$ spark-submit --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount --master local[4] --driver-class-path /tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar --jars /tmp/spark-streaming-kafka-0-8-assembly_2.11-2.1.1.2.6.1.0-129.jar spark-examples_2.11-2.1.1.2.6.1.0-129.jar broker1:6667 test2

this does not work though in cluster mode even after copying the jars to all the nodes, same directory, why would this be? Also, which why and when do we need to use --driver-class-path?