Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to submit spark Application when the cluster is setup using Ambari?

avatar

Hi All,
I have set up a cluster using Ambari. Now I'm trying to run spark application in standalone mode. It's not working. I used the command below. Am I giving the wrong port? Which port should I be using? Plz help

spark-submit \ --class org.apache.spark.examples.SparkPi \ --master spark://111.33.22.111:50070 \ --executor-memory 20G \ --total-executor-cores 100 \ /usr/hdp/2.6.4.0-91/spark2/examples/jars/spark-examples_2.11-2.2.0.2.6.4.0-91.jar \ 1000

Error:18/05/02 16:48:48 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /138.85.56.240:50070 is closed 18/05/02 16:49:08 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /138.85.56.240:50070 is closed 18/05/02 16:49:28 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /138.85.56.240:50070 is closed 18/05/02 16:49:48 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 18/05/02 16:49:48 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem at scala.Predef$.require(Predef.scala:224) at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) at org.apache.spark.SparkContext.<init>(SparkContext.scala:524) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516) at org.apache.spark.sql.SparkSession$Builder$anonfun$7.apply(SparkSession.scala:923) at org.apache.spark.sql.SparkSession$Builder$anonfun$7.apply(SparkSession.scala:915) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:915) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:782) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem at scala.Predef$.require(Predef.scala:224) at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91) at org.apache.spark.SparkContext.<init>(SparkContext.scala:524) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516) at org.apache.spark.sql.SparkSession$Builder$anonfun$7.apply(SparkSession.scala:923) at org.apache.spark.sql.SparkSession$Builder$anonfun$7.apply(SparkSession.scala:915) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:915) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:782) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

4 REPLIES 4

avatar

I used --master spark://111.33.22.111:7077 as well. Didn't work.

avatar
Expert Contributor
@Satya P

The error:

StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.

and the --master spark://111.33.22.111:50070

Any specific reason to use NN port 50070 instead of Spark related ports?

Thanks

Venkat

avatar

@Venkata Sudheer Kumar M It is unresponsive even if I use --master spark://111.33.22.111:8020.

18/05/03 09:04:51 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /111.33.22.111:8020 is closed 18/05/03 09:05:11 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /111.33.22.111:8020 is closed 18/05/03 09:05:31 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /111.33.22.111:8020 is closed 18/05/03 09:05:51 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up. 18/05/03 09:05:52 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

avatar

Infact I tried several ports. I'm not able to figure out which port is NN listening to.