Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark2-shell stuck

Spark2-shell stuck

New Contributor

I ran a spark2-submit command through command prompt. It ran successfully, after some time I terminated this command using CTRL+C on Centos.

e.g. spark2-submit --class org.apache.spark.SparkProgram.simpleapp --master yarn --deploy-mode cluster /x/xx/xxx/sbt/project/simpleapp/target/scala-2.11/simpleapp_2.11-1.0.jar


After that when I enter spark2-shell, it is stuck and not opening spark shell.



$ spark2-shell
WARNING: User-defined SPARK_HOME (/data/opt/cloudera/parcels/SPARK2-2.3.0.cloudera4-1.cdh5.13.3.p0.611179/lib/spark2) overrides detected (/opt/cloudera/parcels/SPARK2-2.3.0.cloudera4-1.cdh5.13.3.p0.611179/lib/spark2).
WARNING: Running spark-class from user-defined location.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).


I also tried to submit job using spark2-submit but no use this time.


Please suggest.. 


Re: Spark2-shell stuck




    Did you check your job in Cloudera Manager in Clusters -> Yarn Applications ?


    By default spark-shell is running in root.default queue on YARN, maybe your fair/capacity scheduler is configured in such way that there is not free resources for this queue. Try to put additional parameter --queue <queue-name> to spark-shell command points to queue with some free resources.