Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

can't access pyspark shell

can't access pyspark shell

Explorer

Hi Team,

when i tried to access pyspark shell,it shows up with below error.

 

tocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.

: org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.

at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124)

at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)

at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:151)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:538)

at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)

at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)

at py4j.Gateway.invoke(Gateway.java:238)

at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)

at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)

at py4j.GatewayConnection.run(GatewayConnection.java:238)

at java.lang.Thread.run(Thread.java:745)

3 REPLIES 3

Re: can't access pyspark shell

Explorer

hi @Harish19 ,

can you tell us what version of java,spark and CDH you're running ? 


otherwise try to add spark-assembly-XXX.jar to hdfs (copy from local to hdfs)
then add  that parameter :

 

spark.yarn.jars hdfs://IP/spark/spark-assembly-XXX.jar

 

 

to the file  spark-defaults.conf

and don't forget to restart yarn.

I hope this will work for you ,




Re: can't access pyspark shell

Explorer

Thanks @Bildervic  for your reply.i can do the stuff that you mentioned in below reply but the thing is,we are facing this issues only from one edge node  but not from other edge node which have same clients.

 

To your questions:

Java version:oracle 1.8

spark 2.3

cdh  5.14

 

Thanks.

 

 

Re: can't access pyspark shell

Cloudera Employee

Hi,

 

Could you please share the Entire Error log console for analysis purpose and also share the Pyspark command that you are submitting.

 

 

Thanks

AK