Support Questions

Find answers, ask questions, and share your expertise

Who agreed with this topic

Failed to initialize pyspark2.2 in CDH5.12

avatar
New Contributor
ENV :Python3.6.1 ,JDK1.8,CDH5.12,Spark2.2. Following the official tutorial to setup with csd and parcels. Anything seen on the cloudera manager is ok! But I failed to initialize the pyspark2 in the shell. I found no methods to solve this problem . Anyone can help me? I feel so down for several days and boss want me fix the problem until today Very Thank You ! Here's the log : [hdfs@Master /data/soft/spark2.2]$ pyspark2 Python 3.6.1 (default, Jul 27 2017, 11:07:01) [GCC 4.4.6 20110731 (Red Hat 4.4.6-4)] on linux Type "help", "copyright", "credits" or "license" for more information. Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/07/27 12:02:09 ERROR spark.SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
Who agreed with this topic