Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

sqlContext not started after spark-shell command issued at console of CDH5.5

avatar
Contributor

Hi All,

 

My CDH 5.5 was running all fine but now when i typed spark-shell command ,i see below issue of sqlcontext,can anyone suggest what should i do to remove this issue,spark-shell was running perfectly till this issue.

 

Please note i restarted Cloudera manager from admin console 

 

16/08/04 13:38:51 ERROR Utils: Uncaught exception in thread main
java.lang.NullPointerException
at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
16/08/04 13:38:51 INFO SparkContext: Successfully stopped SparkContext
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /user/cloudera/.sparkStaging/application_1470339377450_0002. Name node is in safe mode.
The reported blocks 919 needs additional 2 blocks to reach the threshold 0.9990 of total blocks 921.
The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1416)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNames


<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql

 

1 ACCEPTED SOLUTION

avatar
Contributor

got solved by below 

 

sudo -u hdfs hdfs dfsadmin -safemode leave

 

View solution in original post

3 REPLIES 3

avatar
Contributor

got solved by below 

 

sudo -u hdfs hdfs dfsadmin -safemode leave

 

avatar
Visitor

i am using cloudera vm 5.7 when i type spark-shell i am getting following error.

<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql

avatar
Master Collaborator

5.5 or 5.7? the title and text disagree. 5.5 would have Spark 1.4, and I am not sure whether SQLContext was exposed as sqlContext by the shell by default like that. It should be in Spark 1.6 (= CDH 5.6+)