Created 10-11-2018 05:00 AM
I am using spark shell , my os is centos6 when i am trying to import spark.implicits._ , i get below error:
<console>:30: error: not found: value spark import spark.implicits._ ^
Created 10-16-2018 11:27 AM
I am able to import import spark.implicits._ , earlier i was using spark1 but launching spark2 solved the problem.
Created 10-11-2018 07:15 AM
also I get below error too :
scala> val spark = SparkSession.builder().enableHiveSupport().getOrCreate() <console>:30: error: not found: value SparkSession val spark = SparkSession.builder().enableHiveSupport().getOrCreate(
Created 10-16-2018 11:27 AM
I am able to import import spark.implicits._ , earlier i was using spark1 but launching spark2 solved the problem.
Created 07-13-2020 09:41 PM
Hi,
There is no package called spark.implicits.
Spark 1.x:
If you are using spark1.x version you will create sqlContext. By using sqlContext you can call sqlContext.implicits
Example:
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
sqlContext.implicits
Spark 2.x:
If you are using spark2.x version you will create session object. By using session you can call spark.implicits.
Example:
val spark: SparkSession = SparkSession.builder.appName(appName).config("spark.master", "local[*]").getOrCreate
spark.implicits
Note: If you are created session object using different name then you need to call with that reference name.
For example,
val rangaSpark: SparkSession = SparkSession.builder.appName(appName).config("spark.master", "local[*]").getOrCreate
rangaSpark.implicits