Member since
08-21-2018
1
Post
0
Kudos Received
0
Solutions
08-21-2018
04:32 AM
@rupertlssmith You have to initialize sc depends upon how you are executing your code. If you are using spark-shell command line then you don't need to initilize sc as it will be initialized by default when you login but if you are developing code in other 3rd party tools and executing then you have to initilialize as follows: You can add the below lines before you call rddFromParquetHdfsFile import org.apache.spark.SparkConf import org.apache.spark.SparkContext val conf = new SparkConf().setAppName("your topic").setMaster("yarn-client") val sc = new SparkContext(conf)
... View more