Support Questions

Find answers, ask questions, and share your expertise

How to pass HiveContext as an argument to another function using spark Scala

below is my scenario:

Object Sample {
	def main(args:Array[String]){
		val fileName = "SampleFile.txt"
		val conf = new SparkConf().setMaster("local").setAppName("LoadToHivePart")  
		val sc=new SparkContext(conf)
		val sqlContext = new org.apache.spark.sql.SQLContext(sc)  
		val hc = new org.apache.spark.sql.hive.HiveContext(sc) 
	def test(hc:String, fileName: String){

As per above code I am unable to pass hiveContext variable "hc" from main to another function. It is showing error.

Kindly help me on the same. Thanks.


Super Collaborator

Hi @Chaitanya D,

On the test function definition you must define hc as "org.apache.spark.sql.hive.HiveContext" (or HiveContext if you import ) instead of String( which currently defined)

Rest of all same as you have described.

Hope that helps !!

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.