Support Questions
Find answers, ask questions, and share your expertise

How to pass HiveContext as an argument to another function using spark Scala

below is my scenario:

Object Sample {
	def main(args:Array[String]){
		val fileName = "SampleFile.txt"
		val conf = new SparkConf().setMaster("local").setAppName("LoadToHivePart")  
		conf.set("spark.ui.port","4041")
		val sc=new SparkContext(conf)
		val sqlContext = new org.apache.spark.sql.SQLContext(sc)  
		val hc = new org.apache.spark.sql.hive.HiveContext(sc) 
		hc.setConf("hive.metastore.uris","thrift://127.0.0.1:9083")
		test(hc,fileName)
		sc.stop()
	}
	def test(hc:String, fileName: String){
		//code.....
	}
}

As per above code I am unable to pass hiveContext variable "hc" from main to another function. It is showing error.

Kindly help me on the same. Thanks.

1 REPLY 1

Super Collaborator

Hi @Chaitanya D,

On the test function definition you must define hc as "org.apache.spark.sql.hive.HiveContext" (or HiveContext if you import ) instead of String( which currently defined)

Rest of all same as you have described.

Hope that helps !!

; ;