Support Questions

Find answers, ask questions, and share your expertise

Spark submit in local mode against a kerberized cluster

avatar
New Contributor

Hello Cloudera Team,

I have a requirement read hdfs file from kerberized cluster in spark local mode when i try this i am facing  Client cannot authenticate via:[TOKEN, KERBEROS]. This was the API i implemented ,

 

public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("spark-ml").setMaster("local[1]")
.set("hadoop.security.authentication", "KERBEROS")
.set("spark.hadoop.fs.defaultFS","hdfs://server35:8020")
.set("hadoop.rpc.protection", "privacy")
.set("hadoop.security.authorization", "true")
.set("spark.history.kerberos.enabled", "true")
.set("spark.kerberos.keytab", "D:\\resources\\ruleuser.keytab")
.set("spar.kerberos.principal", "ruleuser/server35@HADOOP.COM")
.set("principal", "ruleuser/server35@HADOOP.COM")
.set("keytab","D:\\resources\\ruleuser.keytab")
.set("spark.files", "D:\\resources\\core-site.xml,D:\\resources\\hdfs-site.xml")
.set("class", "Main2");
SparkContext context = new SparkContext(conf);
context.addFile("D:\\resources\\", true);
SparkSession session = SparkSession.builder().sparkContext(context).getOrCreate();
Dataset<Row> df = session.read().format("csv").option("header", true)
.load("hdfs://server35:8020/user/ruleuser/seahoarse/file123.csv");
df.show(5);
}

 

Will it possible in local mode without run kinit command before spark-submit.

 

Thanks,

Ajay Babu Maguluri.

1 REPLY 1

avatar
Master Collaborator

Hi @ajaybabumYes we can able run Spark in local mode against the Kerberized cluster. For a quick test, can you directly open spark-shell to try reading the CSV file from the HDFS location and show the output of the contents to verify whether do you have any issue in the Cluster / Spark configuration or if it's more on your application code?

>>  
Will it possible in local mode without run kinit command before spark-submit.

-- By passing --keytab --principal details in your spark-submit, you don't need to run kinit command before spark-submit.  Thanks