Support Questions
Find answers, ask questions, and share your expertise

How to pass configuration file that hosted in HDFS to Spark Application?

How to pass configuration file that hosted in HDFS to Spark Application?

Contributor

I'm working with Spark Structured Streaming. I want to pass config file to my spark application. This configuration file hosted in HDFS. For example;

spark_job.conf (HOCON)

spark {
  appName: "",
  master: "",
  shuffle.size: 4
  etc.. 
}

kafkaSource {
  servers: "",
  topic: "",
  etc.. 
}

redisSink {
  host: "",
  port: 999,
  timeout: 2000,
  checkpointLocation: "hdfs location",
  etc.. 
}

How can I pass it to Spark Application? How can I read this file(hosted HDFS) in Spark?