Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to pass configuration file that hosted in HDFS to Spark Application?

Highlighted

How to pass configuration file that hosted in HDFS to Spark Application?

New Contributor

I'm working with Spark Structured Streaming. I want to pass config file to my spark application. This configuration file hosted in HDFS. For example;

spark_job.conf (HOCON)

spark {
  appName: "",
  master: "",
  shuffle.size: 4
  etc.. 
}

kafkaSource {
  servers: "",
  topic: "",
  etc.. 
}

redisSink {
  host: "",
  port: 999,
  timeout: 2000,
  checkpointLocation: "hdfs location",
  etc.. 
}

How can I pass it to Spark Application? How can I read this file(hosted HDFS) in Spark?