Support Questions
Find answers, ask questions, and share your expertise

File not found exception while loading a properties file on a Scala SBT project

File not found exception while loading a properties file on a Scala SBT project


I am trying to learn a Scala-Spark JDBC program on IntelliJ IDEA. In order to do that, I have created a Scala SBT Project and the project structure looks like:Rb1tS





Before writing the JDBC connection parameters in the class, I first thought of loading a properties file which contain all my connection properties and trying to display if they are loading properly as below: content:



package com.yearpartition.obj

import java.util.Properties

import org.apache.spark.sql.SparkSession
import org.apache.log4j.{Level, LogManager, Logger}
import org.apache.spark.SparkConf

object PartitionRetrieval {

  var conf = new SparkConf().setAppName("Spark-JDBC")
  val properties = new Properties()
  properties.load(new FileInputStream(""))
  val connectionUrl = properties.getProperty("gpDevUrl")
  val devUserName=properties.getProperty("devUserName")
  val devPassword=properties.getProperty("devPassword")
  val gpDriverClass=properties.getProperty("gpDriverClass")

  println("connectionUrl: " + connectionUrl)


  def main(args: Array[String]): Unit = {
    val spark = SparkSession.builder().enableHiveSupport().config(conf).master("local[2]").getOrCreate()
    println("connectionUrl: " + connectionUrl)

Content of build.sbt:

name := "YearPartition"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies ++=  {
  val sparkCoreVer = "2.2.0"
  val sparkSqlVer = "2.2.0"
    "org.apache.spark" %% "spark-core" % sparkCoreVer % "provided" withSources(),
    "org.apache.spark" %% "spark-sql" % sparkSqlVer % "provided"  withSources(),
    "org.json4s" %% "json4s-jackson" % "3.2.11" % "provided",
    "org.apache.httpcomponents" % "httpclient" % "4.5.3"

Since I am not writing or saving data into any file and trying to display the values of properties file, I executed the code using following:

SPARK_MAJOR_VERSION=2 spark-submit --class com.yearpartition.obj.PartitionRetrieval yearpartition_2.11-0.1.jar

But I am getting file not found exception as below:

Caused by: (No such file or directory)

I tried to fix it in vain. Could anyone let me know what is the mistake I am doing here and how can I correct it ?