Support Questions
Find answers, ask questions, and share your expertise

Spark JDBC Connection closing clarification

Highlighted

Spark JDBC Connection closing clarification

New Contributor

Hi , I have spark scala application connecting to Teradata using JDBC  with the below syntax.

val df = spark.read.format("jdbc").options(Map("url" -> connectionString, "user" -> userName, "password" -> passWord, "dbtable" -> sourceQuery, "driver" -> "com.teradata.jdbc.TeraDriver")).load()

 I perform processing from  above DataFrame object in the same application and at the end of the application there is "spark.stop" .  Does the jdbc connection also be closed when the Spark application is completed successfully when it executes spark.stop or If we need to specify explicitly any statement to close the connection please advise. I have seen some examples using Java where they creation connection first and then close the connection at the end  but mine is Spark Scala application , 

Apache Spark Version: 2.2.1