New Contributor
Posts: 2
Registered: ‎08-21-2018

error writing data from spark streaming to postgresql ?

[ Edited ]

below is my code , i m reading the data from kafka having json data , and i wanted to store the data into postgresql. i have created the database and table with schema in postgrase but it doesnot allow streaming data ingestion. 



val spark = SparkSession.builder().master("local[*]")

val schema = new StructType()

import spark.implicits._

val df = spark
.option("kafka.bootstrap.servers", "localhost:9092")
.option("subscribe", "topic1")
.option("startingOffsets", "earliest")

val data =$"value" cast "string" as "json")
.select(from_json($"json", schema) as "data")


val pgdata = data.writeStream
.option("url", "jdbc:postgresql://localhost:5432/spark_db")
.option("dbtable", "spark_data")
.option("user", "username")
.option("password", "password")


error - Data source jdbc does not support streamed writing

Posts: 1,903
Kudos: 435
Solutions: 305
Registered: ‎07-31-2013

Re: error writing data from spark streaming to postgresql ?

Like the error notes, support for writing from a stream to a JDBC sink is not present in Spark yet:

Take a look at this past thread where an alternative, more direct approach is discussed: