Support Questions

Find answers, ask questions, and share your expertise

Getting FileNotFoundException and LeaseExpired Exception while writing a df to hdfs path

New Contributor
I am using spark 1.6 to read messages from Tibco.i am taking each Rdd in dstream ,performing some tranformations and converting it to a data frame and then finally writing to a hdfs path.My job runs for few batches then it starts erroring out saying _temporary file not found. And finally the job is aborted.Below is the detailed link to stack overflow question.

P.s I am already using spark.speculation=false

Cloudera Employee
Would you please try submitting Spark job by disabling
--conf spark.streaming.unpersist=false