Support Questions
Find answers, ask questions, and share your expertise

How to avoid zero byte file or blank file while writing into spark

Im trying to write data into hive through spark-sql. There are few filter which is being used in the sql and even though if there is no data coming out of the query spark is creating empty file. If Im using scala or pyspark I can avoid this condition using the program. But in spark-sql I have so such option. Is there any set parameter which I make use of to avoid writing a blank file when there is no record coming out of the query?