Support Questions
Find answers, ask questions, and share your expertise

DataFrame join with OR condition

I have the following join which is making my spark application hang here and never produces the result. Is OR condition on supported in Spark Dataframes?

DataFrame DFJoin = DF1.join(DF2, DF1.col("device").equalTo(DF2.col("id")).or(DF1.col("device").equalTo(DF2.col("new_id"))), "inner");

2 REPLIES 2

Mentor

You can use with filter

df2 = df1.filter($"Status" === 2 || $"Status" === 3)

thanks. The application is in Java, so what is going wrong above in the join statement?