Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Is Case Statement in query of Spark-Sql (Ver 2.0 above) is supported.Previously It is workking fine.Now it is giving me syntax error.

avatar
New Member
 
1 ACCEPTED SOLUTION

avatar

Works for me

print(s"Spark ${spark.version}")
val df = spark.createDataFrame(Seq(( 2,  9), ( 1,  5),( 1,  1),( 1,  2),( 2,  8)))
              .toDF("y", "x")
df.createOrReplaceTempView("test")

spark.sql("select CASE WHEN y = 2 THEN 'A' ELSE 'B' END AS flag, x from test").show

Returns

Spark 2.0.0
df: org.apache.spark.sql.DataFrame = [y: int, x: int]
+----+---+
|flag|  x|
+----+---+
|   A|  9|
|   B|  5|
|   B|  1|
|   B|  2|
|   A|  8|
+----+---+

View solution in original post

1 REPLY 1

avatar

Works for me

print(s"Spark ${spark.version}")
val df = spark.createDataFrame(Seq(( 2,  9), ( 1,  5),( 1,  1),( 1,  2),( 2,  8)))
              .toDF("y", "x")
df.createOrReplaceTempView("test")

spark.sql("select CASE WHEN y = 2 THEN 'A' ELSE 'B' END AS flag, x from test").show

Returns

Spark 2.0.0
df: org.apache.spark.sql.DataFrame = [y: int, x: int]
+----+---+
|flag|  x|
+----+---+
|   A|  9|
|   B|  5|
|   B|  1|
|   B|  2|
|   A|  8|
+----+---+