Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark SQL - Update Command

avatar
Expert Contributor

I am trying to update the value of a record using spark sql in spark shell

I get executed the command Update tablename set age=20 where name=justin, and I am getting the following errors

scala> val teenagers = sqlContext.sql("UPDATE people SET age=20 WHERE name=Justin")

java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier UPDATE found UPDATE people SET age=20 WHERE name=Justin ^

at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)

......

Thanks

Sridhar

1 ACCEPTED SOLUTION

avatar

@Sridhar Babu M it depends on the datasource you are updating, not all sources can be updated. What is the backend ur using for the Dataframe People?

For example for hive: it’s possible to update data in Hive using ORC format https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-sql-hive-orc-example.html

View solution in original post

10 REPLIES 10

avatar
Guru

@Sridhar Babu M

Glad it worked out. Would you mind accepting this answer and the one from the other thread?

https://community.hortonworks.com/questions/24518/spark-sql-query-to-modify-values.html