Created 03-21-2016 04:16 AM
I am trying to update the value of a record using spark sql in spark shell
I get executed the command Update tablename set age=20 where name=justin, and I am getting the following errors
scala> val teenagers = sqlContext.sql("UPDATE people SET age=20 WHERE name=Justin") java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier UPDATE found UPDATE people SET age=20 WHERE name=Justin ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) ...... |
Thanks
Sridhar
Created 03-21-2016 05:21 PM
@Sridhar Babu M it depends on the datasource you are updating, not all sources can be updated. What is the backend ur using for the Dataframe People?
For example for hive: it’s possible to update data in Hive using ORC format https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-sql-hive-orc-example.html
Created 03-31-2016 11:45 AM
Glad it worked out. Would you mind accepting this answer and the one from the other thread?
https://community.hortonworks.com/questions/24518/spark-sql-query-to-modify-values.html