- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Spark SQL - Update Command
- Labels:
-
Apache Spark
Created ‎03-21-2016 04:16 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am trying to update the value of a record using spark sql in spark shell
I get executed the command Update tablename set age=20 where name=justin, and I am getting the following errors
scala> val teenagers = sqlContext.sql("UPDATE people SET age=20 WHERE name=Justin") java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier UPDATE found UPDATE people SET age=20 WHERE name=Justin ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) ...... |
Thanks
Sridhar
Created ‎03-21-2016 05:21 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Sridhar Babu M it depends on the datasource you are updating, not all sources can be updated. What is the backend ur using for the Dataframe People?
For example for hive: it’s possible to update data in Hive using ORC format https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-sql-hive-orc-example.html
Created ‎03-31-2016 11:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Glad it worked out. Would you mind accepting this answer and the one from the other thread?
https://community.hortonworks.com/questions/24518/spark-sql-query-to-modify-values.html

- « Previous
-
- 1
- 2
- Next »