The CDH distribution of Hive does not support transactions (HIVE-5317). Currently, transaction support in Hive is an experimental feature that only works with the ORC file format. Cloudera recommends using the Parquet file format, which works across many tools. Merge updates in Hive tables using existing functionality, including statements such as INSERT, INSERT OVERWRITE, and CREATE TABLE AS SELECT.
If you require these features, please inquire about Apache Kudu.
Kudu is storage for fast analytics on fast data—providing a combination of fast inserts and updates alongside efficient columnar scans to enable multiple real-time analytic workloads across a single storage layer.
Ok Could you please let ,me know the file format that you are using for Hive ( testTableNew ) ,
Hive supports Delete Update only on ORC format starting from 0.14 .
Try creating a table with ORC format , if you want more flexibility then try Apache KUDU but it has it owns merits and demerits . Hope this helps .
CREATE TABLE Sample (
CLUSTERED BY (id) INTO 2 BUCKETS STORED AS ORC
@syamsri Since you are using Cloudera manager - are you using safety valve to add those properties that needs to go in HS2 or did you manual edited the hive-site.xml ? because it looks like your default session configuration is what being used and its not picking it up those transcation properties .
Sorry for the late response. Nope i could not find the exact solution for those error. However i did followed all the steps mentioned on this post but that did not work. As a result i uninstalled hive and re-installed some other hive version which works for me. I spend many days for this issue to find the exact solution but could not find it out.