Not able to update records in ORC hive table from either Hive or Spark



I am using Hive 1.1.0-cdh5.10.0. I created a hive table in ORC format, bucketed it, created table properties transactional as true and mention below settings in Hive session - hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

I am trying to update one table record from hive shell (update <TABLE_NAME> set <COLUMN_NAME=SOME_VALUE> where id=1;
but getting below error - FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.

I tried same from Spark by below steps -

  1. Load all table records in RDD and convert RDD to Dataframe using toDF method
  2. registered Dataframe as temptable using DF.registerTempTable
  3. tried to update the records in dataframe

getting below error -

mismatched input ‘update’ expecting {’(’, ‘SELECT’, ‘FROM’, ‘ADD’, ‘DESC’, ‘WITH’, ‘VALUES’, ‘CREATE’, ‘TABLE’, ‘INSERT’, ‘DELETE’, ‘DESCRIBE’, ‘EXPLAIN’, ‘SHOW’, ‘USE’, ‘DROP’, ‘ALTER’, ‘MAP’, ‘SET’, ‘RESET’, ‘START’, ‘COMMIT’, ‘ROLLBACK’, ‘REDUCE’, ‘REFRESH’, ‘CLEAR’, ‘CACHE’, ‘UNCACHE’, ‘DFS’, ‘TRUNCATE’, ‘ANALYZE’, ‘LIST’, ‘REVOKE’, ‘GRANT’, ‘LOCK’, ‘UNLOCK’, ‘MSCK’, ‘EXPORT’, ‘IMPORT’, ‘LOAD’}(line 1, pos 0)

Anyone having any idea how to update table records in Hive and Spark??Please suggest