How to take incremetal backup in hive for RDBMS and HDFS

How to take incremental backup in hive data source as RDBMS and HDFS.
Kindly give the all steps.

What do you mean by incremental backup in HDFS?

Not in HDFS considering data source as HDFS or RDBMS.
I just want to know that incremental backup steps in HIVE.

Can you elaborate what you mean by incremental backup with example?

Like using sqoop we are using the quey --last-value
In incremental backup it copies only new data it will not copy the whole data. The last modified data.
EX-given below

sqoop import --connect jdbc:teradata://{host name}/Database=retail
–connection-manager org.apache.sqoop.teradata.TeradataConnManager
–username dbc --password dbc --table SOURCE_TBL --target-dir /user/incremental_table -m 1
–check-column modified_date --incremental lastmodified --last-value {last_import_date}

ex in mysql
mysqlbackup --defaults-file=/home/pekka/my.cnf --incremental

Ok, do you want to take data back up of Hive?

Yes, i hope you got my question. so do you know how to take incremental backup?


  1. Have you tried sqoop options --incremental and --last-value
    while importing from RDBMS to Hive?
  2. While importing from HDFS to Hive everytime new file loaded to Hive table will be stored as new file in Hive warehouse. is this fine for you?