Sqoop Import - Hive Table using target-dir


#1

I just tried simple sqoop import just to create Hive table using target-dir.
My observation:
Sqoop import did not create hive table in the target-dir path I gave in my query. Instead it created in the default path /user/hive/warehouse.

Example: I have retail_ods.db in the path /user/hive/warehouse. I want to import table departments from mysql to Hive using sqoop import. When I use --target-dir /user/hive/warehouse/retail_ods.db/departments. Sqoop import did not create departments table under the target directory path, but created under default path /user/hive/warehouse

Does target-dir not work to create hive table in a specified directory?. Purpose of this query is to understand target-dir option to create hive table during sqoop import.


#2

Can you provide complete sqoop import command?


#3

I have used the following command
sqoop import --connect jdbc:mysql://quickstart.cloudera:3306/retail_db --username retail_dba --password cloudera --hive-import --create-hive-table --table departments --target-dir /user/hive/warehouse/retail_ods.db/departments --outdir java_files
Before using this command I have deleted the departments table from default path /user/hive/warehouse and also made sure no table called departments exist in retail_ods.db


#4

Ok, you should not specify target-dir. Instead you need to use hive-database and give the database name which is retail_ods in this case.


#5

When using hive import, the --target-dir or --warehouse-dir is just used for placing data temporarily on HDFS and then this data is moved to hive tables’ location.
They cannot be used to set any hive table’s data location.

If you have an non-existent table,then a table will be created and the data will always be moved to hive warehouse directory /user/hive/warehouse/db_name/table_name.

If you have an already existing hive external table,then only the data will be moved to external location and not inside the hive warehouse directory. Make sure to remove --create-hive-table argument in such a case from the import command.


#6

Hi Durga,

i’m trying to import one table from oracle db to hive db.table, pfb the query however its not working.

please help.

sqoop import --connect jdbc:oracle:thin:@hostname --username root --P --query ‘select * from S_SRV_REQ_XM’ --hive-import --hive-table scrm_db_stg_zone1.S_SRV_REQ_XM
Warning: /opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/bin/…/lib/sqoop/…/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/08/01 21:31:53 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.11.0
Enter password:
Must specify destination with --target-dir.
Try --help for usage instructions.


#7

You must specify a destination directory. Refer the below docs