Unable to create the hive table

sqoop

#1

HI Team,

I have created the folder in hive that is /user/hive/warehouse/sqoop1_import.db

I want to try the example in which I am inserting the data from mysql into hive table directly.

I am following the instruction and written the command to import but its failing.

Below is the command :-

sqoop import-all-tables
–num-mappers 1
–connect “jdbc:mysql://nn01.itversity.com:3306/retail_db”
–username=retail_dba
–password=itversity
–hive-import
–hive-overwrite
–create-hive-table
–compress
–compress-codec org.apache.hadoop.io.compress.SnappyCodec
–outdir java_files

I know somewhere I am making mistake in which I am not specifying the destination folder. Can you please help me with this.

It is giving me following error :-

17/12/26 21:44:30 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/12/26 21:44:30 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/26 21:44:30 ERROR tool.BaseSqoopTool: Error parsing arguments for import-all-tables:
17/12/26 21:44:30 ERROR tool.BaseSqoopTool: Unrecognized argument: --compress-codec
17/12/26 21:44:30 ERROR tool.BaseSqoopTool: Unrecognized argument: org.apache.hadoop.io.compress.SnappyCodec
17/12/26 21:44:30 ERROR tool.BaseSqoopTool: Unrecognized argument: --outdir
17/12/26 21:44:30 ERROR tool.BaseSqoopTool: Unrecognized argument: java_files


#2

@vibhoroffice:
I tried this is labs.itversity.com

  1. You have to use hostname = ms.itversity.com
  2. username = retail_user (not sure if this retail_dba user available in gw01.itversity.com)
  3. Imp point here is: when you use --create-hive-table , if table already exist in HIVE, your sqoop script will fail.
    Reason - Since you didn’t mention any HIVE database, so sqoop will import all the available tables into default HIVE database. Of course, in this lab many users did this import and all these tables already exist.
  4. –compress-codec org.apache.hadoop.io.compress.SnappyCodec ==> not --compress here. It should be --compression-codec org.apache.hadoop.io.compress.SnappyCodec
  5. sqoop import-all-tables to my own HIVE database (vanampudi_sqoop_import_to_hive)
  6. Recommend to use : --autoreset-to-one-mapper when you try to import-all-tables so system will takes care of num.of mappers based on source table’s Primary Key.

Hope this info solve your error.

Below you can see tables imported into my database in HIVE.
#####################################################
sqoop import-all-tables
–num-mappers 1
–connect “jdbc:mysql://ms.itversity.com:3306/retail_db”
–username=retail_user
–password=itversity
–hive-import
–hive-overwrite
–create-hive-table
–hive-database vanampudi_sqoop_import_to_hive
–compress
–compression-codec org.apache.hadoop.io.compress.SnappyCodec
–outdir java_files
#####################################################

hive (vanampudi_sqoop_import_to_hive)> show tables;
OK
categories
customers
departments
order_items
order_items_nopk
orders
products
Time taken: 0.036 seconds, Fetched: 7 row(s)
hive (vanampudi_sqoop_import_to_hive)>

Thanks
Venkat


#3

@avr8082,

Thanks for valuable input Venkat it worked for me.

Regards,
Vibhor.