Sqoop import to hive

#1

When i tried to do sqoop import to hive tables in bigdata-labs its getting imported in \user{username} location instead of hive database location.

I am using the below code where sqoop_import_man is the database in hive location

sqoop import-all-tables --num-mappers 1 --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversi
ty --hive-import --hive-overwrite --create-hive-table --hive-database sqoop_import_man

Please clarify.

0 Likes

How to find the hadoop fs -ls /user/hive/warehouse
#2

@mangleeswaran use --hive-home to copy data to required location. By default data will be copied to \user{username}

0 Likes

#3

Sorry for asking as reply to this question. I am just looking here how to post a new question. I know how to reply for a question. I am not seeing anything like ‘ask a question or post a question’. Just want to know how to post a new question…

0 Likes

#4

@jeevakrishnarg : Go home. Click new topic button available in right corner. Thanks!

1 Like

#5

@SrikanthGanapavarapu: Still it trying to store in /user/{username}. Please find below code.

sqoop import-all-tables --num-mappers 1 --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversi
ty --hive-import --hive-overwrite --create-hive-table --hive-home=/apps/hive/warehouse/sqoop_import_man

Am I missing anything here? Thanks.

0 Likes

#6

Thanks, Got it…

0 Likes

#7

@SrikanthGanapavarapu - Query looks correct

I just did the reformat of your query as below and it’s default pointing to /apps/hive/warehouse/, it went through fine and created the hive tables under sqoop_import_man and files also present in /apps/hive/warehouse/sqoop_import_man. You may be referring the files which is copied earlier ?. You can delete these tables and files and retry it.

sqoop import-all-tables --num-mappers 1
–connect “jdbc:mysql://nn01.itversity.com:3306/retail_db”
–username retail_dba
–password itversity
–hive-import
–hive-overwrite
–create-hive-table
–hive-database sqoop_import_man

0 Likes

#8

@gnanaprakasam

Hi, I have deleted the database sqoop_import_man and recreated it again and used your query.

sqoop import-all-tables --num-mappers 1
–connect “jdbc:mysql://nn01.itversity.com:3306/retail_db”
–username retail_dba
–password itversity
–hive-import
–hive-overwrite
–create-hive-table
–hive-database sqoop_import_man

And got the below issue “Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
hdfs://nn01.itversity.com:8020/user/mangleeswaran/categories already exists”

0 Likes

#9

@mangleeswaran - adding --hive-home=/apps/hive/warehouse/sqoop_import_man helps you to point the correct hdfs path ?

0 Likes

#10

You need to drop the files in your home directory.

When you do hive import, this is what it does

  • copy data to your home directory
  • then move the data from home directory to HDFS location pointed by hive table

As you have data already in directories conflicting with sqoop hive import, it is failing. Just delete those directories and rerun the command.

3 Likes

Sqoop Import-all-tables in Hive
Sqoop Import failing
#11

@itversity - Thanks Durga.

0 Likes

#12

@itversity: Thanks Durga. It worked.

1 Like

#13

Thanks a lot . Its worked !!

0 Likes

#14

Thats the reason why we have to --target-dir <hdfs_path> while importing to hive. Correct me if I am wrong.

0 Likes

#15

As per my knowledge --target-dir is not used as part of hive-import. But I am not 100% sure.

0 Likes

#16

@mangleeswaran How did you delete the directory. I deleted all the files and then tried deleting directory. getting following error.

rm: `/user/cloudera/categories’: Is a directory

0 Likes

#17

@rahulabvp Use hadoop fs -rm -R /user/cloudera/categories

It deletes directory with its files.

0 Likes

#18

@mangleeswaran Thanks Buddy. It worked…

1 Like