Loading data using sqoop to Hive table hangs

#1

I am trying to load data using sqoop to Hive table.

sqoop import-all-tables --num-mappers 1 --connect “jdbc:mysql://quickstart.cloudera:3306/retail_db” --username=retail_dba --password=cloudera --hive-import --hive-overwrite --create-hive-table --compress --compression-codec org.apache.hadoop.io.compress.SnappyCodec --outdir /user/java_files

Its runs perfectly till th eend where it say " Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.8.0.jar!/hive-log4j.properties"
, and then it takes forever and never comes out and finish the job. What could be the issue?

Output log :-
16/12/20 23:13:34 INFO mapreduce.ImportJobBase: Transferred 60 bytes in 20.2111 seconds (2.9687 bytes/sec)
16/12/20 23:13:34 INFO mapreduce.ImportJobBase: Retrieved 6 records.
16/12/20 23:13:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM departments AS t LIMIT 1
16/12/20 23:13:34 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.8.0.jar!/hive-log4j.properties

0 Likes

#2

These are all common issues with virtual machines. It is very tough to troubleshoot these issues.

0 Likes

#3

@Robbiekant

Please try to start the services in the below manner in the case of cloudera quickstart VM(Don’t start all the services if you don’t have enough memory)

In the case of sqoop to work with hive please start the services in below manner
1.Zookeper
2.HDFS
3.Yarn
4.hive
5.Sqoop

1.Make sure you have enough memory when using cloudera quickstart VM.
2.If you don’t have enough memory start only the services which are required for you as mentioned above.

0 Likes

#4

If it is just for a practice purpose then use “import” a particular table instead of “import-all-tables”

0 Likes

#5

Thanks , It worked. Looks like zookeeper was not running.

0 Likes

#6

Thanks all for the reply. Restart of Zookeeper help resolving the issue.

0 Likes