Sqoop import aloo failing with below mentioned details

sqoop

#1

18/10/09 00:43:10 INFO mapreduce.ImportJobBase: Transferred 2.861 MB in 25.117 seconds (116.6394 KB/sec)
18/10/09 00:43:10 INFO mapreduce.ImportJobBase: Retrieved 68883 records.
Exception in thread “Thread-3” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:49)
at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:46)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
at org.apache.phoenix.util.PhoenixContextExecutor.callWithoutPropagation(PhoenixContextExecutor.java:93)
at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl.getConfiguration(ConfigurationFactory.java:46)
at org.apache.phoenix.jdbc.PhoenixDriver$1.run(PhoenixDriver.java:88)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)


#2

Yes, I too have failed commands on my cmd prompt. labs are not totally up.


#3

Can you check if the data is copied to your desired location?

Please paste the command also that you are using.


#4

The Labs are working fine. We are looking into some minor issue, you can start using it and post if you come across any issues in discuss.itversity.com


#5

facing issue with sqoop execution with following error as metioned below
WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/10/09 02:20:22 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 19.9475 seconds (0 bytes/sec)
18/10/09 02:20:22 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18/10/09 02:20:22 INFO mapreduce.ImportJobBase: Retrieved 0 records.
18/10/09 02:20:22 ERROR tool.ImportTool: Error during import: Import job failed!


#6

Please paste the sqoop command to troubleshoot.


#7

sqoop import
–connect “jdbc:mysql://ms.itversity.com:3306/retail_db”
–username retail_user
–password itversity
–table orders
–compress
–compression-codec org.apache.hadoop.io.compress.SnappyCodec
–target-dir /home/cmahendranbigdata/problem1/orders
–as-avrodatafile ;


#8

The target directory path which you have given is local path. Give HDFS path as /user/username/.