Sqoop import failed due to class not found

sqoop
Sqoop import failed due to class not found
4.0 1

#1

Hi,

I am trying to perform import operation from Sqoop.
But I am getting below problem. I believe there is some Jar is missing but don’t know how to find it and where to download the same and where to place it. Please help.

java.lang.Exception: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class order_items not found
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:489)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:549)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class order_items not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2216)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:403)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.createDBRecordReader(DataDrivenDBInputFormat.java:270)
at org.apache.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:266)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.(MapTask.java:515)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: Class order_items not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2122)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2214)
… 12 more
18/05/16 18:55:41 INFO mapreduce.Job: Job job_local715673690_0001 running in uber mode : false
18/05/16 18:55:41 INFO mapreduce.Job: map 0% reduce 0%
18/05/16 18:55:41 INFO mapreduce.Job: Job job_local715673690_0001 failed with state FAILED due to: NA
18/05/16 18:55:41 INFO mapreduce.Job: Counters: 0
18/05/16 18:55:41 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/05/16 18:55:41 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 7.4781 seconds (0 bytes/sec)
18/05/16 18:55:41 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18/05/16 18:55:41 INFO mapreduce.ImportJobBase: Retrieved 0 records.
18/05/16 18:55:41 ERROR tool.ImportTool: Import failed: Import job failed!

Thanks

Sign up for our state of the art Big Data cluster for hands on practice as developer. Cluster have Hadoop, Spark, Hive, Sqoop, Kafka and more.



#2

@himanshuporwal Can u please share the command you have tried for sqoop import.

Thanks & Regards,


#3

sqoop import --connect jdbc:mysql://localhost:3306/retail_db --username retail_dba --password hadoop --target-dir /user/training/hipor/retail_db/order_items --num-mappers 2 --query “select o.*, sum(oi.order_item_subtotal) order_revenue from orders o join order_items oi on o.order_id = oi.order_item_order_id and $CONDITIONS group by o.order_id, o.order_customer_id,o.order_status” --split-by order_id


#4

@himanshuporwal I tried command in labs. Try below command and verify your expected output.

sqoop import --connect jdbc:mysql://ms.itversity.com:3306/retail_db --username retail_user --password ****** --target-dir /user/annapurnachinta/retail_db/order_items --num-mappers 2 --query "select o.*, sum(oi.order_item_subtotal) order_revenue from orders o join order_items oi on o.order_id = oi.order_item_order_id where \$CONDITIONS group by o.order_id, o.order_customer_id,o.order_status" --split-by order_id

#5

@himanshuporwal

You need to prefix with \ for “$CONDITIONS” i.e. it should be \$CONDITIONS as your query is enclosed in Double Quotes.

OR

You can enclose your query with Single Quotes and leaving remaining command as is.

This should fix your error. Hope this helps !