Why my sqoop export fails?


#1

Here is my practice, it would be appreciated if anyone can please help me to sort it out, thank you in advance.

[paslechoix@gw03 ~]$ hdfs dfs -ls paslechoix/products0310
Found 2 items
-rw-r–r-- 3 paslechoix hdfs 0 2018-03-10 08:54 paslechoix/products0310/_SUCCESS
-rw-r–r-- 3 paslechoix hdfs 58300 2018-03-10 08:54 paslechoix/products0310/part-m-00000

  1. create table on mysql
    create table product0310
    (product_id int, product_category_id int, product_name varchar(20), product_description varchar(255), product_price float, product_image varchar(255))

  2. export
    *sqoop export -m 1 *
    *–connect jdbc:mysql://ms.itversity.com/retail_export *
    *–username=retail_user *
    *–password=itversity *
    *–table=product0310 *
    –export-dir paslechoix/products0310

Error:

18/03/10 09:02:56 INFO mapreduce.Job: Job job_1520592249193_0788 running in uber mode : false
18/03/10 09:02:56 INFO mapreduce.Job: map 0% reduce 0%
18/03/10 09:03:03 INFO mapreduce.Job: map 100% reduce 0%
18/03/10 09:03:04 INFO mapreduce.Job: Job job_1520592249193_0788 failed with state FAILED due to: Task failed task_1520592249193_0788_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/03/10 09:03:04 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=1
Launched map tasks=1
Rack-local map tasks=1
Total time spent by all maps in occupied slots (ms)=10302
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=5151
Total vcore-milliseconds taken by all map tasks=5151
Total megabyte-milliseconds taken by all map tasks=10549248
18/03/10 09:03:04 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/03/10 09:03:04 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 47.662 seconds (0 bytes/sec)
18/03/10 09:03:04 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18/03/10 09:03:04 INFO mapreduce.ExportJobBase: Exported 0 records.
18/03/10 09:03:04 ERROR mapreduce.ExportJobBase: Export job failed!
18/03/10 09:03:04 ERROR tool.ExportTool: Error during export: Export job failed!


#2

check the delimiter in the export file and give input-fields-terminated-by “,” ( if the delimiter is “,”)


#3

Post the error log from cloudera job history

http://quickstart.cloudera:19888/jobhistory/logs/

Also post the content of the file.

hadoop fs -cat paslechoix/products0310 | head.

Check by adding --columns


#4

Thank you Vinayak and Vani,

I finally sort it out, the root cause was the imported data doesn’t match the new mysql table structure in one field’s varchar(20) while the raw data is varchar(45)

Thanks.