Export job failed


#1

I am exporting data from hive to mysql

sqoop export
–connect jdbc:mysql://ms.itversity.com:3306/retail_export
–username retail_user
–password itversity
–export-dir /apps/hive/warehouse/meetgauravjain_sqoop_import.db/daily_revenue
–table daily_revenue
–input-fields-terminated-by “\001”

But the job is failing. Below is the log

17/12/05 12:21:24 INFO mapreduce.Job: Job job_1507687444776_20320 failed with state FAILED due to: Task failed task_1507687444776_20320_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/12/05 12:21:24 INFO mapreduce.Job: Counters: 9
Job Counters
Failed map tasks=3
Killed map tasks=1
Launched map tasks=4
Data-local map tasks=4
Total time spent by all maps in occupied slots (ms)=48594
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=24297
Total vcore-milliseconds taken by all map tasks=24297
Total megabyte-milliseconds taken by all map tasks=49760256
17/12/05 12:21:24 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/12/05 12:21:24 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 26.7251 seconds (0 bytes/sec)
17/12/05 12:21:24 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
17/12/05 12:21:24 INFO mapreduce.ExportJobBase: Exported 0 records.
17/12/05 12:21:24 ERROR mapreduce.ExportJobBase: Export job failed!
17/12/05 12:21:24 ERROR tool.ExportTool: Error during export: Export job failed!

job url is

http://rm01.itversity.com:8088/proxy/application_1507687444776_20320/

could you please help.


#2

Hi Gaurav,

Issue is with the data. There is a primary key set on order_date in mysql.
You are trying to insert into order_date table a duplicate value for order_date '2013-07-31 00:00:00.0’
Consider creating a new table and then insert data or truncate the table.

Hope it helps and let me know if it worked?

Thanks,
Dinakar

36 PM