Sqoop Export failure - Runtime Exception

I have data in textfiles (comma as delimiter) that I am trying to export to mysql using sqoop.
Sqoop command - sqoop export --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --table neha_result --export-dir /user/nehaluthra09/cloudera/ordertext
–input-fields-terminated-by ‘\001’ --columns “order_date,order_status,total_amount,total_orders”

MySQL table - ±-------------±------------±-----±----±--------±------+
| Field | Type | Null | Key | Default | Extra |
±-------------±------------±-----±----±--------±------+
| order_date | varchar(10) | YES | | NULL | |
| order_status | varchar(30) | YES | | NULL | |
| total_amount | double | YES | | NULL | |
| total_orders | double | YES | | NULL | |
±-------------±------------±-----±----±--------±------+

But the job is failing with the following exception:
2017-07-13 00:27:02,135 INFO [Thread-13] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
2017-07-13 00:27:02,137 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException: Can’t export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:122)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: Can’t parse input data: '2013-10-10,ON_HOLD,5559.400000000001,4689.660000000001’
at neha_result.__loadFromFields(neha_result.java:365)
at neha_result.parse(neha_result.java:298)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
… 10 more
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:854)
at neha_result.__loadFromFields(neha_result.java:350)
… 12 more

Can you please provide the task logs to get more idea on the issue.

@Neha The data file you are using ,is it text file? you data seem to be in format ::

‘2013-10-10,ON_HOLD,5559.400000000001,4689.660000000001’

If yes ,try --fields-terminated-by ‘,’ --input-lines-terminated-by ‘\n’, see if this helps!

It worked. Thank you!