Sqoop Export error



I am having trouble exporting data from HDFS to MySQL table.

Data in HDFS:
location: /user/sbhupathiraju86/departments
Data format:
7,Fan Shop
7,Fan Shop
7,Fan Shop
7,Fan Shop

Sqoop command to export:
sqoop export --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --export-dir=/user/sbhupathiraju86/departments --table test_dept_exp --input-fields-terminated-by ‘,’;

Error log:

Warning: /usr/hdp/ does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/08/01 22:55:30 INFO sqoop.Sqoop: Running Sqoop version:
18/08/01 22:55:30 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/08/01 22:55:30 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/08/01 22:55:30 INFO tool.CodeGenTool: Beginning code generation
18/08/01 22:55:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM test_dept_exp AS t LIMIT 1
18/08/01 22:55:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM test_dept_exp AS t LIMIT 1
18/08/01 22:55:30 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/
Note: /tmp/sqoop-sbhupathiraju86/compile/ad76448c0001b4d78d32a850c83b7d15/test_dept_exp.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/08/01 22:55:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-sbhupathiraju86/compile/ad76448c0001b4d78d32a850c83b7d15/test_dept_exp.jar
18/08/01 22:55:32 INFO mapreduce.ExportJobBase: Beginning export of test_dept_exp
18/08/01 22:55:34 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
18/08/01 22:55:34 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/
18/08/01 22:55:34 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/
18/08/01 22:55:43 INFO input.FileInputFormat: Total input paths to process : 11
18/08/01 22:55:43 INFO input.FileInputFormat: Total input paths to process : 11
18/08/01 22:55:44 INFO mapreduce.JobSubmitter: number of splits:4
18/08/01 22:55:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1528589352821_35699
18/08/01 22:55:45 INFO impl.YarnClientImpl: Submitted application application_1528589352821_35699
18/08/01 22:55:45 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19288/proxy/application_1528589352821_35699/
18/08/01 22:55:45 INFO mapreduce.Job: Running job: job_1528589352821_35699
18/08/01 22:55:52 INFO mapreduce.Job: Job job_1528589352821_35699 running in uber mode : false
18/08/01 22:55:52 INFO mapreduce.Job: map 0% reduce 0%
18/08/01 22:55:59 INFO mapreduce.Job: map 100% reduce 0%
18/08/01 22:56:01 INFO mapreduce.Job: Job job_1528589352821_35699 failed with state FAILED due to: Task failed task_1528589352821_35699_m_000001
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/08/01 22:56:01 INFO mapreduce.Job: Counters: 13
Job Counters
Failed map tasks=1
Killed map tasks=3
Launched map tasks=4
Other local map tasks=1
Data-local map tasks=3
Total time spent by all maps in occupied slots (ms)=31384
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=15692
Total vcore-milliseconds taken by all map tasks=15692
Total megabyte-milliseconds taken by all map tasks=32137216
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
18/08/01 22:56:01 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/08/01 22:56:01 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 27.6609 seconds (0 bytes/sec)
18/08/01 22:56:01 INFO mapreduce.ExportJobBase: Exported 0 records.
18/08/01 22:56:01 ERROR mapreduce.ExportJobBase: Export job failed!
18/08/01 22:56:01 ERROR tool.ExportTool: Error during export: Export job failed!

Please advise how to fix this error.


Could you please provide the table structure . As per the log, the deptname column data in the /user/sbhupathiraju86/departments file is bigger than the departments table deptname column.

2018-08-01 22:55:57,472 ERROR [Thread-12] org.apache.sqoop.mapreduce.AsyncSqlOutputFormat: Got exception in update thread: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column ‘dept_name’ at row 1


You are right, error was due to data truncation.