Sqoop export error while loading data into mysql table from hdfs


#1

I am getting an error while loading data into an empty table dept_exp in MYSQL from data in hdfs /user/sbhupathiraju86/departments_raw

Commad used to create table in Mysql:
sqoop eval --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --query “create table dept_exp (id integer, name char)” ;

Command to load data into mysql table from HDFS:

sqoop export --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --table dept_exp --export-dir /user/sbhupathiraju86/departments_raw ;

output:
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/07/19 23:45:18 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
18/07/19 23:45:18 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/07/19 23:45:18 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/07/19 23:45:18 INFO tool.CodeGenTool: Beginning code generation
18/07/19 23:45:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM dept_exp AS t LIMIT 1
18/07/19 23:45:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM dept_exp AS t LIMIT 1
18/07/19 23:45:19 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-sbhupathiraju86/compile/336d416a3a6ca415505581a8a52ae08c/dept_exp.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/07/19 23:45:20 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-sbhupathiraju86/compile/336d416a3a6ca415505581a8a52ae08c/dept_exp.jar
18/07/19 23:45:20 INFO mapreduce.ExportJobBase: Beginning export of dept_exp
18/07/19 23:45:24 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
18/07/19 23:45:24 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/07/19 23:45:25 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/07/19 23:45:32 INFO input.FileInputFormat: Total input paths to process : 5
18/07/19 23:45:32 INFO input.FileInputFormat: Total input paths to process : 5
18/07/19 23:45:32 INFO mapreduce.JobSubmitter: number of splits:4
18/07/19 23:45:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1528589352821_28690
18/07/19 23:45:33 INFO impl.YarnClientImpl: Submitted application application_1528589352821_28690
18/07/19 23:45:33 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19288/proxy/application_1528589352821_28690/
18/07/19 23:45:33 INFO mapreduce.Job: Running job: job_1528589352821_28690
18/07/19 23:45:39 INFO mapreduce.Job: Job job_1528589352821_28690 running in uber mode : false
18/07/19 23:45:39 INFO mapreduce.Job: map 0% reduce 0%
18/07/19 23:45:44 INFO mapreduce.Job: map 100% reduce 0%
18/07/19 23:45:45 INFO mapreduce.Job: Job job_1528589352821_28690 failed with state FAILED due to: Task failed task_1528589352821_28690_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/07/19 23:45:45 INFO mapreduce.Job: Counters: 13
Job Counters
Failed map tasks=1
Killed map tasks=3
Launched map tasks=4
Other local map tasks=1
Data-local map tasks=3
Total time spent by all maps in occupied slots (ms)=18856
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=9428
Total vcore-milliseconds taken by all map tasks=9428
Total megabyte-milliseconds taken by all map tasks=19308544
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
18/07/19 23:45:45 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/07/19 23:45:45 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 21.9065 seconds (0 bytes/sec)
18/07/19 23:45:45 INFO mapreduce.ExportJobBase: Exported 0 records.
18/07/19 23:45:45 ERROR mapreduce.ExportJobBase: Export job failed!
18/07/19 23:45:45 ERROR tool.ExportTool: Error during export: Export job failed!

=========================================================================
Please advise what is the root cause of error and how to fix it.


#2

@sbhupathiraju
Create table in MYSQl as:

sqoop eval
–connect jdbc:mysql://ms.itversity.com:3306/retail_export
–username retail_user
–password itversity
–query “create table dept_exp(id int, name varchar(20))”;

Try below command for loading data from HDFS to MYSQL:

sqoop export
–connect jdbc:mysql://ms.itversity.com:3306/retail_export
–username retail_user
–password itversity
–table dept_exp
–export-dir /user/sbhupathiraju86/departments_raw/


#3

Check error logs here:
http://rm01.itversity.com:19888/jobhistory/attempts/job_1528589352821_28690/m/FAILED

Looks like you are trying to insert text into a non-text field