Sqoop export fails 2

Hello

Sqoop export failed after following the udemy lecture on the same topic.
Need help to troubleshoot.

Query and error below:
'[purbitabiswas0608@gw03 ~]$ sqoop export --connect jdbc:mysql://ms.itversity.com:3306/retail_export --username retail_user --password itversity --table daily_revenue --export-dir /apps/hive/warehouse/sse_sqoop_imports.db/daily_revenue --input-fields-terminated-by β€˜\001’

Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/03/15 02:41:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
18/03/15 02:41:29 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/03/15 02:41:30 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/03/15 02:41:30 INFO tool.CodeGenTool: Beginning code generation
18/03/15 02:41:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM daily_revenue AS t LIMIT 1
18/03/15 02:41:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM daily_revenue AS t LIMIT 1
18/03/15 02:41:30 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-purbitabiswas0608/compile/bf879ffdf8ffc20d8b3697e7b0239754/daily_revenue.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/03/15 02:41:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-purbitabiswas0608/compile/bf879ffdf8ffc20d8b3697e7b0239754/daily_revenue.jar
18/03/15 02:41:32 INFO mapreduce.ExportJobBase: Beginning export of daily_revenue
18/03/15 02:41:34 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
18/03/15 02:41:34 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/03/15 02:41:34 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/03/15 02:42:00 INFO input.FileInputFormat: Total input paths to process : 1
18/03/15 02:42:00 INFO input.FileInputFormat: Total input paths to process : 1
18/03/15 02:42:01 INFO mapreduce.JobSubmitter: number of splits:4
18/03/15 02:42:01 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1520592249193_10216
18/03/15 02:42:02 INFO impl.YarnClientImpl: Submitted application application_1520592249193_10216
18/03/15 02:42:02 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:8088/proxy/application_1520592249193_10216/
18/03/15 02:42:02 INFO mapreduce.Job: Running job: job_1520592249193_10216
18/03/15 02:42:09 INFO mapreduce.Job: Job job_1520592249193_10216 running in uber mode : false
18/03/15 02:42:09 INFO mapreduce.Job: map 0% reduce 0%
18/03/15 02:42:15 INFO mapreduce.Job: map 100% reduce 0%
18/03/15 02:42:16 INFO mapreduce.Job: Job job_1520592249193_10216 failed with state FAILED due to: Task failed task_1520592249193_10216_m_000003
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/03/15 02:42:16 INFO mapreduce.Job: Counters: 12
Job Counters
Failed map tasks=1
Killed map tasks=3
Launched map tasks=4
Data-local map tasks=4
Total time spent by all maps in occupied slots (ms)=33726
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=16863
Total vcore-milliseconds taken by all map tasks=16863
Total megabyte-milliseconds taken by all map tasks=34535424
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
18/03/15 02:42:16 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/03/15 02:42:16 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 42.7494 seconds (0 bytes/sec)
18/03/15 02:42:16 INFO mapreduce.ExportJobBase: Exported 0 records.
18/03/15 02:42:16 ERROR mapreduce.ExportJobBase: Export job failed!
18/03/15 02:42:16 ERROR tool.ExportTool: Error during export: Export job failed!Preformatted text

If you go to the link and go to logs of failed task it says this

java.lang.RuntimeException: Can’t parse input data: β€˜2013-07-31 00:00:00.0’

It means the order of columns in the table and in the data set being exported is not same or there is data type mismatch.

1 Like

Yes, Found that and fixed it! Thanks!
But, is the data read bottom-up? Because in the date field, the date β€˜2013-07-31 00:00:00.0’ is the last record of the table. My hive table looks like this:

hive (sse_sqoop_imports)> select * from daily_revenue;
OK
2013-07-25 00:00:00.0 68153.82999999997
2013-07-26 00:00:00.0 136520.1700000003
2013-07-27 00:00:00.0 101074.34000000014
2013-07-28 00:00:00.0 87123.08000000013
2013-07-29 00:00:00.0 137287.09000000032
2013-07-30 00:00:00.0 102745.62000000011
2013-07-31 00:00:00.0 131878.06000000006

Awesome my issue resolved!
In my case varchar(20) does not fit the data so i use varchar(25).
Thanks for the answer.