Sqoop export of ARVO and SEQUENCE File - ERROR

sqoop

#1

I’m facing the issue with sqoop export of a table(EMP_PHOTO) having BLOB datatype from DB2 Database.
The import works fine with both sequence and arvo file formats but while exporting to table (a new empty table created in DB2 database) results in error.

Command I used for import:
[hduser@master ~]$ sqoop import --connect jdbc:db2://192.168.1.6:50000/SAMPLE --username MLING -P -table EMP_PHOTO -m 1 --target-dir EMP_PHOTO --delete-target-dir --as-avrodatafile;

18/09/18 21:22:07 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
18/09/18 21:22:08 INFO manager.SqlManager: Using default fetchSize of 1000
18/09/18 21:22:08 INFO tool.CodeGenTool: Beginning code generation
18/09/18 21:22:08 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP_PHOTO AS t WHERE 1=0
.
.
.
18/09/18 21:22:29 INFO mapreduce.ImportJobBase: Transferred 378.6914 KB in 19.1963 seconds (19.7273 KB/sec)
18/09/18 21:22:29 INFO mapreduce.ImportJobBase: Retrieved 8 records.

In HDFS the file as below:
[hduser@master ~]$ hadoop fs -ls /user/hduser/EMP_PHOTO
Found 2 items
_-rw-r–r-- 2 hduser supergroup 0 2018-09-18 21:22 /user/hduser/EMP_PHOTO/SUCCESS
-rw-r–r-- 2 hduser supergroup 387780 2018-09-18 21:22 /user/hduser/EMP_PHOTO/part-m-00000.avro

Command and error for export command:
sqoop export --connect jdbc:db2://192.168.1.6:50000/SAMPLE --username MLING -P -table EMP_PHOTO_HDFS -m 1 --export-dir /user/hduser/EMP_PHOTO/ ;

18/09/18 21:23:03 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
18/09/18 21:23:03 INFO manager.SqlManager: Using default fetchSize of 1000
18/09/18 21:23:03 INFO tool.CodeGenTool: Beginning code generation
18/09/18 21:23:04 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP_PHOTO_HDFS AS t WHERE 1=0
.
.
.
18/09/18 21:23:09 INFO mapreduce.Job: Running job: job_1537282972351_0013
18/09/18 21:23:16 INFO mapreduce.Job: Job job_1537282972351_0013 running in uber mode : false
18/09/18 21:23:16 INFO mapreduce.Job: map 0% reduce 0%
18/09/18 21:23:22 INFO mapreduce.Job: Task Id : attempt_1537282972351_0013_m_000000_0, Status : FAILED
Error: java.lang.UnsupportedOperationException: BlobRef not supported
_ at org.apache.sqoop.avro.AvroUtil.fromAvro(AvroUtil.java:146)_
_ at org.apache.sqoop.avro.AvroUtil.fromAvro(AvroUtil.java:170)_
_ at org.apache.sqoop.mapreduce.GenericRecordExportMapper.toSqoopRecord(GenericRecordExportMapper.java:94)_
_ at org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:36)_
_ at org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:30)_
_ at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)_
_ at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)_
_ at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)_
_ at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)_
_ at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)_
_ at java.security.AccessController.doPrivileged(Native Method)_
_ at javax.security.auth.Subject.doAs(Subject.java:422)_
_ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)_
_ at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)_
18/09/18 21:39:44 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 33.9824 seconds (0 bytes/sec)
18/09/18 21:39:44 INFO mapreduce.ExportJobBase: Exported 0 records.
18/09/18 21:39:44 ERROR tool.ExportTool: Error during export: Export job failed!

I tried the import with sequence file format and tried exporting the same but that too failed - logs below:
18/09/18 21:39:13 INFO mapreduce.Job: Running job: job_1537282972351_0015
18/09/18 21:39:22 INFO mapreduce.Job: Job job_1537282972351_0015 running in uber mode : false
18/09/18 21:39:22 INFO mapreduce.Job: map 0% reduce 0%
18/09/18 21:39:26 INFO mapreduce.Job: Task Id : attempt_1537282972351_0015_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.io.IOException: WritableName can’t load class: EMP_PHOTO
_ at org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:2033)_
_ at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1963)_
_ at org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1813)_
_ at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1762)_
_ at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1776)_
_ at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.initialize(SequenceFileRecordReader.java:54)_
_ at org.apache.sqoop.mapreduce.CombineShimRecordReader.initialize(CombineShimRecordReader.java:76)_
_ at org.apache.sqoop.mapreduce.CombineFileRecordReader.initialize(CombineFileRecordReader.java:64)_
_ at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:548)_
_ at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:786)_
_ at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)_
_ at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)_
_ at java.security.AccessController.doPrivileged(Native Method)_
_ at javax.security.auth.Subject.doAs(Subject.java:422)_
_ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)_
_ at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)_
Caused by: java.io.IOException: WritableName can’t load class: EMP_PHOTO
_ at org.apache.hadoop.io.WritableName.getClass(WritableName.java:77)_
_ at org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:2031)_
_ … 15 more_
Caused by: java.lang.ClassNotFoundException: Class EMP_PHOTO not found
_ at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)_
_ at org.apache.hadoop.io.WritableName.getClass(WritableName.java:75)_
_ … 16 more_
18/09/18 21:43:28 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 33.2002 seconds (0 bytes/sec)
18/09/18 21:43:28 INFO mapreduce.ExportJobBase: Exported 0 records.
18/09/18 21:43:28 ERROR tool.ExportTool: Error during export: Export job failed!

Need some clarifications on where I am missing things !!!
Thanks in advance