Sqoop Failed when tried with avro format

#1

When I tried to import a table as avrodatafile i got below error. Please let me know if any one has idea about this error.

sqoop import --connect “jdbc:mysql://nn01.itversity.com:3306/retail_db” --username retail_dba --password itversity --as-avrodatafile --target-dir /user/raghavendrakumars/sqoop_import/categories --table categories

error I am getting is

Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

16/11/29 14:32:32 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000002_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

16/11/29 14:32:32 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000001_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

16/11/29 14:32:32 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000003_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:35 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000002_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:35 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000001_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:35 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:36 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000003_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:39 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000002_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:39 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000001_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:40 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000003_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:40 INFO mapreduce.Job: Task Id : attempt_1480307771710_0246_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/11/29 14:32:44 INFO mapreduce.Job: map 100% reduce 0%
16/11/29 14:32:45 INFO mapreduce.Job: Job job_1480307771710_0246 failed with state FAILED due to: Task failed task_1480307771710_0246_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0

16/11/29 14:32:46 INFO mapreduce.Job: Counters: 12
Job Counters
Failed map tasks=13
Killed map tasks=3
Launched map tasks=16
Other local map tasks=16
Total time spent by all maps in occupied slots (ms)=37440
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=37440
Total vcore-milliseconds taken by all map tasks=37440
Total megabyte-milliseconds taken by all map tasks=38338560

0 Likes

#2

You have to use -Dmapreduce.job.user.classpath.first parameter.

e.g
sqoop import -Dmapreduce.job.user.classpath.first=true
–connect=“jdbc:mysql://nn01.itversity.com:3306/retail_db”
–username=retail_dba
–password=itversity \
–table=departments
–warehouse-dir=/apps/hive/warehouse/gnanaprakasam.db
–delete-target-dir
–as-avrodatafile

1 Like

#3

Yeah… just saw another thread with same topic.

1 Like

#4

when i use the option -Dmapreduce.job.user.classpath.first=true i got four .avro extension files such as
/user/mangleeswaran/departments_delimiter/part-m-00000.avro

But we are supposed to get .avsc file as per Durga’s explaination on avrodata.

0 Likes

#5

@mangleeswaran, the .avsc file will be generated in your home directory, the .avro files (depending upon number of mappers) will be generated in your import target directory/warehouse

0 Likes

#6

@pramodvspk : you mean in /user/mangleeswaran?

0 Likes

#7

@pramodvspk : Got it :slight_smile: in /home/mangleeswaran

0 Likes