Sqoop Import error - Yashwanth 232

Sqoop import command used :slight_smile:
sqoop import --connect jdbc:mysql://ms.itversity.com:3306/retail_export --username retail_user --password itversity --table Customers --where “state=‘CA’” --compress --compression-codec org.apache.hadoop.io.compress.SnappyCodec --delete-target-dir --target-dir /user/yashwanth232/mydata/avrodata --fields-terminated-by ‘|’ --as-avrodatafile

Once I execute i get the below error , can you let me know how can i write the data using avro?

19/12/29 04:53:54 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.5.0-292
19/12/29 04:53:54 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/12/29 04:53:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/12/29 04:53:54 INFO tool.CodeGenTool: Beginning code generation
19/12/29 04:53:54 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Customers AS t LIMIT 1
19/12/29 04:53:54 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Customers AS t LIMIT 1
19/12/29 04:53:54 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.5.0-292/hadoop-mapreduce
Note: /tmp/sqoop-yashwanth232/compile/e50bfc50ff5176abc4340b905bfa13d6/Customers.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/12/29 04:53:56 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-yashwanth232/compile/e50bfc50ff5176abc4340b905bfa13d6/Customers.jar
19/12/29 04:53:58 INFO tool.ImportTool: Destination directory /user/yashwanth232/mydata/avrodata deleted.
19/12/29 04:53:58 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/12/29 04:53:58 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/12/29 04:53:58 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/12/29 04:53:58 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/12/29 04:53:58 INFO mapreduce.ImportJobBase: Beginning import of Customers
19/12/29 04:53:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Customers AS t LIMIT 1
19/12/29 04:53:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Customers AS t LIMIT 1
19/12/29 04:53:58 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-yashwanth232/compile/e50bfc50ff5176abc4340b905bfa13d6/Customers.avsc
19/12/29 04:53:58 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
19/12/29 04:53:58 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
19/12/29 04:54:05 INFO db.DBInputFormat: Using read commited transaction isolation
19/12/29 04:54:05 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(cust_no), MAX(cust_no) FROM Customers WHERE ( state=‘CA’ )
19/12/29 04:54:05 INFO mapreduce.JobSubmitter: number of splits:1
19/12/29 04:54:06 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1565300265360_44341
19/12/29 04:54:06 INFO impl.YarnClientImpl: Submitted application application_1565300265360_44341
19/12/29 04:54:06 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19088/proxy/application_1565300265360_44341/
19/12/29 04:54:06 INFO mapreduce.Job: Running job: job_1565300265360_44341
19/12/29 04:54:13 INFO mapreduce.Job: Job job_1565300265360_44341 running in uber mode : false
19/12/29 04:54:13 INFO mapreduce.Job: map 0% reduce 0%
19/12/29 04:54:19 INFO mapreduce.Job: Task Id : attempt_1565300265360_44341_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/12/29 04:54:23 INFO mapreduce.Job: Task Id : attempt_1565300265360_44341_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/12/29 04:54:27 INFO mapreduce.Job: Task Id : attempt_1565300265360_44341_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/12/29 04:54:34 INFO mapreduce.Job: map 100% reduce 0%
19/12/29 04:54:35 INFO mapreduce.Job: Job job_1565300265360_44341 failed with state FAILED due to: Task failed task_1565300265360_44341_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
19/12/29 04:54:35 INFO mapreduce.Job: Counters: 11
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=27076
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=13538
Total vcore-milliseconds taken by all map tasks=13538
Total megabyte-milliseconds taken by all map tasks=27725824
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
19/12/29 04:54:35 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
19/12/29 04:54:35 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 37.4217 seconds (0 bytes/sec)
19/12/29 04:54:35 INFO mapreduce.ImportJobBase: Retrieved 0 records.
19/12/29 04:54:35 ERROR tool.ImportTool: Error during import: Import job failed!