Sqoop import as avrodatafile is not working in lab environment

#1

@itversity,

when i try to do sqoop import as avrodata file getting below error in lab environment.

Error :

19/07/02 16:35:18 INFO impl.YarnClientImpl: Submitted application application_1561702578426_1726
19/07/02 16:35:18 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19088/proxy/application_1561702578426_1726/
19/07/02 16:35:18 INFO mapreduce.Job: Running job: job_1561702578426_1726
19/07/02 16:35:25 INFO mapreduce.Job: Job job_1561702578426_1726 running in uber mode : false
19/07/02 16:35:25 INFO mapreduce.Job: map 0% reduce 0%
19/07/02 16:35:30 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/07/02 16:35:32 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000002_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/07/02 16:35:32 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000001_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/07/02 16:35:32 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000003_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.
19/07/02 16:35:34 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:36 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000003_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:36 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000002_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:36 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000001_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:38 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:40 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000003_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:40 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000002_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

19/07/02 16:35:41 INFO mapreduce.Job: Task Id : attempt_1561702578426_1726_m_000001_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
19/07/02 16:35:43 INFO mapreduce.Job: map 100% reduce 0%
19/07/02 16:35:44 INFO mapreduce.Job: Job job_1561702578426_1726 failed with state FAILED due to: Task failed task_1561702578426_1726_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

19/07/02 16:35:44 INFO mapreduce.Job: Counters: 12
Job Counters
Failed map tasks=13
Killed map tasks=3
Launched map tasks=15
Other local map tasks=15
Total time spent by all maps in occupied slots (ms)=84284
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=42142
Total vcore-milliseconds taken by all map tasks=42142
Total megabyte-milliseconds taken by all map tasks=86306816
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
19/07/02 16:35:44 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
19/07/02 16:35:44 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 34.8424 seconds (0 bytes/sec)
19/07/02 16:35:44 INFO mapreduce.ImportJobBase: Retrieved 0 records.
19/07/02 16:35:44 ERROR tool.ImportTool: Error during import: Import job failed!

0 Likes

#2

@Divya_R

please use below command

sqoop import -Dmapreduce.job.user.classpath.first=true --connect jdbc:mysql://ms.itversity.com:3306/retail_db --username retail_user --password itversity --table orders --target-dir "/user/divisarojar/arun/problem1/orders" --as-avrodatafile --compress -- compression-codec snappy

instead of below command which you are using.

sqoop import --connect jdbc:mysql://ms.itversity.com:3306/retail_db --username 
retail_user --password itversity --table orders --target-dir "/user/divisarojar/arun/problem1/orders" --as-avrodatafile   
--compress -- compression-codec snappy
0 Likes

#3

@Ramesh1 and @itversity

sqoop import
-Dmapreduce.job.user.classpath.first=true
–connect jdbc:mysql://ms.itverstiy.com:3306/retail_db
–username retail_user
–password itversity
–table orders
–as-avrodatafile
–compress
-compression-codec “snappy”
–target-dir “/user/divisarojar/arun/problem1/orders”

Today when i run the sqoop import getting the below error please take a look

19/07/09 16:34:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.5.0-292
19/07/09 16:34:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/07/09 16:34:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/07/09 16:34:56 INFO tool.CodeGenTool: Beginning code generation
19/07/09 16:34:56 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1121)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:357)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2484)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2521)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2306)
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:839)
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:49)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:421)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:350)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:328)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1853)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1653)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.net.UnknownHostException: ms.itverstiy.com: unknown error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getAllByName0(InetAddress.java:1276)
at java.net.InetAddress.getAllByName(InetAddress.java:1192)
at java.net.InetAddress.getAllByName(InetAddress.java:1126)
at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:249)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:307)
… 33 more
19/07/09 16:34:56 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1659)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)

0 Likes

#4

Spelling mistake in the connection string. Try below command and let us know

--connect jdbc:mysql://ms.itversity.com:3306/retail_db

0 Likes

#5

oops its my bad. Thanks for the response @annapurna.

0 Likes

closed #6
0 Likes