Sqoop Export Not working


#1

Hi,
I tried using the below query:
sqoop export
–username retail_dba
–password itversity
–connect jdbc:mysql://nn01.itversity:3306/retail_export
–export-dir /user/hive/warehouse/samandav.db/products_hive/
–input-fields-terminated-by ‘\001’
–input-null-non-string “null”
–input-null-string “null”
–update-mode allowinsert
–update-key product_id
–columns “product_id, product_category_id, product_name, product_description, product_price, product_imaage, product_grade, product_sentiment” --table products_external_saranya --driver com.mysql.jdbc.Driver;

It fails with an error Message:
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/05/10 13:46:23 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
18/05/10 13:46:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/10 13:46:23 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
18/05/10 13:46:23 INFO manager.SqlManager: Using default fetchSize of 1000
18/05/10 13:46:24 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1121)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:357)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2484)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2521)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2306)
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:839)
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:49)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:421)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:350)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnNamesForRawQuery(SqlManager.java:151)
at org.apache.sqoop.manager.SqlManager.getColumnNames(SqlManager.java:116)
at org.apache.sqoop.manager.ConnManager.configureDbOutputColumns(ConnManager.java:733)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:96)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.net.UnknownHostException: nn01.itversity: unknown error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getAllByName0(InetAddress.java:1276)
at java.net.InetAddress.getAllByName(InetAddress.java:1192)
at java.net.InetAddress.getAllByName(InetAddress.java:1126)
at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:249)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:307)
… 28 more
18/05/10 13:46:24 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
at org.apache.sqoop.manager.SqlManager.filterSpecifiedColumnNames(SqlManager.java:129)
at org.apache.sqoop.manager.SqlManager.getColumnNames(SqlManager.java:116)
at org.apache.sqoop.manager.ConnManager.configureDbOutputColumns(ConnManager.java:733)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:96)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Could anyone please help me on this?


#2

Hi,
‘.com’ is missed in your command,
–connect jdbc:mysql://nn01.itversity:3306/retail_export

Try with
–connect jdbc:mysql://nn01.itversity.com:3306/rerail_export

Thanks
Aparna


#3

I have changed my query as below:
sqoop export --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --export-dir /apps/hive/warehouse/samandav.db/products_hive/ --update-mode allowinsert --update-key product_id --table products_external_saranya

still getting the below error:
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/05/10 16:40:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
18/05/10 16:40:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/10 16:40:11 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
18/05/10 16:40:11 INFO manager.SqlManager: Using default fetchSize of 1000
18/05/10 16:40:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM products_external_saranya AS t WHERE 1=0
18/05/10 16:40:12 INFO tool.CodeGenTool: Beginning code generation
18/05/10 16:40:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM products_external_saranya AS t WHERE 1=0
18/05/10 16:40:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM products_external_saranya AS t WHERE 1=0
18/05/10 16:40:12 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-saranyamandava/compile/f3608a39e5c377ee70f6377ac2e58e15/products_external_saranya.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/05/10 16:40:13 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-saranyamandava/compile/f3608a39e5c377ee70f6377ac2e58e15/products_external_saranya.jar
18/05/10 16:40:13 ERROR tool.ExportTool: Error during export: Mixed update/insert is not supported against the target database yet
[saranyamandava@gw03 ~]$ sqoop export --connect jdbc:mysql://nn01.itversity.com:3306/retail_export --username retail_dba --password itversity --export-dir /apps/hive/warehouse/samandav.db/products_hive/ --update-mode allowinsert --update-key product_id --table products_external_saranya
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/05/10 16:40:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
18/05/10 16:40:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/10 16:40:33 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/05/10 16:40:33 INFO tool.CodeGenTool: Beginning code generation
18/05/10 16:40:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM products_external_saranya AS t LIMIT 1
18/05/10 16:40:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM products_external_saranya AS t LIMIT 1
18/05/10 16:40:34 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-saranyamandava/compile/2db344041d88572deb33ec12ffde2661/products_external_saranya.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/05/10 16:40:35 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-saranyamandava/compile/2db344041d88572deb33ec12ffde2661/products_external_saranya.jar
18/05/10 16:40:35 WARN manager.MySQLManager: MySQL Connector upsert functionality is using INSERT ON
18/05/10 16:40:35 WARN manager.MySQLManager: DUPLICATE KEY UPDATE clause that relies on table’s unique key.
18/05/10 16:40:35 WARN manager.MySQLManager: Insert/update distinction is therefore independent on column
18/05/10 16:40:35 WARN manager.MySQLManager: names specified in --update-key parameter. Please see MySQL
18/05/10 16:40:35 WARN manager.MySQLManager: documentation for additional limitations.
18/05/10 16:40:35 INFO mapreduce.ExportJobBase: Beginning export of products_external_saranya
18/05/10 16:40:38 INFO impl.TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
18/05/10 16:40:38 INFO client.RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/05/10 16:40:38 INFO client.AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/05/10 16:40:48 INFO input.FileInputFormat: Total input paths to process : 2
18/05/10 16:40:48 INFO input.FileInputFormat: Total input paths to process : 2
18/05/10 16:40:49 INFO mapreduce.JobSubmitter: number of splits:4
18/05/10 16:40:49 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1525279861629_5823
18/05/10 16:40:50 INFO impl.YarnClientImpl: Submitted application application_1525279861629_5823
18/05/10 16:40:50 INFO mapreduce.Job: The url to track the job: http://rm01.itversity.com:19288/proxy/application_1525279861629_5823/
18/05/10 16:40:50 INFO mapreduce.Job: Running job: job_1525279861629_5823
18/05/10 16:40:56 INFO mapreduce.Job: Job job_1525279861629_5823 running in uber mode : false
18/05/10 16:40:56 INFO mapreduce.Job: map 0% reduce 0%
18/05/10 16:41:01 INFO mapreduce.Job: map 100% reduce 0%
18/05/10 16:41:02 INFO mapreduce.Job: Job job_1525279861629_5823 failed with state FAILED due to: Task failed task_1525279861629_5823_m_000003
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/05/10 16:41:02 INFO mapreduce.Job: Counters: 32
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=161337
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=327
HDFS: Number of bytes written=0
HDFS: Number of read operations=7
HDFS: Number of large read operations=0
HDFS: Number of write operations=0
Job Counters
Failed map tasks=1
Killed map tasks=2
Launched map tasks=4
Rack-local map tasks=4
Total time spent by all maps in occupied slots (ms)=22886
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=11443
Total vcore-milliseconds taken by all map tasks=11443
Total megabyte-milliseconds taken by all map tasks=23435264
Map-Reduce Framework
Map input records=0
Map output records=0
Input split bytes=268
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=40
CPU time spent (ms)=540
Physical memory (bytes) snapshot=221339648
Virtual memory (bytes) snapshot=3702120448
Total committed heap usage (bytes)=217055232
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=0
18/05/10 16:41:02 INFO mapreduce.ExportJobBase: Transferred 327 bytes in 24.9473 seconds (13.1076 bytes/sec)
18/05/10 16:41:02 INFO mapreduce.ExportJobBase: Exported 0 records.
18/05/10 16:41:02 ERROR mapreduce.ExportJobBase: Export job failed!
18/05/10 16:41:02 ERROR tool.ExportTool: Error during export: Export job failed!


#4

Hi,
I tried your query with little changes with my login, it worked and inserted two records in products_external_saranya, you can check.

KIndly check modified query.

sqoop export
–connect jdbc:mysql://nn01.itversity.com:3306/retail_export
–username retail_dba
–password itversity
–export-dir /apps/hive/warehouse/samandav.db/products_hive/
–input-fields-terminated-by ‘\001’
–input-null-non-string “null”
–input-null-string “null”
–update-mode allowinsert
–update-key product_id
–columns “product_id,product_category_id,product_name,product_description,product_price,product_impage,product_grade,product_sentiment”
–table products_external_saranya

Please check and try again. Hope it works now.

Thanks
Aparna


#5

Your second query is fine, but since the column order in source and target table are different, you need to specify --columns field in the export command, with the order same as of the source table and and no space between the column names.

Please try again and let me know if it works.

Thanks
Aparna