Sqoop export error!

#1

I’m getting the below error when I try to practice the last question in HDPCD practice exam. I’ve given the question below the error message in bold italics. Please help me to solve this issue.

[root@sandbox ~]# sqoop export --connect jdbc:mysql://sandbox.hortonworks.com:3306/flightinfo --username root --password ‘’ --export-dir /user/horton/weather --table weather
–input-fields-terminated-by ‘,’ -m 1;
Warning: /usr/hdp/2.4.0.0-169/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/12/25 17:59:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.0.0-169
16/12/25 17:59:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/12/25 17:59:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/12/25 17:59:56 INFO tool.CodeGenTool: Beginning code generation
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/12/25 17:59:57 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM weather AS t LIMIT 1
16/12/25 17:59:57 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@6eb58296 is still active. No
statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets be
fore attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@6eb58296 is still active. No statements may be issued when any streaming result sets are open and in
use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
16/12/25 17:59:57 ERROR tool.ExportTool: Encountered IOException running export job: java.io.IOException: No columns to generate for ClassWriter

TASK 10

Sqoop Export

Put the local file /home/hortonworks/datasets/flightdelays/sfo_weather.csv into HDFS in a new directory named /user/hortonworks/weather/
Note there is a MySQL database named flightinfo on the namenode machine. It contains a table named weather with the following schema:
±--------------±-------------±-----±----±--------±------+
| Field | Type | Null | Key | Default | Extra |
±--------------±-------------±-----±----±--------±------+
| station | varchar(100) | YES | | NULL | |
| year | int(11) | YES | | NULL | |
| month | int(11) | YES | | NULL | |
| dayofmonth | int(11) | YES | | NULL | |
| precipitation | int(11) | YES | | NULL | |
| maxtemp | int(11) | YES | | NULL | |
| mintemp | int(11) | YES | | NULL | |
±--------------±-------------±-----±----±--------±------+
** **
Use Sqoop to export the weather directory in HDFS to the weather table in MySQL on port 3306 on the namenode machine. The username for MySQL is root and the password is hadoop.

0 Likes

#2

In your export command replace sandbox.hortonworks.com by namenode as that is alias for namenode given.
I.e. jdbc:mysql://namenode:3306/…

0 Likes

#3

Hi, I think that is not the issue as the namenode alias is valid in AWS EC2 environment, where as I’ve downloaded the dataset and uploaded to Hortonworks sandbox VM in my local machine. I’ve validated it using the below command.

sqoop eval --connect jdbc:mysql://sandbox.hortonworks.com/ --username root --password “” --query “show databases”;

the above statement is working perfectly without any issue. It is giving error when I try to export only. So I think the issue is something else.

0 Likes

#4

Hi, The issue got resolved by adding --driver com.mysql.jdbc.Driver to the statement. Thanks for your support.

sqoop export --connect jdbc:mysql://localhost/flightinfo --table weather --export-dir /user/horton/weather --input-fields-terminated-by ‘,’ --username root --password “” --driver com.mysql.jdbc.Driver --direct;

0 Likes

#5

Nice. Pls Mark solution.

0 Likes