Hello Friends
I am new to Big Data and Sqoop. I just started learning this new wonderful technology.
So, I am getting an error while loading the data from MySQL to HDFS. I am trying to move the whole database, in that database I have 20 tables, all the tables have been imported except one, I dug into the table where I am getting issue. So, I found that one of the columns of this table has a data type called Tinybit(1). Here is the code:
sqoop import
–connect jdbc://localhost.localdomain:3306/databasename
–username root
-P
–table sales_orders
I read on Apache Sqoop website about Tinyint(1) this:
MySQL: Import of TINYINT(1) from MySQL behaves strangely
Problem: Sqoop is treating TINYINT(1) columns as booleans, which is for example causing issues with HIVE import. This is because by default the MySQL JDBC connector maps the TINYINT(1) to java.sql.Types.BIT, which Sqoop by default maps to Boolean.
Solution: A more clean solution is to force MySQL JDBC Connector to stop converting TINYINT(1) to java.sql.Types.BIT by adding tinyInt1isBit=false into your JDBC path (to create something like jdbc:mysql://localhost/test?tinyInt1isBit=false). Another solution would be to explicitly override the column mapping for the datatype TINYINT(1) column. For example, if the column name is foo, then pass the following option to Sqoop during import: --map-column-hive foo=tinyint. In the case of non-Hive imports to HDFS, use --map-column-java foo=integer.
I tried to use above solution too, but that did not work too.
Can anyone help me out in this?