Loading Sequence File data into hive table created using stored as sequence file failing

Importing the content from MySQL to HDFS as sequence files using below sqoop import command

sqoop import --connect “jdbc:mysql://quickstart.cloudera:3306/retail_db” --username retail_dba --password cloudera --table orders --target-dir /user/cloudera/sqoop_import_seq/orders --as-sequencefile --lines-terminated-by ‘\n’ --fields-terminated-by ‘,’

Then i’m creating the hive table using the below command
create table orders_seq(order_id int,order_date string,order_customer_id int,order_status string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘|’ STORED AS SEQUENCEFILE

But when I tried to load sequence data obtained from 1st command into hive table using the below command
hive> LOAD DATA INPATH ‘/user/cloudera/sqoop_import_seq/orders’ INTO TABLE orders_seq;

It is giving the below error.
Loading data to table practice.orders_seq
Failed with exception java.lang.RuntimeException: java.io.IOException: WritableName can’t load class: orders
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

Where am I going wrong? Already wasted 6 hours trying to find the issue on my own :frowning:

@TheIndianYoutuber
Please refer below link:
Load Sqoop Sequence Files in Hive

2 Likes

Hi,

Try importing as sequence file without specifying the line and field terminators. Sequence file import is an import in the binary format where the records are imported as key value pairs with metadata. I am not sure but I believe the line and field terminators are not right and specifying the statement –as-sequencefile should suffice and handle all the format.

Thanks.

I tried all the possible means without delimiters,with different delimiters too… none of them worked :frowning:

@TheIndianYoutuber
with delimiters and without doesn’t make any difference. As Sqoop Sequence SerDe and Hive Sequence SerDe is different. That’s why we need to use Hive-Sqoop-SerDe, refer above link for reference.

Sure :slight_smile: Thanks a lot for the link…Will check it