Issue Reading sequence file in Spark

spark-shell
sqoop
spark

#1

Hi ,

I have issue with reading sequenceFile after doing sqoop import. Below are the list of steps i have done.

While reading the sequencefile it says value products not found. I am not sure how to use the products.jar file created during sqoop import for reading sequenceFile

I have done sqoop import and saved data as sequence file. Below is the command i used.

sqoop import --connect jdbc:mysql://nn01.itversity.com/retail_db
–username retail_dba
–password itversity
–table products
–as-sequencefile
–target-dir ‘/user/satishp38/sqoop201806/products_import_using_assequencefile’
–delete-target-dir

hadoop fs -cat /user/satishp38/sqoop201806/products_import_using_assequencefile/part-00000 | head -10

SEQ!org.apache.hadoop.io.LongWritableproducts

I have products.jar created after executing sqoop import and i have moved this jar file to hdfs

Location for jar file: /user/satishp38/Data/products.jar

Now iam trying to read this sequence file in spark

import org.apache.hadoop.io._

val seqraw = sc.sequenceFile[LongWritable,products] ("/user/satishp38/sqoop201806/products_import_using_assequencefile/")
:30: error: not found: type products
val seqraw = sc.sequenceFile[LongWritable,products] ("/user/satishp38/sqoop201806/products_import_using_assequencefile/")

I have gone through the itversity forums and could not find solution for reading the sqoop imported sequence file.