How to read sequence file compressed with BZip2Codec

How to read sequence file compressed with BZip2Codec?

I am able to create RDD using

scala> val custSeqRDD = sc.sequenceFile("/user/cloudera/sqoop_import/customers_seq_bzip2", classOf[LongWritable], classOf[customers])
custSeqRDD: org.apache.spark.rdd.RDD[(org.apache.hadoop.io.LongWritable, customers)] = /user/cloudera/sqoop_import/customers_seq_bzip2 HadoopRDD[0] at sequenceFile at :33

Above code is giving an error:

scala> custSeqRDD.take(4).foreach(println)
17/06/22 19:41:17 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 192.168.247.131, executor 1): java.lang.RuntimeException: java.io.IOException: WritableName can’t load class: customers
at org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:2107)
at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:2037)
at org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1878)
at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1827)
at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1841)
at org.apache.hadoop.mapred.SequenceFileRecordReader.(SequenceFileRecordReader.java:49)
at org.apache.hadoop.mapred.SequenceFileInputFormat.getRecordReader(SequenceFileInputFormat.java:64)
at org.apache.spark.rdd.HadoopRDD$$anon$1.(HadoopRDD.scala:240)
at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211)
at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:101)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: WritableName can’t load class: customers
at org.apache.hadoop.io.WritableName.getClass(WritableName.java:77)
at org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:2105)
… 17 more
Caused by: java.lang.ClassNotFoundException: Class customers not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
at org.apache.hadoop.io.WritableName.getClass(WritableName.java:75)
… 18 more

Check out this…