Problem in reading avro file in spark

Hi Folks,
@itversity @perraju ,

Could you please help me in handling the avro file format in spark.
I am not able to read the avro file in spark
br,
jonuchauhan

Pls refer below post

Hi @N_Chakote,

Thanks a lot for your quick reply now it worked .

and one more question actually I am able to read a sequence file but while accessing the file it is giving me error not able to load class : department.

Br,
Jonu Chauhan

You mean to say u can in Spark as sc.sequencefile. can u able to preview the data using take(5)? Can you pls elaborate on accessing means what operation u r doing after reading the file.

Yes when i am using take(5) then it gives the error unable to load the class departments .
@N_Chakote
@pratyush04
@itversity
I am using below command :-
val dep = sc.sequenceFile("/user/jonuchauhan/seq/dep" ,classOf[String] ,classOf[String] )

Please help me in the sequence file scenario reading is easy but how to access the data if you give me one example code for example you can take department file at my location.

Br,
Jonu Chauhan

You need to import
import hadoop.mapreduce.lib.input._
Then you confirm that /user/jonuchauhan/seq/dep has data and it is in sequencefile format.
Then to preview you need to run
sc.sequenceFile("/user/jonuchauhan/seq/dep" ,classOf[Text] ,classOf[Text]),map(rec => rec.toString()).collect().foreach(println)

Thanks

@N_Chakote for quick reply i’ll try but i guess spark seems to be down now.

Br,
jonu chauhan

Hi @N_Chakote,
@itversity

Can somebody help me with this case
Still i am getting the same error

scala> val dep =sc.sequenceFile("/user/jonuchauhan/seq/dep" ,classOf[Text] ,classOf[Text]);
dep: org.apache.spark.rdd.RDD[(org.apache.hadoop.io.Text, org.apache.hadoop.io.Text)] = /user/jonuchauhan/seq/dep HadoopRDD[2] at sequenceFile at :36
scala> dep.map(rec => rec.toString()).collect().foreach(println);
17/04/03 06:50:09 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
java.lang.RuntimeException: java.io.IOException: WritableName can’t load class: departments;

i have imported
import org.apache.hadoop.mapreduce.lib.output._
import org.apache.hadoop.io._