org.apache.spark.SparkException: Job aborted due to stage failure

Hi All,
I created the products RDD and executed the below command and job is abending. I have given the screen print. Please let me know how to resolve this issue.

products.map( rec => (rec.split(",")(4).toFloat,rec)).sortByKey(false).take(5).foreach(println)

Error says it has some issue with data “empty string”.

This is code issue. You have to provide entire script.

Also there is a record where there is data issue which is causing this. Details are provided as part of the itversity.com course content and also mentioned in the video.

thanks @itversity, Can you please let me know the video number ( out of 88 )

@Giri - if you are using comma delimited file then try to filter product_id 685. One of the field has comma in it.

Thanks @gnanaprakasam, it worked.

@Giri - Good to hear, please close the topic.