Facing issue in Data Frame

Hi guys when i am trying to make data frame by case classes i am getting error like :38: error: value toDF is not a member of org.apache.spark.rdd.RDD[U]
possible cause: maybe a semicolon is missing before `value toDF’?
}).toDF

I have also imported
val sqlContext= new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
But still i am facing the same issue.I am giving my code here, your help would be highly appreciated.Thanks in advance

scala>case class Orders(
order_id : Int,
order_date : String,
order_no : Int,
order_status : String)
defined class Orders

scala> val ordersDF = sc.textFile(“file:///home/cloudera/Public/orders”).
map(rec => {
r = rec.split(",")
Orders(r(0).toInt, r(1) , r(2).toInt , r(3))
}).toDF
ce

Below command is working:

val ordersDF = sc.textFile("/home/cloudera/orders").
map(rec => {
val r = rec.split(",")
Orders(r(0).toInt, r(1) , r(2).toInt , r(3))
}).toDF

in your code just add ''val" keyword before r = rec.split(",") as it is scala :slight_smile:

Hope this would be more readable:

// define case class
case class Orders(
     order_id : Int,
     order_date : String,
     order_no : Int,
     order_status : String)

// create SQLContext object
val sqlContext= new org.apache.spark.sql.SQLContext(sc)

// import spark SQL implicit
import sqlContext.implicits._

// create ordersDF
val ordersDF = sc.textFile("file:///home/cloudera/Public/orders").
     map(_.split(",")).
     map(r => Orders(r(0).toInt, r(1) , r(2).toInt , r(3))).
     toDF

Cheers,
AVR

Thank you so much.Some silly mistakes.