SQLContext issue in Spark


#1

Got some error while using SQLContext(please find attachment)

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{DataFrame, SQLContext}

object OrderRevenue {
def main(args: Array[String]): Unit= {
val conf = new SparkConf().
setAppName("order Revenue for ").
setMaster(“local”)
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)

 val rdd = new org.apache.spark.sql.SQLContext(sc).read.format("CSV").option("header","true").load("C:/Users/Rajesh/Desktop/LCA_FY2010.csv")
 val df: DataFrame = rdd.toDF()
 df.write.parquet("C:/Users/Rajesh/Sales.parquet")![Capture|690x192](upload://htcUgQmhjGVDVVtzuLlhX5qgd9t.png)

#2

What is the error you are facing?


#3

Please find here