Unable to load csv file in spark sql

Am Unable to load CSV File using SPARKSQL am getting Class not Found Exception

i have used the Below command

 val csvsrc = sqlContext.read.format("com.databricks.spark.csv").
	 |option("header", "true").
     | option("inferSchema", "true").
     | load("/user/esakkipillai/sparkSql/Resources/cars.csv")	

please refer the below screenshot .

Please let me know what am i missing?

Do import com.databricks.spark.csv._ before running the script

hi @N_Chakote ,

Thanks for the Quick response .

I have tried to import the same but am getting the below error

Any other possible Solution ?

Kindly post your spark-shell launching script e.g spark-shell --master-yarn … Are you adding package for spark-csv while launching. And this one you r doing on labs or vm?

Can you please try the below?
import com.databricks.spark.csv._
val csvsrc = sqlContext.read.format(“csv”).
|option(“header”, “true”).
| option(“inferSchema”, “true”).
| load("/user/esakkipillai/sparkSql/Resources/cars.csv")

@esak import org.apache.spark.sql.SQLContext - try this and then use your command. It should work.

@N_Chakote , Am using the itversity lab

i have started the Spark shell using the below command

spark-shell --conf spark.ui.port=22322 --master yarn-client

in the Spark shell i try to import the below

import com.databricks.spark.csv._

Please let me know if we need to add any additional jars ? if so please share the Commands as well


While launching spark-shell use --packages com.databricks:spark-csv_2.11:1.3.0 and then import.

Thanks @N_Chakote , Am able to load the databricks jar after launching the spark shell using below command


1 Like