Not able to read textfile in spark-shell



Hello everyone, I’m trying to use directly the console Spark-shell to read a file from my local file system as a data frame in spark. woefully, I dont know how to read it in the right way, please give me a hand about how to read it.

My code:

//Define the class to map customers coming from the data inputh
case class customer (cusid: Int, name: String, city : String, province: String, postalcode: String)

//spark context
val sqlContext = new org.apache.spark.sql.SQLContext(sc)

//load the file info
val def ="/home/ingenieroandresangel/scalascripts/customer.txt")


java.lang.ClassNotFoundException: Failed to find data source: txt. Please find packages at
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109)

Caused by: java.lang.ClassNotFoundException: txt.DefaultSource
at java.lang.ClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)

I have tried :

val df =“text”).load("/home/ingenieroandresangel/scalascripts/customer.txt")

But I get the same error.

thanks so much guys


Spark by default assumes it’s file system as HDFS , If you want to read from local file system read as below :slight_smile:
sc.textFile(“file:///path to the file/”)