I started spark on terminal using command
spark-shell --conf spark.ui.port=22322 spark.port.maxRetries=100 --master yarn-client
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.
Im trying to do a read using this
val c = sc.textfile("/home/samspark/Sam/largedeck.txt")
but inturn getting this
:27: error: value textfile is not a member of org.apache.spark.SparkContext