ERROR when trying the below code


I am getting the following error when trying below code, please help me where i am getting wrong…
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
val conf = new SparkConf().setAppName(“Simple Application”).setMaster(“yarn-client”)
val sc = new SparkContext(conf)
val orders = sc.textFile("/public/retail_db/orders")

ERROR TaskSetManager: Task 0 in stage 1.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 7, wn05.itversity

Thank you

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster




I have tried this code in labs its working fine. Are you trying code in your local system?