Unable to write to local file system

#1

I am unable to write any file to local file system from spark2 shell (Code follows logs) -

Please advice urgently

19/08/13 12:08:02 ERROR SparkHadoopWriter: Aborting job job_20190813120801_0010.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 8 in stage 2.0 failed 4 times, most recent failure: Lost task 8.3 in stage 2.0 (TID 16, wn02.itversity.com, executor 4): java.io.IOException: Mkdirs failed to create file:/home/vramakrishnan3/mytemp/text/_temporary/0/_temporary/attempt_20190813120801_0010_m_000008_3 (exists=false, cwd=file:/hadoop/yarn/local/usercache/vramakrishnan3/appcache/application_1565300265360_1918/container_1565300265360_1918_01_000005)

Code executed -

import spark.implicits._
import org.apache.spark.sql._

val inputFileContentRDD = sc.textFile("/public/crime/csv")
val header = inputFileContentRDD.first
val inpDataWithoutHeaderRDD = inputFileContentRDD.filter( line => line.equals(header) == false )

val inpDataWithoutHeaderRDDTop1000 = sc.parallelize(inpDataWithoutHeaderRDD.take(1000).toList)

val crimeItemsRDD = inpDataWithoutHeaderRDDTop1000.map(line => {
(
line.split(",")(0).toInt, line.split(",")(1), line.split(",")(2), line.split(",")(4)
)
})

val crimeItemDF = crimeItemsRDD.toDF(“id”, “case_number”, “date”, “type” )

crimeItemDF.write.format(“json”).mode(SaveMode.Overwrite).save(“file:///home/vramakrishnan3/mytemp/json”)

0 Likes