Spark Streaming Failed

apache-spark

#1

I launched sbt console and then issued below commands:

import org.apache.spark.SparkConf
import org.apache.spark.streaming._
val conf = new SparkConf().setAppName(“streaming”).setMaster(“yarn-client”)
val ssc = new StreamingContext(conf, Seconds(10))

I am getting the error org.apache.spark.SparkException: Unable to load YARN support

My build.sbt looks like this:

name := “retail”
version := “1.0”
scalaVersion := “2.10.6”
libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.3”
libraryDependencies += “org.apache.spark” % “spark-streaming_2.10” % “1.6.3”

Can you please help?

Thanks
Rahul Gangadharan


#2

@Rahul_Gangadharan

You have to use setMaster as local instead of yarn-client.


#3

@Sunil_Itversity : Thanks !!! That worked. Appreciate your help!


closed #4