Error in SparkStreaming


#1

Hi Team,
I am practicing the Spark streaming and during the execution I get below error. I have given the writeup of the commands and file being used. Error would show “Unable to load YARN support”

Build.sbt has below code

name := “retail”
version := “1.0”
scalaVersion := “2.10.6”
libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.3”
libraryDependencies += “org.apache.spark” % “spark-streaming_2.10” % “1.6.3”

Step -2
sbt console ( would run just fine)

Steps-3,4,5 ( would run fine) but step 6 is failing with below error. PLEASE HELP
3 import org.apache.spark.streaming._
4 import org.apache.spark.SparkConf
5 val conf = new SparkConf().setAppName(“Streaming word count”).setMaster(“yarn-client”)
6 val ssc = new StreamingContext(conf, Seconds(10))

******** Error

scala> val ssc = new StreamingContext(conf, Seconds(10))
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
18/07/10 07:49:47 INFO SparkContext: Running Spark version 1.6.3
18/07/10 07:49:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
18/07/10 07:49:49 INFO SecurityManager: Changing view acls to: niranjanalkari
18/07/10 07:49:49 INFO SecurityManager: Changing modify acls to: niranjanalkari
18/07/10 07:49:49 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(niranjanalkari); users with modify permissions: Set(niranjanalkari)
18/07/10 07:49:49 INFO Utils: Successfully started service ‘sparkDriver’ on port 43030.
18/07/10 07:49:50 INFO Slf4jLogger: Slf4jLogger started
18/07/10 07:49:50 INFO Remoting: Starting remoting
18/07/10 07:49:50 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.16.1.109:57347]
18/07/10 07:49:50 INFO Utils: Successfully started service ‘sparkDriverActorSystem’ on port 57347.
18/07/10 07:49:50 INFO SparkEnv: Registering MapOutputTracker
18/07/10 07:49:50 INFO SparkEnv: Registering BlockManagerMaster
18/07/10 07:49:50 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-fd419b2c-555a-4d06-98e9-6a9c6d65c3d6
18/07/10 07:49:50 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
18/07/10 07:49:50 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Unable to load YARN support


#2

Hi @Niranjan_Alkari

You have to use local mode instead of yarn-client

so change setMaster("yarn-client) to

setMaster(“local”)

It will work fine.