Spark streaming- sparkConf is not a member of package org.apache.spark

#1

Hi,

I am getting the below error while importing the package:
image
Please someone help me.

Thanks,

0 Likes

#2

This happens whenever spark-shell is not launched successfully or spark core library is not included in self-contained application. Can you check whether spark-shell has been launched successfully?
or let me know how did you launch spark shell?

0 Likes

#3

Hi Jagjit,

I am able to launch spark-shell successfully using the below command.
spark-shell --master yarn --conf spark.ui.port=12456
image

and how can we include spark core library is in self-contained application.
Could you please provide me steps for the same.

0 Likes

#4

add this line to build.sbt
libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.3”
(1.6.3 is old version . you can use latest version is 2.4.1)

and run sbt package command from scala project directory
it will download all required spark jars.

0 Likes