Not able to create SparkContext object - Getting error


#1

I am getting error as stated below -

scala> import org.apache.spark.{SparkContext,SparkConf}
import org.apache.spark.{SparkContext, SparkConf}

scala> val conf = new SparkConf().setMaster(“local”).setAppName(“Daily Revenue”)
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@25a2401f

scala> val sc = new SparkContext(conf)
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
18/04/03 20:50:26 INFO SparkContext: Running Spark version 1.6.3
java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.apache.spark.util.TimeStampedWeakValueHashMap.(TimeStampedWeakValueHashMap.scala:42)
at org.apache.spark.SparkContext.(SparkContext.scala:298)
… 42 elided
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 44 more

scala>


#2

Which version of Scala you are using? With 1.6.x, you need to use Scala 2.10.


Do you want to avoid these incompatibility issues and accelerate your preparation for certification? Use our Big Data cluster

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster