I am using STS and have launched scala prompt after opting “Create Scala Interpreter in demo-spark-scala-app” and then executed following commands line by line (as shown below but when I executed “val sc = new SparkContext(conf)” to get the sc object getting error “16/12/08 16:08:08 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 449314816 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.”:
scala>import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import com.typesafe.config.ConfigFactory
import org.apache.hadoop.fs._
scala> val conf = new SparkConf().setAppName(“Average Revenue -Daily”).setMaster(“local”)
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@7fac631b
scala> val sc = new SparkContext(conf)
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
16/12/08 16:08:04 INFO SparkContext: Running Spark version 2.0.2
16/12/08 16:08:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
16/12/08 16:08:05 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.44.133 instead (on interface ens33)
16/12/08 16:08:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/12/08 16:08:06 INFO SecurityManager: Changing view acls to: fedora
16/12/08 16:08:06 INFO SecurityManager: Changing modify acls to: fedora
16/12/08 16:08:06 INFO SecurityManager: Changing view acls groups to:
16/12/08 16:08:06 INFO SecurityManager: Changing modify acls groups to:
16/12/08 16:08:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(fedora); groups with view permissions: Set(); users with modify permissions: Set(fedora); groups with modify permissions: Set()
16/12/08 16:08:07 INFO Utils: Successfully started service ‘sparkDriver’ on port 36177.
16/12/08 16:08:07 INFO SparkEnv: Registering MapOutputTracker
16/12/08 16:08:08 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 449314816 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.(SparkContext.scala:420)
at $line5.$read$$iw$$iw$$iw$$iw$.(:18)
at $line5.$read$$iw$$iw$$iw$$iw$.()
at $line5.$eval$.$print$lzycompute(:7)
at $line5.$eval$.$print(:6)
at $line5.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:415)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:923)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
16/12/08 16:08:08 INFO SparkContext: Successfully stopped SparkContext
java.lang.IllegalArgumentException: System memory 449314816 must be at least 471859200. Please increase heap size using the –driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.(SparkContext.scala:420)
… 32 elided
scala>
I have made some changes as well after seeing some suggestion on google like changes under STS.ini, spark-defaults.conf file but nothing happened. Uploaded the screen shot as well here…
Pls help…