#spark program executed in spark-shell
scala> sc.parallelize(List(5,4,6,8,9)).collect.foreach(println)
16/12/16 15:38:53 INFO spark.SparkContext: Starting job: collect at <console>:22
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Got job 0 (collect at <console>:22) with 1 output partitions
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Final stage: ResultStage 0(collect at <console>:22)
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Missing parents: List()
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at <console>:22), which has no missing parents
16/12/16 15:38:53 INFO storage.MemoryStore: ensureFreeSpace(1272) called with curMem=0, maxMem=560497950
16/12/16 15:38:53 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1272.0 B, free 534.5 MB)
16/12/16 15:38:53 INFO storage.MemoryStore: ensureFreeSpace(843) called with curMem=1272, maxMem=560497950
16/12/16 15:38:53 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 843.0 B, free 534.5 MB)
16/12/16 15:38:53 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:35249 (size: 843.0 B, free: 534.5 MB)
16/12/16 15:38:53 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:861
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at <console>:22)
16/12/16 15:38:53 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/12/16 15:38:53 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2045 bytes)
16/12/16 15:38:53 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/12/16 15:38:53 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 918 bytes result sent to driver
16/12/16 15:38:53 INFO scheduler.DAGScheduler: ResultStage 0 (collect at <console>:22) finished in 0.085 s
16/12/16 15:38:53 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 72 ms on localhost (1/1)
16/12/16 15:38:53 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/12/16 15:38:53 INFO scheduler.DAGScheduler: Job 0 finished: collect at <console>:22, took 0.414255 s
5
4
6
8
9
#contents of build.sbt and .classpath
name := "Spark Logs Analyzer with Spark 2.0.1"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
.classpath
<classpath>
<classpathentry kind="src" path="src\main\scala"/>
<classpathentry kind="con" path="org.scala-ide.sdt.launching.SCALA_CONTAINER"/>
<classpathentry sourcepath="C:\Users\farhan.misarwala\.ivy2\cache\mysql\mysql-connector-java\srcs\mysql-connector-java-5.1.24-sources.jar" kind="lib" path="C:\Users\farhan.misarwala\.ivy2\cache\mysql\mysql-connector-java\jars\mysql-connector-java-5.1.24.jar"/>
<classpathentry sourcepath="C:\Users\farhan.misarwala\.ivy2\cache\org.apache.spark\spark-core_2.11\srcs\spark-core_2.11-1.6.2-sources.jar" kind="lib" path="C:\Users\farhan.misarwala\.ivy2\cache\org.apache.spark\spark-core_2.11\jars\spark-core_2.11-1.6.2.jar">