Trying to execute a program on scala IDE using eclipse

apache-spark
scala

#1

hi

i am trying to run below scala program

package com.devinline.spark

package object Wordcount1 {
//  package com.devinline.spark
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD.rddToPairRDDFunctions
object WordCount {
  def main(args: Array[String]) = {

    //Start the Spark context
    val conf = new SparkConf()
      .setAppName("WordCount1")
      .setMaster("local")
    val sc = new SparkContext(conf)

    //Read some example file to a test RDD
    val test = sc.textFile("C:/Users/solas/Desktop/smaa.txt")

    test.flatMap { line => //for each line
      line.split(" ") //split the line in word by word.
    }
      .map { word => //for each word
        (word, 1) //Return a key/value tuple, with the word as key and 1 as value
      }
      .reduceByKey(_ + _) //Sum all of the value with same key
      .saveAsTextFile("output.txt") //Save to a text file

    //Stop the Spark context
    sc.stop
  }
}
}

recieving error as

Exception in thread "main" java.lang.IllegalArgumentException: Not enough arguments: missing class name.
	at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:222)
	at org.apache.spark.launcher.Main.main(Main.java:51)

here is the link ,how to run the following program:

http://www.devinline.com/2016/01/apache-spark-setup-in-eclipse-scala-ide.html

any help what is the error ?

do i need to include ? in main ?

any help @mayank2711 @vinodnerella @BaLu_SaI @dgadiraju