Externalize properties - typesafe config

Originally published at: http://www.itversity.com/topic/externalize-properties-typesafe-config/

So far we have been running the job in local mode, that too hardcoded. Typically execution mode might differ between different environments, hence it might be good idea to externalize it (using 3rd party plugin – typesafe config) Update build.sbt with the dependency for typesafe config Create application.properties under src/main/resources To run application in local…

i have followed full video and run using spark-submit --class wordcount.WordCount
spark-submit --class wordcount.WordCount /home/hduser/IdeaProjects/sands/target/scala-2.10/sands_2.10-1.0.jar dev /home/hduser/data/try.txt /home/hduser/data/make
but getting error as java.lang.ClassNotFoundException: wordcount.WordCount
i am attaching the screenshot

code i wrote in intellij as follow
import org.apache.hadoop.fs._
import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spark.{SparkConf, SparkContext}

object WordCount {

def main(args: Array[String]) = {
val executionEnvironment = args(0)
val props: Config = ConfigFactory.load()
val conf = new SparkConf().
setAppName(“Word Count”).
val sc = new SparkContext(conf)

val fs = FileSystem.get(sc.hadoopConfiguration)

val inputPath = args(1)
val outputPath = args(2)

if(!fs.exists(new Path(inputPath))) {
  println("Input path does not exist")
} else {
  if(fs.exists(new Path(outputPath)))
    fs.delete(new Path(outputPath), true)
    flatMap(rec => rec.split(" ")).
    map(rec => (rec.replace(",", ""), 1)).
    // above line can be added if you want to discard punctuation marks while performing word count
    reduceByKey(_ + _).
    map(rec => rec.productIterator.mkString("\t")).

// alternative - map(rec => rec._1 + “\t” + rec._2).

plz help me solve it