Scala code execution error in IJ on local

import org.apache.spark.{SparkConf, SparkContext}

object wcc {

def main(args: Array[String]) = {

val conf = new SparkConf().setAppName("word count").setMaster("local")
val sc = new SparkContext(conf)
val lines = sc.textFile("file:///Users/kumar/Desktop/abc.txt")
val linesapp => (x,1))
val wc = linesapp.reduceByKey((x,y) => x +y)



Error is:

objc[7261]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/bin/java (0x10f3624c0) and /Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/jre/lib/libinstrument.dylib (0x10f3f54e0). One of the two will be used. Which one is undefined.
Exception in thread “main” java.lang.NoClassDefFoundError: scala/Product$class
at org.apache.spark.SparkConf$DeprecatedConfig.(SparkConf.scala:682)
at org.apache.spark.SparkConf$.(SparkConf.scala:539)
at org.apache.spark.SparkConf$.(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:72)
at org.apache.spark.SparkConf.setAppName(SparkConf.scala:87)
at wcc$.main(wcc.scala:11)
at wcc.main(wcc.scala)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.lang.ClassLoader.loadClass(
at sun.misc.Launcher$AppClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(
… 7 more

Process finished with exit code 1

Which build tool are you using to import library dependencies?

Hi, I am trying to develop sample wordcount program in scala on IntelliJ on local but when using arguments it is giving me error. could you please help me with this?
import org.apache.spark.{SparkConf, SparkContext}
object wcount{
def main(arg: Array[String]): Unit ={
val sConf = new SparkConf().setMaster(“local”).setAppName(“Word Count”)
val sc=new SparkContext(sConf)
System.setProperty(“hadoop.home.dir”, “C:/winutils”)
// val loadfile=sc.textFile(“file:///C:/Users/Atyant/IdeaProjects/WordCount/wordcount.txt”)
val loadfile=sc.textFile(args(0))
loadfile.flatMap(rec => rec.split(" ")).map(rec=> (rec,1)).
reduceByKey((agg,value) => agg+value).saveAsTextFile(“file:///C:/Users/Atyant/IdeaProjects/WordCount/wcop”)

Error :-
Error:(11, 27) not found: value args
val loadfile=sc.textFile(args(0))

Sbt file
name := “WordCount”

version := “1.0”

scalaVersion := “2.11.8”

libraryDependencies += “org.apache.spark” %% “spark-core” % “1.6.2”

Are you passing file name as an argument while executing​?

Yes in run=>edit configuration and program arguments I am passing file name. The main problem is that while writing code only it is not taking args(0) in Scala code. Do I need to add references for args or something as it is not able to recognize keyword args but the whole code works when works on hardcode file name

You are passing ‘arg’ as argument to main function. But using ‘args’ variable while loading text file.

Yes, you are right. the issue should be purely due to this. Many thanks

I have also one more issue. In office I have tried to setup the environment with IntelliJ and when I create a new project with scala and sbt the project compiles the dependencies and I get the following error

Error:Error while importing SBT project:

Server access Error: Connection timed out: connect url=

Server access Error: Connection timed out: connect url=

Server access Error: Connection timed out: connect url=

Server access Error: Connection timed out: connect url=

Server access Error: Connection timed out: connect url=

unresolved dependency: org.fusesource.jansi#jansi;1.11: not found
Error during sbt execution: Error retrieving required libraries
(see C:\Users\jainaty.sbt\boot\update.log for complete log)
Error: Could not retrieve jansi 1.11
Java HotSpot™ 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0

See complete log in C:\Users\jainaty\AppData\Local\JetBrains\IntelliJ IDEA Community Edition 2016.2.5\logs\sbt.last.log

Do you have some insights to this issue? Looks to me that https is not able to connect from IDE whereas If I browse from internet I am able to download the files.

on sbt:

name := “wc”

version := “1.0”

//scalaVersion := "2.12.1"
scalaVersion := “2.10.4”
//libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.2”
//libraryDependencies += (“org.scala-lang” % “scala-library”).%(“2.12.1”)
libraryDependencies += “org.apache.spark” %% “spark-core” % “1.6.2”

Looks like library dependencies is wrong. Put below code in and try to run.
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.2"

I think its because of firewalls. You can try workaround provided here -