Running the jar (created in intelliJ) in spark cluster -issue

Hi,
Here is my build.sbt

name := “IntelliJWorkspace”

version := “1.0”

scalaVersion := "2.10.5"
libraryDependencies += “org.apache.spark” % “spark-core_2.10” % “1.6.2”

Scala Program

import org.apache.spark.SparkContext, org.apache.spark.SparkConf

object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName(“scala spark”).setMaster(args(0))
val sc = new SparkContext(conf)
val i = List(1, 2, 3, 4, 5)
val dataRDD = sc.parallelize(i)
dataRDD.saveAsTextFile(args(1))
}
}

The jar is created with intellij and have the below error wehn running with

[root@pocd-hdp250-manager ~]# spark-submit --class “SimpleApp” \

–master yarn
–executor-memory 512m
–total-executor-cores 1
SimpleApp.jar SimpleApp.jar
Exception in thread “main” java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
at sun.security.util.SignatureFileVerifier.processImpl(SignatureFileVerifier.java:284)
at sun.security.util.SignatureFileVerifier.process(SignatureFileVerifier.java:238)
at java.util.jar.JarVerifier.processEntry(JarVerifier.java:316)
at java.util.jar.JarVerifier.update(JarVerifier.java:228)
at java.util.jar.JarFile.initializeVerifier(JarFile.java:383)
at java.util.jar.JarFile.getInputStream(JarFile.java:450)
at sun.misc.JarIndex.getJarIndex(JarIndex.java:137)
at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:839)
at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:831)
at java.security.AccessController.doPrivileged(Native Method)
at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:830)
at sun.misc.URLClassPath$JarLoader.(URLClassPath.java:803)
at sun.misc.URLClassPath$3.run(URLClassPath.java:530)
at sun.misc.URLClassPath$3.run(URLClassPath.java:520)
at java.security.AccessController.doPrivileged(Native Method)
at sun.misc.URLClassPath.getLoader(URLClassPath.java:519)
at sun.misc.URLClassPath.getLoader(URLClassPath.java:492)
at sun.misc.URLClassPath.getNextLoader(URLClassPath.java:457)
at sun.misc.URLClassPath.getResource(URLClassPath.java:211)
at java.net.URLClassLoader$1.run(URLClassLoader.java:365)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

@Seshi you have to pass the required arguments like master value and file name after the jar name

Sorry. I got the solution.

zip -d .jar META-INF/.RSA META-INF/.DSA META-INF/*.SF

My call was actually correct and used the same as Durga explained. After the above command executed on jar, it’s sorted.

spark-submit --class “SimpleApp”
–master yarn
–executor-memory 512m
–total-executor-cores 1
simple-scala_2.11-1.0.jar yarn-client /user/root/simpleappoutput