How to Write Unit Test cases for Spark and Scala

apache-spark

#1

Hi i am trying to write spark and Scala test cases in Maven
Please find the code screenshot as below

import org.scalatest.FunSuite
import org.scalatest.BeforeAndAfterEach
import org.apache.spark.sql.SparkSession

class WordCountTest extends FunSuite with SparkSessionObject with BeforeAndAfterEach{
var spark: SparkSession = _

override def beforeEach() {
spark = new SparkSession.Builder().appName(“Test”).master(“local”).getOrCreate()
}

test(“Testing RDD”){
val data = spark.sparkContext.textFile(“Inputs/inputFile2.txt”)
assert(data.count()!=0)
}
override def afterEach() {
spark.stop()
}

}

While i am running JUnit test in Scala IDE its throwing “No Test Found with test runner scala junit5”

could anyone help me so sort out this.


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster