Find Wordcount using Accumulator

apache-spark
scala

#1

Hi All,

I was trying to find the wordcount using accumulator.Is my approach correct or not.
wordcount = sc.accumulator(0)
val mapfile = lines.flatMap(rec => rec.split(",")).map(rec => (rec,1)).reduceByKey((x,y) => (x+y)).foreach( rec => wordcount +=1)

Thanks & Regards,
Balaji

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster