Scala:Spark Error

apache-spark
spark-shell
#1

scala> val lines = sc.parallelize(List(1, 2, 3))
lines: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[3] at parallelize at :29

scala> val linelengths = lines.map(s => s.length)
:31: error: value length is not a member of Int
val linelengths = lines.map(s => s.length)
^

scala>

I am new to spark and scala.
Could anyone please help me understand what is this error all about.
Regards,
Pooja

0 Likes

#2

Please follow this link to learn scala and spark. What ever you are doing is not correct.

0 Likes

#3

@poojakurwai 1. you have created a list of type int.
2. Trying to get the length of each value. However, in scala to calculate the length of string, use length() function. The error is becoz of you are trying to get the length of Int. Try creating a String list and calculate the length.

2 Likes