RDD not found : Wierd Error

bigdatalabs

#1

scala> val ordersItems = sc.textFile("/public/retail_db/order_items")
17/11/13 00:28:54 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 336.5 KB, free 701.3 KB)
17/11/13 00:28:54 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 28.3 KB, free 729.6 KB)
17/11/13 00:28:54 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 172.16.1.100:48240 (size: 28.3 KB, free: 511.1 MB)
17/11/13 00:28:54 INFO SparkContext: Created broadcast 1 from textFile at :27
ordersItems: org.apache.spark.rdd.RDD[String] = /public/retail_db/order_items MapPartitionsRDD[3] at textFile at

scala> orderItems
:26: error: not found: value orderItems
orderItems

I am trying to read back the RDD. Even though its creating it but while reading it back its giving error.
Does anyone have any idea?


#2

It is not weird error. There is typo. You just need to be careful with the variable names.


#3

sorry for that…Yes :frowning: