Spark Joins is not working

apache-spark

#1

scala> val ordersjoin = ordersMap.join(orderitemsMap)
ordersjoin: org.apache.spark.rdd.RDD[(Int, (String, Double))] = MapPartitionsRDD[66] at join at :37

ordersjoin.count

returning 0

ordersMap.count 30455
orderitemsMap.count 172198

I have tried with rightOuterJoin leftOuterJoin and fullOuterJoin :frowning_face:

Someone please help me on this


#2

can you give the code for generating ordersMap and orderItemsMap?