Data Frames : Wrong Data Displaying

Hello All ,

While running the below script I found one memory leak issue. Anyone Please help me resolving the same.

joinedDF.groupBy(from_unixtime(col(“order_date”)/1000),col(“order_status”)).agg(round(sum(col(“order_item_subtotal”)),2),countDistinct(col(“order_id”))).show

17/06/07 08:26:20 WARN memory.TaskMemoryManager: leak 8.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@428bdb0c
17/06/07 08:26:20 ERROR executor.Executor: Managed memory leak detected; size = 8650752 bytes, TID = 1439

Your code works fine . I did not get any error . But do you know the reason for having the col and dividing by 1000