Type cast dataframe columns dynamically scala



I have a dataframe with all string columns as in below df.show()

Column| value | TypeCast

Amount1| 100.8945 |numeric(10,3)

date | 2099-12-31 | timestamp

amount2 | 10.48416 |numeric(15,4)

amount3 | 12103306| float

I need to type cast the df(“value”) as per the datatype available in df(“TypeCast”) Ex: 100.8945 to be casted as numberic(10,3) , that is 100.894 so on … Please help me here

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster