Type cast dataframe columns dynamically scala with dataframes

scala
dataframes
spark

#1

All,

I have a dataframe with all string columns as in below df.show()

Column| value | TypeCast

Amount1| 100.8945 |numeric(10,3)

date | 2099-12-31 | timestamp

amount2 | 10.48416 |numeric(15,4)

amount3 | 12103306| float

I need to type cast the df(“value”) as per the datatype available in df(“TypeCast”) Ex: 100.8945 to be casted as numberic(10,3) , that is 100.894 so on … Please help me here