How to run scala script in spark-shell


#1

HI All,

I tried to run scala script with args option but it not accepting the input args and following ways I tried.

code:-
object abc()
{
def main(args: Array[String])
{
println(args(0))
}
}

executed commands
spark-shell -I abc.scala “Prasad” --> it throws error
:load abc.scala --> abc.main(“Prasad”)–> it throws error.

above 2 ways I tried…could you please guide me how to pass argument and read in my scala script.

Thanks,
Prasad