Refresh impala Metadata in Spark



In spark, I am adding a partition to HDFS.
Its HIVE External table.

I am able to refresh Hive metadata to add new partition to the HIVE Metadata.
But how i can refresh Impala Metadata i.e., how to run Refresh tablename or Invalidate Metadata tablename from Spark?

Could you please let me know,


I don’t think it’s possible run that command from spark, better run in impala-shell & then start using spark.
Because from spark-shell you can’t run other shell commands. in case of Hive, there is no need to refresh metadata it is by default.