Spark Cache - Application scope

cloudera
apache-spark
scala
spark-sql

#1

We have spark streaming application; in which we need to cache/persist one static data.

This static data is pulling from cassandra for first time, subsequent requests should not go to database, it should get from database.

but, this caching is not working, spark is caching for for every message, so executor memory is piling up.

Is there any better way to implement caching (need to put this data in application scope, it should available for spark context life)

Thanks, for your help

Murthy