Where i can see spark configuration properties

Hi

I want to check different spark paramter values in config file like spark.local_dir

spark.storage.memoryfraction ,spark.shuffle.memoryfraction

According to me by default spark does not provide any config file with that properties . you have set that configurations by your self under spark/conf/spark-defaults.conf file like below …

spark.yarn.queue=jobs
spark.history.ui.port 18080
spark.eventLog.dir hdfs:///user/centos/spark-history
spark.eventLog.enabled true
spark.history.fs.logDirectory hdfs:///user/centos/spark-history
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.fs.cleaner.enabled true
spark.history.fs.cleaner.interval 1d
spark.history.fs.cleaner.maxAge 3d
spark.history.retainedApplications 10
spark.driver.memory 1024M
spark.executor.memory 1024M
spark.num.executors 2
spark.executor.cores 2
spark.cassandra.input.consistency.level=QUORUM
spark.cassandra.input.split.size_in_mb=1024
spark.yarn.maxAppAttempts=5

for more config properties see the below link :smile:
https://spark.apache.org/docs/latest/configuration.html