Spark-submit job not running with given executors

apache-spark

#1

I am running spark-submit job on spark 1.6.2. My spark submit command is below

spark-submit --class Crime
–conf spark.ui.port=23145
–packages com.databricks:spark-csv_2.10:1.5.0
–num-executors 6
–executor-cores 3
–executor-memory 2G
projects_2.11-0.1.jar /user/nikkhiel123/project /user/nikkhiel123/projectoutput prod

But while running, it is not running with 6 executors but it is running with default 2 executors. But if I run same job with below command it is running with 6 executors.

spark-submit --class Crime
–conf spark.ui.port=23145
–conf spark.executor.instances=6
–conf spark.executor.cores=3
–conf spark.executor.memory=2g
–packages com.databricks:spark-csv_2.10:1.5.0
projects_2.11-0.1.jar /user/nikkhiel123/project /user/nikkhiel123/projectoutput prod

Can some one kindly tell where I am going wrong.

Thanks.