Spark-version in LAB

I just got registered myself for lab access.
I have two questions

  1. In lab, when I excute pyspark
    Console says -Multiple versions are installed and by default 1.6.3 version get initialized.
    How can I upgrade to spark 2.4.6
  2. According to the latest update in exam, Cluster will be preloaded with 2.4 and CDH6. Does it mean hadoop version should be 1.7.

Hi @monika_arora,

for launching spark2.x on terminal use below command-

pyspark2 --master yarn --conf spark.ui.port=0

and CDH6.x comes with Hadoop 3.x, hence hadoop version should be 3.x

1 Like

HI @Shubham_Maurya1
Thank you for your answer.

But Spark 2.4.6 doesn’t come with with hadoop 3.x .
The highest possible version is Hadoop 2.7.
I am using the below link to download