I just got registered myself for lab access.
I have two questions
- In lab, when I excute pyspark
Console says -Multiple versions are installed and by default 1.6.3 version get initialized.
How can I upgrade to spark 2.4.6
- According to the latest update in exam, Cluster will be preloaded with 2.4 and CDH6. Does it mean hadoop version should be 1.7.