Unable to access Hive and Pyspark



I am trying to connect to Hive and Pyspark using web console but getting below error.

[gunjanbarot@gw02 ~]$ hive
bash: hive: command not found…
[gunjanbarot@gw02 ~]$ pyspark
pyspark is not found, please check if spark is installed

I was able to access it this morning but suddenly started getting this error.


We are looking into the issue. We will give an update once it is resolved.


@vinodnerella I can see a notice that Labs are back to normal. But I can’t still run sqoop or spark. It says - bash: spark: command not found…


@Srikapardhi Issue resolved in the labs.

To launch spark-shell use the command below

spark-shell --master yarn --conf spark.ui.port=<Five_Digit_port>

spark-shell --master yarn --conf spark.ui.port=12335

Sample sqoop command and use full resource for sqoop https://www.youtube.com/watch?v=Q8hIVuRWbvk&list=PLf0swTFhTI8qZPHwjy4Hrh_NUlhLku5cx

sqoop eval --connect "jdbc:mysql://ms.itversity.com/retail_db" \ --username retail_user \ --password itversity \ --query "SELECT * FROM departments