Spark submit job with Apache livy

apache-spark

#1

Hi , any one knows Apache spark submit job using Apache Livy interface ,
any knowledge about Apache Livy, share me please , Thank you


#2

@rpammidi.rao,

I have configured Hue & Jupyter notebooks with Spark using Livy Interface & triggered jobs not direct Livy REST API interface. If you need any assistance please refer in below blog link:
https://hortonworks.com/blog/livy-a-rest-interface-for-apache-spark/


#3

thank for responding Ravi, i know the above link , but i need a configuration and code also , how to do the spark submit job in real time, do you have any screen shots there send me please thank you


#4

@rpammidi.rao,

In real-time we don’t need Livy for submitting spark jobs, we have ‘spark-submit’ & Oozie schedulers.
Livy is for REST API, Hue & Jupyter notebooks. I can share details of Hue & Jupyter as i’ve already done it. But for that you need administration rights to cluster. Just let me know.


#5

Instal; Iivy Server and configure so that it is pointing to the cluster

Checkout Examples:
https://livy.incubator.apache.org/examples/

  1. Form a JSON structure with the required job parameters:
    { “className”: “org.apache.spark.examples.SparkPi”,
    “executorMemory”: “20g”,
    “args”: [2000],
    “file”: “/path/to/examples.jar”
    }
  2. Specify master and deploy mode in the livy.conf file.
  3. To submit the SparkPi application to the Livy server, use the a POST /batches request.